Please answer the following questions in order to process your application.
Select your working status in the UK *
File Attachments:
(2MB file maximum. doc, docx, pdf, rtf or txt files only)
* denotes required field
Additional Information:
Availability/Notice
Hourly Rate GBP
Approximately how far are you willing to travel to work (in miles) ?
Key Privacy Information
When you apply for a job, ComputerJobs will collect the information you provide in the application and disclose it to the advertiser of the job.
If the advertiser wishes to contact you they have agreed to use your information following data protection law.
ComputerJobs will keep a copy of the application for 90 days.
More information about our Privacy Policy.
Job Details
GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid (Contract)
Location: Cardiff/Remote Country: UK Rate: Up to £550 Per Day (Inside IR35)
GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid
Are you ready to take your expertise in data engineering to the next level? Join us for a thrilling opportunity as a GCP Data Engineer, working on a 6-month contract (Inside IR35) with the potential for extension beyond 12 months. This role offers a chance to collaborate with a market-leading banking client, shaping the future of data management.
As a GCP Data Engineer, you'll be instrumental in designing, implementing, and optimizing data solutions on Google Cloud Platform (GCP). Your role will involve working onsite for 3 days per week at our Cardiff-based office, collaborating closely with our banking client to deliver exceptional results.
Key Responsibilities:
- Utilize expertise in batch, microbatch, event-based, and data streaming architectures to design solutions for replicating, transforming, and processing data effectively.
- Hands-on configuration of Kafka connectors for both batch processing and streaming, including source and sink connectors.
- Optimize the number of connectors to enhance performance and efficiency.
- Configure Kafka brokers to ensure optimal performance and reliability.
- Implement security measures and schema governance practices to maintain data integrity and compliance.
- Apply strong database modelling concepts and SQL skills to design and optimize database structures.
- Develop and execute SQL queries to extract, transform, and load data efficiently.
- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
- Document architectural designs, configurations, and best practices for knowledge sharing and future reference.
What you will Ideally Bring:
- Proficiency in Google Cloud Platform (GCP) DataFlow, including Apache Beam, for building and executing data processing pipelines.
- Strong coding skills in Python for developing custom data processing logic and transformations within DataFlow pipelines.
- Experience with log-based Change Data Capture (CDC) using Confluent Kafka Connectors for low-latency data replication
- In-depth knowledge of Kafka broker configuration, including topics, partitions, replication, and optimization for performance and reliability
- Proficiency in implementing security measures such as SSL/TLS encryption, SASL authentication, and ACL-based authorization to secure Kafka clusters
- Hands-on experience with Confluent Control Centre for monitoring and managing Kafka clusters, topics, and consumer groups
- Proficiency in using tools such as GitHub for version control, Confluence for documentation, and Jenkins for continuous integration and deployment (CI/CD) processes
Contract Details:
- Duration: 6 months
- Location: 3x Per Week Cardiff
- Day Rate: Up to £550 Per Day (Inside IR35)
GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid
Posted Date: 13 May 2024
Reference: JSTP
Employment Business: Hamilton Barnes
Contact: Daniel Bennett