Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with UST. For the most up-to-date job details, please visit the official website by clicking "Apply Now."
Role Description
Data Engineer- Contract Inside IR35
UST are recruiting Data Engineer with strong technical ability who can articulate well to non-tech audience, who will join our UK team on a permanent basis.
- Location Leeds or London
- Duration 3 Months
- Contract Inside IR35
About The Role
The successful candidate will engage with external Clients and internal customers, understand their needs, and design, build, and maintain data pipelines and infrastructure using Google Cloud Platform (GCP). This will involve the design and implementation of scalable data architectures, ETL processes, and data warehousing solutions on GCP.
You will have expertise in big data technologies, cloud computing, and data integration, as well as the ability to optimize data systems for performance and reliability. This requires a blend of skills including programming, database management, cloud infrastructure, and data pipeline development.
Responsibilities
- Design, build, and maintain scalable data pipelines and ETL processes using GCP services such as Cloud Dataflow, Cloud Dataproc, and BigQuery.
- Implement and optimize data storage solutions using GCP technologies like Cloud Storage, Cloud SQL, and Cloud Spanner.
- Develop and maintain data warehouses and data lakes on GCP, ensuring data quality, accessibility, and security.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access solutions.
- Implement data governance and security measures to ensure compliance with regulations and best practices.
- Automate data workflows and implement monitoring and ing systems for data pipelines.
- Sharing data engineering knowledge with the wider functions and developing reusable data integration patterns and best practices.
Skills/Experience
- BSc/MSc in Computer Science, Information Systems, or related field, or equivalent work experience.
- Proven experience (as a Data Engineer or similar role, preferably with GCP expertise.
- Strong proficiency in SQL and experience with NoSQL databases.
- Expertise in data modeling, ETL processes, and data warehousing concepts.
- Significant experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub.
- Proficiency in at least one programming language (e.g., Python, Java, or Scala) for data pipeline development.
- Experience with big data technologies such as Hadoop, Spark, and Kafka.
- Knowledge of data governance, security, and compliance best practices.
- GCP certifications (e.g., Professional Data Engineer) are highly advantageous.
- Effective communication skills to collaborate with cross-functional teams and explain technical concepts to non-technical stakeholders.
Skills
Bigquery,ETL,GCP,Data Management