Domestic & General

Senior Data Engineer

Location
London, England, United Kingdom
Posted At
12/6/2024
Advertise with us.
Please contact: hello@huntukvisasponsors.com
Description
For this role, senior experience of Data Engineering and building automated data pipelines on IBM Datastage & DB2, AWS and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies where experience of delivering complex pipelines will be significantly valuable to how D&G maintain and deliver world class data pipelines.

Job Summary

D+G is transforming into a technology-powered product business serving customers around the world. Our products and services rely heavily on compelling digital experiences and data-led journeys for our B2B and B2C clients and customers.

This is a key lead engineering role with D&G’s technology team which presents a challenging and exciting opportunity which will require real enthusiasm and modern Data engineering experience to stabilise, enhance and transform D&G’s operational Customer Databases as they move from legacy systems to new scalable cloud solutions across the UK, EU and US. The role will require an experienced Data engineer with good knowledge of IBM Datastage & DB2, AWS & Databricks pipelines who is able to excel in challenging environments with the confidence to help the teams steer the right course in the development of the data platform alongside supporting any required tooling decisions.

The role will enable D+G to deliver a modern data services layer, delivered as a product, and which can hence be consumed by key service channels and stakeholders on demand.

Strategic Impact

Quality Customer Data is the lifeblood of D&G’s operations which allows us to serve our customers with outstanding propositions and outcomes. This role will be integral to supporting this through the following areas of delivery:

On-Prem Customer Data Platform Stabilisation

This role will initially help stabilise existing on-prem Customer Data Platforms to help serve our customers and protect the one-billion-pound revenue across the UK and EU. Targets will be to reduce merge and compliance incident backlog, promote more automation and support onboarding of 3rd party to provide managed break / fix service.

Support Data Growth in UK and US Markets

Supporting further growth in UK / EU markets through enhancement of the Customer on-prem IBM platforms to ensure they remain available, robust and secure for growing data demands in UK / EU whilst leading on delivery of cloud based solutions for the US pipelines and Data platform.

Knowledge, Expertise, Complexity And Scope

Knowledge in the following areas essential:

Data Engineering Experience

  • Databricks: Expertise in managing and scaling Databricks environments for ETL, data science, and analytics use cases.
  • AWS Cloud: Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM.
  • IBM Skills: DB2, Datastage, Tivoli Workload Scheduler, Urban Code
  • Programming Languages: Proficiency in Python, SQL.
  • Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques.
  • DevOps & CI/CD: Familiarity with DevOps practices for data engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog).
  • Familiarity with big data technologies like Apache Spark, Hadoop, or similar.
  • Test automation skills
  • ETL/ELT tools and creating common data sets across on-prem (IBM Datastage ETL) and cloud data stores
  • Leadership & Strategy: Lead Data Engineering team(s) in designing, developing, and maintaining highly scalable and performant data infrastructures.
  • Customer Data Platform Development: Architect and manage our data platforms using IBM (legacy platform) & Databricks on AWS technologies (e.g., S3, Lambda, Glacier, Glue, EventBridge, RDS) to support real-time and batch data processing needs.
  • Data Governance & Best Practices: Implement best practices for data governance, security, and data quality across our data platform. Ensure data is well-documented, accessible, and meets compliance standards.
  • Pipeline Automation & Optimisation: Drive the automation of data pipelines and workflows to improve efficiency and reliability.
  • Team Management: Mentor and grow a team of data engineers, ensuring alignment with business goals, delivery timelines, and technical standards.
  • Cross Company Collaboration: Work closely with all levels of business stakeholder including data scientists, finance analysts, MI and cross-functional teams to ensure seamless data access and integration with various tools and systems.
  • Cloud Management: Lead efforts to integrate and scale cloud data services on AWS, optimising costs and ensuring the resilience of the platform.
  • Performance Monitoring: Establish monitoring and alerting solutions to ensure the high performance and availability of data pipelines and systems to ensure no impact to downstream consumers.
  • Key Responsibilities:
  • Manage outcomes for the on-prem customer platform break / fix service.
  • Build and deliver automated and secure data pipelines that provisions data for all business users and applications (including operational and insight)
  • Work with the DevOps developer and testers to help support and create our AWS & Databricks infrastructure and continuous delivery pipelines.
  • Ensure all developments are tested and deployed within the automated CI / CD pipeline where appropriate.
  • Version and store all development artefacts in the agreed repository
  • Ensure all data are catalogued and appropriate documentation is created and maintained for all ETL code and associated NFR’s.
  • Collaborate with the product owner (Data) & business stakeholders to understand the requirements and capabilities.
  • Collaborate with the lead architect, CCOE to align to the best practice delivery strategy.
  • Participate in the teams agile planning and delivery process to ensure work is delivered in line with the Product Owners priorities
  • Create low level designs for Epic’s and Stories and where required, support the lead architect to create the designs to enable the realization of the Data Lake, Operational Customer DB, Warehouse and marts while ensuring scalability, security by design, ease of use and high availability & reliability.
  • Identify the key capabilities needed for success and the technology choices, coding standards, testing techniques and delivery approach to deliver reliable data services
  • Learn emerging technologies to keep abreast of new or better ways of delivering the Data Pipeline
  • Welcomes a challenge as a new opportunity to learn new things and make new friends whilst always thinking of better techniques to solve problems

At Domestic & General, we are proud of our 100-year legacy and excited about our future growth plans. We are expanding our horizons, entering new markets and territories internationally and we need your expertise to help us on the journey.
Advertise with us.
Please contact: hello@huntukvisasponsors.com
Hunt UK Visa Sponsors

Copyright © 2024

Stay up to date