logologo
Hunt UK Visa Sponsors
Jobs
logologoHunt UK Visa Sponsors

Find jobs from UK licensed visa sponsors — Companies House verified, updated daily.

About

How does it workContact Us

Find Work

JobsJobs by RoleLicensed SponsorsVisa TypesSponsor Statistics

Resources

BlogGlossaryOccupation EligibilityIncome Tax CalculatorILR Tracker

Content on this site is for general information only and does not constitute legal advice. Always consult a regulated UK immigration solicitor for advice specific to your situation.

Copyright © 2026. All rights reserved.

Stott and May

Data Engineer

CompanyStott and May
LocationReading, England, United Kingdom
Posted At2/3/2026

UK Visa Sponsorship Analytics

Occupation Type
Telecoms and related network installers and repairers
Occupation Code Skill LevelMedium Skilled
Sponsorship Salary Threshold
£41,700 (£21.38 per hour)
Standard minimum applies

Above analytics are generated algorithmically based on job titles and may not always be the same as the company's job classification. You can also check detailed occupation eligibility, and salary criteria on our UK Visa Eligible Occupations & Salary Thresholds page.

Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with Stott and May. For the most up-to-date job details, please visit the official website by clicking "Apply Now."

Description
Job Description

Job Title: Senior Data Engineer

Location: UK (Hybrid, 2–3 days per week in-office)

Rate: £446/day (Inside IR35)

Contract Duration: 6 months

Additional Requirements: May require occasional travel to Dublin office

About The Role

We are looking for an experienced Senior Data Engineer to join a Data & Analytics (DnA) team. You will design, build, and operate production-grade data products across customer, commercial, financial, sales, and broader data domains. This role is hands-on and heavily focused on Databricks-based engineering, data quality, governance, and DevOps-aligned delivery.

You will work closely with the Data Engineering Manager, Product Owner, Data Product Manager, Data Scientists, Head of Data & Analytics, and IT teams to transform business requirements into governed, decision-grade datasets embedded in business processes and trusted for reporting, analytics, and advanced use cases.

Key Responsibilities

  • Design, build, and maintain pipelines in Databricks using Delta Lake and Delta Live Tables.
  • Implement medallion architectures (Bronze/Silver/Gold) and deliver reusable, discoverable data products.
  • Ensure pipelines meet non-functional requirements such as freshness, latency, completeness, scalability, and cost-efficiency.
  • Own and operate Databricks assets including jobs, notebooks, SQL, and Unity Catalog objects.
  • Apply Git-based DevOps practices, CI/CD, and Databricks Asset Bundles to safely promote changes across environments.
  • Implement monitoring, alerting, runbooks, incident response, and root-cause analysis.
  • Enforce governance and security using Unity Catalog (lineage, classification, ACLs, row/column-level security).
  • Define and maintain data-quality rules, expectations, and SLOs within pipelines.
  • Support root-cause analysis of data anomalies and production issues.
  • Partner with Product Owner, Product Manager, and business stakeholders to translate requirements into functional and non-functional delivery scope.
  • Collaborate with IT platform teams to define data contracts, SLAs, and schema evolution strategies.
  • Produce clear technical documentation (data contracts, source-to-target mappings, release notes).

  • Essential Skills & Experience

    • 6+ years in data engineering or advanced analytics engineering roles.
    • Strong hands-on expertise in Python and SQL.
    • Proven experience building production pipelines in Databricks.
    • Excellent attention to detail, with the ability to create effective documentation and process diagrams.
    • Solid understanding of data modelling, performance tuning, and cost optimisation.

    Desirable Skills & Experience

    • Hands-on experience with Databricks Lakehouse, including Delta Lake and Delta Live Tables for batch/stream pipelines.
    • Knowledge of pipeline health monitoring, SLA/SLO management, and incident response.
    • Unity Catalog governance and security expertise, including lineage, table ACLs, and row/column-level security.
    • Familiarity with Databricks DevOps/DataOps practices (Git-based development, CI/CD, automated testing).
    • Performance and cost optimization strategies for Databricks (autoscaling, Photon/serverless, partitioning, Z-Ordering, OPTIMIZE/VACUUM).
    • Semantic layer and metrics engineering experience for consistent business metrics and self-service analytics.
    • Experience with cloud-native analytics platforms (preferably Azure) operating as enterprise-grade production services.