logologo
Hunt UK Visa Sponsors
Jobs
logologoHunt UK Visa Sponsors

Copyright © 2026

About us

How does it workContact UsBlog

Product

JobsJobs by RoleCompanies (Licensed Sponsors)Visa TypesVisa Eligible Occupations & Salary ThresholdsUK Income Tax CalculatorLicensed Sponsors Statistics

Stay up to date

IBM

Data Engineer

CompanyIBM
LocationMarkham, Wales, United Kingdom
Posted At3/4/2026

UK Visa Sponsorship Analytics

Occupation Type
Telecoms and related network installers and repairers
Occupation Code Skill LevelMedium Skilled
Sponsorship Salary Threshold
£41,700 (£21.38 per hour)
Standard minimum applies

Above analytics are generated algorithmically based on job titles and may not always be the same as the company's job classification. You can also check detailed occupation eligibility, and salary criteria on our UK Visa Eligible Occupations & Salary Thresholds page.

Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with IBM. For the most up-to-date job details, please visit the official website by clicking "Apply Now."

Description
Introduction

At IBM Research, we are the innovation engine of IBM. Exploring what’s next in computing and shaping the technologies the world will rely on tomorrow. From advancing AI and hybrid cloud to pioneering practical quantum computing, we anticipate challenges and unlock new opportunities for clients, partners, and society. Working in Research means joining a team that accelerates discovery at the intersection of high-performance computing, AI, quantum, and cloud. You’ll collaborate with leading scientists, engineers, and visionaries to push boundaries and turn ideas into reality. With a culture built on curiosity, creativity, and collaboration, IBM Research offers the opportunity to grow your career while contributing to breakthroughs that transform industries and change the world.

Your Role And Responsibilities

IBM Quantum is building the world’s leading quantum computing systems, software, and cloud services. The Data Engineer in this role will design and operate the data pipelines that power insight into quantum hardware performance, system reliability, user workloads, and platform operations. You will work closely with quantum hardware, firmware, cloud, and product teams to turn diverse technical datasets into trusted analytics assets that guide decision‑making across IBM Quantum’s roadmap.

Preferred Education

Master's Degree

Required Technical And Professional Expertise

  • Design, build, and maintain scalable, reliable data pipelines supporting analytics, operational dashboards, and hardware performance insights for IBM Quantum systems.
  • Develop and operate ETL/ELT workflows with a focus on data quality, accuracy, timeliness, and continuous improvement.
  • Apply advanced SQL skills using PostgreSQL and Presto to support analytical workloads, including complex queries and performance tuning.
  • Build and operate orchestration workflows in Apache Airflow, including dependency management, retries, backfills, monitoring, and operational reliability.
  • Implement data transformations and validations using Python (e.g., pandas and related libraries).
  • Support large‑scale batch processing for high‑volume, heterogeneous datasets, including system telemetry, experiment metadata, cloud operations data, and device performance metrics.
  • Work with streaming platforms such as Apache Kafka or IBM Event Streams to consume event‑driven data from distributed quantum systems and services.
  • Apply streaming architecture concepts including topics, partitions, consumer groups, and schema evolution.
  • Integrate multiple technical data sources—quantum hardware telemetry, calibration data, experiment logs, job execution data, user activity, system health metrics—into trusted analytical datasets.
  • Collaborate with quantum hardware, software, product, SRE, and analytics teams to translate requirements into robust, production-ready data solutions.
  • Use Git-based version control, contribute via code reviews, and follow industry-standard software engineering best practices.

  • Preferred Technical And Professional Experience

    • Experience with Lakehouse solutions and architectures, including IBM watsonx.data
    • Experience with distributed analytics engines such as Presto/Trino, or Apache Spark
    • Familiarity with data modeling techniques for analytical and reliability engineering use cases.