About us
The Independent is an online news publisher that was established in 1986 as a national newspaper independent of party political affiliations or proprietorial influence. In 2016, The Independent became a fully digital publisher, moving away from print in pursuit of sustainability, and to safeguard its values and journalism for the future.
The Independent has always thrived through innovation and change. It was the first British newspaper to add a Saturday magazine; the first to give photography the same prestige as a news copy; the first to challenge the Westminster lobby system of closed briefings; the first broadsheet to move to the more compact ‘tabloid’ format; the first to launch a concise quality compact paper; and the first – and only – major newspaper to pull off a successful transformation to fully digital publishing.
Through The Independent, Independent TV, eCommerce, indy100, subscriptions and other ‘reader revenues’, The Independent plans to continue the work of many decades, bringing much-needed independent journalism to over 100 million unique global visitors a month, make its voice ever louder and more insistent the world over.
We have an international editorial team with our main offices in London and New York.
In 2024 The Independent’s portfolio of brands increased through a new licensing partnership with BuzzFeed Inc. to operate the BuzzFeed brands in the UK - BuzzFeed UK, Tasty, Seasoned and HuffPost UK. The additional brands echo the existing business ethos and allow for increased audiences and a further strategic diversification of revenue streams.
About You
- A data engineer with advanced knowledge of SQL and hands-on experience with both relational and non-relational databases, supporting data needs in fast-paced, content-driven environments.
- Experience in designing and maintaining scalable data pipelines and architectures, ideally integrating data from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media.
- Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data.
- Experience building robust data models and reporting layers to support performance dashboards, user engagement analytics, ad revenue tracking, and A/B testing frameworks.
- Skilled in cloud-based data platforms and infrastructure (e.g., AWS, GCP), ensuring scalability and security for large volumes of streaming and batch data. Had exposure to data warehouses such as BigQuery or Snowflake.
- Adept in Python and/or Java for developing data services and integrating APIs to bring in diverse sources of media data.
- Strong interpersonal and communication skills, enabling effective collaboration with analytical and commercial teams to turn data into actionable insights.
- Proactive and self-driven, capable of managing multiple data projects in a high tempo setting while meeting tight deadlines.
- A continuous learner with a diligent approach to data engineering including data privacy (e.g., GDPR compliance) and evolving best practices.
Job Purpose
We are looking for a Data Engineer to join our team. They will be responsible for maintaining and supporting the existing data infrastructure used to underpin our data analytics and reporting. They will be accountable for building as well as owning new engineering solutions. These must complement our current, scalable data architecture. The Data Engineer will be expected to optimize the architecture of our data pipelines and ensure that data flows support various cross functional teams across the business.
The ideal candidate will have a self-directed, innovative mindset who is comfortable supporting the data needs of multiple teams. The right candidate will be proactive in identifying and implementing improvements for our systems contributing constructively to the current data eco-system.
As the business continues to invest in cloud solutions, particularly Google Cloud Platform, you will be excited by the prospect of owning new projects, propelling our data initiatives and capabilities. The Data Engineer will be exposed to best practice methods with the current framework.
Key Responsibilities and Accountabilities
- Design and Maintain Data Pipelines: Develop and maintain robust, scalable, and efficient data pipeline architecture to support current and future business needs.
- Engineering and Integration: Assemble large, complex datasets from a variety of structured and unstructured sources, ensuring they meet functional requirements.
- Process Automation and Optimisation: Identify, design, and implement improvements to automate manual processes, enhance data delivery performance, and re-architect infrastructure for improved scalability and resilience.
- ETL Development and Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage.
- Business Intelligence Enablement: Prepare and transform pipeline data to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the organization.
- Cross-Functional Collaboration: Partner with internal stakeholders—ranging from Data, Commercial, and Editorial teams to executive leadership—to address data-related technical challenges and support their infrastructure needs.
- Enhance Data System Functionality: Collaborate with Data Team to continuously improve the functionality, flexibility, and performance of data systems and platforms.
- Data Governance and Compliance: Ensure all data is handled responsibly, securely, and in full compliance with the Data Protection Act, GDPR regulations, and the Company’s Code of Conduct.
Expected Behaviours
- Continuous Improvement – Works in a smart, flexible and focussed way, is open to change and suggests ideas for improvements to the way things are done.
- Attention to detail – precise reporting and communication with internal and external teams.
- Collaboration – Builds and maintains positive and supportive working relationships with colleagues. Helps to create an inclusive and professional work environment.
- Building Capability – Keeps own knowledge and skill set current and evolving and looks for ways to continue learning to support the achievement of business objectives.
- Quality Service – Delivers high quality and efficient service and takes account of the diverse customer needs and requirements when looking at ways to improve service quality.
- Responsive Delivery - Works to agreed business goals and objectives and deals with challenges in a constructive and responsive way. Takes personal responsibility for quality of outputs.
Skills and Experience
- SQL and Database Expertise: Strong working knowledge of SQL with hands-on experience querying and managing relational databases, alongside familiarity with a variety of database technologies (e.g., PostgreSQL, MySQL, BigQuery).
- Big Data Engineering: Exposure to designing, building, and optimizing ‘big data’ pipelines, architectures, and datasets, enabling efficient data processing at scale.
- Analytical Problem Solving: Ideally has performed root cause analysis on internal and external data sources and business processes to resolve issues and uncover opportunities for operational or strategic improvements.
- Unstructured Data Handling: Capability for working with unstructured and semi-structured datasets, transforming raw information into actionable insights.
- Data Workflow Development: Skilled in developing and maintaining data transformation processes, managing data structures, metadata, workload dependencies, and orchestration frameworks.
- Large-scale Data Processing: A demonstrated history of manipulating, processing, and extracting value from large, diverse, and disconnected datasets in fast-moving environments.
- Project Management & Collaboration: Strong project management and organizational skills, with experience supporting and collaborating with cross-functional teams in dynamic and evolving settings.
- Education & Professional Background: Holds a graduate degree in Computer Science, STEM related quantitative field, with 2+ years of hands-on experience in a data engineering role.
- Tools & Technologies:
- Databases: Proficient in relational SQL databases.
- Workflow Management Tools: Exposure to orchestration platforms such as Apache Airflow.
- Programming Languages: Skilled in one or more of the following languages, i.e.: Python, Java, Scala.
- Cloud Infrastructure: Understanding of cloud infrastructure such as GCP and tools within the respective platform
Diversity, Equity and Inclusion
We champion diversity in our teams and in our reporting. As a growing and global brand, we must have a workforce that’s more representative of our readers, viewers, clients and partners, and a workplace that creates a sense of belonging for everyone.
We are committed to hiring and developing a diverse workforce regardless of background, and we support our people to thrive in their careers here.
The Independent is an equal opportunities employer, if you require any reasonable adjustments to complete your application, please do not hesitate to advise us accordingly.
Our values – you will deliver across all our values
Inclusive: We champion diversity in our teams and in our reporting. Working as a team, we put transparency and effective communication at the heart of everything we do.
Innovative: From the very beginning, The Independent has been breaking the mould. We take risks and are always looking to try new ideas in pursuit of excellence.
Independent: Nobody tells us what to think; we make up our own minds and aren’t afraid to do things differently. Like our readers, we value honesty and integrity above outside influences.