Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with Eligible.ai. For the most up-to-date job details, please visit the official website by clicking "Apply Now."
About Eligible.ai
Eligible.ai helps UK lenders identify and proactively support customers heading toward financial distress. Our intelligence layer turns raw banking data into timely nudges that keep households on track and lenders compliant. We’re a remote-first, product-led team backed by top fintech investors and working with some of the country’s largest banks.
The Impact You'll MakeYour mission is to ship the intelligence layer that powers every lender decision on our platform. You will have complete ownership to design and build our new data platform from the ground up, transforming how we use data at Eligible.
Working directly with our Head of Engineering, you will lead the green-field redesign of our data architecture – from implementing a modern event-streaming solution to building a historically accurate data warehouse. You will be the foundational engineer turning complex, raw data into a scalable, trusted asset that unblocks our Data Science, Analytics and Product teams.
What You'll Do- Establish the Foundation: Implement a Change Data Capture (CDC) or event streaming solution to create a complete, real-time historical record of our business data in AWS
- Model the Core: Lead the redesign of our analytical warehouse in Redshift. You'll use dbt to build clean, performant and historically accurate data models (e.g. using SCDs) that become the source of truth for the entire company
- Own the Platform End-to-End: Manage and scale our data stack (AWS Redshift, S3, dbt, Airflow, SageMaker) using Infrastructure as Code (e.g. CloudFormation) and robust CI/CD practices
- Enable Insights & ML: Partner with Data Science and Analytics to build data marts that unblock everything from exploratory analysis to the training of production machine learning models in SageMaker
- Champion Quality: Embed best practices for data quality, observability, testing and cost-efficiency into the DNA of the new platform
What We’re Looking ForMust-have
- Extensive experience shipping and maintaining production data platforms in Python and SQL.
- Deep expertise in data warehousing (Redshift preferred) and building clean, scalable data models with dbt.
- Proven experience designing historical data models (e.g. Slowly Changing Dimensions).
- Hands-on experience implementing Change Data Capture (CDC) or event-streaming systems (e.g. Debezium, Kinesis, Fivetran).
- Strong infrastructure skills, including Infrastructure as Code (IaC), CI/CD, monitoring and cost management in a cloud environment (preferably AWS).
- A pragmatic and collaborative approach; you can navigate ambiguity and work effectively with technical and non-technical stakeholders to deliver value.
Bonus points
- Familiarity with AWS SageMaker, particularly integrating it into data pipelines.
- Experience with our application stack (e.g. Django, Heroku).
- Prior work modelling complex user journeys or behavioural data.
- Experience building or serving features for machine learning models.
How We Measure Success- Execution: You ship on time, meeting acceptance and success criteria.
- Quality: Pipelines are observable, well-tested and cost-efficient.
- Collaboration: Stakeholders trust your communication and delivery rhythm.
Why Join Us?- Own the Architecture: Green-field redesign with genuine autonomy.
- Mission that Matters: Help millions avoid financial distress.
- Deep Learning Curve: Real consumer data, real ML, real-time APIs.
- Trusted Team Sport: No gate-keeping – if you can own it, it’s yours.
Application – Show Us Your ThinkingWhen you apply, please answer all three short questions in your cover note:
- Describe one data platform or pipeline that you took from architecture to production. What made it challenging, and how did you measure success?
- Describe a situation where you redesigned an existing data warehouse or analytical model. How did you handle historical data and how did you balance the need for a clean break with the need for continuity?
- What excites you about Eligible.ai’s mission, and how would you prioritise your first 30 days in this role?