Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with Diligencia. For the most up-to-date job details, please visit the official website by clicking "Apply Now."
Due to our continued success in our online platform, we are expanding our portfolio to increase our current offerings, adding more products to our corporate intelligence and due diligence solutions.
We have a new and exciting opportunity for a Senior Analytics Engineer to join the team based at our Head Office in Oxford.
This opportunity is to work with our Data & BI Team scoping out new potential developments and redevelopments of our data platform used by our clients. We are striving to digitally transform all areas of the company and require a strong, capable, and structured Senior Analytics Engineer to help achieve this. Our data capabilities are predominantly in the cloud, supporting all areas of the business.
Main accountabilities
- Responsible for developing and maintaining the existing data platform in the cloud
- Perform analysis of proposed information requests and collaborate with the business units to drive enterprise reporting
- Write scripts for importing and exporting data into enterprise applications, maintaining the integrity of the data across the applications
- Use enterprise report writing tools to extract data and share information for automated reporting in compliance with regulatory guidelines
- Always strive to meet corporate goals and look for opportunities to use technology to do so
- Select the best technology solutions for business problems
- Ensure pristine data quality in all delivered solutions
- Provide solid documentation on all design and coding projects
- Maintain confidentiality of the information being processed, stored, or accessed by the end-users on the network
- Ensure that the data architecture is scalable and maintainable
- Design, implement, and continuously expand data acquisition by performing extraction, transformation, and loading activities
- Build, debug, monitor and troubleshoot databases and ETL processes to ensure optimal performance, reliability, and integrity
- Design, develop and maintain data stores (data warehouses, data marts and related information repositories) to support organisational reporting, analytics, and integrations
- Translate business needs to technical specifications for BI
- Design, implement, monitor, and maintain BI environments and database servers to meet usage and performance demands
- Establish performance and reliability metrics for all data stores, reports, dashboards, and analytics tools
- Design, implement and tune relational queries, stored procedures, and table-valued functions for optimal performance and reliability
- Create and maintain supporting documentation including data dictionaries, data flows and database schemas necessary to document database environments and ensure maintainability
- Collaborate with developers, product owners and business analysts in conceptualising, designing, and developing new database applications, modules, and enhancements
- Assist with the design and development of data schemas and definitions to support Business Intelligence and Data Warehousing requirements
- Follow and maintain data security policies and practices
This job description provides an indication of the role and responsibilities but should not be construed as an exhaustive list of duties that the post holder may be asked to undertake.
Requirements specific to the role
You will:
- Have a minimum of a BSc/BA in Computer Science or a related field, and other professional management training
- Extensive (5+ years) experience in at least 3 of the following: Azure SQL Server 2022, Azure Data Factory (ADF), Microsoft Fabric, ETL/ELT workflows and Synapse.
- Architectural experience in Azure Data Warehousing, with bonus points for NoSQL (Cosmos DB, MongoDB, etc.).
- Advanced SQL CI/CD automation using Azure DevOps (YAML pipelines) for deployments.
- Proficiency in GIT for repository management, branching strategies, and collaboration.
- Strong Python scripting for data automation, orchestration, and pipeline optimisation.
- Working knowledge of C# for maintaining/modifying legacy Windows applications.
- Experience with Power BI, Quick Sight, or similar tools for analytics.
- Adherence to ETL/data management best practices (e.g., incremental loads, error handling, metadata logging).
- Knowledge of data protection regulations (GDPR, CCPA) and security protocols (encryption, RBAC).
- Agile/SCRUM practitioner with experience in offshore team collaboration.
- Ability to migrate legacy systems to modern cloud architectures.
- Anticipates issues, recommends pragmatic solutions, and writes clean, reusable code.
- Articulate with stakeholders, from engineers to non-technical leaders.
- Manages tight deadlines with attention to detail.
Experience/skills regarded as ideal but not essential
- Excellent understanding of Entity-Relationship/Multidimensional Data Modelling (Star schema, Snowflake schema), Data Warehouse Life Cycle and SQL Server.
- Experience in creating master and child packages, package configurations, logging and using variables and expressions in packages
- Experience working with offshore teams preferred
- Experience writing technical specifications for work to be performed by other developers
- Development experience in C# with good software methodology/practices
- Informatica, Talend, Azure SQL Server, NoSQL Databases including MongoDB, DynamoDB, etc, experience helpful but not required