Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with HCLTech. For the most up-to-date job details, please visit the official website by clicking "Apply Now."
HCLTech is a global technology company, home to more than 220,000 people across 60 countries, delivering industry-leading capabilities centered around digital, engineering, cloud and AI, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion.
Responsibilities:
- Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse.
- Implement data transformations and build analytical data models using dbt (data build tool).
- Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake.
- Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs.
- Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs.
- Ensure data quality, integrity, and governance throughout the data lifecycle.
- Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions.
- Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git.
- Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness.
Qualifications:
- Strong proficiency in SQL, particularly with Snowflake's features and functionalities.
- Extensive experience with dbt for data modeling, transformations, testing, and documentation.
- Solid experience with Apache Airflow for workflow orchestration and scheduling.
- Proficiency in Python for data manipulation, scripting, and automation.
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services.
- Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
- Familiarity with data quality, governance, and security best practices.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work independently and collaboratively in a fast-paced environment.