Copyright © 2025
Disclaimer: Hunt UK Visa Sponsors aggregates job listings from publicly available sources, such as search engines, to assist with your job hunting. We do not claim affiliation with Empiric. For the most up-to-date job details, please visit the official website by clicking "Apply Now."
Snowflake Architect - ETL, AIRFLOW, AWS, SQL, Python, and ETL tools (Streamsets, DBT), RDBMS
A Snowflake Architect is required for a long-term project with a fast-growing company.
Responsibilities:
-Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
-Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
-Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements.
-Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
-Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
-Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
-Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS
Qualifications:
>Bachelor’s degree in computer science, Engineering, or a related field.
>5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
>Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.)
>Hands on experience with Oracle RDBMS
>Data Migration experience to Snowflake
>Experience with AWS services such as S3, Lambda, Redshift, and Glue.
>Strong understanding of data warehousing concepts and data modeling.
>Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
>Understanding/hands on experience in Orchestration solutions such as Airflow
>Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability