PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…

ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Snowflake Data Engineer:
-
Employment Type:
Full-Time
-
Location:
Chicago, IL (Onsite)
Do you meet the requirements for this job?
Snowflake Data Engineer
About the Opportunity
Our client is a leading organization in the B2B media and data analytics space, focused on helping enterprise clients transform raw data into actionable insights. As they continue modernizing their data infrastructure, they are seeking an experienced Snowflake Data Engineer to lead critical efforts around data warehouse design, cloud migration, and data pipeline optimization.
This is a unique opportunity to make a direct impact on a highly visible platform powering media insights and analytics for some of the biggest brands in the industry.
Key Responsibilities
-
Design and architect scalable and efficient Snowflake data warehouse solutions.
-
Enhance and maintain the existing platform migration from SQL Server to Snowflake.
-
Build and optimize ELT/ETL pipelines and workflows using tools like Airbyte and Python.
-
Collaborate with internal teams to translate business requirements into efficient, scalable technical solutions.
-
Own performance tuning and resource optimization in Snowflake.
-
Mentor junior engineers and act as a Snowflake SME within the team.
-
Monitor and troubleshoot data flow, ingestion, and transformation issues.
-
Partner closely with DevOps to ensure strong integration with Kubernetes, Docker, and AWS-based infrastructure.
Required Skills & Qualifications
-
Bachelor’s degree in Computer Science, Engineering, or a related field.
-
3+ years of hands-on experience working with Snowflake in production environments.
-
2+ years of working experience with AWS services (EKS, VPC, ELB, IAM, etc.).
-
2+ years building data models and layers for visualization tools such as Sigma.
-
Proficient in SQL and Python for data transformation and scripting.
-
Experience with data replication, cloud migration, and Snowflake architecture best practices.
-
Familiarity with Kubernetes and Docker in a cloud environment.
-
Knowledge of ETL/ELT tools such as Airbyte.
Preferred Qualifications
-
Snowflake certification (SnowPro or higher).
-
1+ years of experience using Airbyte for data integration.
-
Familiarity with data governance practices and data quality frameworks.
Why Apply?
-
Work with a cutting-edge cloud data stack (Snowflake, Airbyte, AWS, Kubernetes).
-
Be a hands-on contributor to a high-impact migration and modernization initiative.
-
Join a collaborative, fast-moving team that values innovation and clean engineering.
Interested candidates should be based in the U.S. and authorized to work without sponsorship.