Position

Snowflake and DW Developer

Posted

22-Sep-2025

Location

Reading, Pennsylvania

Category

Data Science and Analytics

Remote Friendly

Hybrid

Work Type

Contract

Reference

Salary Range

226854

Compensation: Competitive; Open to negotiation based on experience

Hybrid Details: 3 days/week onsite
Duration: 14 months to start 

Job Description:

  • Design, implement, and optimize efficient ETL processes to transform raw data into actionable insights.
  • Develop and maintain robust data warehouse solutions, including the implementation of star and snowflake schemas.
  • Establish and manage reliable data pipelines to ensure timely data availability.
  • Create modular, maintainable, and scalable dbt workflows for advanced data transformations.
  • Leverage dbt testing, documentation, snapshotting, and Change Data Capture (CDC) for incremental data refresh.
  • Implement and manage Type 2 data modelling techniques for historical data.
  • Develop reusable macros and packages using Python libraries and dbt packages.
  • Optimize complex SQL queries and leverage Snowflake’s performance-enhancing features like Streams, Time Travel, partitioning, and clustering.
  • Orchestrate data pipelines for both batch and near real-time data refresh scenarios.
  • Write and optimize Snowflake SQL queries and stored procedures for seamless data transformation and integration.
  • Ensure compliance with data governance policies and implement security controls for sensitive data.

What we need from you:
  • Bachelor’s degree in computer science, Information Systems, or a related field.
  • 3-5 years of experience in data warehouse development and ETL tools (e.g., dbt, SSIS, Informatica, or Azure Data Factory).
  • 1–2 years of experience with dbt and Snowflake.
  • Proficiency in SQL/PL-SQL for data querying and optimization.
  • Familiarity with Python for enhancing dbt pipelines.
  • Strong analytical, problem-solving, and communication skills.

Additional knowledge and/or experience desired:
  • Hands-on experience with ETL tools such as DBT, SSIS, Informatica, or Azure Data Factory.
  • Knowledge of Snowflake, including query writing and data integration.
  • Familiarity with cloud platforms like Azure Synapse, AWS Redshift, or Snowflake.
  • Experience with Agile methodologies.
  • Experience with CI/CD tools like Jenkins, Docker, or Terraform.

#LI-Hybrid

Talent Groups is an equal opportunity employer. Our goal is to promote an environment that helps our employees and clients appreciate the benefits that diversity provides.

APPLY NOW
Share this job
Interested in this job?
Save Job
CREATE AS ALERT

Similar Jobs

Recruiter Name: Soujanya Gonea

Recruiter Email:  soujanya.g@talentgroups.com

SCHEMA MARKUP ( This text will only show on the editor. )