Home  /  Jobs  /  Senior Operations & Data Engineer (Snowflake Specialist)  /  230328

Senior Operations & Data Engineer (Snowflake Specialist)

Job ID:

230328

Job Title:

Senior Operations & Data Engineer (Snowflake Specialist)

Work Type:

Contract

Location:

Denver, CO

Pay Range:

$54.00 - $64.00 Per Hour

Employment Type:

Hybrid
Hybrid Details: Denver, CO - Hybrid

Duration: 12 months to start

Job Description
  • The client is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Snowflake ecosystem.
  • This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data (FTI/CJIS) is managed within a secure, high-uptime, and cost-effective environment.
Preferred  Qualifications
To be considered for this role, candidates should  provide proof of the following:
  • Active Snowflake Certification
  • Background Clearance Readiness: Absolute eligibility to pass the client's, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.
Key Responsibilities
Platform Operations & Administration
  • Snowflake Mastery: Act as the lead administrator for Snowflake environments; manage platform uptime, vendor escalations, and patch/versioning communications.
  • Environment Provisioning: Configure Snowflake, including complex RBAC (Role-Based Access Control) and security permissions.
  • Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team.
  • Financial Stewardship: Configure cost-management features such as Snowflake resource monitors, budgets, and consumption tracking; consult on chargeback models.
Data Engineering & Transformation
  • Pipeline Architecture: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment.
  • Analytical Modeling: Translate Data Architect visions into technical reality by building complex transformations and target schemas.
  • Quality Management: Design and deploy automated data cleansing and quality-check pipelines.
  • Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.
Primary Deliverables
  • Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents.
  • Deployment Assets: New account/environment deployments, security schemas, and permission assignments.
  • Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines.
  • Product Backlog & Support Ticket Management; performance reports
  • Weekly Status Reports
#LI-Hybrid

This email is already in use, please login

Apply Today

Please wait while we fetch your resume and information...
Submit →

This job is no longer available

Share This Job
SCHEMA MARKUP ( This text will only show on the editor. )