Snowflake Engineer---
Role: Snowflake Engineer--W2
Visa: NEED USC
Work type: fully remote
Top Skills
Snowflake
Azure
Great comm skills
This is a complete separate team that was created to expedite the monetization of paychex data. This team including the person that will join will be composed of only 3 people so they will be doing end to end from stakeholder engagement to engineering. They chose Snowflake as their enterprise data warehouse about 2 years ago and use Azure for Cloud.
This person needs to have great comm skills, W2 and reports directly to the VP of Data and AI . Will help solution how to monetize Paychex data and pull from multiple onprem data sources. They currently use 700 thousand databases in paychex since every client needs its own. Complex environment, consisting of 9 Petabytes
Key Responsibilities
Snowflake Production Environment & Azure Integration
Design and implement a production-grade Snowflake architecture on Azure, including:
o Role-based access control (RBAC) and least-privilege design
o Warehouses, resource monitors, and cost governance
o Network policies and IP allow/deny strategies
Configure and manage Snowflake Storage Integrations with Azure Data Lake Storage (ADLS Gen2) and/or Azure Blob Storage.
Partner with cloud and security teams to implement:
o Private connectivity (Azure Private Link / Snowflake Private Endpoint)
o Secure authentication (managed identities, service principals)
o Secrets and key management (Azure Key Vault)
Migration from Non-Prod to Prod
Lead the migration of Snowflake objects and data pipelines from non-production to production, including:
o Databases, schemas, views, tasks, streams, pipes, and stages
o External stages backed by Azure Storage
Define migration strategies, validation checks, cutover plans, and rollback procedures.
Ensure production readiness through performance testing, security validation, and operational documentation.
CI/CD & Automation
Build and maintain CI/CD pipelines for Snowflake and data assets using tools such as:
o GitHub Actions or Azure DevOps
o dbt, SQL, Python, and Infrastructure-as-Code (Terraform)
Automate environment promotion (dev test prod) with approval gates, testing, and auditability.
Implement policy-as-code and guardrails for consistent deployments.
Data Quality & Reliability
Help design and implement a data quality program using tools such as Great Expectations or Soda.
Define and enforce data quality checks for:
o Schema changes
o Nulls, ranges, and referential integrity
o Freshness and timeliness
Integrate data quality testing into CI/CD pipelines and alerting workflows.
Partner with data owners to define quality SLAs and remediation processes.
Apply tot his job
Apply To this Job