Back to Jobs

Snowflake Data Architect

Not Disclosed

Job Description & Details

The demand for secure, high‑performance data platforms in healthcare is soaring, and Snowflake has become the go‑to solution for modern data warehousing. Companies are racing to build scalable architectures that meet strict PHI/HIPAA regulations while delivering rapid insights. This Snowflake Data Architect role in Boston offers a unique chance to lead those initiatives on a 12‑month contract.

Job Summary

We are seeking a senior‑level Snowflake Data Architect to design, build, and optimize end‑to‑end data solutions for a healthcare organization. You will lead data modeling, ETL/ELT pipeline design, cost‑efficient warehouse sizing, and embed governance controls while collaborating with business, analytics, and engineering teams.

Top 3 Critical Skills Table

Skill Why it's critical Mastery Level
Snowflake Architecture (micro‑partitioning, zero‑copy cloning, Snowpipe) Drives performance, scalability, and cost‑efficiency of the data warehouse Senior
ETL/ELT Design (dbt, Informatica, Matillion, Fivetran) Ensures reliable ingestion from diverse healthcare sources and maintains data quality Senior
Data Modeling (Dimensional, Data Vault 2.0, relational) Provides a robust foundation for analytics and compliance with PHI/HIPAA Senior

Interview Preparation

  1. Explain Snowflake’s micro‑partitioning and how it impacts query performance.
    What the interviewer is looking for: Understanding of storage architecture, pruning, and cost implications.
  2. Describe how you would implement CDC (change data capture) using Snowpipe and dbt.
    What the interviewer is looking for: Practical knowledge of real‑time ingestion and transformation pipelines.
  3. Walk through the steps you take to size a Snowflake warehouse for a high‑volume healthcare dataset.
    What the interviewer is looking for: Ability to balance concurrency, compute credits, and performance tuning.
  4. How do you embed PHI/HIPAA governance into data models and pipelines?
    What the interviewer is looking for: Experience with data masking, encryption, audit trails, and role‑based access.
  5. Compare Dimensional modeling vs. Data Vault 2.0 for a large clinical data mart.
    What the interviewer is looking for: Insight into modeling trade‑offs, scalability, and auditability.

Resume Optimization

  • Snowflake
  • Micro‑partitioning
  • Snowpipe
  • dbt
  • ETL/ELT Architecture
  • Data Vault 2.0
  • Dimensional Modeling (Kimball)
  • PHI/HIPAA Governance
  • Azure/AWS Cloud
  • Advanced SQL (window functions, stored procedures)

Application Strategy

When reaching out to the recruiter, send a concise email that starts with a friendly greeting, attaches your updated resume, and clearly highlights your top relevant skills. Make sure to mention related skills you possess, such as Snowflake architecture, ETL/ELT design with dbt, and healthcare data modeling, and reference specific projects where you delivered cost‑optimized, compliant data solutions.

Career Roadmap

Current Role Typical Experience Core Focus Next Position
Snowflake Data Architect (12‑mo contract) 15+ years in data warehousing, strong Snowflake expertise End‑to‑end architecture, governance, performance tuning Lead Data Architecture Manager
Lead Data Architecture Manager 3–5 years leading teams, cross‑cloud strategy Strategic roadmap, stakeholder alignment Director of Data Engineering
Director of Data Engineering 5+ years overseeing enterprise data platforms Innovation, budgeting, enterprise governance VP of Data & Analytics