Data Engineer with 4 years building reliable data pipelines, warehouses, and analytics infrastructure at scale. I bridge the gap between raw data and business decisions — working closely with analytics, ML, and product teams.
Data Engineer with 4 years building and maintaining data infrastructure at scale using Python, Airflow, dbt, and Snowflake.
Data Engineer with 4 years building and maintaining data infrastructure at scale using Python, Airflow, dbt, and Snowflake.
I started in analytics at a mid-size e-commerce company and kept getting pulled into the infrastructure side — fixing broken pipelines, rebuilding unreliable ingestion jobs, and wondering why the data was always wrong. Eventually I made it official. I've worked across retail, fintech, and SaaS, and I've learned that good data engineering is mostly about trust — making sure downstream teams can rely on what you ship.
I started in analytics at a mid-size e-commerce company and kept getting pulled into the infrastructure side — fixing broken pipelines, rebuilding unreliable ingestion jobs, and wondering why the data was always wrong. Eventually I made it official. I've worked across retail, fintech, and SaaS, and I've learned that good data engineering is mostly about trust — making sure downstream teams can rely on what you ship.
Currently building PipelineKit and exploring real-time streaming with Flink and Kafka.
Currently building PipelineKit and exploring real-time streaming with Flink and Kafka.
I write pipelines like I write code — modular, tested, and documented. If an analyst can trust the data, the pipeline matters. Observability and data quality checks are first-class citizens, not afterthoughts.
I write pipelines like I write code — modular, tested, and documented. If an analyst can trust the data, the pipeline matters. Observability and data quality checks are first-class citizens, not afterthoughts.
Languages
3 skillsdata engineering
4 skillsdata warehousing
3 skillsdatabases
2 skillstools & devops
4 skillsdata quality
1 skilldata & ML
2 skillsCLI tool for scaffolding production-ready Apache Airflow DAGs from YAML specs, with built-in retry logic, SLA alerting, and data quality checks via Great Expectations.
Reference architecture for real-time financial event processing using Kafka and Flink, with a live React dashboard.
A collection of dbt macros for automated audit logging, freshness checks, and row-count reconciliation across Snowflake and BigQuery models.
Meridian Analytics — Seattle, WA
2023 – Now
3yr 3moCurrentJan 2023 — Present
Senior data engineer on a 6-person data platform team at a Series B fintech company. Own core ingestion pipelines, the Snowflake data warehouse, and data quality infrastructure used by 30+ analysts and 3 ML engineers.
Novu Commerce — Portland, OR
2020 – 2022
2yr 5moAug 2020 — Dec 2022
Data engineer on a two-person data team at a mid-size e-commerce company. Built and maintained pipelines for marketing, finance, and operations analytics.
University of Washington — Seattle
Udacity
Making Data Teams Actually Trust Their Data
I'm currently open to senior data engineering roles at data-driven companies. If you're looking for someone who can build reliable pipelines, improve data quality, and work closely with analytics and ML teams — feel free to reach out.