Portfolio
  • About
  • Skills
  • Projects
  • Experience
  • Education
  • Contact
P
Portfolio

Navigation

  • 01About
  • 02Skills
  • 03Projects
  • 04Experience
  • 05Education
  • 06Contact
Ravi Shrestha

Ravi Shrestha

Other

Available for work
Seattle, WA

0

Yrs Experience

Engineering Portfolio

Other

Data Engineer with 4 years building reliable data pipelines, warehouses, and analytics infrastructure at scale. I bridge the gap between raw data and business decisions — working closely with analytics, ML, and product teams.

  • —Designed and maintained a multi-source ELT pipeline ingesting 15GB+ of daily event data into Snowflake, cutting analyst query times from 40s to under 3s
  • —Migrated a legacy cron-based ETL system to Airflow DAGs with full observability, reducing pipeline failures by 80% and eliminating on-call incidents
  • —Built PipelineKit solo — an open-source CLI for scaffolding production-ready Airflow DAGs with built-in retry logic, alerting, and data quality checks
GitHubLinkedInTwitter

About

02
“

I write pipelines like I write code — modular, tested, and documented. If an analyst can trust the data, the pipeline matters. Observability and data quality checks are first-class citizens, not afterthoughts.

”

Data Engineer with 4 years building and maintaining data infrastructure at scale using Python, Airflow, dbt, and Snowflake.

I started in analytics at a mid-size e-commerce company and kept getting pulled into the infrastructure side — fixing broken pipelines, rebuilding unreliable ingestion jobs, and wondering why the data was always wrong. Eventually I made it official. I've worked across retail, fintech, and SaaS, and I've learned that good data engineering is mostly about trust — making sure downstream teams can rely on what you ship.

CurrentlyCurrently building PipelineKit and exploring real-time streaming with Flink and Kafka.

Skills

Languages

3
PythonSQLScala

data engineering

4
Apache AirflowApache SparkApache Kafkadbt (data build tool)

data warehousing

3
SnowflakeBigQueryDelta Lake

databases

2
PostgreSQLRedis

tools & devops

4
AWS (S3, Glue, Lambda, EMR)DockerTerraformGitHub Actions

data quality

1
Great Expectations

data & ML

2
PandasNumPy
03.

All Projects

video

PipelineKit — Airflow DAG Scaffolding CLI

CLI tool for scaffolding production-ready Apache Airflow DAGs from YAML specs, with built-in retry logic, SLA alerting, and data quality checks via Great Expectations.

PythonApache AirflowGreat ExpectationsClick+1
31Python
video

StreamLedger — Real-Time Financial Event Tracker

Reference architecture for real-time financial event processing using Kafka and Flink, with a live React dashboard.

PythonApache KafkaApache FlinkPostgreSQL+2
12Python
dbt-audit-macros 1
dbt-audit-macros 2

dbt-audit-macros

A collection of dbt macros for automated audit logging, freshness checks, and row-count reconciliation across Snowflake and BigQuery models.

dbtSQLSnowflakeBigQuery+1
44SQL
Aug 2025

02.My Experience

Senior Data Engineer

Meridian Analytics — Seattle, WA

Jan 2023 — Present
3y 3m
CurrentFull-TimeSeattle, WA

Senior data engineer on a 6-person data platform team at a Series B fintech company. Own core ingestion pipelines, the Snowflake data warehouse, and data quality infrastructure used by 30+ analysts and 3 ML engineers.

  • 01Redesigned the company's core ELT architecture from ad-hoc Python scripts to a fully orchestrated Airflow + dbt stack, reducing data freshness SLA breaches by 90%.
  • 02Built a data contract framework with Great Expectations that runs on every pipeline run — catching schema drift and volume anomalies before they reach downstream dashboards.
  • 03Led a cross-functional initiative with analytics and product to define and document 80+ core business metrics in dbt, creating a single source of truth for company-wide reporting.
PythonApache AirflowdbtSnowflakeKafkaGreat ExpectationsTerraform

Data Engineer

Novu Commerce — Portland, OR

Aug 2020 — Dec 2022
2y 5m
Full-TimePortland, OR

Data engineer on a two-person data team at a mid-size e-commerce company. Built and maintained pipelines for marketing, finance, and operations analytics.

  • 01Migrated 14 legacy cron-based ETL jobs to Apache Airflow DAGs with proper retry logic, SLA monitoring, and Slack alerting — reducing weekly pipeline failures from 8-10 incidents to near-zero.
  • 02Implemented a Snowflake data warehouse from scratch, consolidating data from Shopify, Stripe, Google Ads, and an internal PostgreSQL database into a unified analytics layer.
  • 03Built a near-real-time inventory sync pipeline using AWS Lambda and DynamoDB Streams that reduced inventory discrepancy reports by 70%.
PythonApache AirflowSnowflakedbtAWS LambdaDynamoDBPostgreSQL

Academic Background

Education

01

B.S. Information Systems

2016 – 20203yr 10mo

University of Washington — Seattle

02

Data Engineering Nanodegree

2020 – 20215mo

Udacity

Contact

Available for Work

Making Data Teams Actually Trust Their Data

I'm currently open to senior data engineering roles at data-driven companies. If you're looking for someone who can build reliable pipelines, improve data quality, and work closely with analytics and ML teams — feel free to reach out.

Email
ravishrestha2057@gmail.com
Website
#
Location
Seattle, WA

Find me online

Made withSerisLab