Applying for Data Analyst II at Brex

Data that drives
decisions. Not
dashboards.

I build the analytical infrastructure, experiments, and self-serve tools that let business teams move without waiting for an analyst in every meeting.

50+ Enterprise clients served
30+ A/B experiments run
22% Campaign ROI lift
3+ Years in analytics
Work that changed
how teams decided.
Vue.ai — Client & Product Analytics
SQL Metrics Layer Across 50+ Enterprise Accounts
Analytics Engineering

Commercial teams at Adidas, Zara, Dune London, and Crocs were pulling numbers from their own dashboards and arriving at different answers to the same questions. There was no single source of truth for engagement, conversion, or pipeline KPIs across the portfolio.

20+ KPIs standardized
50+ Enterprise accounts
0 Reporting discrepancies
Designed and maintained a SQL metrics layer that standardized KPI definitions and query logic across all client accounts. Once teams pulled from the same layer, reporting inconsistencies disappeared and commercial conversations became faster and more credible.
SQLData ModelingKPI DesignStakeholder Reporting
Vue.ai — Experimentation Program
30+ A/B Tests Across Pricing, Promotions, Personalization
Experimentation

Retail clients wanted to know which recommendation strategies and campaign configurations were actually moving revenue, not just clicks. There was no structured experimentation program to answer that reliably.

22% Campaign ROI lift
18% Churn reduction
60% Engagement lift
Built and ran the A/B testing program end to end across hypothesis design, metric selection, significance testing, and stakeholder readout. Findings went directly to commercial leadership and changed how clients allocated spend across channels.
A/B TestingPythonStatistical AnalysisFunnel Analysis
Vue.ai — Data Quality
Catching a Silent Pipeline Failure Before It Reached Clients
Root Cause Analysis

A client's recommendation click-through rate dropped over two weeks. The pipeline showed green. No errors flagged. The data looked clean but the commercial signal was wrong.

<15m Time to resolution
2 Future issues caught early
0 Client-facing errors
An upstream schema change had reclassified user behavior events under a new type our ingestion layer did not recognize. The pipeline was silently discarding behavioral signal. I traced the root cause, fixed the ingestion logic, backfilled the data, and built a statistical validation layer monitoring event type distributions against a 7-day rolling baseline, failing loudly if any category dropped more than 20%.
SQLPythonData QualityAnomaly Detection
WPTI — Analytics Engineering
Self-Serve Dashboards That Improved Retention by 25%
Business Intelligence

Program leaders were receiving delayed, manually exported reports. By the time data reached them, intervention windows for at-risk participants had already closed.

25% Retention improvement
40% Faster reporting
120h Saved monthly
Built Tableau dashboards tracking cohort behavior across 20K+ participant records, replacing a manual export process with a structured ELT pipeline into Redshift. Program leaders could now see early risk signals in real time and act before participants dropped out.
TableauSQLdbtAirflowAmazon Redshift
Built to solve
real problems.
Personal Project — Data Engineering
Real-Time Crypto Analytics Pipeline on AWS
Streaming Architecture

Built a production-grade streaming data pipeline that ingests live Coinbase market data, processes it in real time using Kafka and Spark, and loads analytics-ready tables into AWS for downstream reporting and trend analysis.

Pipeline Architecture
Coinbase API Live market data
Kafka Event streaming
Spark Stream processing
AWS S3 Data lake
Airflow Orchestration
Analytics Reporting layer
Real-time Streaming ingestion
End-to-end Owned architecture
AWS Cloud deployed
Designed and built the entire pipeline from scratch, covering event ingestion, stream processing, data lake storage, and orchestration. Mirrors the kind of high-volume transactional data work relevant to fintech analytics at Brex scale.
Apache KafkaApache SparkAWS S3EC2AirflowPython
Personal Project — Applied AI
Governed AI Analytics Agent
LLM + Analytics

Non-technical stakeholders needed a way to query live data without opening a ticket, waiting for an analyst, or learning SQL. The challenge was making that safe: no write access, no schema hallucinations, no unvalidated results reaching the user.

100% Read-only enforced
0 Schema hallucinations
Live Production queries
The agent translates natural language into validated SQL, enforces SELECT-only policies, grounds every query against an allowed schema, and shows transparent query evidence before returning results. Built with OpenAI API, Guardrails, and custom orchestration. Directly relevant to Brex's requirement for experience applying LLM-based tools to accelerate analyses and build self-service data tools.
PythonOpenAI APISQLGuardrailsLLM Orchestration
Tools I work
with every day.
Query & Analysis
SQL Python PySpark Pandas NumPy scikit-learn
BI & Visualization
Tableau Looker Studio Power BI Excel
Data Infrastructure
BigQuery Snowflake Redshift dbt Airflow Kafka Spark Databricks
Cloud & AI
AWS S3 EC2 GCP OpenAI API LangChain Guardrails

Ready to talk
about the role.

Excited about what Brex is building and how data can make it faster.