Posted at: 15 October

Senior Analytics Engineer (Contract)

Company

Uniswap Foundation

The Uniswap Foundation is a nonprofit organization dedicated to supporting innovation in decentralized finance (DeFi) within the Uniswap community, focusing on grants and community development in the fintech and blockchain industries.

Remote Hiring Policy:

The Uniswap Foundation supports remote work and hires from various regions, operating as a decentralized organization without specific geographic restrictions.

Job Type

Contract

Allowed Applicant Locations

Worldwide

Job Description

About the Role

As a Senior Analytics Engineer at the Uniswap Foundation, you’ll design and maintain the data infrastructure that powers Uniswap’s research, growth, and liquidity-mining programs. You’ll own the transformation layer that turns raw on-chain and off-chain data into reliable, analytics-ready models.

Your day-to-day might range from building dbt models and custom subgraphs to structuring Snowflake pipelines that unify on-chain and market data. You’ll help ingest and normalize feeds like Tardis trade data, DEX swaps, and protocol metrics, enabling advanced analytics such as slippage, markout, liquidity efficiency, and incentive ROI. Working closely with analysts and researchers, you’ll ensure the Uniswap ecosystem has accurate, consistent, and queryable data across Dune, Snowflake, and beyond.

 

What You’ll Do

  • Build & optimize data models (dbt, SQL, Python) for Uniswap, Hook protocols, and broader DEX metrics ensuring accuracy, consistency, and performance across chains.

 

  • Contribute to subgraphs, indexers, and adapters (Ponder, The Graph, DefiLlama) to extend coverage and standardize schema definitions across the Uniswap ecosystem.

  • Develop & maintain pipelines that ingest and normalize data from on-chain events and off-chain APIs, transforming and delivering them into Snowflake and Dune.

  • Design complex transformations to power analyses of slippage, markout, liquidity efficiency, and incentive performance ensuring models are reproducible and transparent.

  • Collaborate & iterate: partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret

  • Centralize data sources: merge disparate feeds into a unified repository while provisioning data to where it’s needed

  • Implement monitoring and observability:  add alerts, audits, and lineage tracking to keep pipeline health visible and data quality high.

  • Champion best practices:  modular code, testing, documentation, and reproducible transformations while continuously improving our data stack.

Who You Are

  • Engineering-minded: you treat analytics like software transformations are versioned, tested, and production-grade.

  • Hybrid data fluent: equally comfortable in dbt and SQL as in subgraphs (Ponder/The Graph) or DefiLlama adapters; you understand how to stitch together on-chain and off-chain data into coherent models.

  • Analytical by nature: you think about how data will be consumed by dashboards, analysts, or machine learning models and design schemas that make analysis easy and accurate.

  • Detail-obsessed: you catch data drift, missing decimals, or mismatched schemas before they surface downstream.

  • Collaborative: you enjoy translating analytical needs from Growth, Research, or Grants teams into robust data pipelines and shared models.

  • Curious and future-focused: you stay on top of emerging standards in on-chain data indexing, cloud infrastructure, and open analytics tooling (dbt, DefiLlama Adapters, Dune Spellbook).

  • DeFi-native: you understand core DEX mechanics, liquidity provisioning, and the nuances of Uniswap’s evolving hook ecosystem.

Must‑Have Qualifications

  • Proven experience with dbt (or similar) to build, test and document data models

  • Skilled across both analytical (SQL/dbt) and engineering (Python/TypeScript) layers — comfortable designing schemas, writing transformations, and building data ingestion logic end-to-end.

  • Experience building or extending indexer-style pipelines (e.g., Ponder, The Graph, DefiLlama adapters, or custom RPC-based indexers).

  • Deep understanding of on‑chain data structures, DeFi protocols and DEX workflows.

Nice‑to‑Haves

  • Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse

  • Hands-on experience with scalable data pipelines (Airflow, Dagster, Fivetran, or equivalent), including monitoring and alerting.

  • Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes

  • Advanced degree in Computer Science, Data Engineering, or a related technical field

About Uniswap Foundation

In pursuit of a more open and fair financial system, the Uniswap Foundation supports the growth, decentralization, and sustainability of the Uniswap community. Through grants, we're driving value in five key focus areas: Protocol and Innovation, Developers, Governance, Research, and Security. Our grant making approach is designed to maximize positive impact, bringing in new contributors to our community who focus on building new initiatives, committees, products, infrastructure, and more. To learn more about our community impact, visit uniswapfoundation.org/impact.