Data Engineering Expert
What you’ll do
Mercor is partnering with leading AI labs to advance frontier agent evaluations in data engineering. As a Data Engineering Expert, you’ll build long-horizon pipeline tasks that mirror the work you already do, each paired with a deterministic rubric that grades agent performance against verifiable ground truth. Tasks need to have checkable answers; no open-ended essays, no subjective judgment calls.
Expect to build scenarios across:
- Pipelines: ETL/ELT and dbt models that produce a specified output table, incremental logic with defined watermark behavior
- Orchestration and quality: Airflow/Dagster DAGs that pass a test suite, data quality tests with known pass/fail cases
- Warehouse design: schemas matching a defined contract, performance targets tied to a measured query-time budget
These scenarios will be challenging and take long sessions of focus.
Who we’re looking for
- BS or MS in CS or related; 3+ years in data engineering or analytics engineering
- Expertise in one or more of the following: dbt model development, pipeline orchestration (Airflow, Dagster, Prefect), warehouse design (Snowflake, BigQuery, Redshift, Databricks), data quality and testing
- Comfortable reading and producing data engineering artifacts: dbt models, DAGs, schema docs, data contracts, test suites
- Clear written communication; can articulate reasoning step by step and encode it into deterministic rubrics
Compensation
$90–$125/hr depending on domain depth and prior experience. Strong contributors are promoted based on task quality and throughput.
In-depth analysis: how it works, pay rates, pros & cons, and tips to get hired.
