Tallo logoTallo logo

Senior Data Engineer (GCP, BigQuery, dbt)

Job

Strategic Staffing Solutions

Remote

$210,080 Salary, Full-Time

Posted 5 days ago (Updated 2 days ago) • Actively hiring

Expires 6/7/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
79
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Senior Data Engineer (GCP, BigQuery, dbt)
Duration:
3•6 months contract to hire W2 Rate•$100-102/hr•
NO 1099 NO CTC NO THIRD PARTY
Remote but may need to attend meetings in Richmond, VA on occasion Technical skills and proficiency requirements 5+ years of data engineering experience with strong expertise in GCP (BigQuery, GCS) and modern data stack tools Advanced hands-on experience with dbt Core, including incremental models, snapshots, macros, testing, and semantic layer development Strong BigQuery SQL expertise, including window functions, complex CTEs, SCD modeling, and query/cost optimization techniques Experience building and managing data pipelines and orchestration workflows using Python and tools like Prefect or Cloud Composer (Airflow) Solid understanding of data platform engineering, including BigQuery administration (partitioning, clustering), BigLake, IAM security, and CI/CD for data workflows (GitHub Actions or Cloud Build) Day to Day Design and develop scalable data models using dbt Core, including incremental models, snapshots, macros, and testing frameworks Build and optimize BigQuery SQL transformations, leveraging advanced techniques such as window functions, complex CTEs, and SCD patterns Ensure cost-efficient query performance through partition pruning, clustering strategies, and query optimization Manage and configure BigQuery datasets, including partitioning, clustering, materialized views, and external tables Develop and maintain data pipelines and orchestration workflows using tools like Prefect or Cloud Composer Implement event-driven pipelines (e.g., GCS-triggered workflows) with proper retry logic, monitoring, and alerting Build and maintain data validation frameworks using Python and tools like Great Expectations Configure and manage BigLake external tables over GCS (Parquet/Iceberg), including metadata caching and partition management Implement secure data access controls using GCP IAM, including service accounts and authorized views Collaborate with cross-functional teams to ensure high-quality, reliable data delivery
Pay:
$100.00•$102.00 per hour
Work Location:
Remote

Similar remote jobs

Similar jobs in Richmond, VA

Similar jobs in Virginia