Tallo logoTallo logo

Senior Data Engineer

Job

CTP Consulting

Glendale, CA (In Person)

$176,800 Salary, Full-Time

Posted 1 week ago (Updated 5 days ago) • Actively hiring

Expires 6/3/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
78
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Senior Data Engineer CTP Consulting Glendale, CA Job Details Contract $80 - $90 an hour 1 day ago Benefits Health insurance Dental insurance 401(k) Qualifications Data engineering Data modeling Data transformation pipeline development Cloud data warehouses Spark Scalable systems Snowflake SQL AWS Bachelor's degree Scalability Distributed computing Python Database software proficiency
Full Job Description Senior Data Engineer Location:
Glendale, CA (
Hybrid:
2-3 days onsite)
Contract:
12
Months Compensation:
~$90/hr This is an opportunity to contribute to a large-scale data ecosystem supporting a global media organization at the intersection of technology and storytelling. The team focuses on building robust, scalable data platforms that power critical functions, from financial operations to production workflows and consumer-facing analytics. As a Senior Data Engineer , you'll be embedded in a high-performing data services group, working on a core data platform that enables enterprise-wide insights. The environment is fast-moving and highly collaborative, with strong emphasis on data reliability, scalability, and governance . Key Responsibilities Design, enhance, and maintain end-to-end data pipelines supporting enterprise data needs Contribute to platform capabilities around data lineage, governance, and privacy Work within a modern data stack (Airflow, Spark, Databricks, Delta Lake, Snowflake) Collaborate with cross-functional teams including product, engineering, and analytics Drive consistency through standards, best practices, and documentation Ensure datasets meet performance, accuracy, and SLA expectations Participate in Agile workflows and contribute to continuous delivery improvements Engage with stakeholders to align data solutions with evolving business needs Required Experience 7+ years in data engineering , with a focus on large-scale pipeline development Strong programming experience in Python, Java , or Scala Advanced SQL skills and experience working with complex data models Hands-on experience with Airflow (or similar orchestration tools) Experience with Snowflake or comparable cloud data warehouses Solid understanding of data modeling, OLTP vs OLAP, and distributed systems Experience working in cloud environments (AWS preferred) Preferred / Bonus Exposure to Kubernetes or containerized environments Experience with ingestion tools (e.g., Fivetran, Airbyte, Matillion) Background in data services engineering or distributed processing frameworks
Pay:
$80.00 - $90.00 per hour
Benefits:
401(k) Dental insurance Health insurance
Education:
Bachelor's (Required)
Experience:
SQL:
5 years (Required)
AWS:
1 year (Preferred)
Airflow:
1 year (Preferred)
Kubernetes:
1 year (Preferred)
Data Engineering:
7 years (Required)
Snowflake:
5 years (Required) Ability to
Commute:
Glendale, CA 91201 (Required)
Work Location:
In person

Similar remote jobs

Similar jobs in Glendale, CA

Similar jobs in California