Tallo logoTallo logo

Analyst - ADAS Data Analytics

Job

RGBSI

Remote

Full-Time

Posted 2 days ago (Updated 10 hours ago) • Actively hiring

Expires 6/8/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
72
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Analyst - ADAS Data Analytics#26-00705 Dearborn, MI
All On-site Job Description Position Description:
The ADAS Data Analytics team is seeking a high-impact analytics professional to bridge the gap between physical automotive testing and cloud-based big data. In this role, you will be a key driver of the feedback loop for Client's ADAS features such as BlueCruise and Automatic Emergency Braking by transforming raw vehicle data into actionable engineering narratives. We are looking for a developer-minded analyst with expert SQL skills and hands-on experience in the GCP/BigQuery ecosystem (specifically using Dataform or similar tools) to build robust pipelines and high-fidelity visualizations. If you are a data storyteller who wants to see your code translate directly into physical vehicle behavior, this is the role for you.
Technical Core Skills:
Advanced SQL Proficiency:
Expert-level ability to write, debug, and optimize complex queries, including window functions, CTEs, and nested data structures.
Google BigQuery Expertise:
Deep understanding of BigQuery architecture, including partitioning, clustering, and slot utilization to manage large-scale datasets efficiently.
Query Performance Optimization:
Proven track record of auditing and refining slow-running queries to reduce computational costs and improve processing speed.
Big Data Architecture:
Familiarity with modern data warehouse design patterns, schema modeling (Star/Snowflake), and ETL/ELT pipelines. Modern Data Transformation (Dataform/dbt): Experience using Dataform or dbt to manage data transformations, version control, and documentation (Highly Preferred). Python for
Data Analysis:
Ability to use Python (Pandas, NumPy) for data manipulation, automation, and extending analytical capabilities beyond SQL (Preferred). Modern Workflow & AI Integration
AI-Assisted Development:
Proficiency in leveraging Large Language Models (LLMs) like ChatGPT, Claude, or GitHub Copilot to accelerate code generation, SQL debugging, and documentation.
Rapid Prototyping:
Ability to quickly move from a business question to a functional data model or dashboard using a mix of traditional tools and AI productivity boosters. Business Intelligence & Visualization
BI Tool Mastery:
High proficiency in visualization platforms (e.g., Looker, Tableau, Power BI) to build intuitive, self-service reporting environments.
Data Storytelling:
The ability to translate complex technical findings into clear, narrative-driven insights that non-technical stakeholders can act upon.
Actionable Metrics Design:
Experience defining and tracking "North Star" metrics and KPIs that directly correlate with business growth and operational efficiency. Soft Skills & Strategic Thinking
Results-Oriented Mindset:
A focus on "so-what" analytics - ensuring every report or insight has a clear path to driving business results.
Stakeholder Management:
Ability to partner with product, engineering, and leadership teams to gather requirements and deliver data solutions.
Analytical Rigor:
A disciplined approach to data quality, testing, and validation to ensure "one version of the truth."
Data Engineering & Pipeline Development:
Design, develop, and maintain robust data transformation workflows using SQL and Dataform (or similar tools) within the Google Cloud Platform (GCP) / BigQuery ecosystem.
Data Storytelling:
Translate complex ADAS performance metrics into clear, compelling narratives for stakeholders. You will be responsible for showing the "why" behind the numbers.
Visualization & Dashboarding:
Build and maintain high-fidelity data visualizations (e.g., Looker, Tableau, or PowerBI) that provide real-time insights into system performance and customer usage. -
Impact-Driven Analysis:
Identify trends and anomalies in ADAS data to drive improvements in feature safety, comfort, and reliability.
Cross-Functional Collaboration:
Partner with software engineers and feature owners to define data requirements for future ADAS features.
Requirements:
Analytical skills, Troubleshooting (Problem Solving)
Preferred:
Testing, Data/Analytics dashboards,
Business Intelligence, SQL, Data Analysis, Data Governance, Big Query, Python Experience Required:
1
Year of Data Analytics Experience Education Required:
Bachelor's Degree Additional Details:
Position is Hybrid We offer attractive, competitive compensation and benefits including, medical, dental, 401k, short-term disability, AD&D, tuition reimbursement, and more. If you take pride in your work and are committed to personal and professional success, let's talk. Please visit www.zobility.com to learn more. Zobility is RGBSI's workforce management and staffing division. RGBSI is a multi-national corporation headquartered in Troy, MI with branches throughout the USA, Canada, Germany, and India.

Similar remote jobs

Similar jobs in Dearborn, MI

Similar jobs in Michigan