Tallo logoTallo logo

SDET Engineer / Data Quality

Job

VDart, Inc.

Frisco, TX (In Person)

Full-Time

Posted 3 weeks ago (Updated 1 week ago) • Actively hiring

Expires 5/28/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
73
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Job Title:
SDET Engineer /
Data Quality Location:
Atlanta, GA / Frisco, TX /
Bellevue, WA Duration:
/
Term:
Contract Experience Desired:
8+
Years Job Description:
We are seeking a Senior SDET Engineer to own quality engineering for our Customer Data Platform (CDP) — the authoritative source of truth for customer data across the entire US adult population. An authoritative source of truth is only authoritative if the data is correct.
This role ensures exactly that:
building comprehensive quality frameworks that validate data accuracy, completeness, and consistency at every stage — from ingestion through identity resolution to consumption. You will go beyond traditional testing to embed quality into the DNA of every pipeline, API, and AI-driven system in CDP. You will work closely with data engineers, AI/ML engineers, platform teams, and product stakeholders to ensure that CDP earns and maintains the trust of every consumer across the organization. Job Responsibilities Design and implement scalable test automation frameworks for data pipelines, APIs, and distributed systems — with quality standards calibrated to CDP''s role as the authoritative source of truth Build data validation frameworks to ensure completeness, accuracy, and consistency of customer profiles across systems (e.g., ADLS, Databricks, Snowflake) Develop manual and automated tests for batch and streaming data pipelines, including reconciliation, anomaly detection, and data freshness validation Validate identity resolution outputs — ensuring deduplication, matching, and golden profile creation meet accuracy thresholds before data reaches downstream consumers Validate API integrations and microservices, including contract testing and performance validation for systems serving real-time customer experiences Drive test strategy for cloud-native and data platforms, including CI/CD integration and shift-left practices that catch quality issues before production Partner with engineering teams to ensure testability, observability, and quality gates are built into every solution — not bolted on after the fact Lead quality initiatives for GenAI/ML-based applications, including prompt validation, output consistency, and evaluation frameworks for LLM-driven features Design and maintain data quality scorecards and dashboards that give stakeholders visibility into CDP''s trustworthiness Analyze production issues, identify root causes, and improve test coverage to prevent recurrence — protecting the platform''s reputation as the source of truth Mentor junior engineers and promote quality engineering best practices across the team Collaborate across teams to support continuous delivery and high availability systems Education and Work Experience Bachelor''s or Master''s degree in Computer Science, Engineering, or related field 6+ years of experience in software quality engineering / SDET roles, preferably in data platforms or cloud environments Strong experience testing data engineering pipelines and large-scale datasets Hands-on experience with cloud platforms (Azure preferred) and modern data stack technologies Experience working in Agile/Scrum environments Technical Skills Strong programming skills in Python, Java, or Scala Experience with data platforms and storage systems: ADLS, Databricks, Snowflake, SQL Server, Cosmos DB Experience validating ETL/ELT pipelines, including batch and streaming (Kafka/Event Hub) Hands-on experience with API testing tools (Postman, Rest Assured, PyTest, etc.) Experience building test automation frameworks from scratch for data-intensive applications Knowledge of CI/CD pipelines and integration with testing frameworks Experience with data quality tools, reconciliation techniques, and query-based validation at scale Exposure to GenAI/LLM validation, Azure AI Foundry, or similar platforms is a strong plus Familiarity with performance testing and scalability validation for distributed systems processing billions of records Knowledge, Skills, and Abilities Strong understanding of data architecture, distributed systems, and the unique quality challenges of population-scale customer data Ability to think in terms of data correctness, lineage, and system reliability — understanding that data quality is the foundation of CDP''s authority Excellent problem-solving and debugging skills in complex, multi-system environments Strong collaboration skills across engineering, AI/ML, product, and operations teams Ability to drive quality as a culture, not just a phase — embedding trust into every layer of the platform
Key Skills:
Data Quality, Azure, Databricks, PySpark, AI/ML

Similar remote jobs

Similar jobs in Frisco, TX

Similar jobs in Texas