Data Engineer III
Job
SCP Health
Atlanta, GA (In Person)
$119,245 Salary, Full-Time
Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
83
out of 100
Average of individual scores
Skill Insights
Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.
Job Description
Data Engineer III SCP Health - 3.3 Atlanta, GA Job Details Full-time $96,155 - $142,335 a year 1 hour ago Benefits Paid holidays Health insurance Dental insurance Paid time off Vision insurance 401(k) matching Qualifications Cost management Data model design Version control Performance tuning Data Integration (Data management) Cloud data warehouses Automation IT system monitoring HL7 Snowflake SQL Access control implementation Task prioritization Data quality monitoring Requirements analysis Improving database performance Continuous integration Technical writing APIs Data validation Root cause analysis Clustering Business requirements Project stakeholder communication Python Dimensional modeling Full Job Description At SCP Health, what you do matters As part of the SCP Health team, you have an opportunity to make a difference. At our core, we work to bring hospitals and healers together in the pursuit of clinical effectiveness. With a portfolio of over 8 million patients, 7500 providers, 30 states, and 400 healthcare facilities, SCP Health is a leader in clinical practice management spanning the entire continuum of care, including emergency medicine, hospital medicine, wellness, telemedicine, intensive care, and ambulatory care. Why you will love working here: Strong track record of providing excellent work/life balance. Comprehensive benefits package and competitive compensation. - Commitment to fostering an inclusive culture of belonging and empowerment through our core values - collaboration, courage, agility, and respect.
Responsibilities:
Pipeline & Product Development (Lead on Work): Design, build, and operate scalable data pipelines and curated datasets that ingest internal enterprise data (Scheduling, HR, Finance, etc.) and external clinical data (EHR extracts, HL7, FHIR) into the Data Platform; lead defined workstreams from design through implementation, deployment, and production support.Medallion Architecture Execution:
Implement and maintain transformation patterns across Bronze/Silver/Gold layers (dbt + Snowflake) using established team conventions; contribute improvements to modeling practices, documentation, and business rule enforcement through peer review and reusable templates.Domain Stewardship & Data Contracts:
Partner with business and technical stakeholders to document source-to-target mappings, define dataset expectations (inputs, outputs, refresh cadence, and change impacts), and align on definitions for priority domains that support core workflows.Performance, Reliability & FinOps:
Tune Snowflake query performance and pipeline efficiency; apply cost controls (e.g., right-sizing warehouses, scheduling, resource monitors) and communicate consumption drivers and tradeoffs to stakeholders.Governance, Security & Compliance:
Implement HIPAA-aligned controls for PHI/PII (least-privilege RBAC, row-level security, masking, audit logging); follow approved access and handling procedures; support periodic access reviews and respond to audit requests in partnership with Security/Compliance.Integration Delivery & Reusable Patterns:
Lead implementation for onboarding new sources and facility integrations by profiling data, documenting mappings, and applying reusable ingestion and validation patterns that improve reliability and reduce cycle time.Operational Ownership & Incident Response:
Participate in the on-call rotation and lead resolution for complex pipeline failures within assigned areas; perform root-cause analysis, coordinate with partner teams, and document follow-ups (runbooks, alerts, backlog items) to prevent recurrence.Data Quality, Testing & Observability:
Implement robust tests and reconciliation checks (unit, integration, and validation); define and monitor freshness/completeness targets for priority pipelines; and maintain monitoring/alerting to detect and remediate recurring data issues.Documentation, Metadata & Standards:
Create and maintain technical documentation (runbooks, lineage notes, data dictionaries, operating procedures) and key metadata (definitions, ownership, refresh cadence) to improve discoverability, auditability, and consistent delivery.Release Engineering & Change Management:
Execute safe deployment practices across environments (dev/test/prod) including version control, peer review, automated checks, and rollback planning; propose incremental improvements that reduce defects and improve delivery reliability.Mentorship & Peer Leadership:
Support the growth of Data Engineers I/II through pairing, code/design reviews, and knowledge sharing; contribute to team execution by helping break down work, identifying risks, and coordinating with Analytics, App Dev, and Integration partners.Requirements Translation & Delivery Partnership:
Partner with business and analytics stakeholders to clarify requirements for assigned initiatives, define acceptance criteria, and deliver curated datasets that support reporting, dashboards, and downstream operational workflows.Knowledge, Skills, and Abilities:
Advanced SQL & Programming:
Strong proficiency in complex SQL (window functions, joins, CTEs) and performance analysis (query plans, pruning/clustering considerations). Proficient in Python for automation, API interaction, and building reusable utilities.Snowflake Platform Knowledge:
Strong working knowledge of Snowflake concepts and features (warehouses, micro-partitioning, clustering, tasks/streams, zero-copy cloning, sharing), and the ability to diagnose performance and cost drivers using usage and query profile data. dbtProficiency:
Proficiency with dbt best practices including project organization, sources/exposures, macros, documentation, testing, and incremental strategies; ability to follow and contribute to shared conventions across environments.Data Modeling & Semantic Design:
Ability to model data for analytics and reporting (e.g., dimensional modeling), manage slowly changing dimensions, and apply naming/definition conventions that improve usability, consistency, and downstream performance.Healthcare Data & Workflow Literacy:
Working knowledge of common healthcare data concepts and formats (HL7, FHIR, EMR extracts), including encounters, diagnoses, procedures, provider/location identifiers, and how these map to revenue cycle and operational workflows.Data Quality & Reliability Practices:
Knowledge of validation techniques (schema, volume, reconciliation, anomaly detection), test design, and monitoring/alerting approaches; ability to investigate data issues methodically and document root causes and prevention steps.Integration Patterns & Data Movement:
Understanding of batch and near-real-time ingestion patterns (watermarking, CDC, backfills, replayability, idempotency) and common data formats (JSON, CSV, Parquet), with the ability to design recoverable processing and handle late or changing source data. Engineering Practices (CI/CD & Code Quality): Ability to apply disciplined development practices for data (version control, peer review, automated testing, environment promotion) and to write maintainable code using shared libraries, packages, and conventions.Governance, Security & Privacy:
Working knowledge of least-privilege RBAC, masking, row-level policies, and auditing for PHI/PII; ability to apply secure data handling practices and recognize when to escalate access, compliance, or control concerns.Stakeholder Communication:
Ability to gather and clarify requirements, communicate status and risks, and explain technical tradeoffs in a way that aligns business intent with implementable data solutions.Prioritization & Work Management:
Ability to estimate effort, sequence work, manage competing priorities, and adapt plans based on feedback, incidents, and shifting business needs. Cost Awareness (FinOps): Ability to interpret platform usage metrics, identify primary cost drivers, and recommend right-sizing and scheduling opportunities that balance performance, reliability, and consumption.Pay Range:
96,155.00 - 142,335.00 USD annually This range represents the anticipated base salary for this role. Actual compensation will be determined based on experience, qualifications, and internal equity considerations. We offer a comprehensive benefits package designed to support your health, financial well-being, and work-life balance, including medical dental, vision insurance, a 401(k) plan with a company match, paid time off and holidays, professional development support, and employee wellness resources. Visit our website for further information.https:
//myscpbenefits.com/ Login name:
corp-guestPassword:
wehealSimilar remote jobs
UnitedHealth Group
Fort Wayne, IN
Posted2 days ago
Updated13 hours ago
Similar jobs in Atlanta, GA
Emory Healthcare/Emory University
Atlanta, GA
Posted2 days ago
Updated13 hours ago
Amcor Flexibles, LLC
Atlanta, GA
Posted2 days ago
Updated13 hours ago
Similar jobs in Georgia
Marietta City Schools
Marietta, GA
Posted2 days ago
Updated13 hours ago