Tallo logoTallo logo

SR Data Engineer || Madison, WI or Minneapolis, MN (Hybrid Role)

Job

AKAASA Technologies

Remote

Full-Time

Posted 3 days ago (Updated 11 hours ago) • Actively hiring

Expires 6/8/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
81
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Job Description:
Resumes should not be longer than 5 pages please! Pyspark Azure Data Factory Github Snowflake Role Summary Senior, hands on data engineering contractor to support Modern data platform delivery across Health Payer Domains. This role requires independent ownership of end to end data pipelines using modern cloud and Lakehouse architecture. This is a hit the ground running role with minimal ramp up. ________________________________________ 3. Key Responsibilities Design, build, and operate end to end data pipelines Source ingestion ? transformation ? analytics ready datasets Develop transformations and data models using SQL and Python Implement automated data quality checks and validations Follow Git based development Collaborate with business/domain stakeholders on data rules and definitions Ensure solutions are secure, auditable, reliable, and supportable Produce clear technical documentation and operational notes ________________________________________ 4. Required Technologies (Must Have) Candidates must have strong, recent hands on experience with most of the following: Languages SQL Python (PySpark preferred) Data Platforms / Storage Cloud data platforms (Azure preferred; AWS/Google Cloud Platform acceptable) Azure Data Lake Storage Gen2 (ADLS Gen2) or equivalent Object storage-based data lakes Parquet format Lakehouse concepts (Iceberg and/or Delta) Transformation & Modeling dbt (dbt Core and/or dbt Cloud) Source Control, GitOps & CI/CD GitHub Pull request-based development CI/CD pipelines (GitHub Actions or equivalent) Compute (one or more) Snowflake Microsoft Fabric Databricks

Similar remote jobs

Similar jobs in Minneapolis, MN

Similar jobs in Minnesota