Tallo logoTallo logo

Data Engineer

Job

Strategic Staffing Solutions

Charlotte, NC (In Person)

Full-Time

Posted 3 days ago (Updated 19 hours ago) • Actively hiring

Expires 6/12/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
83
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Data Engineer Charlotte, NC We are seeking an experienced Data Engineer with strong expertise in Python, PySpark, and AWS cloud technologies to support enterprise data engineering and ETL development initiatives. The ideal candidate will have hands-on experience designing, developing, and optimizing scalable data pipelines and cloud-based data solutions in an Agile environment. Key Responsibilities Design, develop, and maintain scalable ETL/data engineering solutions using Python and PySpark Build and support cloud-based data pipelines utilizing AWS services including Glue, S3, Lambda, Step Functions, Event Bridge, MSK (Kafka), EKS, and RDS Develop and optimize SQL queries, stored procedures, functions, triggers, and database objects Perform data modeling, data warehousing, data profiling, and data analysis activities Create technical design documentation including HLDs, LLDs, and mapping specifications Participate in coding, unit testing, troubleshooting, and performance tuning activities Support CI/CD and DevOps processes using tools such as GitHub, Bitbucket, and Jenkins Collaborate with cross-functional teams in Agile/SCRUM delivery environments Required Qualifications 4+ years of experience in Data Engineering and ETL development Strong hands-on experience with Python and PySpark Experience with AWS cloud services including Glue, Lambda, S3, Kafka/MSK, Step Functions, EKS, and RDS Experience working with databases such as Postgre
SQL, SQL
Server, Oracle, and Sybase Strong SQL programming and database performance tuning skills Knowledge of CI/CD pipelines and DevOps tools Strong understanding of data modeling and data warehousing concepts Experience working within Agile/SCRUM methodologies Excellent communication and interpersonal skills

Similar remote jobs

Similar jobs in Charlotte, NC

Similar jobs in North Carolina