GCP Data Engineer Position Available In Mecklenburg, North Carolina

Tallo's Job Summary:

Company:
Collabera
Salary:
JobFull-timeOnsite

Job Description

Job Description

GCP Data Engineer Contract:
Charlotte, North Carolina, US Salary Range:

40.00 – 50.00 |

Per Hour Job Code:

360109

End Date:

2025-05-01

Days Left:

6 days, 5 hours left GCP Data Engineer

Location:

Charlotte, NC (Hybrid)

Duration:

12-24 months (Contract)

Client:

 Top Bank
About the

Role:

We are seeking a GCP Data Engineer with expertise in

GCP, ETL

pipelines, Python, and Kubernetes to build and optimize scalable data solutions. This role involves working with GCP services (Dataproc, Composer, GCS), Apache Airflow, Spark, and hybrid cloud clusters to enhance data processing capabilities.

Responsibilities:

Design, develop, and optimize ETL pipelines for efficient data processing.
Work with GCP services (Dataproc, Composer, GCS) to build and manage cloud-based solutions.
Develop Python-based solutions for scripting, automation, and API development.
Implement Spark-based data processing frameworks for handling large-scale data.
Build and manage hybrid cloud clusters using OpenShift and GCP.
Deploy and manage GKE clusters for containerized workloads.
Automate workflows and orchestrate data jobs using Apache Airflow.
Implement CI/CD pipelines for deployment and version control.
Ensure data security, governance, and compliance across cloud and on-premise systems.
Troubleshoot performance issues, optimize queries, and enhance data processing capabilities.

Qualifications & Skills:

4-6 years of hands-on experience in Python development with strong database expertise.
Experience with GCP services (Dataproc, Composer, GCS) and OpenShift environments.
Strong expertise in ETL pipeline development and handling large-scale data transformations.
Proficiency in Spark, Django, and Microservices architecture.
Experience with S3 object storage and handling unstructured data.
Familiarity with API development, CI/CD pipelines, and DevOps best practices.
Hands-on experience in GKE and container orchestration.
Expertise in Apache Airflow for job orchestration and workflow automation.
Strong problem-solving skills with expertise in database query optimization.

Preferred Skills:

Experience with BigQuery, Terraform, and Cloud Functions.
Knowledge of distributed computing frameworks and data lake architectures.
Familiarity with ML/AI model deployment in cloud environments.
Exposure to NoSQL databases like MongoDB, Cassandra, or DynamoDB.
Experience in financial services, healthcare, or retail industries.
Job Requirement
GCP
Google Cloud Platform
GCP Services
Python
Dataproc
Composer
Apache Airflow
Data Engineering
GCS
Google Cloud Storage
GKE
Google Kubernetes
Reach Out to a Recruiter
Recruiter
Email
Phone
Apoorva Pisharoty
apoorva.pisharoty@collabera.com

Other jobs in Mecklenburg

Other jobs in North Carolina

Start charting your path today.

Connect with real educational and career-related opportunities.

Get Started