Senior Data Engineer Position Available In Fulton, Georgia
Tallo's Job Summary: The Senior Data Engineer role in Atlanta, GA requires expertise in data warehousing, Snowflake, ETL processes, and BI tools. Responsibilities include designing scalable data solutions, developing ETL/ELT pipelines, optimizing data warehouses, and collaborating with cross-functional teams. Required skills include proficiency in Snowflake, SQL, and Kafka, with a preference for a Bachelor's or Master's degree in a related field.
Job Description
Senior Data Engineer
Role:
Senior Data Engineer
Location:
Atlanta, GA
Contract Primary Skills
Data Modelling Fundamentals, Data Warehousing, ETL Fundamentals, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures, Python, Snowflake Data Exchange, Snowpipe, Snow
SQL, SQL, SQL
(Basic + Advanced), Time Travel and Fail Safe, Zero Copy Cloning
Secondary Skills
Azure, AI/ML
Specialization
Snowflake Data Architecture:
Lead Data Engineer
The ideal candidate will have extensive expertise in data engineering, with a focus on data warehousing, Snowflake, streaming data pipelines, Kafka, Debezium, BI and reporting tools, embedded reporting, and ETL processes. This role will involve designing, implementing, and optimizing scalable data solutions to support business intelligence, analytics, and AI/ML needs.
Key Responsibilities
Data Pipeline Development:
Design, build, and maintain robust ETL/ELT pipelines for efficient data ingestion, transformation, and integration from diverse sources.
Data Warehousing:
Architect and optimize data warehouse solutions using Snowflake, ensuring scalability, performance, and reliability.
Streaming Data Pipelines:
Develop and manage real-time data streaming pipelines using technologies such as Kafka and Debezium.
ERP Integration:
Collaborate with ERP system teams to ensure seamless integration of enterprise data into the data ecosystem. Understand SSO, embedded reporting scenarios.
BI and Reporting:
Work with BI tools (e.g., OAS, Domo, etc) to enable embedded reporting solutions and provide actionable insights.
Data Modeling:
Design scalable data models (e.g., star schema, snowflake schema) to support analytics and reporting needs.
Collaboration:
Partner with cross-functional teams including to understand requirements and deliver tailored solutions.
System Monitoring & Optimization:
Monitor system performance, troubleshoot issues, and implement optimizations to enhance efficiency.
Required Skills & Qualifications Technical Expertise:
Proficiency in Snowflake architecture and SQL for complex queries.
Hands-on experience with Kafka for real-time data streaming.
Familiarity with cloud platforms such as AWS and Azure.
Knowledge of ERP systems integration.
Data Engineering Skills:
Expertise in data modeling techniques (e.g., star schema, snowflake schema).
Solid understanding of access control mechanisms and SSO protocols.
Experience in designing scalable ETL/ELT pipelines.
Soft Skills:
Strong problem-solving abilities in ambiguous environments.
Excellent communication skills for collaborating across teams.
Preferred Qualifications
Bachelor s or Master s degree in Computer Science, Engineering, or a related field.
Experience migrating on-premise systems to cloud-based platforms.
Familiarity with modern CI/CD pipelines for deployment automation.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job
Dice Id:
10110952
Position Id:
8618042