Snowflake Data Engineer Position Available In Wake, North Carolina
Tallo's Job Summary: The Snowflake Data Engineer role in Raleigh, NC entails designing scalable data solutions, optimizing ETL/ELT workflows, and integrating Snowflake with Azure services. Key responsibilities include SQL query optimization, data governance, and automating data workflows. Qualifications include 3+ years of data engineering experience, expertise in Snowflake and SQL, and proficiency in Azure Data Factory.
Job Description
Position:
Contract to hire (C2C
NOT AVAILABLE
)
Location:
Hybrid in Raleigh, NC area About the
Role:
We’re looking for a seasoned Snowflake Data Engineer with 3-5 years of experience building and optimizing cloud-based data architectures and pipelines. The ideal candidate brings deep expertise in SQL, a solid understanding of modern data warehousing, and hands-on skills in Azure Data Factory and SQL Server . You’ll play a key role in designing scalable data solutions to support business intelligence and reporting initiatives.
Key Responsibilities:
Design, develop, and maintain scalable ETL/ELT workflows using Snowflake and Azure Data Factory Optimize data models, query performance, and cost efficiency within Snowflake Write advanced SQL queries to extract, transform, and load data from multiple systems, including SQL Server Integrate Snowflake with Azure cloud services and external tools Implement best practices for data governance , including access controls and data masking Collaborate with stakeholders and technical teams to understand business needs and deliver robust data solutions Proactively monitor data pipelines to ensure quality, reliability, and efficiency Automate data workflows using scripting languages and orchestration tools Stay current on evolving best practices and updates in Snowflake, Azure, and data engineering Qualifications 3+ years of experience in data engineering , with at least 1 year specifically working with Snowflake Strong command of SQL for querying, performance tuning, and data transformation Proficient in Azure Data Factory for pipeline development and orchestration Solid experience with SQL Server for data integration Deep understanding of data warehousing concepts and dimensional modeling Experience with cloud-based data integration and architecture (Azure preferred) Proficiency in a scripting language like Python for automation and task management Strong analytical and troubleshooting skills Ability to thrive in a dynamic, fast-paced setting Familiar with version control systems (e.g., Git) and CI/CD practices in data environments Bonus Points For Building APIs for data ingestion or integration Working with BI tools such as Sigma , Power BI , or Looker Background in the hospitality industry Experience with B2B ERP systems Developing analytics platforms and data visualizations