Snowflake Dev with strong ETL & SQL Position Available In Seminole, Florida
Tallo's Job Summary: The Snowflake Dev with strong ETL & SQL role in Lake Mary, FL involves designing scalable data pipelines, writing advanced SQL queries, and maintaining Snowflake database objects. Key responsibilities include collaborating with teams, ensuring data quality, and supporting data migration. Required skills include 8+ years of SQL and ETL experience, 5+ years with Snowflake, and proficiency in data warehousing concepts.
Job Description
Role:
Snowflake Dev with strong
ETL & SQL
Location:
Lake Mary, FL Duration:
long
Term Job Description:
Key Responsibilities:
Design and develop scalable data pipelines and ETL processes using Snowflake. Write optimized, advanced SQL queries for data extraction, transformation, and loading (ETL). Develop, implement, and maintain Snowflake database objects including schemas, tables, views, and stored procedures. Work closely with business analysts, data architects, and data engineers to gather requirements and translate them into technical solutions. Ensure performance tuning, query optimization, and best practices in Snowflake development. Support data migration from legacy platforms to Snowflake. Create and maintain technical documentation related to data pipelines, ETL workflows, and Snowflake models. Perform data quality checks and ensure data integrity across systems. Collaborate with cross-functional teams for project delivery in an agile environment.
Required Skills and Qualifications:
8+ years of experience in SQL development, ETL design, and implementation. 5+ years of hands-on experience working with Snowflake Data Warehouse. Strong knowledge of Snowflake architecture, SnowSQL, and Snowflake security frameworks. Proficient in developing ETL pipelines using tools or custom Python/Spark scripts. Strong understanding of data warehousing concepts and dimensional modeling. Experience with any of the cloud platforms Expertise in performance tuning and query optimization. Knowledge of Git, CI/CD pipelines, and version control for data pipeline deployments.