Senior Data Engineer/Architect Position Available In Davidson, Tennessee
Tallo's Job Summary: This job listing in Davidson - TN has been recently added. Tallo will add a summary here for this job shortly.
Job Description
Title:
Sr.
Data Engineer/Architect Location :
US Remote Work Authorization :
US Citizen or Green Card Type :
Contract to hire/
FTE SCOPE:
Kanini is seeking a highly skilled Senior Data Engineer with deep expertise in Microsoft technologies, Power BI, and strong working knowledge of both Azure Data Services and Google Cloud Platform (GCP) . This role plays a key part in supporting our data-driven initiatives by designing and maintaining scalable data infrastructure and ensuring seamless access to critical insights across the organization.
Key Responsibilities Data Infrastructure & Architecture:
Design, build, and maintain high-performance, scalable, and reliable data pipelines and data architectures on Azure and GCP platforms.
Data Warehousing:
Develop and manage cloud-based data warehouse solutions, particularly using BigQuery and Snowflake , ensuring optimized storage and query performance.
ETL/ELT Development:
Create robust ETL/ELT processes to ingest structured and unstructured data from diverse sources, including point-of-sale (POS) systems, product usage logs, web/eCommerce platforms, and geolocation data.
Analytics & Reporting Enablement:
Enable data access for business users and analysts by building effective reporting layers and dashboards using tools like Power BI , Tableau , and Looker .
Collaboration & Stakeholder Engagement:
Work closely with data analysts, data scientists, and business stakeholders to define data requirements and deliver actionable insights aligned with business goals.
Performance Optimization & Troubleshooting:
Monitor data pipelines for performance, reliability, and integrity. Optimize queries, manage data partitioning and clustering, and resolve technical issues swiftly.
Governance & Security:
Ensure adherence to data governance, quality, and security best practices across all data handling processes.
Tooling:
Utilize modern data orchestration and workflow tools such as Cloud Composer (Apache Airflow) , ADF , Pub/Sub , and Cloud Storage to support data movement and transformation. Required Qualifications Strong proficiency in SQL with expertise in advanced query tuning and performance optimization. Solid hands-on experience with Python for data engineering tasks. Proven experience working with cloud platforms (Azure and/or GCP), especially integrating with Snowflake . Familiarity with Azure Data Factory , Synapse Analytics , and Microsoft Fabric . Deep understanding of data modeling and the ability to design scalable data solutions. Demonstrated ability to implement and enforce data governance and security practices. Experience in agile and product-centric environments . Excellent communication and collaboration skills, with a proactive approach to stakeholder engagement.