Tallo logoTallo logo

Data Warehouse Architect

Job

Radinnova

Remote

Full-Time

Posted 3 weeks ago (Updated 1 week ago) • Actively hiring

Expires 5/29/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
75
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Position Overview We are seeking an experienced Data Warehouse Architect to design, build, and evolve our enterprise data platform. This role is central to unifying data across our organization's critical business domains — including Order to Cash (O2C), Procure to Pay (P2P), and Finance Record to Report (R2R) — into a scalable, governed, and performant data warehouse. The ideal candidate brings deep expertise in enterprise data modeling, modern cloud data platforms, and hands-on experience integrating data from both large-scale ERP systems and niche industry platforms. Key Responsibilities Data Architecture & Modeling Design and own enterprise-grade dimensional and relational data models across core business domains: Order to Cash (O2C) , Procure to Pay (P2P) , and Finance Record to Report (R2R) Develop and maintain conceptual, logical, and physical data models aligned to business process frameworks (e.g., APQC, SAP reference models) Establish and enforce data modeling standards, naming conventions, and architectural best practices across the organization Define and manage data domains, subject areas, and data products to support self-service analytics and enterprise reporting Data Pipeline & Integration Architect and oversee end-to-end data pipelines — ingestion, transformation, orchestration, and delivery — for batch and near-real-time workloads Lead the integration of structured and semi-structured data from ERP and operational systems into the data warehouse Evaluate, select, and implement ETL/ELT tools and frameworks appropriate to the platform stack Ensure pipeline reliability, observability, and scalability through monitoring, alerting, and automated testing practices Cloud Platform Engineering Architect and manage cloud-based data warehouse environments on Microsoft Fabric and/or AWS (Redshift, Glue, S3, Lake Formation, or equivalent) Design lakehouse and medallion architectures (Bronze / Silver / Gold) where appropriate Optimize platform performance, cost, and security in alignment with enterprise cloud governance standards ERP & Source System Integration Lead data extraction and mapping from enterprise ERP platforms, like SAP including complex financial, procurement, and sales modules Integrate data from niche and industry-specific ERP platforms including Clipboard K8 and Stone Profit System Collaborate with ERP functional teams to understand business processes, data flows, and source-system data quality characteristics Document source-to-target mappings and maintain data lineage across all integrated systems Governance & Quality Define and implement data governance frameworks, including data ownership, stewardship, and quality rules across integrated domains Partner with data governance and compliance teams to ensure data accuracy, completeness, and auditability — especially for financial reporting (R2R) Implement and maintain master data management (MDM) principles for key entities such as customers, vendors, and chart of accounts Leadership & Collaboration Serve as the technical lead and subject matter expert for all data warehouse and data modeling initiatives Mentor and guide data engineers and analysts on architecture standards and best practices Partner with business stakeholders across Finance, Supply Chain, and Operations to translate requirements into scalable data solutions Engage with vendors, consultants, and third-party data providers as needed Required Qualifications 5-10 years of progressive experience in data warehousing, data architecture, or data engineering roles Proven expertise designing enterprise data models for Order to Cash , Procure to Pay , and/or Finance Record to Report business domains Strong proficiency in SQL and experience with dimensional modeling (Kimball, Inmon) and modern data vault methodologies Hands-on experience building and managing data pipelines using tools such as dbt, Azure Data Factory, AWS Glue, Apache Airflow, or similar Demonstrated experience with cloud data platforms — Microsoft Fabric and/or AWS (Redshift, S3, Glue, Athena, etc.) Meaningful experience integrating data from SAP including financial, logistics, and procurement modules Solid understanding of data governance, data quality frameworks, and metadata management Preferred Qualifications Experience with niche ERP platforms including Clipboard K8 and/or Stone Profit System Familiarity with Microsoft Fabric components: OneLake, Lakehouse, Data Pipelines, and Semantic Models AWS certifications (e.g., AWS Certified Data Analytics - Specialty) or Microsoft certifications (e.g., DP-700 Fabric Analytics Engineer) Experience with BI and reporting platforms such as Power BI, Tableau, or Phocas Background in financial close processes, revenue recognition, or supply chain operations Familiarity with data catalog and lineage tools (e.g., Purview, Alation, Collibra) Experience with real-time or streaming data architectures (Kafka, Kinesis, Event Hubs)
Work Location:
Hybrid remote in Livonia, MI 48150

Similar remote jobs

Similar jobs in Livonia, MI

Similar jobs in Michigan