Global Platform Big Data Architect Position Available In Fulton, Georgia
Tallo's Job Summary: This job listing in Fulton - GA has been recently added. Tallo will add a summary here for this job shortly.
Job Description
Global Platform Big Data Architect Equifax – 3.4
Alpharetta, GA Job Details Full-time Estimated:
$142K – $180K a year 19 hours ago Benefits Health insurance Paid time off 401(k) matching Qualifications CI/CD Data modeling Cloud architecture Computer Science Data lake Kubernetes 5 years Big data DevOps Spark AWS Certification NoSQL Google Cloud Platform Data governance Java Master’s degree AWS Docker Bachelor’s degree Machine learning Scala Terraform S3 Kafka Metadata Redshift Senior level Leadership Communication skills Python MLOps Design patterns Full Job Description Alpharetta United States of America Technology Full time 6/23/2025 J00168610 Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. We are seeking a highly experienced and visionary Global Platform Big Data Architect to spearhead the design, development, and evolution of our next-generation Data Fabric platform. This pivotal role will be responsible for defining the architectural roadmap, establishing best practices, and providing expert guidance to engineering teams building scalable, reliable, and secure data solutions across both Google Cloud Platform (GCP) and Amazon Web Services (AWS). The ideal candidate will possess deep technical expertise in big data technologies, cloud-native data services, and a proven track record of delivering complex data platforms. Equifax has a hybrid work schedule that allows for 2 days of remote work (Monday and Friday), with 3 required onsite days (Tuesday, Wednesday, Thursday) every week. This role will work the required onsite days at our Equifax office in Alpharetta, Georgia. This position does not offer immigration sponsorship (current or future) including
F-1 STEM OPT
extension support. This position is not open to third-party vendors or C2C. What you will do
Data Fabric Vision & Strategy:
Define and champion the architectural vision and strategy for our enterprise-wide Data Fabric platform, enabling seamless data discovery, access, integration, and governance across disparate data sources.
Architectural Leadership:
Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.
Technical Guidance & Mentorship:
Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.
Platform Development & Evolution:
Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.
Cloud-Native Expertise:
Leverage deep understanding of GCP and AWS data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, S3, EMR, Kinesis, Redshift, Glue, Athena) to design optimal solutions.
Data Governance & Security:
Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.
Performance & Optimization:
Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.
Innovation & Research:
Stay abreast of emerging big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.
Cross-Functional Collaboration:
Collaborate closely with data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.
Documentation & Standards:
Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development. Proof-of-Concepts (POCs): Lead and execute proof-of-concepts for new technologies and architectural patterns to validate their feasibility and value. What experience you will need Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related quantitative field. 10+ years of progressive experience in data architecture, big data engineering, or cloud platform engineering. 5+ years of hands-on experience specifically designing and building large-scale data platforms in a cloud environment. Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments. Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java). Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink). Experience with various data modeling techniques (dimensional, relational, NoSQL). Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation). Experience with real-time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub). Strong understanding of data governance, data quality, and metadata management concepts. Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non-technical audiences. Proven ability to lead and influence technical teams without direct authority. What could set you apart Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions). GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect). Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda). AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty). Experience with data mesh principles and implementing domain-oriented data architectures. Familiarity with other cloud platforms (e.g., Azure) or on-premise data technologies. Experience with containerization technologies (e.g., Docker, Kubernetes). Knowledge of machine learning operationalization (MLOps) principles and platforms. Contributions to open-source big data projects. #LI-Hybrid #LI-KD1 We offer comprehensive compensation and healthcare packages, 401k matching, paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.