Tallo logoTallo logo

Cloud Data Engineer

Job

Hewlett Packard Enterprise Company

Remote

Full-Time

Posted 6 days ago (Updated 11 hours ago) • Actively hiring

Expires 6/13/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
100
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Cloud Data Engineer This role has been designed as 'Hybrid' with an expectation that you will work on average 2 days per week from an HPE office.
Who We Are:
Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. Our culture thrives on finding new and better ways to accelerate what's next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE.
Job Description:
Description As a data engineer, you'll ensure the reliability, performance, and security of applications. You'll collaborate with data scientists to design, build, and maintain data pipelines, platforms, and services. You'll bridge the gap between data engineering and advanced AI-driven applications. You'll work with cutting-edge AI/ML technologies like LLMs, autonomous AI agents, enabling intelligent automation and deep data insights. You'll contribute to product development in programming, data analysis and CI/CD infrastructure
Responsibilities :
  • Application design and development in streaming or batch mode over Kafka and Spark
  • Building RAG pipelines and Agents/Tools
  • Evaluate and implement new technologies and tools to improve efficiency and reduce cost
  • Analyze and validate telemetry data, learn error patterns and produce views that show network problem conditions and patterns
  • Work with a team of data scientists, domain experts, architects and other engineers to increase the accuracy of AI outcomes in our device management product
  • Build CI/CD pipelines
  • Work with SMEs and data scientists to increase accuracy of actionable insights Education and Experience Required
  • Degree in Computer Science, Information Systems, or equivalent
  • Master's with 2 years of data engineering experience
  • At least 4 years of work experience in relevant technologies Knowledge and Skills
  • 2+ years of programming experience in Python
  • 1+ years of programming experience in Java
  • Expertise in big data technologies such as Apache Spark or Kafka with at least 1 year of relevant experience
  • Experience with developing Generative-AI and Agentic AI based applications with at least 1 year of relevant experience
  • Experience with managing and analyzing large data sets
  • Experience with containerization and orchestration tools such as Kubernetes and Airflow with at least 1 year of relevant experience
  • Experience developing applications in Cloud computing environments with at least 2 years of relevant experience Addit.
..

Similar remote jobs

Similar jobs in San Juan, TX

Similar jobs in Texas