Tallo logoTallo logo

Jr Data Engineer

Job

Rose IT Corp.

Full-Time

Posted 2 weeks ago (Updated 1 week ago) • Actively hiring

Expires 5/31/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
86
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Role Overview A Junior Data Engineer is responsible for building, maintaining, and optimizing data pipelines and data infrastructure. This role focuses on collecting, transforming, and storing data to support analytics, reporting, and business intelligence. Key Responsibilities Develop and maintain data pipelines for data ingestion and processing Extract, transform, and load (ETL) data from multiple sources Design and manage data storage solutions (data warehouses, data lakes) Write efficient SQL queries for data extraction and transformation Ensure data quality, integrity, and consistency across systems Monitor and troubleshoot data pipeline issues Collaborate with data analysts and data scientists to meet data needs Optimize data workflows for performance and scalability Implement data validation and error-handling mechanisms Document data processes and pipeline architectures Required Skills Strong knowledge of SQL for data querying and transformation Proficiency in Python or Java for data processing Understanding of ETL processes and data pipelines Familiarity with databases (MySQL, PostgreSQL) Basic knowledge of data warehousing concepts Understanding of data structures and algorithms Problem-solving and debugging skills Preferred Skills Experience with big data tools (Apache Spark, Hadoop) Familiarity with cloud platforms (AWS, Azure, Google Cloud Platform) Knowledge of data warehouse tools (Amazon Redshift, Google BigQuery, Snowflake) Exposure to streaming technologies (Kafka) Understanding of workflow orchestration tools (Apache Airflow) Basic knowledge of Docker and CI/CD pipelines
Tools & Technologies Languages:
SQL, Python, Java Tools:
Apache Spark, Airflow, Kafka Databases:
MySQL, PostgreSQL Data Warehouses:
Redshift, BigQuery, Snowflake Platforms:
Linux, AWS/Azure/Google Cloud Platform

Similar remote jobs