Tallo logoTallo logo

Data Developer

Job

TESTEREERE

Indio Hills, CA (In Person)

Full-Time

Posted 2 weeks ago (Updated 1 day ago) • Actively hiring

Expires 6/8/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
78
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Data DeveloperPosition Overview We are seeking a pragmatic and detail-oriented Data Developer to design, build, and maintain scalable data pipelines and data solutions that power analytics and operational systems. The ideal candidate will have strong SQL skills, hands-on experience with ETL and modern data stack technologies, a solid understanding of data modeling and warehousing concepts, and proven ability to optimize performance and ensure high data quality and governance.

Key ResponsibilitiesDesign, develop, test, and maintain ETL/ELT pipelines to ingest, transform, and deliver data across transactional and analytical systems.

Develop and maintain data warehouses and data marts using dimensional modeling and star schema principles to support reporting and analytics.

Implement batch and streaming data solutions using technologies such as Spark/PySpark, Spark SQL, Spark Streaming, Kafka, and AWS Kinesis.

Work with cloud data platforms (Snowflake, Amazon Redshift, BigQuery, Azure Synapse) and on-premise Hadoop ecosystems to ensure scalable storage and processing.

Author and optimize complex queries (T-SQL, PL/SQL) and perform performance tuning, indexing, and query optimization for databases and data platforms.

Integrate and orchestrate workflows using Airflow, DBT, SSIS, Informatica, Talend, or equivalent orchestration/ETL tools.

Implement data quality checks, monitoring, metadata management, and data governance practices to ensure trusted and auditable data.

Collaborate with data engineers, analysts, data scientists, and product teams to translate business requirements into robust data solutions.

Build and expose data via APIs (REST/JSON/XML) and support downstream BI tools (Tableau, Power BI, Looker) and analytics consumers.

Adopt software engineering best practices: version control (GIT), CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes).Troubleshoot production issues, perform root cause analysis, and implement scalable remediation and optimization strategies.

QualificationsBachelors degree in Computer Science, Information Systems, Engineering, Mathematics, or related field (or equivalent practical experience).3+ years of hands-on experience building data pipelines, ETL/ELT processes, and data warehouse solutions in production environments.

Strong SQL expertise (T-SQL, PL/SQL) and experience with relational databases; familiarity with NoSQL systems like MongoDB, Cassandra, or Redis is a plus.

Practical experience with cloud data warehouses and big data platforms: Snowflake, Amazon Redshift, BigQuery, Azure Synapse, Hadoop ecosystems.

Proficiency with Spark (PySpark/Scala), Spark SQL, and streaming technologies (Kafka, AWS Kinesis).Experience with ETL and orchestration tools such as Airflow, SSIS, Informatica, Talend, and transformation frameworks like DBT.Solid data modeling skills (dimensional modeling, star schema) and strong understanding of data integration, metadata management, data quality, and governance.

Programming experience in Python (pandas, NumPy), Scala, Java, or R for data processing and automation.

Familiarity with BI tools (Tableau, Power BI, Looker) and building data products to support analytics and reporting.

Knowledge of Linux, shell scripting, REST APIs, JSON/XML, version control (GIT), containerization (Docker), Kubernetes, and CI/CD practices.

Strong problem-solving skills, attention to detail, effective communication, and ability to collaborate with cross-functional teams.

Relevant certifications (e.g., Snowflake, AWS/GCP/Azure data certifications, Databricks) are a plus.

Similar remote jobs

Similar jobs in California