Tallo logoTallo logo

Remote Sr. Data Engineer

Job

Insight Global

Remote

Full-Time

Posted 5 days ago (Updated 3 hours ago) • Actively hiring

Expires 6/8/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
80
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Job Description The Senior Data Engineer will design, code, test, and analyze software programs and applications. This includes researching, designing, documenting, and modifying software specifications throughout the production lifecycle. This role will also analyze and amend software errors in a timely and accurate fashion and provide status reports where required. The position responsibilities outlined below are not all encompassing. Other duties, responsibilities, and qualifications may be required and/or assigned as necessary.
POSITION RESPONSIBILITIES
Work with Product team to determine requirements and propose approaches to address users' needs.
  • Analyze requirements to determine approach/proposed solution.
  • Design and Build Solutions using relevant programming language.
  • Thoroughly test solutions using relevant approaches and tools
  • Conduct research into software-related issues and products
  • Bring out-of-box thinking and solutions to address challenging issues
  • Effectively prioritize and execute tasks in a fast-paced environment
  • Work both independently and in a team-oriented, collaborative environment
  • Flexible and adaptable to learning and understanding new technologies
  • Highly self-motivated and directed
  • Demonstrate a commitment to Hyatt core values
  • Exercise independent judgment in methods and techniques for obtaining results.
  • Work in an agile/scrum environment.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.

To learn more about how we collect, keep, and process your private information, please review
Insight Global's Workforce Privacy Policy:
https://insightglobal.com/workforce-privacy-policy/. Skills and Requirements Required Skills - Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners
  • 5-8 years of experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business Apache Airflow, Snowflake, Writing dags for Python for Airflow, AWS, Local to Chicago is a plus
  • Experience in one of the scripting languages: Python or Unix Scripting
  • Proficient in SQL, PL/SQL, relational databases (RDBMS), database concepts and dimensional modeling
  • Strong verbal and written communication skills
  • Demonstrate analytical and problem-solving skills, particularly those that apply to Data Warehouse and Big Data environments.
  • Very good understanding of the full software development life cycle
  • Very good understanding of Data warehousing concepts and approaches
  • Experience in building Data pipelines and ETL approaches using Informatica IICS, Alteryx or any leading pipeline tool.
  • Experience in building high-volume data workflows in a cloud environment
  • Experience in building Data warehouse and Business intelligence projects
  • Good Experience using Snowflake environment.
  • Good Experience using streaming data using Kafka.
  • Experience in data cleansing, data validation and data wrangling.
  • Hands-on experience in AWS cloud and AWS native technologies such as Glue, Lambda, Kinesis, Lake Formation, S3, Redshift
  • Experience using Spark EMR, RDS, EC2, Athena, API capabilities, CloudWatch, CloudTrail is a plus
  • Experience with Business Intelligence tools like Tableau, Cognos, ThoughtSpot, etc is a plus.
  • Experience in one of the scripting languages: Python or Unix Scripting
  • Proficient in SQL, PL/SQL, relational databases (RDBMS), database concepts and dimensional modeling
  • Strong verbal and written communication skills
  • Demonstrate analytical and problem-solving skills, particularly those that apply to Data Warehouse and Big Data environments.
  • Resources should be able work along with On-site hours to participate in Req/Design discussions.
  • Experience using Spark EMR, RDS, EC2, Athena, API capabilities, CloudWatch, CloudTrail is a plus
  • Experience with Business Intelligence tools like Tableau, Cognos, ThoughtSpot, etc is a plus.
Hospitality industry

Similar remote jobs

Similar jobs in Chicago, IL

Similar jobs in Illinois