Tallo logoTallo logo

Data Engineer II

Job

Google

Boulder, CO (In Person)

$128,500 Salary, Full-Time

Posted 1 day ago (Updated 7 hours ago) • Actively hiring

Expires 6/16/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
83
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Data Engineer II corporate_fare Google place Boulder, CO, USA ; Atlanta, GA, USA bar_chart Early Early Experience completing work as directed, and collaborating with teammates; developing knowledge of relevant concepts and processes. info_outline X The application window will be open until at least May 29, 2026. This opportunity will remain online based on business needs which may be before or after the specified date.
Note:
By applying to this position you will have an opportunity to your preferred working location from the following: Boulder, CO, USA; Atlanta, GA, USA .
Minimum qualifications:
Bachelor's degree or equivalent practical experience. 1 year of experience coding in one or more programming languages. 1 year of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.). Experience working with data models by performing exploratory queries and scripts.
Preferred qualifications:
1 year of experience partnering with stakeholders to deliver projects on time, within budget, and scope. Experience writing and maintaining ETLs for structured and unstructured data sources. Experience modeling complex business processes or real-world data sources. Strong familiarity with non-relational (NoSQL) data storage systems and Unix/Linux environments. Excellent written communication and organizational skills. Expert in designing data models, data warehouses, and handling large-scale distributed data processing. About the job gTech's Product and Tools Operations team (gPTO) leverages deep user, operational, and technical insights to innovate Google's Ads products into customer experiences that are so intuitive (or automated) that they require no support at all. gPTO partners closely with gTech's Support, Professional Services, Product Management, and Engineering teams to innovate and simplify our Ads products and build the productivity tools ecosystem for gTech users. In gTech Users and Products (gUP), our mission is to advocate for Google's users by creating helpful and trusted experiences across the product ecosystem. We achieve this by meeting partners and consumers where they are with support and help, representing their needs with our product partners and proposing fixes and features that elevate their engagement with Google's diverse product ecosystem. Additionally we provide a range of product services that ensure our products are optimized for every user, no matter where they are in the world (e.g., localization, digitization, partner integration and more). The US base salary range for this full-time position is $106,000-$151,000 + bonus + equity + benefits. Our salary ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can more about the specific salary range for your preferred location during the hiring process. Please note that the compensation details listed in US role postings reflect the base salary only, and do not include bonus, equity, or benefits. Learn more about . Responsibilities Design, develop, and support data pipelines, warehouses, and reporting systems, leveraging AI to integrate data into full-stack software solutions. Create robust ETL pipelines and reporting systems for new data while continuously improving existing data models and workflows. Partner with stakeholders and support engineers to ensure data infrastructure evolves with business, product, and user requirements. Work closely with data scientists to scale and transition statistical and machine learning models into production pipelines. Write and review clear technical documentation while driving innovation through new analytical tools to unlock deeper data insights.

Similar remote jobs

Similar jobs in Boulder, CO

Similar jobs in Colorado