Tallo logoTallo logo

AI Systems Engineer

Job

Trinity Industries

Dallas, TX (In Person)

Full-Time

Posted 3 days ago (Updated 20 hours ago) • Actively hiring

Expires 6/12/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
100
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

AI Systems Engineer 🔍 Dallas, Texas, United States NEW 📁 Information Technology 2501173 📅 23 hours ago Delivering Goods For The Good Of All™
  • TrinityRail While _TrinityRail_ has built one of the broadest and deepest railcar platforms in the world, it's what's inside our cars that counts.
We are proud to manufacture, maintain, and manage the railcars that deliver the indispensable goods that keep our lives, and our economy, rolling. "Delivering Goods For The Good Of All" is why we proudly come to work each day, and this is why our company exists. This is our purpose. _TrinityRail_ provides
  • Railcar Manufacturing:
    https://www.trinityrail.com/products/
  • Railcar Leasing Solutions:
    https://www.trinityrail.com/leasing/
  • Railcar Maintenance & Parts:
    https://www.trinityrail.com/maintenance/
Fleet Management Services:
https://www.trinityrail.com/fleet-management/ For more information, visit us at trinityrail.com. Learn more about Trinsight®: https://www.trinityrail.com/trinsight/ Trinsight is _TrinityRail_ 's cloud-based platform purpose-built to bring your fleet into focus, allowing you to transform and streamline your supply chain with real-time tracking, analysis and management features that modern logistics demands Video tran For decades, _TrinityRail_ has been delivering the goods that keep North America moving. A humble promise to transport the everyday things that make everybody's lives better. It's our commitment to being stewards of the supply chain connecting companies with customers. People with products and Delivering Goods for the Good of All. While we've built one of the broadest and deepest railcar platforms in the world, to us it's what's on the inside that counts. And the purpose these products serve for you like grain the secret ingredient for making memories, or gravel that paves the way for a getaway, fuel to power the road trip of a lifetime, cars that transport us to exciting new places, and essential chemicals that purify our water food and promote a healthy way of life. Indispensable goods that truly move us and keep our lives, and our economy rolling. TrinityRail is proud to Trinity Industries is searching for an AI Data Engineer to join our Service Analytics organization, supporting rail optimization and shipper decisioning solutions. In this role, you will use Claude and modern AI tooling to build data pipelines, accelerate data science work, and ship production AI capabilities on top of our Azure and Databricks platform. You will sit at the intersection of data engineering and data science. You will be involved in building data pipelines, training and evaluating models, and building LLM-powered systems — but what amplifies this role beyond a standard DS/DE seat is your fluency with Claude as a development partner: Claude Code, custom Skills, sub-agents, hooks, and the Model Context Protocol (MCP). You will partner with data engineers, data scientists, analysts, and business stakeholders to turn telematics, maintenance, and operations data into decisions our customers and operators can act on. Join our team today and be a part of Delivering Goods for the Good of All! What you'll do: Design, develop, and operate data pipelines on Databricks ( PySpark, SQL, Python) Build and ship LLM applications and agents using the Claude API — document extraction, RAG over maintenance and tariff data, internal copilots, and workflow automation Use Claude Code as a primary engineering tool: author and maintain Claude Code Skills (packaged slash-command workflows), sub-agents, hooks, and MCP integrations that let the team build pipelines and analytical assets faster Partner with data scientists on model development — feature pipelines, evaluation harnesses, training runs, and the path from notebook to production Process and optimize large-scale datasets, including IoT, telematics, and geospatial data, to support analytical and operational use cases Establish and enforce engineering hygiene around AI work — prompt evaluation, cost governance (prompt caching, model routing), monitoring, and drift detection Translate ambiguous business problems from rail, service, and operations stakeholders into shipped pipelines, models, or AI tools Apply version control and collaborative development practices across Azure DevOps repos and pipelines to ensure code quality and deployment readiness Identify and implement process improvements and automation to improve pipeline efficiency, reliability, and maintainability Partner with management to prioritize data initiatives and align engineering solutions with organizational information needs What you'll need: The core test for this role is simple: can you manage pipelines and extract data from Databricks, and can you use Claude as a real engineering partner? If yes, you can do this job. Bachelor's Degree Computer Science, Information Management, or related field required; Masters preferred 8+ years in data engineering including prior experience in data transformation
Databricks:
hands-on experience building, running, and debugging data pipelines using medallion architecture (bronze / silver / gold). Comfortable extracting data, writing PySpark, SQL, and Python, and managing jobs end to end IDE fluency: daily driver in VS Code, Cursor, JetBrains, or equivalent — comfortable in repos, terminals, and modern dev workflows
Claude Code:
hands-on experience with the Claude Code architecture — Skills, sub-agents, hooks, MCP servers, and settings — and how to compose them into reliable engineering workflows. Bring examples of what you have built Claude API (or equivalent): production experience with prompt design, tool use, structured output, and evaluation Applied data science: comfortable with the model lifecycle — feature engineering, evaluation, and the path from notebook to production (You do not need to be a research scientist) Team engineering hygiene: Git, code review, CI, and Azure DevOps repos and pipelines
Communication:
able to explain a model, a pipeline, or a trade-off to a non-technical stakeholder without losing them
THE FOLLOWING MUST ATTACHED TO YOUR APPLICATION
Submit your resume along with a link to a personal GitHub repository (or public gist) showcasing your Claude Code architecture — Skills, sub-agents, hooks, MCP servers, settings, or any combination you have built. A short README explaining what each piece does and why you built it is more valuable than volume. If the repo is private, a redacted tree, settings.json, and one or two example Skill files are sufficient. Candidates who include a working Claude Code repo with their application will be prioritized for the technical round.

Similar remote jobs

Similar jobs in Dallas, TX

Similar jobs in Texas