Tallo logoTallo logo

Lead Analytics Engineer

Job

eShipping, LLC

Charlotte, NC (In Person)

$97,500 Salary, Full-Time

Posted 4 days ago (Updated 13 hours ago) • Actively hiring

Expires 6/11/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
83
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Lead Analytics Engineer eShipping, LLC - 2.8 Charlotte, NC Job Details Full-time $80,000 - $115,000 a year 8 hours ago Qualifications Data model design Data visualization software proficiency SQL databases Data modeling projects Spark Scalable systems SQL Data Architecture Design (Architecture design skills) Requirements analysis Mentoring Data analytics Scalability Data validation Cross-functional collaboration Technical Proficiency Project stakeholder communication Python Cross-functional communication Full Job Description Position Summary The Lead Analytics Engineer serves as the primary analytics resource embedded within the Solutions team, bridging the gap between complex data systems and business decision-making. This role combines deep technical expertise in analytics engineering with a consultative partnership approach — translating business needs into well-structured data models, building scalable data pipelines, and equipping cross-functional stakeholders with the insights and tools they need to drive outcomes. The Lead Analytics Engineer also provides technical mentorship and guidance to peers, reviewing work for accuracy, and helping elevate the team's overall data maturity. Essential Duties and Responsibilities Duties include but are not limited to the following: Design, build, and maintain scalable data models, reusable datasets, and analytics-ready assets that support reliable reporting, self-service analysis, and downstream decision-making across the organization Use SQL expertly to query, validate, and optimize data workflows, serving as a bridge between business questions, source systems, and scalable analytics solutions Write and maintain Python-based data transformation logic, including production-grade PySpark pipelines, to manipulate, validate, and operationalize complex datasets at scale Implement and manage bronze/silver/gold data modeling patterns within a Delta Lake or comparable lakehouse architecture Partner directly with the Solutions team as an embedded analytics resource, proactively identifying opportunities to leverage data for operational improvements Translate business requirements into technical specifications and deliver actionable insights to non-technical stakeholders Guide other team members on analytics engineering best practices, data modeling standards, and technical approaches Review the work of others to ensure data accuracy, consistency, and adherence to established standards Read and tune established reporting solutions to diagnose and resolve performance issues Collaborate with engineering, operations, finance, and customer success teams to understand evolving data needs Evaluate, learn, and adopt new tools, platforms, and frameworks quickly, helping the team stay effective in a fast-evolving data environment Specific Department Responsibilities Serve as point of contact between data engineering function and Solutions team, fostering an embedded partnership Proactively identify gaps in existing data models and reporting and recommend improvements Contribute to the development and evolution of the organization's data strategy, including architecture decisions, tooling, and governance standards Support the evaluation and adoption of new data technologies and platforms Create and maintain technical documentation for data models, pipelines, and processes Participate in code reviews and provide constructive, growth-oriented feedback to peers Communicate project status, technical trade-offs, and data insights to both technical and non-technical audiences Required Skills and Abilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skills, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions. Advanced proficiency in Python for data manipulation and analytics engineering, including writing clean, maintainable code to transform, validate, and operationalize complex datasets Expert-level proficiency in SQL, including window functions, CTEs, complex joins, MERGE operations, and query execution plan analysis, with the ability to use SQL as a bridge between business needs and scalable data solutions Familiarity with Delta Lake or comparable lakehouse technologies, including schema evolution, time travel, and medallion architecture patterns Demonstrated ability to quickly learn new tools, platforms, and frameworks, and become productive with emerging technologies in a fast-evolving data environment Demonstrated ability to translate complex business requirements into well-structured, scalable data models Excellent written and verbal communication skills, with ability to explain technical concepts to non-technical stakeholders Strong analytical and problem-solving skills with keen attention to detail Ability to work independently with minimal oversight while exercising sound judgment Comfortable mentoring others and providing technical guidance without direct management authority Ability to manage multiple priorities and adapt in a fast-paced environment Experience working with BI and visualization tools (e.g., Power BI, Apache Superset, or similar) Familiarity with cloud data platforms such as Apache Spark, Databricks, Azure Data Lake, or comparable environments Minimum Education and Experience Bachelor's degree in Computer Science, Data Science, Information Systems, Statistics, or a related field — or equivalent practical experience 5+ years of professional experience in analytics engineering, data engineering, or a senior data analyst role with significant hands-on data modeling responsibilities Hands-on production experience with Apache Spark and PySpark Working experience with lakehouse architectures (Apache Spark or comparable) Track record of partnering directly with business teams (operations, finance, solutions, customer success, etc.) as a primary analytics resource Experience mentoring or guiding peers in a technical environment Freight, logistics, or transportation industry experience preferred Physical Demands and Work Environment The physical demands and work environment characteristics described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions. This description reflects management's assignment of essential functions; it does not proscribe or restrict the tasks that may be assigned.
Physical Demands:
While performing the duties of this job, the employee is regularly required to remain in a stationary position for at least 50% of the time. The employee needs to occasionally move about inside the office to access file cabinets, office machinery, etc. The general level of physical activity would be defined as sedentary. The employee is regularly required to operate a computer and other office productivity machinery, such as a calculator, telephone, copy machine, and printer. Some movements of the hands, arms, and wrists may involve repetitive motions. Specific vision abilities required by this job include the ability to detect, determine, perceive, identify, recognize, judge, observe, inspect, estimate, and assess various activities and surroundings.
Cognitive/Mental Requirements:
While performing the duties of this job, the employee is regularly required to comprehend and use basic language, either written or spoken, to communicate simple and complex information, ideas, and information. The employee is also required to use logic to define problems, collect information, establish facts, draw valid conclusions, interpret information, and deal with abstract variables for unique or unfamiliar situations. The employee must use problem-solving skills to formulate and apply appropriate courses of action for routine or familiar situations. The employee may be required to perform numerical operations including basic counting, adding, subtracting, multiplying, and dividing or more complex quantitative calculations.
Work Environment:
While performing the duties of this job, the employee is inside a central heat and air-conditioned office building. The noise level in the work environment is minimal. Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of an employee. Duties, responsibilities, and activities may change at any time with or without notice. eShipping is an Equal Opportunity Employer.

Similar remote jobs

Similar jobs in Charlotte, NC

Similar jobs in North Carolina