Senior Software Engineer Developer Position Available In Durham, North Carolina

Tallo's Job Summary: The Senior Software Engineer Developer position at Fidelity Investments in Durham, NC involves developing Big data applications in a Cloud environment using Python and Jenkins. The role includes building and maintaining large scale data processes systems, developing innovative Big Data solutions, and implementing solutions in data analytics spaces. The ideal candidate should have a Bachelor's degree in relevant fields and at least three years of experience as a Senior Software Engineer/Developer. Key skills required include optimizing PySpark scripts, developing end-to-end ETL workflows, designing scalable solutions for data lakes, and writing complex SQL queries. This role offers a competitive salary and the opportunity to work in a hybrid model combining onsite and offsite work experiences.

Company:
Fidelity Investments
Salary:
JobFull-time

Job Description

Senior Software Engineer Developer
Durham, NC

Salary:

listed in the job description or discussed with your recruiter

Published:

May 07, 2025
Onsite

Experience:

Manager

ID:

2111395

Company:

Fidelity Investments

Job Description:
Position Description:

Develops Big data applications in a Cloud environment using technical tools — Python and Jenkins. Builds and maintains large scale data processes systems and develops innovative Big Data solutions in an Agile environment. Builds advanced analytics solutions using Cloud technologies — Amazon Web Services (AWS). Implements Big Data solutions in data analytics spaces. Develops highly scalable distributed systems using Open-source technologies. Provides business solutions by developing complex or multiple software applications.

Primary Responsibilities:

Develops original and creative technical solutions to on-going development efforts.
Designs applications or subsystems on major projects and for/in multiple platforms.
Develops applications for multiple projects supporting several divisional initiatives.
Supports and performs all phases of testing leading to implementation.
Assists in the planning and conducting of user acceptance testing.
Develops comprehensive documentation for multiple applications supporting several corporate initiatives.
Responsible for post-installation testing of any problems.
Establishes project plans for projects of moderate scope.
Works on complex assignments and often multiple phases of a project.
Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.

Education and Experience:

Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems Technologies, Information Assurance, Mathematics, Physics, or a closely related field and three (3) years of experience as a Senior Software Engineer/Developer (or closely related occupation) developing Python packages for Extraction, Transformation and Loading (ETL) processes in Big Data applications.
Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems Technologies, Information Assurance, Mathematics, Physics, or a closely related field and one (1) year of experience as a Senior Software Engineer/Developer (or closely related occupation) developing Python packages for Extraction, Transformation and Loading (ETL) processes in Big Data applications.

Skills and Knowledge:

Candidate must also possess:
Demonstrated Expertise (“DE”) optimizing PySpark scripts using push down predicates in AWS Glue jobs to process large data sets in AWS S3; processing complex nested JSON files; and optimizing parallel processing using grouping files method to reduce out of memory errors and improve efficiency with big data on files exceeding 50k at a time.
DE developing and handling end-to-end ETL workflows, integrations, system maintenance, troubleshooting, developing mapplets for complex requirements, and mappings using Informatica PowerCenter and BDM; and performing deployment activities, upgrades, identifying bottlenecks, improving performance, reducing total run-time, and providing on-call support for production environment.
DE designing and implementing scalable solutions for large-scale data lakes using AWS Glue; ensuring seamless integration with diverse data sources (S3 and RDS); creating Glue crawlers, data catalogs and Glue ETL jobs with structured and unstructured data and transforming for analysis with Athena; and parallelizing reads and writes to improve efficiency and cost reduction, using partitions.
DE writing complex SQL queries in MySQL, Postgres, Oracle, and Snowflake for data validation, checks for anomalies and data consistency, using Built-in SQL constraints, RegEx, and Display types; optimizing SQL performance by techniques including indexing and query restructuring; and developing stored procedures in SQL to encapsulate and streamline data processing logic, for seamless integration into ETL workflows.
#PE1M2
#LI-DNI

Certifications:
Category:

Information Technology
Fidelity’s hybrid working model blends the best of both onsite and offsite work experiences. Working onsite is important for our business strategy and our culture. We also value the benefits that working offsite offers associates. Most hybrid roles require associates to work onsite every other week (all business days, M-F) in a Fidelity office.

Other jobs in Durham

Other jobs in North Carolina

Start charting your path today.

Connect with real educational and career-related opportunities.

Get Started