Big Data Engineer
Job
Tential | ConsultNet | Thoughtwave Software and Solutions, Inc. | Software Guidance & Assistance
Rockville, MD (In Person)
Full-Time
Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
82
out of 100
Average of individual scores
Skill Insights
Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.
Job Description
Big Data Engineer
Job Description SummaryWe are seeking a highly skilled and experienced Big Data Engineer to design, develop, and optimize large-scale data processing systems. In this role, you will work closely with cross-functional teams to architect data pipelines, implement data integration solutions, and ensure the performance, scalability, and reliability of big data platforms. The ideal candidate will have deep expertise in distributed systems, cloud platforms, and modern big data technologies such as Hadoop, Spark etc
Demonstrated technical expertise in Object Oriented and database technologies/concepts which resulted in deployment of enterprise quality solutions.
Past experience with developing enterprise quality solutions in an iterative or Agile environment.
Extensive knowledge of industry leading software engineering approaches including Test Automation, Build Automation and Configuration Management frameworks.
Strong written and verbal technical communication skills.
Demonstrated ability to develop effective working relationships that improved the quality of work products.
Should be well organized, thorough, and able to handle competing priorities.
Ability to maintain focus and develop proficiency in new skills rapidly.
Ability to work in a fast paced environment.
Experience with object oriented programming languages such as Java, Scala or Python.
Responsibilities:
- Design, develop, and maintain large-scale data processing pipelines using Big Data technologies (e.g., Hadoop, Spark, Python, Scala).
- Implement data ingestion, storage, transformation, and analysis of solutions that are scalable, efficient, and reliable.
- Stay current with industry trends and emerging Big Data technologies to continuously improve the data architecture
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Optimize and enhance existing data pipelines for performance, scalability, and reliability.
- Develop automated testing frameworks and implement continuous testing for data quality assurance.
- Conduct unit, integration, and system testing to ensure the robustness and accuracy of data pipelines.
- Work with data scientists and analysts to support data-driven decision-making across the organization.
- Ability to write and maintain automated unit, integration, and end-to-end tests
- Monitor and troubleshoot data pipelines in production environments to identify and resolve issues.
Education/Experience Requirements:
Bachelor's degree in Computer Science, Information Systems or related discipline with at least five (5) years of related experience, or equivalent training and/or work experience; Master's degree and past Financial Services industry experience preferred.Demonstrated technical expertise in Object Oriented and database technologies/concepts which resulted in deployment of enterprise quality solutions.
Past experience with developing enterprise quality solutions in an iterative or Agile environment.
Extensive knowledge of industry leading software engineering approaches including Test Automation, Build Automation and Configuration Management frameworks.
Strong written and verbal technical communication skills.
Demonstrated ability to develop effective working relationships that improved the quality of work products.
Should be well organized, thorough, and able to handle competing priorities.
Ability to maintain focus and develop proficiency in new skills rapidly.
Ability to work in a fast paced environment.
Experience with object oriented programming languages such as Java, Scala or Python.
Essential Technical Skills:
AI Tool Proficiency:
Hands-on experience with AI development tools (GitHub Copilot, Q Developer, ChatGPT, Claude, etc.)Technical Background:
Strong software development background with ability to contribute to technical discussionsAgile Methodology:
Extensive experience with Scrum, Kanban, and continuous improvement practicesBig Data technologies- Experience with Big data technologies such as Hadoop, Spark, Hive & Trino
- Evaluate understanding of common issues like:? Data skew and strategies to mitigate it.? Working with massive data volumes in PetaBytes.? Troublehsooting job failures due to resource limitations, bad data, scalability challenged.
- Look for real-world debugging and mitigation stories. AI Skills
Prompt Engineering:
Proficiency in crafting effective prompts for AI coding assistants and analysis toolsAI Workflow Design:
Experience redesigning development processes to leverage AI capabilitiesData Analysis:
Ability to interpret AI-generated insights and translate them into actionable team improvementsChange Management:
Experience leading teams through AI adoption and workflow transformation #LI-WB #DiceSimilar remote jobs
Wells Fargo
Chandler, AZ
Posted2 days ago
Updated12 hours ago
Merck Sharp Dohme
Des Moines, IA
Posted2 days ago
Updated12 hours ago
Similar jobs in Rockville, MD
Giant Food - PROD
Rockville, MD
Posted2 days ago
Updated12 hours ago
Giant Food - PROD
Rockville, MD
Posted2 days ago
Updated12 hours ago
Similar jobs in Maryland
White Glove Placement
Bel Air, MD
Posted1 day ago
Updated12 hours ago
TravelCenters of America
Jessup, MD
Posted2 days ago
Updated12 hours ago