AWS Data Lake House Engineer Position Available In Washington, Vermont
Tallo's Job Summary: This job listing in Washington - VT has been recently added. Tallo will add a summary here for this job shortly.
Job Description
AWS Data Lake House Engineer Tech Army 1 National Life Drive, Montpelier, VT 05602
Job Details Below Job Title:
Data Lake House Engineer Client:
Direct Client Location:
Montpelier, VT Duration:
1 Year Contract Approach to all layers: Design and implement a data lake in AWS that is compliant with requirements and best practices protecting sensitive data, including health data and other sensitive data, in collaboration with the Vermont Technical Lead, using the latest Lake House standards and best practices. The lake house will be implemented in a way that the State can maintain, enhance, and expand it on its own. The lake house will be implemented using templates that the State can use to refine and implement its data strategy in other domains. The lake house will be implemented using a medallion architecture. All layers will be scalable as needed to ensure capabilities and cost can be tuned. The State prefers to use Python-based solutions as much as possible, while also providing support for R and other data management languages. Outputs of the development work are not intended be used for mission critical operational decision support and control system automation at this time. Data will be ingested in a read-only manner and will be aggregated for reporting and analytical purposes only. Data Security Layer Design and implement Identity and Access Management roles and processes. Design and implement data governance processes that are flexible to future security requirements and evolving use cases. Create technical templates to implement security controls The existing templates are designed to comply with CJIS standards, the commercial lake will be focused on complying with other data standards. Some adjustments may be necessary. Data Catalog Layer Design and implement solution to govern and catalog data using AWS Glue and other technologies as recommended by the implementor. The data catalog layer will resolve issues in other layers due to data schema drift in AWS Glue for use with reporting and analytical needs. Design and implement crawlers and catalog to store and maintain schema information. Consume the metadata catalog to control permissions and processing in other components of the solution. Create crawler templates for data sources.
Bronze:
Data Ingestion and Landing Zon This layer will be based on S3, combined with AWS warehouse technologies (Redshift and other technologies as recommended by the implementer). Design, implement, and templatize ingestion approaches for a variety of sources, including: Operational Database Sources, MySQL, SQL Server SaaS Applications File Shares (SharePoint, OneDrive, and Onprem) Stream Data Sources System Templates of ingestion processes Data Extract, Load and Transform (ELT) process for loading from source systems. Templates for creating ELT processes for future processes. Implement in a way that can be integrated with a future State data mastery solution.
Job Type:
Contract Pay:
$50.00 – $60.00 per hour
Benefits:
Life insurance
Schedule:
8 hour shift Ability to
Commute:
Montpelier, VT 05602 (Required) Ability to
Relocate:
Montpelier, VT 05602: Relocate before starting work (Preferred) Willingness to travel: 25% (Preferred)
Work Location:
In person