Enterprise Data Architect - Global Solutions Group
Oak Park, MI
About the Job
Work Location: Remote
Project Duration: Long Term
Client: Government
Pay Rate: $80/hour – $85/hour (Depends on experience)
Candidates must have active Secret or Top Secret clearance (Due to Project requirements)
As an Enterprise Data Architect, you will be responsible for the high-level design and implementation of data solutions for our clients’ complex technology landscapes. You will work closely with technical and business teams to develop data strategies and solutions that enable seamless communication between various systems, applications, and data sources. You will have expertise in data management and governance, including metadata management, data quality management, master data management, data lineage, and other related areas.
Key Responsibilities:
Develop and maintain the high-level design of the data landscape, data management components, data platform, and reporting and analytical components.
Develop and maintain data architecture standards, policies, and best practices.
Collaborate with technical and business teams to ensure seamless communication between various systems, applications, and data sources.
Develop and maintain data management and governance standards, policies, and best practices, including metadata management, data quality management, master data management, data lineage, and other related areas.
Conduct technical assessments and evaluations to identify data management and governance requirements and opportunities.
Develop and maintain data management and governance design documentation and technical specifications.
Develop and maintain databases in Azure environment
Build and maintain ETL pipelines using AWS and Azure data services
Perform migration processes to or from cloud environments
Design and implement data storage, data processing workflows
Secure, monitor, and optimize data storage and data processing
Developing using Python, PostgreSQL and SQL
Work in the team following Agile practices.
Experience building data intensive application integrating with lowcode no code front end applications