Lead Data Integration Engineer - Veterans Sourcing Group
Coppell, TX 75019
About the Job
Lead Data Integration Engineer
Dallas, TX 75019
Direct hire full time - $125K
Dallas, TX 75019
Direct hire full time - $125K
JOB DESCRIPTION AND RESPONSIBILITIES:
QUALIFICATIONS:
- Function as a technical expert on one or more applications utilized by our company.
- Work with the Business System Analyst to ensure designs satisfy functional requirements.
- Partner with Infrastructure to identify and deploy optimal hosting environments.
- Tune application performance to eliminate and reduce issues.
- Research and evaluate technical solutions consistent with DTCC technology standards.
- Align risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately.
- Apply different software development methodologies dependent on project needs.
- Contribute expertise to the design of components or individual programs and participate in the construction and functional testing.
- Support development teams, testing, troubleshooting, and production support.
- Build applications and construct unit test cases that ensure compliance with functional and non-functional requirements.
- Work with peers to mature ways of working, continuous integration, and continuous delivery.
- Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk, advances appropriately.
QUALIFICATIONS:
- Bachelor's degree and/or equivalent experience
- Minimum of 7 years of related experience.
- Data warehouse ETL Data Engineer – Snowflake and Python:
- Looking for a strong data engineer with hands-on experience in developing large scale Data engineering pipelines for financial services and preferably for risk management.
- Minimum 7 years of related experience in building and maintaining large scale Data warehouse applications in cloud.
- Minimum 4 years of experience in developing complex data pipelines loading and transforming data into snowflake Database using SQL and PLSQL procedures.
- 3 years of experience in writing and performance tuning of PL/SQL languages with best practices and standards.
- Minimum 5 years of Hands-on experience in writing, tuning, and managing complex SQL for real-time reporting from large scale data warehouse systems in Snowflake.
- Preferred experience with Python framework for orchestrating data loads.
- Hands on experience in writing shell scripts for Linux for file processing and orchestration with Python.
- Experience in creating SQL based processes to build slowly changing dimension and several types of facts.
- Experience in data lineage analysis, data profile and creating mapping integration across multiple source systems.
- Experience in developing in AWS cloud-based platform and knowledge of its services.
- Knowledge in Snowflake role-based access, compute warehouse sizing, clustering keys in large tables, partition schemes and query profiling.
- Hands on experience in creating and managing Autosys JILs for scheduling jobs.
- Strong hands-on experience in bitbucket, git, Liquibase and managing multiple versions of release and CICD pipelines.
- Good understanding of Enterprise Data integration concepts, Data warehouse modelling and Data architecture patterns.
Source : Veterans Sourcing Group