ETL Ab Initio Data Validation Engineer (SDET) at Tech Era Inc.
Erie, PA
About the Job
Job Title: ETL Ab Initio Data Validation Engineer (SDET)
Primary skills: Ab Initio, Pyspark, Hive
Iceberg on Azure Datalake (ADLS) or Iceberg on Snowflake
Database and OS: Oracle, DB2, SQL, Data Warehousing, Linux
About the job
We are seeking a highly skilled ETL Ab Initio Validation Engineer with expertise in Ab Initio and Snowflake or Azure Datalake to join our dynamic team within our organization. This role involves integrating seamlessly with Internal Development Platforms (IDP) and other tools to enhance the developer experience on ETL process. The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing. This role involves designing, developing, and maintaining data solutions that support our business intelligence and analytics initiatives.
Know your team
At ValueMomentum’s Engineering Center, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise.
Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects.
Responsibilities
- Design, develop and maintain ETL processes using Ab Initio to integrate data as Iceberg on ADLS/Snowflake
- Optimize ETL workflows to ensure efficient data processing and loading.
- Develop scripts to automate data processing and loading tasks.
- Implement data quality checks and validation processes within ETL workflows.
- Understanding business requirements/scope of projects, create ETL code as per business logic/process; be able to provide estimation for the tasks as required with supporting data points.
- Ensure data governance policies are adhered to, including data lineage and metadata management.
- Provide support for data-related issues and troubleshoot ETL-related problems.
- Create and maintain technical documentation and reports for stakeholders.
Requirements
Must Have:
- 5-6 years of total technical experience on designing, developing, and implementing ETL solutions using Ab Initio/Linux.
- Experience in complete Software Development Life Cycle which includes Systems Analysis of various applications in Client/Server Environment.
- Working with various software applications with advanced knowledge (such as Azure ADLS, Iceberg, Spark, Hive Thrift Server, Apache Ranger, Lift, Snowflake)
- Experience in creating graphs, PSets, shell scripting, deployment activities, performance tuning and error handling skills.
- Excellent hands-on experience in various Transform, Partition/De-partition, Database, Dataset and XML components, .
- Analytical problem solving and business interaction skills.
- Must have strong exposure to Parallelism techniques, Generic graph design, EME and Data Warehousing concepts.
- Experience in writing complex Database queries (preferably Oracle and DB2).
- Effective communication with entire offshore team, customer and all concerned teams on day-to-day basis; provide daily status reporting to all stakeholders.
- Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred.