Senior Data Engineer at Albano Systems
Hartford, CT
About the Job
This is a contract to hire opportunity. hybrid work location in Charlotte, NC or Hartford, CT. W-2 hourly No C2C.
Must be eligible to work in the US without company sponsorship now or in the future.
Our property & casualty insurance client is looking for a Staff Data Engineer to join their Commercial Lines (CL) Data modernization and simplification program. You will have an opportunity to engage in enabling well architected cloud-based data solutions for data and analytics in support of BI, Actuarial, Data Science, Finance, Operations, ERM etc. within The Hartford. To succeed in this role, you should be a strong critical thinker, technical acumen and be able to derive the root causes of business problems. You will work closely with technology colleagues, product owners and product managers within the Agile Release Train to evolve and mature the Commercial Lines Data assets for data and analytics consumers.
Responsibilities:
Accountable as the subject matter expert and/or technical lead for a large-scale data product.
Drive End-to-End solution delivery involving multiple platforms and technologies with medium to large complexity or oversee certain parts of very large complex implementations, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to influence and implement the vision of the pipeline and data product architecture while safeguarding the integrity and scalability of the environment.
Articulate risks and tradeoffs of technology solutions to senior leaders with translations as needed for business leaders.
Build or enhance data pipelines using cloud-based architecture .
Ability of build simplified data models for complex business problems.
Accountable for Data Engineering Practices across all the teams involved.
Implement and utilize leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake , Talend & Informatica) with cloud/on premise hybrid hosting solutions, on a multi-team/product level.
Ability to execute independently.
Team player with transformation mindset.
Ability to operate successfully in a lean and fast-paced organization, leveraging Scaled Agile principles and ways of working.
Collaboration across teams, decision making, conflict resolution and relationship building.
Collaborate with the team to mature Code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications:
5+ years of Data engineering experience.
2+ years of developing and operating production workloads in cloud infrastructure.
Bachelors degree in Computer Science, Data Science, Information Technology or similar majors.
Experience with Snowflake cloud data platform including hands-on experience with snowflake utilities like SnowSQL, Snow pipe.
Expert level skills on AWS services, Snowflake, Python, Spark Certifications are huge plus.
Expert level skills on ETL tools like Talend and Informatica
Expert level skills in Data Warehousing Modeling, mapping, building batch and real time data pipelines and framework
Experienced in DataOps tools like GitHub, Jenkins, Udeploy etc.
Strong Knowledge of P&C Commercial Lines business.
Strong knowledge of Legacy tech stack Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting etc.
Experience in using Agile tools like Rally.
Strong written and verbal communication to interact effectively with both technical and nontechnical users at all levels of the organization.