Data Engineer - TechDigital
Providence, RI
About the Job
- Experience working within the Azure ecosystem, including Azure AI Search, Azure Storage Blob, Azure Postgres and understanding how to leverage them for data processing, storage, and analytics tasks.
- Ability to preprocess and clean large datasets efficiently using Azure Tools /Python and other data manipulation tools. Experience with techniques such as data normalization, feature engineering, and data augmentation is preferred.
- Expertise in working with healthcare data standards (ex. HIPAA and FHIR), sensitive data and data masking techniques to mask personally identifiable information (PII) and protected health information (PHI) is essential.
- In-depth knowledge of search algorithms, indexing techniques, and retrieval models for effective information retrieval tasks. Familiarity with search platforms like Elasticsearch or Azure AI Search is a must.
- Familiarity with chunking techniques and working with vectors and vector databases like Pinecone.
- Ability to design, develop, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data.
- Experience with implementing best practices for data storage, retrieval, and access control to ensure data integrity, security, and compliance with regulatory requirements.
- Be able to implement efficient data processing workflows to support the training and evaluation of solutions using large language models, ensuring reliability, scalability, and performance.
- Ability to proactively identify and address issues related to data quality, pipeline failures, or resource contention, ensuring minimal disruption to systems.
- Experience with large language model frameworks, such as Langchain and know how to integrate them into data pipelines for natural language processing tasks.
- Experience working within the snowflake ecosystem.
- Knowledge of cloud computing principles and experience in deploying, scaling, and monitoring AI solutions on cloud platforms like Snowflake, Azure, AWS.
- Ability to communicate complex technical concepts effectively to technical and non-technical stakeholders and collaborate with cross-functional teams.
- Analytical mindset with a keen attention to detail, coupled with the ability to solve complex problems efficiently.
- Knowledge of cloud cost management principles and best practices to optimize cloud resource usage and minimize costs.
- Minimum of 10 years' experience as a data engineer
- Hands-on experience with Azure Cloud eco-system.
- Hands-on experience using Python for data manipulation.
- Deep understanding of vectors and vector databases.
- Hands-on experience scaling POC to production.
- Hands-on experience using tools such as Document Intelligence (formerly Azure Form Recognizer), Snowflake, function app. Azure AI Search
- Experience working with PII/PHI
- Hands-on experience working with unstructured data.
Source : TechDigital