Data engineer - Applab Systems Inc
Remote, NJ 08540
About the Job
Role: Data Engineer
Remote
Full Time
Only Need H1 Transfer
Job Description:
• Design and develop data applications using selected tools and frameworks as required and requested.
• Collaborate with Engineers and Business to build intuitive products that address the needs of our customers.
• Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
• Gather and process raw data at scale.
• Perform tasks such as writing scripts, web scraping, calling-writing APIs, write SQL queries, etc.
• Work closely with the engineering team to integrate your work into our production systems.
• Process unstructured data into a form suitable for analysis. Analyze processed data.
• Participate in design and code reviews to ensure the quality of the products we deliver.
• Establish a good working rapport with Development group, System Engineers, Quality Assurance/Testers, and Business users.
• Develop scripts or workflows to facilitate complex ingress processes
Qualifications/Skills:
• 7 + years of recent experience in data engineering and/or software development, ETL work
• A solid track record of data management showing your flawless execution and attention to detail.
• Programming experience, ideally in Python, Snowflake, AWS, Airflow and a willingness to learn new programming languages to meet goals and objectives.
• Knowledge of data cleaning, wrangling, visualization and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks.
• Good Understanding of Databases and Datawarehouse tools
• Experience processing large amounts of structured and unstructured data, including integrating data from multiple sources.
• A willingness to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and your experience to get the job done.
Must be an expert:
• Python: Boto3, Multiprocessing, Classes
• Java Script (Snowflake SP)
• AWS: Lambda, Triggers, EC2, S3, CloudWatch, Security.
• Snowflake: SQL & Stored Procedures
• Airflow: Celery, MQRabbit, Postgress
• Unix-Linux: Shell Scripting, admin experience
• Experience building APIs
Good to know:
• Looker
• Data Science
• AI
Thanks
Kawaljeet Kaur
kawaljeet@applabsystems.com
7325384989
Remote
Full Time
Only Need H1 Transfer
Job Description:
• Design and develop data applications using selected tools and frameworks as required and requested.
• Collaborate with Engineers and Business to build intuitive products that address the needs of our customers.
• Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
• Gather and process raw data at scale.
• Perform tasks such as writing scripts, web scraping, calling-writing APIs, write SQL queries, etc.
• Work closely with the engineering team to integrate your work into our production systems.
• Process unstructured data into a form suitable for analysis. Analyze processed data.
• Participate in design and code reviews to ensure the quality of the products we deliver.
• Establish a good working rapport with Development group, System Engineers, Quality Assurance/Testers, and Business users.
• Develop scripts or workflows to facilitate complex ingress processes
Qualifications/Skills:
• 7 + years of recent experience in data engineering and/or software development, ETL work
• A solid track record of data management showing your flawless execution and attention to detail.
• Programming experience, ideally in Python, Snowflake, AWS, Airflow and a willingness to learn new programming languages to meet goals and objectives.
• Knowledge of data cleaning, wrangling, visualization and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks.
• Good Understanding of Databases and Datawarehouse tools
• Experience processing large amounts of structured and unstructured data, including integrating data from multiple sources.
• A willingness to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and your experience to get the job done.
Must be an expert:
• Python: Boto3, Multiprocessing, Classes
• Java Script (Snowflake SP)
• AWS: Lambda, Triggers, EC2, S3, CloudWatch, Security.
• Snowflake: SQL & Stored Procedures
• Airflow: Celery, MQRabbit, Postgress
• Unix-Linux: Shell Scripting, admin experience
• Experience building APIs
Good to know:
• Looker
• Data Science
• AI
Thanks
Kawaljeet Kaur
kawaljeet@applabsystems.com
7325384989
Source : Applab Systems Inc