AWS Data Engineer - TechDigital
Newark, NJ
About the Job
- Coding skill in " SQL , python and pyspark " is mandatory. Unix scripting is desired.
- Having worked on real 2 to 3 projects as AWS data engineer and ability to work on "Data pipeline , ingest, transform, and deliver using s3,Glue and Redshift/RDS is needed.
- Knowledge of streaming ingestion using "KAFKA/kinesis or spark streaming is desired.
- AWS Solution Architect associate or Developer Associate is needed but AWS Data Analytics Certification is desired.
Fill the below table while sharing the profile
Skill/Competency | Rating | Years of Experience | Comments on primary skills Skill/Competency |
AWS Certifications | |||
AWS Solution Architect associate or Developer Associate (desired) | |||
AWS Data Analytics Certification (nice to have) | |||
Data pipeline , ingest, transform, and deliver using S3, Glue, RDS, Redshift, Athena, Lambda | |||
(Real 2-3 projects as AWS data engineer) | |||
SQL, Python, Spark (mandatory) | |||
Complex SQL scripting & Performance Tuning (mandatory) | |||
Lake formation, Cloud Formation templates, CI/CD, Jenkins, Cloud watch (mandatory) | |||
Streaming ingestion using "KAFKA/kinesis or spark stream (desired) | |||
Unix Shell Scripting (desired) | |||
Step Functions/Autosys | |||
Knowledge Implementing ETL / ELT for data solutions - CDC, Data Modeling, Data Governance | |||
Ingest, Storage, Integration, Processing, Access | |||
Code Versioning | |||
Jira and Confluence | |||
QA/Testing | |||
Security (IAM, security cred,..) |
Source : TechDigital