Data Engineer III at Randstad USA
Chicago, IL 60606
About the Job
The Product Analytics team at our Client is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.
JOB OVERVIEW AND RESPONSIBILITIES
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in Client's data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
location: Chicago, Illinois
job type: Contract
salary: $75.05 - 80.05 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in Client's data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
- - Support large scale data pipelines in a distributed and scalable environment
- - Enable and optimize production AWS environment for data infrastructure and frameworks
- - Expert in creating Terraform modules to automate deployments
- - Knowledge of Databricks and Datalake technologies
- - Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities
- - Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies
- - Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues
- - AWS Certification
- - Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks
- - Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking
- - Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement
- - Must be fluent in English (written and spoken)
- - Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners
- - Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels
- - Manage agile development and delivery by collaborating with project manager, product owner and development leads
Top 5 Skill sets
- 1. DevOps
- 2. AWS Cloud
- 3. Terraform
- 4. Python
- 5 CI/CD pipelines
Nice to have skills or certifications:
- 1. Blue-Green deployments
- 2. Kubernetes
- 3. Ansible Playbooks
REQUIRED
- - Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)
- - 5+ years of experience in data engineering or ETL development role
- - Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- - strong analytic skills related to working with structure, semi- structure, and unstructured datasets.
- - Experience with Big Query, SQL server, etc.
- - Experience with AWS cloud