Snowflake Developer With AWS - Ccube
Wilmington, DE
About the Job
Hello All,
We are Hiring Data Engineer with Snowflake Developer With AWS in Wilmington, DE
Below is the JD for the reference
We are Hiring Data Engineer with Snowflake Developer With AWS in Wilmington, DE
Below is the JD for the reference
Snowflake Data Engineer with AWS
Wilmington, DE (Hybrid 3 Days a Week)
W2/FT Preferred
Wilmington, DE (Hybrid 3 Days a Week)
W2/FT Preferred
#Note- NO OPT/CPT Please
Job responsibility
- Data domain expert and drive to know everything about the data on the platform
- Create Functional and Technical Specifications, Epics and User Stories, Process Flows, Data Analysis, Mapping
- Documents, Implementation Plan, Agile artifacts
- Migration from Hadoop to AWS using Pipelines, EMR
- Develop, enhance and test new/existing interfaces. The candidate will be part of existing agile team and will work on developing, enhancing ETL pipelines, design solutions,
- Handle Dev Ops effort in terms of CICD, Scanning, Code, Performance testing and Test coverage
- Identify, analyze, and interpret trends or patterns in complex data sets and transforming existing ETL logic into Hadoop Platform
- Innovate new ways of managing, transforming and validating data
- Establish and enforce guidelines to ensure consistency, quality and completeness of data assets Apply quality assurance best practices to all work products
- Experience of working in a development teams, using agile techniques and Object-Oriented development and scripting languages, is preferred.
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- 10 years of Database experience Knowledge of application. data, and infrastructure architecture disciplines
- ‘Strong experience with documentation and structuring information in tools lke Confluence and Jira
- Experience in SparkSQLimapala and Bigdata technologies
- Familiar with Data Science concepts and applying them to analyze large volumes of data
- Comfortable with data concepts: Oracle, Java, Python. Spark, Kafka, HDFS, Airflow, Elastic Search.
- Working proficiency in SDLC CV/CO Execution (GitHub, Jenkins. SNOR Spinnaker. AIM etc)
- Experience in services like Lambda EC2
- Experience in real time streaming data
- Strong Experience with UNIX shell srpting is must
- Experience with relational database environment (Oracle, Teradata, SQL Server. etc) leveraging databases, tables/views,
- stored procedures, agent jobs, etc. with strong analytical skills withthe ability to collect. organize. analyze, and
- disseminate significant amounts of information with attention to detail and accuracy
- ‘Good understanding of Change management process
- BS. or MS. in computer science, information systems, math, business, or engineering
Preferred qualifications, capabilities, and skills
Minimum 2 Experience with AWS Services like Lambda EC2
Experience in Athena, EMR Redshift Giue Kinesis AuroraRDSS3
Knowledge in one or more modem programming languages lke Java or Python is a plus
[AWS Cloud Practitioner certification a plus for applicant and expected upon joining team
Experience working on AdTech/MarTech platforms is a plus
Part of our mission statement is to provide a great life for our employees. We believe that happy employees make for a better company, so we take care of them. Here are a few of the perks we offer:
- Flexibility for you to work in-office, hybrid or remote
- Medical and Dental Premiums
- Startup culture with opportunity to grow
- Performance bonuses
- Health, dental and vision insurance plans
- Pursue growth and learning with career development expenses
Powered by JazzHR
Source : Ccube