Data Architect.----Atlanta,GA - Georgia IT Inc.
Atlanta, GA
About the Job
Job Title : Data Architect.--Atlanta,GA
Client : CTS (Delta)
Max Salary : Yearly Salary:$130K-$140K
Work Authorization : USC, GC preferred.
AA
Job Description
• 10-15 years of working experience with 3+ years of experience as Big Data solutions architect. Needs to have experience with the major big data solutions like Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra, Spark, .Impala, Oozie, , Flume, ZooKeeper, Sqoop, Kafka, Nifi, etc, NoSQL databases.
• Big Data Solution Architect Certified Preferred. Hands-on experience on Hadoop implementations preferred
• Big Data Certification is a must.
• Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
• Work experience on both Real Time Streaming and Batch processing.
• Translate complex functional and technical requirements into detailed design.
• Propose best practices/standards with data security and privacy handling experience.
• Knowledge in handling different kinds of source systems and different formats of data.
• Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
• Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
• Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
• Experience Cloud Computing.
• To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
• To be able to clearly articulate pros and cons of various technologies and platforms;
• To have excellent written and verbal communication skills;
• To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
• To be able to work in a fast-paced agile development environment.
Required Technical / Functional Skills
• Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
• Translate complex functional and technical requirements into detailed design.
• Propose best practices/standards with data security and privacy handling experience.
• Knowledge in handling different kinds of source systems and different formats of data.
• Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
• Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
• Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
• Experience Cloud Computing.
• To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
• To be able to clearly articulate pros and cons of various technologies and platforms;
• To have excellent written and verbal communication skills;
• To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
• To be able to work in a fast-paced agile development environment. Have experience in Claims domain
Desired Technical / Functional Skills
• Experience in Hortonworks Hadoop Platform
• Knowledge in Airline Domain
Client : CTS (Delta)
Max Salary : Yearly Salary:$130K-$140K
Work Authorization : USC, GC preferred.
AA
Job Description
• 10-15 years of working experience with 3+ years of experience as Big Data solutions architect. Needs to have experience with the major big data solutions like Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra, Spark, .Impala, Oozie, , Flume, ZooKeeper, Sqoop, Kafka, Nifi, etc, NoSQL databases.
• Big Data Solution Architect Certified Preferred. Hands-on experience on Hadoop implementations preferred
• Big Data Certification is a must.
• Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
• Work experience on both Real Time Streaming and Batch processing.
• Translate complex functional and technical requirements into detailed design.
• Propose best practices/standards with data security and privacy handling experience.
• Knowledge in handling different kinds of source systems and different formats of data.
• Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
• Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
• Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
• Experience Cloud Computing.
• To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
• To be able to clearly articulate pros and cons of various technologies and platforms;
• To have excellent written and verbal communication skills;
• To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
• To be able to work in a fast-paced agile development environment.
Required Technical / Functional Skills
• Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
• Translate complex functional and technical requirements into detailed design.
• Propose best practices/standards with data security and privacy handling experience.
• Knowledge in handling different kinds of source systems and different formats of data.
• Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
• Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
• Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
• Experience Cloud Computing.
• To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
• To be able to clearly articulate pros and cons of various technologies and platforms;
• To have excellent written and verbal communication skills;
• To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
• To be able to work in a fast-paced agile development environment. Have experience in Claims domain
Desired Technical / Functional Skills
• Experience in Hortonworks Hadoop Platform
• Knowledge in Airline Domain
Source : Georgia IT Inc.