Data Engineer - My IT LLC
Charlotte, NC
About the Job
Job Title: Data Engineer
Duration: 6+ Months
Location: Charlotte, NC
Exp Level: 8 Years
Duration: 6+ Months
Location: Charlotte, NC
Exp Level: 8 Years
Job Summary: We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and architectures, ensuring efficient data integration and high performance. This role requires a deep understanding of data processing technologies, Java programming, and cloud-based data solutions.
Key Responsibilities:
- Design, develop, and maintain data pipelines and data architecture for large-scale data processing.
- Collaborate with data scientists, analysts, and other engineers to gather requirements and design data solutions.
- Write robust, scalable, and maintainable code in Java to handle data extraction, transformation, and loading (ETL).
- Optimize data workflows for performance and scalability.
- Ensure data quality, security, and governance practices are adhered to in all data engineering solutions.
- Integrate data from multiple sources, including structured and unstructured data, into cohesive datasets.
- Implement data monitoring, logging, and error-handling mechanisms for reliable data pipelines.
- Work with cloud-based services (e.g., AWS, GCP, Azure) to build and deploy data solutions.
- Participate in code reviews and contribute to best practices for the development team.
- Troubleshoot and resolve issues related to data processing and data storage.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- 3+ years of experience as a Data Engineer or similar role.
- Strong programming skills in Java with a solid understanding of object-oriented design.
- Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka.
- Proficiency in SQL and database systems (e.g., PostgreSQL, MySQL, MongoDB).
- Familiarity with data integration tools and ETL frameworks.
- Knowledge of Informatica is required.
- Experience with cloud platforms and their data services (e.g., AWS S3, AWS Glue, Google BigQuery).
- Knowledge of data modeling and schema design.
- Should be familiar with Informatica ,Kafka
- Experience in performance tuning and data optimization techniques.
- Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes is a plus.
- Excellent problem-solving skills and ability to work collaboratively in a team environment.
- Strong communication and documentation skills.
Preferred Skills:
- Experience with real-time data streaming and processing (e.g., Apache Flink, Apache Storm).
- Proficiency in other programming languages such as Python or Scala.
- Familiarity with data visualization tools and practices.
- Certifications related to cloud data engineering or Java development are a plus.
Source : My IT LLC