GCP Data Engineer With Networking at Global IT Con
Dallas, TX 75201
About the Job
Role: -GCP Data Engineer with Networking
Location: Dallas TX ( Onsite )
Duration: - 12+ months
Years of Experience: - 10+ Years
Must have GCP Certification
Job Description:
We are seeking a highly skilled and experienced Senior Data Engineer with a minimum of 10 years of industry experience, specializing in Google Cloud Platform (GCP). The ideal candidate will have a strong background in data engineering, with specific expertise in designing, implementing, and optimizing data pipelines and solutions within GCP environments. Additionally, candidates should have previous experience in data stewardship roles, demonstrating a deep understanding of data governance principles and practices.
Responsibilities:
- Design, develop, and deploy scalable and efficient data pipelines and ETL processes on Google Cloud Platform (GCP), ensuring high data quality, reliability, and performance.
- Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Implement data governance best practices, including data lineage, metadata management, and data quality monitoring, to ensure compliance with regulatory requirements and internal policies.
- Serve as a subject matter expert on GCP services and data engineering technologies, providing guidance and mentorship to junior team members.
- Work closely with infrastructure and DevOps teams to optimize cloud resources and ensure smooth operation of data infrastructure.
- Develop and maintain documentation, including data models, architecture diagrams, and technical specifications.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field; advanced degree preferred.
- Minimum of 8 years of experience in data engineering roles, with a focus on designing and building data pipelines and solutions.
- Strong proficiency in Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage, Data Flow, SQL, Pub/Sub, Apache, Hadoop.
- Previous experience in data stewardship or data governance roles, with a solid understanding of data management principles and practices.
- Proficiency in programming languages such as Python, SQL, and/or Java for data manipulation and analysis.
- Experience with data modeling, schema design, and database technologies (e.g., SQL, NoSQL).