Cloud Engineer - Envision, LLC
St. Louis, MO 63146
About the Job
Key responsibilities include:
Design, build and support of cloud and open source systems to process geospatial data assets via an API-based platform
Partners with other internal development communities to bring needed data sets into the asset and making data available to the Bayer Enterprise and internal development communities
Building highly scalable APIs and associative architecture to support thousands of requests per second
Provides leadership in advancing Bayers understanding of environmental/external influences on field performance and risk factors
Working at all stages of the software life cycle: Proof of Concept, MVP, Production, and Deprecation
Minimum Requirements:
BSc degree in Computer Science or relevant job experience. 7 years plus overall experience.
Minimum of 2-year experience with Python, Java, Go, or similar development languages.
Extensive knowledge in different programming or scripting languages like Go, Scala, Java, Javascript, SQL, Bash, Python and/or R.
Experience developing HTTP APIs (REST and/or GraphQL) that serve up data in an open source technology, preferably in a cloud environment.
Ability to build and maintain modern cloud architecture, e.g. AWS, Google Cloud, etc.
Experience working with PostgreSQL/PostGIS.
Experience with code versioning and dependency management systems such as GitHub, SVT, and Maven.
Proven success utilizing Docker to build and deploy within a CI/CD Environment, preferably using Kubernetes.
Desirable qualifications:
MSc in Computer Science or related field.
Demonstrated knowledge of open-source geospatial solutions like GeoServer, GeoTrellis, GeoMesa.
Experience with stream processing, e.g. Kafka.
Highly proficient (4 years) in GoLang
Experience working with customers/other developers to deliver full-stack development solutions e.g collect software, data, and timeline requirements in an Agile environment.
Demonstrated knowledge of agriculture and/or agriculture-oriented businesses.
Experience implementing complex data projects with a focus on collecting, parsing, managing, and delivery of large sets of data to turn information into insights using multiple platforms.
Demonstrated experience adapting to new technologies.
Capable to decide on the needed hardware and software design needs and act according to the decisions. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions.
Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
Experience creating cloud computing solutions and web applications leveraging public and private APIs.
Proven experience (2 years) with distributed systems, e.g. Argo, Kubernetes, Spark, distributed databases, grid computing.
Proficient (4+ years) working in a Command Line Interface system e.g Docker, Argo, K8s, AWS CLI, GCloud, pSQL, SSH
Design, build and support of cloud and open source systems to process geospatial data assets via an API-based platform
Partners with other internal development communities to bring needed data sets into the asset and making data available to the Bayer Enterprise and internal development communities
Building highly scalable APIs and associative architecture to support thousands of requests per second
Provides leadership in advancing Bayers understanding of environmental/external influences on field performance and risk factors
Working at all stages of the software life cycle: Proof of Concept, MVP, Production, and Deprecation
Minimum Requirements:
BSc degree in Computer Science or relevant job experience. 7 years plus overall experience.
Minimum of 2-year experience with Python, Java, Go, or similar development languages.
Extensive knowledge in different programming or scripting languages like Go, Scala, Java, Javascript, SQL, Bash, Python and/or R.
Experience developing HTTP APIs (REST and/or GraphQL) that serve up data in an open source technology, preferably in a cloud environment.
Ability to build and maintain modern cloud architecture, e.g. AWS, Google Cloud, etc.
Experience working with PostgreSQL/PostGIS.
Experience with code versioning and dependency management systems such as GitHub, SVT, and Maven.
Proven success utilizing Docker to build and deploy within a CI/CD Environment, preferably using Kubernetes.
Desirable qualifications:
MSc in Computer Science or related field.
Demonstrated knowledge of open-source geospatial solutions like GeoServer, GeoTrellis, GeoMesa.
Experience with stream processing, e.g. Kafka.
Highly proficient (4 years) in GoLang
Experience working with customers/other developers to deliver full-stack development solutions e.g collect software, data, and timeline requirements in an Agile environment.
Demonstrated knowledge of agriculture and/or agriculture-oriented businesses.
Experience implementing complex data projects with a focus on collecting, parsing, managing, and delivery of large sets of data to turn information into insights using multiple platforms.
Demonstrated experience adapting to new technologies.
Capable to decide on the needed hardware and software design needs and act according to the decisions. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions.
Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
Experience creating cloud computing solutions and web applications leveraging public and private APIs.
Proven experience (2 years) with distributed systems, e.g. Argo, Kubernetes, Spark, distributed databases, grid computing.
Proficient (4+ years) working in a Command Line Interface system e.g Docker, Argo, K8s, AWS CLI, GCloud, pSQL, SSH
Source : Envision, LLC