NVIDIA - Artificial Intelligence Algorithms Engineer - New College Grad 2024, application via RippleMatch - RippleMatch Opportunities
Santa Clara, CA
About the Job
This role is with Nvidia. Nvidia uses RippleMatch to find top talent.
We are now looking for a Artificial Intelligence Algorithms Engineer!
NVIDIA is seeking engineers to design, develop and optimize Artificial Intelligence solutions to diverse real-world problems. If you have a strong understanding of AI /Deep-Learning(DL) and a deep algorithmic background, with exposure to computer architecture and performance, then this role may be a great fit for you! Collaborate and interact with internal partners, users, and members of the open source community to analyze, define and implement highly optimized AI algorithms. The scope of these efforts includes a combination of implementing new algorithms, performance/accuracy tuning and analysis, defining APIs, and analyzing functionality coverage to build larger, coherent toolsets and libraries. The ability to work in a multifaceted, product-centric environment with excellent interpersonal skills are required, to be successful in this role.
What you'll be doing:
Develop algorithms for AI/DL, data analytics, machine learning, or scientific computing
Tackle large-scale distributed systems capable of performing end-to-end AI training and inference-deployment (data fetching, pre-processing, orchestrate and run model training and tuning, model serving)
Analyze, influence, and improve AI/DL libraries, frameworks and APIs according to good engineering practices
Research, prototype, and develop effective tools and infrastructure pipelines
Publish innovative results on Github and scientific publications
What we need to see:
A PhD or Master's Degree (or equivalent experience) in Computer Science, AI, Applied Math, or related field
Strong Mathematical fundamentals and AI/DL algorithms skills or experience
Excellent programming, debugging, performance analysis, test design and documentation skills
Experience with AI/DL Frameworks (e.g. PyTorch, JAX)
Excellent C/C++ and Python programming skills
Ways to stand out from the crowd:
Knowledge of GPU/CPU architecture and related numerical software
Prior experience with Generative AI techniques applied to Large Language Models and multimodal learning (Image, Video, Speech etc.)
Exposure to large-scale AI training, understanding of the compute system concepts (latency/throughput bottlenecks, pipelining, multiprocessing etc) and related performance analysis and tuning
Hands-on experience with inference and deployment environments would be an asset (e.g. TRT, ONNX, Triton)
NVIDIA is widely considered to be one of the technology world's most desirable employers. We have some of the most forward-thinking and hardworking people on the planet working with us. If you're creative and collaborative computer scientist with a passion for Artificial Intelligence / Deep Learning Algorithms, we want to hear from you!
The base salary range is 120,000 USD - 230,000 USD. Your base salary will be determined based on your location, experience, and the pay of employees in similar positions.
You will also be eligible for equity and benefits. NVIDIA accepts applications on an ongoing basis.
NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.