Senior Software Engineer - vLLM Inference
Listed on 2026-02-21
-
IT/Tech
AI Engineer, Machine Learning/ ML Engineer
At Red Hat we believe the future of AI is open and we are on a mission to bring the power of open-source LLMs and vLLM to every enterprise. The Red Hat Inference team accelerates AI for the enterprise and brings operational simplicity to GenAI deployments. As leading developers, maintainers of the vLLM project, and inventors of state‑of‑the‑art techniques for model compression, our team provides a stable platform for enterprises to build, optimize, and scale LLM deployments.
We are seeking an experienced Senior Software Engineer to work closely with our technical and research teams on vLLM, llm‑compressor, speculators, llm‑d, create Dev Ops and CI/CD infrastructure, and scale our current technology stack. If you are someone who wants to contribute solving challenging technical problems at the forefront of AI inference, this is the role for you! You would be joining the core team behind 2025’s most popular open‑source project on Git Hub.
In this role, your primary responsibility will be to build and release the Red Hat AI Inference Server, continuously improve the processes and tooling used by the Dev Ops team, and find opportunities to automate procedures and tasks.
Join us in shaping the future of AI!
What you will do- Collaborate with research and product development teams to scale machine learning products for internal and external applications
- Actively contribute to managing and releasing upstream and midstream product builds
- Test to ensure correctness, responsiveness, and efficiency
- Troubleshoot, debug and upgrade Dev & Test pipelines
- Identifying and deploying cybersecurity measures by continuously performing vulnerability assessment and risk management
- Collaborate with a cross‑functional team about market requirements and best practices
- Keep abreast of the latest technologies and standards in the field
- 2+ years of experience in MLOps, Dev Ops, Automation and/or modern Software Deployment practices
- Experience with Release Engineering
- Experience evaluating LLMs for performance and accuracy (think Hella Swag, MMLU, Chatbot Arena, Truthful
QA, etc.) - Being super comfortable with Python and PyTest is a must
- Strong experience with Git, Github Actions including self-hosted runners, Build Kite, Terraform, Jenkins, Ansible, and/or other common technologies for automation and monitoring
- Experienced with administering Kubernetes/Open Shift and/or docker/podman
- Experience with Cloud Computing using at least one of the following Cloud infrastructures: AWS, GCP, Azure, or IBM Cloud
- Familiar with Agile development methodology
- Solid troubleshooting skills
- Ability to interact comfortably with the other members of a large, geographically dispersed team
- Experience maintaining an infrastructure and ensuring stability
- While a Bachelor’s degree or higher in computer science, mathematics, or a related discipline is valued, we prioritize technical prowess, initiative, problem solving, and practical experience
- Familiarity with contributing to the vLLM CI community is a big plus
#AI-HIRING
The salary range for this position is $ - $. Actual offer will be based on your qualifications.
Pay TransparencyRed Hat determines compensation based on several factors including but not limited to job location, experience, applicable skills and training, external market value, and internal pay equity. Annual salary is one component of Red Hat’s compensation package. This position may also be eligible for bonus, commission, and/or equity. For positions with Remote-US locations, the actual salary range for the position may differ based on location but will be commensurate with job duties and relevant work experience.
AboutRed Hat
Red Hat is the world’s leading provider of enterprise open source software solutions, using a community‑powered approach to deliver high‑performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in‑office, to office‑flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure.
We’re a leader in…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).