Skip to content

atanikan/llm-inference-service

Repository files navigation

LLM Inference Service

This repository provides the installation and usage of various LLM Inference frameworks on Polaris and Sophia. The frameworks currently tested are:

About

This repo hosts the different ways to run vllm on ANL HPC system

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •