New
Get our monthly roundup of news and resources
·
Subscribe Now
Search
Back to Cloudsmith
Start your free trial
vllm
0.10.1.1
last stable release 3 weeks ago
Submit Feedback
Source Code
See on PyPI
Install
Complexity Score
High
Open Issues
N/A
Dependent Projects
174
Weekly Downloads
global
1,178,207
Keywords
amd
cuda
deepseek
gpt
hpu
inference
inferentia
llama
llm
llm-serving
llmops
mlops
model-serving
pytorch
qwen
rocm
tpu
trainium
transformer
xpu
License
Apache-2.0
Yes
attribution
Permissive
linking
Permissive
distribution
Permissive
modification
Yes
patent grant
Yes
private use
Permissive
sublicensing
No
trademark grant
Downloads
Loading Weekly Download Data
Readme
Loading Readme
Dependencies
Runtime
Development
Loading dependencies...
58
Quality
CVE Issues
Active
0
Scorecards Score
No Data
Test Coverage
No Data
Follows Semver
No
Github Stars
57,283
Dependencies
total
68
Dependencies
Outdated
3
Dependencies
Deprecated
0
Threat Modelling
No Data
Repo Audits
No Data
37
Maintenance
60
Docs
Learn how to distribute vllm in your own private PyPI registry
$
p
i
p
i
n
s
t
a
l
l
v
l
l
m
/
Processing...
✓
Done
Start your free trial
Releases
0.10.1.1
Stable version
3
weeks ago
Released
Loading Version Data
PyPI on Cloudsmith
Getting started with PyPI on Cloudsmith is fast and easy.
Learn more about PyPI on Cloudsmith
View the Cloudsmith + Python Docs