FREE 2024 DevOps + Platform Engineering report
Get Your Copy
Search
Back to Cloudsmith
Start your free trial
vllm-online
0.4.2
last stable release 5 months ago
Submit Feedback
Source Code
See on PyPI
Install
Complexity Score
High
Open Issues
1,956
Dependent Projects
0
Weekly Downloads
global
24
Keywords
amd
cuda
gpt
inference
inferentia
llama
llm
llm-serving
llmops
mlops
model-serving
pytorch
rocm
tpu
trainium
transformer
xpu
License
Apache-2.0
Yes
attribution
Permissive
linking
Permissive
distribution
Permissive
modification
Yes
patent grant
Yes
private use
Permissive
sublicensing
No
trademark grant
Downloads
Loading Weekly Download Data
Readme
Loading Readme
Dependencies
Runtime
Development
Loading dependencies...
65
Quality
CVE Issues
Active
0
Scorecards Score
No Data
Test Coverage
No Data
Follows Semver
Yes
Github Stars
27,681
Dependencies
total
0
Dependencies
Outdated
0
Dependencies
Deprecated
0
Threat Modelling
No
Repo Audits
No
59
Maintenance
60
Docs
Learn how to distribute vllm-online in your own private PyPI registry
$
p
i
p
i
n
s
t
a
l
l
v
l
l
m
-
o
n
l
i
n
e
/
Processing...
✓
Done
Start your free trial
Releases
0.4.2
Stable version
5
months ago
Released
Loading Version Data
PyPI on Cloudsmith
Getting started with PyPI on Cloudsmith is fast and easy.
Learn more about PyPI on Cloudsmith
View the Cloudsmith + Python Docs