Guide
Software Supply Chain Integrity
·
Download
Search
Back to Cloudsmith
Start your free trial
vllm-npu
0.4.2.post2
last stable release 11 months ago
Submit Feedback
Source Code
See on PyPI
Install
Complexity Score
High
Open Issues
3,034
Dependent Projects
0
Weekly Downloads
global
19
Keywords
amd
blackwell
cuda
deepseek
deepseek-v3
gpt
gpt-oss
inference
kimi
llama
llm
llm-serving
model-serving
moe
openai
pytorch
qwen
qwen3
tpu
transformer
License
Apache-2.0
Yes
attribution
Permissive
linking
Permissive
distribution
Permissive
modification
Yes
patent grant
Yes
private use
Permissive
sublicensing
No
trademark grant
Downloads
Loading Weekly Download Data
Readme
Loading Readme
Dependencies
Runtime
Development
Loading dependencies...
60
Quality
CVE Issues
Active
0
Scorecards Score
No Data
Test Coverage
59.00
%
Follows Semver
No
Github Stars
61,024
Dependencies
total
24
Dependencies
Outdated
0
Dependencies
Deprecated
0
Threat Modelling
No
Repo Audits
No
68
Maintenance
60
Docs
Learn how to distribute vllm-npu in your own private PyPI registry
$
p
i
p
i
n
s
t
a
l
l
v
l
l
m
-
n
p
u
/
Processing...
✓
Done
Start your free trial
Releases
0.4.2.post2
Stable version
11
months ago
Released
Loading Version Data
PyPI on Cloudsmith
Getting started with PyPI on Cloudsmith is fast and easy.
Learn more about PyPI on Cloudsmith
View the Cloudsmith + Python Docs