pytorch-lightning-bolts

0.3.2.post1last stable release 3 years ago
Complexity Score
N/A
Open Issues
N/A
Dependent Projects
0
Weekly Downloadsglobal
2,517

License

  • Apache-2.0
    • Yesattribution
    • Permissivelinking
    • Permissivedistribution
    • Permissivemodification
    • Yespatent grant
    • Yesprivate use
    • Permissivesublicensing
    • Notrademark grant

Downloads

Readme

Deep Learning components for extending PyTorch Lightning

Installation • Latest Docs • Stable Docs • About • Community • Website • License

Getting Started

Pip / Conda

pip install lightning-bolts
Other installations

Install bleeding-edge (no guarantees)

pip install https://github.com/Lightning-Universe/lightning-bolts/archive/refs/heads/master.zip

To install all optional dependencies

pip install lightning-bolts["extra"]

What is Bolts?

Bolts package provides a variety of components to extend PyTorch Lightning, such as callbacks & datasets, for applied research and production.

Example 1: Accelerate Lightning Training with the Torch ORT Callback

Torch ORT converts your model into an optimized ONNX graph, speeding up training & inference when using NVIDIA or AMD GPUs. See the documentation for more details.

from pytorch_lightning import LightningModule, Trainer
import torchvision.models as models
from pl_bolts.callbacks import ORTCallback


class VisionModel(LightningModule):
    def __init__(self):
        super().__init__()
        self.model = models.vgg19_bn(pretrained=True)

    ...


model = VisionModel()
trainer = Trainer(gpus=1, callbacks=ORTCallback())
trainer.fit(model)

Example 2: Introduce Sparsity with the SparseMLCallback to Accelerate Inference

We can introduce sparsity during fine-tuning with SparseML, which ultimately allows us to leverage the DeepSparse engine to see performance improvements at inference time.

from pytorch_lightning import LightningModule, Trainer
import torchvision.models as models
from pl_bolts.callbacks import SparseMLCallback


class VisionModel(LightningModule):
    def __init__(self):
        super().__init__()
        self.model = models.vgg19_bn(pretrained=True)

    ...


model = VisionModel()
trainer = Trainer(gpus=1, callbacks=SparseMLCallback(recipe_path="recipe.yaml"))
trainer.fit(model)

Are specific research implementations supported?

We’d like to encourage users to contribute general components that will help a broad range of problems; however, components that help specific domains will also be welcomed!

For example, a callback to help train SSL models would be a great contribution; however, the next greatest SSL model from your latest paper would be a good contribution to Lightning Flash.

Use Lightning Flash to train, predict and serve state-of-the-art models for applied research. We suggest looking at our VISSL Flash integration for SSL-based tasks.

Contribute!

Bolts is supported by the PyTorch Lightning team and the PyTorch Lightning community!

Join our Slack and/or read our CONTRIBUTING guidelines to get help becoming a contributor!

License

Please observe the Apache 2.0 license that is listed in this repository. In addition, the Lightning framework is Patent Pending.

Dependencies

Loading dependencies...

CVE IssuesActive
0
Scorecards Score
3.40
Test Coverage
No Data
Follows Semver
No
Github Stars
1,559
Dependenciestotal
22
DependenciesOutdated
0
DependenciesDeprecated
0
Threat Modelling
No
Repo Audits
No

Learn how to distribute pytorch-lightning-bolts in your own private PyPI registry

pip install pytorch-lightning-bolts
Processing...
Done

13 Releases

PyPI on Cloudsmith

Getting started with PyPI on Cloudsmith is fast and easy.