You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Low-level utilities for PyTorch sparse tensors and operations.
10
10
11
11
## Introduction
12
+
12
13
PyTorch's implementation of sparse tensors is lacking full support for many common operations. This repository contains a set of utilities for making PyTorch sparse tensors into more usable general-purpose sparse data structures, particularly in the context of modern neural network architectures like Transformer-based models.
13
14
14
15
For example, while the basic operation `index_select` has a sparse forward implementation, using it as part of an autograd graph alongside direct manipulation of the sparse tensor's values is not supported:
16
+
15
17
```python
16
18
# Latest PyTorch version (2.7.1) as of this writing
- Autograd-compatible implementations of bulk indexing, sparse tensor shape manipulations, and quick conversions between sparse tensor format and concatenated-batch format for use with position-invariant layers (Linear, BatchNorm, etc.).
68
72
- Interoperability with [Pydata sparse](https://sparse.pydata.org/), a numpy-like sparse array implementation, as well as [MinkowskiEngine](https://github.com/NVIDIA/MinkowskiEngine) and [spconv](https://github.com/traveller59/spconv), two popular PyTorch libraries for convolutions on sparse images and volumes.
69
73
- Full TorchScript compatibility for performance.
70
74
- Extensive unit and property-based tests to ensure correctness and reliability.
71
75
72
76
## Installation
77
+
73
78
pytorch-sparse-utils has minimal requirements beyond PyTorch itself. The simplest way to install is to clone this repository and use `pip install`:
To run the test suite, you'll need to install the optional dependencies:
87
+
80
88
```bash
81
89
pip install -e ".[tests]"
82
90
```
83
91
84
92
Due to incompatibilities with newer CUDA versions, MinkowskiEngine and spconv are not installed as part of the base install. For more information on installing those libraries, see their own repositories.
85
93
86
94
## Documentation
95
+
87
96
Full documentation is available on [GitHub Pages](https://mawright.github.io/pytorch-sparse-utils/).
88
97
89
98
## See Also
99
+
90
100
pytorch-sparse-utils represents a base set of tools for more complex neural-net operations on sparse tensors. For more sparse tensor applications, see the following repositories:
101
+
91
102
-[nd-rotary-encodings](https://github.com/mawright/nd-rotary-encodings): Fast and memory-efficient rotary positional encodings (RoPE) in PyTorch, with novel algorithm updates for multi-level feature pyramids for object detection and other applications.
92
103
-[sparse-transformer-layers](https://github.com/mawright/sparse-transformer-layers): Implementations of Transformer layers tailored to sparse tensors, including variants like Multi-scale Deformable Attention. Features custom gradient checkpointing logic to effectively handle sparse tensors with potentially many nonzero entries.
93
104
94
105
## Future Plans
106
+
95
107
- Custom C++/CUDA extensions for the most performance-critical operations
0 commit comments