Skip to content

BouchardLab/sparse-transformer-layers

 
 

Repository files navigation

sparse-transformer-layers

Tests codecov Python License

Transformer layers for PyTorch sparse tensors

(Readme is a work in progress)

Introduction

This repository contains several advanced Transformer-based layers meant for spatial data that may be large, spatially sparse, and/or irregularly structured. In particular, they are oriented towards Transformer-based object detectors (i.e., DETRs) that operate on multi-level feature pyramids. In particular, the repository contains implementations of:

  • Multilevel self-attention
  • Multilevel sparse neighborhood attention
  • Sparse multi-scale deformable attention (MSDeformAttention)

The primary envisioned scenario for use is object detection and/or segmentation on large, spatially-sparse images or volumes. In this setting, the standard implementations of spatial attention operations may not be applicable due to the size and irregularity of the data. This repository builds on the related libraries pytorch-sparse-utils and nd-rotary-encodings for flexible, performant Transformer operations.

About

Transformer layers for PyTorch sparse tensors

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%