Skip to content

Better docs for reexported packages #2046

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Aug 29, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ makedocs(
"Loss Functions" => "models/losses.md",
"Regularisation" => "models/regularisation.md",
"Advanced Model Building" => "models/advanced.md",
"NNlib" => "models/nnlib.md",
"Functors" => "models/functors.md"
"Neural Network primitives from NNlib.jl" => "models/nnlib.md",
"Functor from Functors.jl" => "models/functors.md"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One more thought. Should Zygote.jl be among the packages which gets a sidebar heading?

],
"Handling Data" => [
"One-Hot Encoding" => "data/onehot.md",
"MLUtils" => "data/mlutils.md"
"One-Hot Encoding with OneHotArrays.jl" => "data/onehot.md",
"Working with data using MLUtils.jl" => "data/mlutils.md"
],
"Training Models" => [
"Optimisers" => "training/optimisers.md",
Expand Down
4 changes: 3 additions & 1 deletion docs/src/data/mlutils.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# MLUtils.jl
# Working with data using MLUtils.jl
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't comment below, but flatten appears on this page (as it should), and also here:

https://fluxml.ai/Flux.jl/latest/models/layers/#Flux.flatten

Should it be removed?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, yes! Thanks!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correction: MLUtils.unsqueeze cross-references MLUtils.flatten, and the doctests fail if I remove MLUtils.flatten's reference from the docs.


Flux re-exports the `DataLoader` type and utility functions for working with
data from [MLUtils](https://github.com/JuliaML/MLUtils.jl).
Expand All @@ -7,6 +7,8 @@ data from [MLUtils](https://github.com/JuliaML/MLUtils.jl).

`DataLoader` can be used to handle iteration over mini-batches of data.

`Flux`'s website has a [dedicated tutorial](https://fluxml.ai/tutorials/2021/01/21/data-loader.html) on `DataLoader` for more information.

```@docs
MLUtils.DataLoader
```
Expand Down
12 changes: 7 additions & 5 deletions docs/src/models/functors.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,17 @@
# Functors.jl
# Recursive transformations from Functors.jl

Flux makes use of the [Functors.jl](https://github.com/FluxML/Functors.jl) to represent many of the core functionalities it provides.
Flux models are deeply nested structures, and [Functors.jl](https://github.com/FluxML/Functors.jl) provides tools needed to explore such objects, apply functions to the parameters they contain, and re-build them.

Functors.jl is a collection of tools designed to represent a [functor](https://en.wikipedia.org/wiki/Functor_(functional_programming)). Flux makes use of it to treat certain structs as functors. Notable examples include the layers that Flux defines.
New layers should be annotated using the `Functors.@functor` macro. This will enable [`params`](@ref Flux.params) to see the parameters inside, and [`gpu`](@ref) to move them to the GPU.

`Functors.jl` has its own [notes on basic usage](https://fluxml.ai/Functors.jl/stable/#Basic-Usage-and-Implementation) for more details.

```@docs
Functors.@functor
Functors.fmap
Functors.isleaf
Functors.children
Functors.fcollect
Functors.functor
Functors.@functor
Functors.fmap
Functors.fmapstructure
```
125 changes: 85 additions & 40 deletions docs/src/models/nnlib.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# NNlib.jl
# Neural Network primitives from NNlib.jl

Flux re-exports all of the functions exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.

Expand All @@ -7,82 +7,127 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.

```@docs
NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.sigmoid_fast
NNlib.hardtanh
NNlib.tanh_fast
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu
celu
elu
gelu
hardsigmoid
sigmoid_fast
hardtanh
tanh_fast
leakyrelu
lisht
logcosh
logsigmoid
mish
relu
relu6
rrelu
selu
sigmoid
softplus
softshrink
softsign
swish
hardswish
tanhshrink
trelu
```

## Softmax

`Flux`'s `logitcrossentropy` uses `NNlib.softmax` internally.

```@docs
NNlib.softmax
NNlib.logsoftmax
softmax
logsoftmax
```

## Pooling

`Flux`'s `AdaptiveMaxPool`, `AdaptiveMeanPool`, `GlobalMaxPool`, `GlobalMeanPool`, `MaxPool`, and `MeanPool` use `NNlib.PoolDims`, `NNlib.maxpool`, and `NNlib.meanpool` as their backend.

```@docs
NNlib.maxpool
NNlib.meanpool
PoolDims
maxpool
meanpool
```

## Padding

```@docs
pad_reflect
pad_constant
pad_repeat
pad_zeros
```

## Convolution

`Flux`'s `Conv` and `CrossCor` layers use `NNlib.DenseConvDims` and `NNlib.conv` internally.

```@docs
NNlib.conv
NNlib.depthwiseconv
conv
ConvDims
depthwiseconv
DepthwiseConvDims
DenseConvDims
```

## Upsampling

`Flux`'s `Upsample` layer uses `NNlib.upsample_nearest`, `NNlib.upsample_bilinear`, and `NNlib.upsample_trilinear` as its backend. Additionally, `Flux`'s `PixelShuffle` layer uses `NNlib.pixel_shuffle` as its backend.

```@docs
NNlib.upsample_nearest
NNlib.upsample_bilinear
NNlib.upsample_trilinear
NNlib.pixel_shuffle
NNlib.grid_sample
upsample_nearest
∇upsample_nearest
upsample_linear
∇upsample_linear
upsample_bilinear
∇upsample_bilinear
upsample_trilinear
∇upsample_trilinear
pixel_shuffle
```

## Batched Operations

`Flux`'s `Bilinear` layer uses `NNlib.batched_mul` internally.

```@docs
NNlib.batched_mul
NNlib.batched_mul!
NNlib.batched_adjoint
NNlib.batched_transpose
batched_mul
batched_mul!
batched_adjoint
batched_transpose
batched_vec
```

## Gather and Scatter

`Flux`'s `Embedding` layer uses `NNlib.gather` as its backend.

```@docs
NNlib.gather
NNlib.gather!
NNlib.scatter
NNlib.scatter!
```

## Sampling

```@docs
grid_sample
∇grid_sample
```

## Losses

```@docs
ctc_loss
```

## Miscellaneous

```@docs
NNlib.logsumexp
logsumexp
NNlib.glu
```
Loading