Skip to content

Commit 111ee92

Browse files
committed
Better page titles
1 parent d4f1d81 commit 111ee92

File tree

4 files changed

+91
-46
lines changed

4 files changed

+91
-46
lines changed

docs/make.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,12 +18,12 @@ makedocs(
1818
"Loss Functions" => "models/losses.md",
1919
"Regularisation" => "models/regularisation.md",
2020
"Advanced Model Building" => "models/advanced.md",
21-
"NNlib" => "models/nnlib.md",
22-
"Functors" => "models/functors.md"
21+
"Neural Network primitives from NNlib.jl" => "models/nnlib.md",
22+
"Functor from Functors.jl" => "models/functors.md"
2323
],
2424
"Handling Data" => [
25-
"One-Hot Encoding" => "data/onehot.md",
26-
"MLUtils" => "data/mlutils.md"
25+
"One-Hot Encoding with OneHotArrays.jl" => "data/onehot.md",
26+
"Working with data using MLUtils.jl" => "data/mlutils.md"
2727
],
2828
"Training Models" => [
2929
"Optimisers" => "training/optimisers.md",

docs/src/data/mlutils.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# MLUtils.jl
1+
# Working with data using MLUtils.jl
22

33
Flux re-exports the `DataLoader` type and utility functions for working with
44
data from [MLUtils](https://github.com/JuliaML/MLUtils.jl).

docs/src/models/functors.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Functors.jl
1+
# Functor from Functors.jl
22

33
Flux makes use of the [Functors.jl](https://github.com/FluxML/Functors.jl) to represent many of the core functionalities it provides.
44

docs/src/models/nnlib.md

Lines changed: 85 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# NNlib.jl
1+
# Neural Network primitives from NNlib.jl
22

33
Flux re-exports all of the functions exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.
44

@@ -7,82 +7,127 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
77
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
88

99
```@docs
10-
NNlib.celu
11-
NNlib.elu
12-
NNlib.gelu
13-
NNlib.hardsigmoid
14-
NNlib.sigmoid_fast
15-
NNlib.hardtanh
16-
NNlib.tanh_fast
17-
NNlib.leakyrelu
18-
NNlib.lisht
19-
NNlib.logcosh
20-
NNlib.logsigmoid
21-
NNlib.mish
22-
NNlib.relu
23-
NNlib.relu6
24-
NNlib.rrelu
25-
NNlib.selu
26-
NNlib.sigmoid
27-
NNlib.softplus
28-
NNlib.softshrink
29-
NNlib.softsign
30-
NNlib.swish
31-
NNlib.tanhshrink
32-
NNlib.trelu
10+
celu
11+
elu
12+
gelu
13+
hardsigmoid
14+
sigmoid_fast
15+
hardtanh
16+
tanh_fast
17+
leakyrelu
18+
lisht
19+
logcosh
20+
logsigmoid
21+
mish
22+
relu
23+
relu6
24+
rrelu
25+
selu
26+
sigmoid
27+
softplus
28+
softshrink
29+
softsign
30+
swish
31+
hardswish
32+
tanhshrink
33+
trelu
3334
```
3435

3536
## Softmax
3637

38+
`Flux`'s `logitcrossentropy` uses `NNlib.softmax` internally.
39+
3740
```@docs
38-
NNlib.softmax
39-
NNlib.logsoftmax
41+
softmax
42+
logsoftmax
4043
```
4144

4245
## Pooling
4346

47+
`Flux`'s `AdaptiveMaxPool`, `AdaptiveMeanPool`, `GlobalMaxPool`, `GlobalMeanPool`, `MaxPool`, and `MeanPool` use `NNlib.PoolDims`, `NNlib.maxpool`, and `NNlib.meanpool` as their backend.
48+
4449
```@docs
45-
NNlib.maxpool
46-
NNlib.meanpool
50+
PoolDims
51+
maxpool
52+
meanpool
53+
```
54+
55+
## Padding
56+
57+
```@docs
58+
pad_reflect
59+
pad_constant
60+
pad_repeat
61+
pad_zeros
4762
```
4863

4964
## Convolution
5065

66+
`Flux`'s `Conv` and `CrossCor` layers use `NNlib.DenseConvDims` and `NNlib.conv` internally.
67+
5168
```@docs
52-
NNlib.conv
53-
NNlib.depthwiseconv
69+
conv
70+
ConvDims
71+
depthwiseconv
72+
DepthwiseConvDims
73+
DenseConvDims
5474
```
5575

5676
## Upsampling
5777

78+
`Flux`'s `Upsample` layer uses `NNlib.upsample_nearest`, `NNlib.upsample_bilinear`, and `NNlib.upsample_trilinear` as its backend. Additionally, `Flux`'s `PixelShuffle` layer uses `NNlib.pixel_shuffle` as its backend.
79+
5880
```@docs
59-
NNlib.upsample_nearest
60-
NNlib.upsample_bilinear
61-
NNlib.upsample_trilinear
62-
NNlib.pixel_shuffle
63-
NNlib.grid_sample
81+
upsample_nearest
82+
∇upsample_nearest
83+
upsample_linear
84+
∇upsample_linear
85+
upsample_bilinear
86+
∇upsample_bilinear
87+
upsample_trilinear
88+
∇upsample_trilinear
89+
pixel_shuffle
6490
```
6591

6692
## Batched Operations
6793

94+
`Flux`'s `Bilinear` layer uses `NNlib.batched_mul` internally.
95+
6896
```@docs
69-
NNlib.batched_mul
70-
NNlib.batched_mul!
71-
NNlib.batched_adjoint
72-
NNlib.batched_transpose
97+
batched_mul
98+
batched_mul!
99+
batched_adjoint
100+
batched_transpose
101+
batched_vec
73102
```
74103

75104
## Gather and Scatter
76105

106+
`Flux`'s `Embedding` layer uses `NNlib.gather` as its backend.
107+
77108
```@docs
78109
NNlib.gather
79110
NNlib.gather!
80111
NNlib.scatter
81112
NNlib.scatter!
82113
```
83114

115+
## Sampling
116+
117+
```@docs
118+
grid_sample
119+
∇grid_sample
120+
```
121+
122+
## Losses
123+
124+
```@docs
125+
ctc_loss
126+
```
127+
84128
## Miscellaneous
85129

86130
```@docs
87-
NNlib.logsumexp
131+
logsumexp
132+
NNlib.glu
88133
```

0 commit comments

Comments
 (0)