You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+53-1
Original file line number
Diff line number
Diff line change
@@ -18,4 +18,56 @@
18
18
<ahref="https://github.com/codespaces/new?hide_repo_select=true&ref=main&repo=563952901&machine=standardLinux32gb&devcontainer_path=.devcontainer%2Fdevcontainer.json&location=EastUshttps://github.com/codespaces/new?hide_repo_select=true&ref=main&repo=563952901&machine=standardLinux32gb&devcontainer_path=.devcontainer%2Fdevcontainer.json&location=EastUs"><imgsrc="https://github.com/codespaces/badge.svg"alt="Open in GitHub Codespaces" /></a>
19
19
</p>
20
20
21
-
Please see [Documentation](https://juliadiff.org/TaylorDiff.jl) and [Benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) for more information.
21
+
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) package for efficient and composable higher-order derivatives, implemented with operator-overloading on Taylor polynomials.
22
+
23
+
Disclaimer: this project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
24
+
25
+
## Features
26
+
27
+
TaylorDiff.jl is designed with the following goals in head:
28
+
29
+
- Linear scaling with the order of differentiation (while naively composing first-order differentiation would result in exponential scaling)
30
+
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order and second order, so there is no penalty in drop-in replacement
31
+
- Capable for calculating exact derivatives in physical models with ODEs and PDEs
32
+
- Composable with other AD systems like [Zygote.jl](https://github.com/FluxML/Zygote.jl), so that the above models evaluated with TaylorDiff can be further optimized with gradient-based optimization techniques
33
+
34
+
TaylorDiff.jl is fast! See our dedicated [benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) page for comparison with other packages in various tasks.
Please see our [documentation](https://juliadiff.org/TaylorDiff.jl) for more details.
54
+
55
+
## Related Projects
56
+
57
+
-[TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl): a systematic treatment of Taylor polynomials in one and several variables, but its mutating and scalar code isn't great for speed and composability with other packages
58
+
-[ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl): well-established and robust operator-overloading based forward-mode AD, where higher-order derivatives can be achieved by nesting first-order derivatives
59
+
-[Diffractor.jl](https://github.com/PumasAI/SimpleChains.jl): next-generation source-code transformation based forward-mode and reverse-mode AD, designed with support for higher-order derivatives in mind; but the higher-order functionality is currently only a proof-of-concept
60
+
-[`jax.jet`](https://jax.readthedocs.io/en/latest/jax.experimental.jet.html): an experimental (and unmaintained) implementation of Taylor-mode automatic differentiation in JAX, sharing the same underlying algorithm with this project
61
+
62
+
## Citation
63
+
64
+
```bibtex
65
+
@software{tan2022taylordiff,
66
+
author = {Tan, Songchen},
67
+
title = {TaylorDiff.jl: Fast Higher-order Automatic Differentiation in Julia},
Copy file name to clipboardExpand all lines: docs/src/index.md
+46-5
Original file line number
Diff line number
Diff line change
@@ -4,15 +4,56 @@ CurrentModule = TaylorDiff
4
4
5
5
# TaylorDiff.jl
6
6
7
-
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) library for efficient and composable higher-order derivatives, implemented with forward evaluation of overloaded function on Taylor polynomials. It is designed with the following goals in head:
7
+
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) package for efficient and composable higher-order derivatives, implemented with operator-overloading on Taylor polynomials.
8
+
9
+
Disclaimer: this project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
10
+
11
+
## Features
12
+
13
+
TaylorDiff.jl is designed with the following goals in head:
8
14
9
15
- Linear scaling with the order of differentiation (while naively composing first-order differentiation would result in exponential scaling)
10
-
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order, so there is no penalty in drop-in replacement
16
+
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order and second order, so there is no penalty in drop-in replacement
11
17
- Capable for calculating exact derivatives in physical models with ODEs and PDEs
12
18
- Composable with other AD systems like [Zygote.jl](https://github.com/FluxML/Zygote.jl), so that the above models evaluated with TaylorDiff can be further optimized with gradient-based optimization techniques
13
19
14
-
This project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
20
+
TaylorDiff.jl is fast! See our dedicated [benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) page for comparison with other packages in various tasks.
21
+
22
+
## Installation
23
+
24
+
```bash
25
+
] add TaylorDiff
26
+
```
27
+
28
+
## Usage
15
29
16
-
# Related Projects
30
+
```julia
31
+
using TaylorDiff
17
32
18
-
This project start from [TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl) and re-implement the Taylor mode automatic differentiation primarily for high-order differentiation in solving ODEs and PDEs.
Please see our [documentation](https://juliadiff.org/TaylorDiff.jl) for more details.
40
+
41
+
## Related Projects
42
+
43
+
-[TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl): a systematic treatment of Taylor polynomials in one and several variables, but its mutating and scalar code isn't great for speed and composability with other packages
44
+
-[ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl): well-established and robust operator-overloading based forward-mode AD, where higher-order derivatives can be achieved by nesting first-order derivatives
45
+
-[Diffractor.jl](https://github.com/PumasAI/SimpleChains.jl): next-generation source-code transformation based forward-mode and reverse-mode AD, designed with support for higher-order derivatives in mind; but the higher-order functionality is currently only a proof-of-concept
46
+
-[`jax.jet`](https://jax.readthedocs.io/en/latest/jax.experimental.jet.html): an experimental (and unmaintained) implementation of Taylor-mode automatic differentiation in JAX, sharing the same underlying algorithm with this project
47
+
48
+
## Citation
49
+
50
+
```bibtex
51
+
@software{tan2022taylordiff,
52
+
author = {Tan, Songchen},
53
+
title = {TaylorDiff.jl: Fast Higher-order Automatic Differentiation in Julia},
0 commit comments