Skip to content

Commit 28481c0

Browse files
committed
Improve README
1 parent 5cc3a71 commit 28481c0

File tree

3 files changed

+99
-7
lines changed

3 files changed

+99
-7
lines changed

README.md

+53-1
Original file line numberDiff line numberDiff line change
@@ -18,4 +18,56 @@
1818
<a href="https://github.com/codespaces/new?hide_repo_select=true&ref=main&repo=563952901&machine=standardLinux32gb&devcontainer_path=.devcontainer%2Fdevcontainer.json&location=EastUshttps://github.com/codespaces/new?hide_repo_select=true&ref=main&repo=563952901&machine=standardLinux32gb&devcontainer_path=.devcontainer%2Fdevcontainer.json&location=EastUs"><img src="https://github.com/codespaces/badge.svg" alt="Open in GitHub Codespaces" /></a>
1919
</p>
2020

21-
Please see [Documentation](https://juliadiff.org/TaylorDiff.jl) and [Benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) for more information.
21+
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) package for efficient and composable higher-order derivatives, implemented with operator-overloading on Taylor polynomials.
22+
23+
Disclaimer: this project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
24+
25+
## Features
26+
27+
TaylorDiff.jl is designed with the following goals in head:
28+
29+
- Linear scaling with the order of differentiation (while naively composing first-order differentiation would result in exponential scaling)
30+
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order and second order, so there is no penalty in drop-in replacement
31+
- Capable for calculating exact derivatives in physical models with ODEs and PDEs
32+
- Composable with other AD systems like [Zygote.jl](https://github.com/FluxML/Zygote.jl), so that the above models evaluated with TaylorDiff can be further optimized with gradient-based optimization techniques
33+
34+
TaylorDiff.jl is fast! See our dedicated [benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) page for comparison with other packages in various tasks.
35+
36+
## Installation
37+
38+
```bash
39+
] add TaylorDiff
40+
```
41+
42+
## Usage
43+
44+
```julia
45+
using TaylorDiff
46+
47+
x = 0.1
48+
derivative(sin, x, 10) # scalar derivative
49+
v, direction = [3.0, 4.0], [1.0, 0.0]
50+
derivative(x -> sum(exp.(x)), v, direction, 2) # directional derivative
51+
```
52+
53+
Please see our [documentation](https://juliadiff.org/TaylorDiff.jl) for more details.
54+
55+
## Related Projects
56+
57+
- [TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl): a systematic treatment of Taylor polynomials in one and several variables, but its mutating and scalar code isn't great for speed and composability with other packages
58+
- [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl): well-established and robust operator-overloading based forward-mode AD, where higher-order derivatives can be achieved by nesting first-order derivatives
59+
- [Diffractor.jl](https://github.com/PumasAI/SimpleChains.jl): next-generation source-code transformation based forward-mode and reverse-mode AD, designed with support for higher-order derivatives in mind; but the higher-order functionality is currently only a proof-of-concept
60+
- [`jax.jet`](https://jax.readthedocs.io/en/latest/jax.experimental.jet.html): an experimental (and unmaintained) implementation of Taylor-mode automatic differentiation in JAX, sharing the same underlying algorithm with this project
61+
62+
## Citation
63+
64+
```bibtex
65+
@software{tan2022taylordiff,
66+
author = {Tan, Songchen},
67+
title = {TaylorDiff.jl: Fast Higher-order Automatic Differentiation in Julia},
68+
year = {2022},
69+
publisher = {GitHub},
70+
journal = {GitHub repository},
71+
howpublished = {\url{https://github.com/JuliaDiff/TaylorDiff.jl}}
72+
}
73+
```

benchmark/runbenchmarks.jl

-1
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@ function benchmark()
1818
branch = headname(repo)
1919
config = BenchmarkConfig(id = branch)
2020
results = benchmarkpkg(TaylorDiff, config)
21-
results.name = "TaylorDiff.jl"
2221
endpoint = "https://benchmark.tansongchen.com"
2322
put(endpoint; body = json(results))
2423
end

docs/src/index.md

+46-5
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,56 @@ CurrentModule = TaylorDiff
44

55
# TaylorDiff.jl
66

7-
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) library for efficient and composable higher-order derivatives, implemented with forward evaluation of overloaded function on Taylor polynomials. It is designed with the following goals in head:
7+
[TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl) is an automatic differentiation (AD) package for efficient and composable higher-order derivatives, implemented with operator-overloading on Taylor polynomials.
8+
9+
Disclaimer: this project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
10+
11+
## Features
12+
13+
TaylorDiff.jl is designed with the following goals in head:
814

915
- Linear scaling with the order of differentiation (while naively composing first-order differentiation would result in exponential scaling)
10-
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order, so there is no penalty in drop-in replacement
16+
- Same performance with [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) on first order and second order, so there is no penalty in drop-in replacement
1117
- Capable for calculating exact derivatives in physical models with ODEs and PDEs
1218
- Composable with other AD systems like [Zygote.jl](https://github.com/FluxML/Zygote.jl), so that the above models evaluated with TaylorDiff can be further optimized with gradient-based optimization techniques
1319

14-
This project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!
20+
TaylorDiff.jl is fast! See our dedicated [benchmarks](https://benchmark.tansongchen.com/TaylorDiff.jl) page for comparison with other packages in various tasks.
21+
22+
## Installation
23+
24+
```bash
25+
] add TaylorDiff
26+
```
27+
28+
## Usage
1529

16-
# Related Projects
30+
```julia
31+
using TaylorDiff
1732

18-
This project start from [TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl) and re-implement the Taylor mode automatic differentiation primarily for high-order differentiation in solving ODEs and PDEs.
33+
x = 0.1
34+
derivative(sin, x, 10) # scalar derivative
35+
v, direction = [3.0, 4.0], [1.0, 0.0]
36+
derivative(x -> sum(exp.(x)), v, direction, 2) # directional derivative
37+
```
38+
39+
Please see our [documentation](https://juliadiff.org/TaylorDiff.jl) for more details.
40+
41+
## Related Projects
42+
43+
- [TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl): a systematic treatment of Taylor polynomials in one and several variables, but its mutating and scalar code isn't great for speed and composability with other packages
44+
- [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl): well-established and robust operator-overloading based forward-mode AD, where higher-order derivatives can be achieved by nesting first-order derivatives
45+
- [Diffractor.jl](https://github.com/PumasAI/SimpleChains.jl): next-generation source-code transformation based forward-mode and reverse-mode AD, designed with support for higher-order derivatives in mind; but the higher-order functionality is currently only a proof-of-concept
46+
- [`jax.jet`](https://jax.readthedocs.io/en/latest/jax.experimental.jet.html): an experimental (and unmaintained) implementation of Taylor-mode automatic differentiation in JAX, sharing the same underlying algorithm with this project
47+
48+
## Citation
49+
50+
```bibtex
51+
@software{tan2022taylordiff,
52+
author = {Tan, Songchen},
53+
title = {TaylorDiff.jl: Fast Higher-order Automatic Differentiation in Julia},
54+
year = {2022},
55+
publisher = {GitHub},
56+
journal = {GitHub repository},
57+
howpublished = {\url{https://github.com/JuliaDiff/TaylorDiff.jl}}
58+
}
59+
```

0 commit comments

Comments
 (0)