DiffSharp is a tensor library with advanced support for differentiable programming. It is designed for use in machine learning, probabilistic programming, optimization and other domains.
DiffSharp provides advanced automatic differentiation capabilities for tensor code, making it possible to use derivative-taking operations, including gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products, as higher-order function compositions. This goes far beyond the standard reverse-mode gradients of traditional tensor libraries such as TensorFlow and PyTorch, allowing the use of nested forward and reverse differentiation up to any level, meaning that you can compute higher-order derivatives efficiently or differentiate functions that are internally making use of differentiation. Please see API Overview for a list of available operations.
DiffSharp 1.0 is implemented in F# and uses PyTorch C++ tensors (without the derivative computation graph) as the default raw-tensor backend. It is tested on Linux and Windows. DiffSharp is developed by Atılım Güneş Baydin, Don Syme and other contributors, having started as a project supervised by Barak Pearlmutter and Jeffrey Siskind. Please join us!
The library and documentation are undergoing development.
The primary features of DiffSharp 1.0 are:
PyTorch backend for CUDA support and highly optimized native tensor operations.
Below is a sample of using DiffSharp. You can access this sample as a script or a .NET Interactive Jupyter Notebook
(open in ).
open DiffSharp
// A 1D tensor
let t1 = dsharp.tensor [ 0.0 .. 0.2 .. 1.0 ]
// A 2x2 tensor
let t2 = dsharp.tensor [ [ 0; 1 ]; [ 2; 2 ] ]
// Define a scalar-to-scalar function
let f (x: Tensor) = sin (sqrt x)
f (dsharp.tensor 1.2)
// Get its derivative
let df = dsharp.diff f
df (dsharp.tensor 1.2)
// Now define a vector-to-scalar function
let g (x: Tensor) = exp (x.[0] * x.[1]) + x.[2]
g (dsharp.tensor [ 0.0; 0.3; 0.1 ])
// Now compute the gradient of g
let gg = dsharp.grad g
gg (dsharp.tensor [ 0.0; 0.3; 0.1 ])
// Compute the hessian of g
let hg = dsharp.hessian g
hg (dsharp.tensor [ 0.0; 0.3; 0.1 ])
To learn more about DiffSharp, use the navigation links to the left.
If you are using DiffSharp, please raise any issues you might have on GitHub. We also have a Gitter chat room. If you would like to cite this library, please use the following information:
Baydin, A.G., Pearlmutter, B.A., Radul, A.A. and Siskind, J.M., 2017. Automatic differentiation in machine learning: a survey. The Journal of Machine Learning Research, 18(1), pp.5595-5637. (link)
© Copyright 2020, DiffSharp Contributors.