Info
Version: | 0.8.4-beta |
Author(s): | Atılım Güneş Baydin, Barak A. Pearlmutter |
Last Update: | Saturday, August 24, 2019 |
.NET Fiddle: | Create the first Fiddle |
NuGet Url: | https://www.nuget.org/packages/DiffSharp |
Install
Install-Package DiffSharp
dotnet add package DiffSharp
paket add DiffSharp
DiffSharp Download (Unzip the "nupkg" after downloading)
Dependencies
- FSharp.Core(>= 4.3.4)
- FSharp.Quotations.Evaluator(>= 1.1.2)
Tags
AD allows exact and efficient calculation of derivatives, by systematically invoking the chain rule of calculus at the elementary operator level during program execution.
AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is affected by expression swell and cannot fully handle algorithmic control flow.
Using the DiffSharp library, derivative calculations (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) can be incorporated with minimal change into existing algorithms. Diffsharp supports nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.
The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth.
DiffSharp is implemented in the F# language and can be used from C# and the other languages running on .NET Core, Mono, or the .NET Framework; targeting the 64 bit platform.
It is tested on Linux and Windows. We are working on interfaces/ports to other languages.