BinderBinderScriptScript

Quickstart

Here we cover some key tasks involved in a typical machine learning pipeline and how these can be implemented with DiffSharp. Note that a significant part of DiffSharp's design has been influenced by PyTorch and you would feel mostly at home if you have familiarity with PyTorch.

Datasets and Data Loaders

DiffSharp provides the Dataset type that represents a data source and the DataLoader type that handles the loading of data from datasets and iterating over minibatches of data.

See the DiffSharp.Data namespace for the full API reference.

Datasets

DiffSharp has ready-to-use types that cover main datasets typically used in machine learning, such as MNIST, CIFAR10, CIFAR100, and also more generic dataset types such as TensorDataset or ImageDataset.

The following loads the MNIST dataset and shows one image entry and the corresponding label.

open DiffSharp
open DiffSharp.Data

// First ten images in MNIST training set
let dataset = MNIST("../data", train=true, transform=id, n=10)

// Inspect a single image and label
let data, label = dataset[7]

// Save image to file
data.saveImage("test.png")
// Inspect data as ASCII and show label
printfn "Data: %A\nLabel: %A" (data.toImageString()) label
Data: "                            
                            
                            
                            
                            
           ~-}@#####Z       
         -j*W########J'     
         O############i     
         [##Mxxxxo####i     
          ::^    'W##Z      
                 |&##f      
                (o###'      
              (q%###d.      
         "uaaa####8}:       
        _m########O         
        _*####@####?        
         "v<____f##?        
                `##?        
                |##?        
       ?.      1&##?        
     iQ#:    `)8##&!        
     p##txxxxb###o\         
     p#########MC.          
     +J#####wdt_            
       }B#Z}^               
                            
                            
                            
"
Label: tensor(3,dtype=Int32)

Data Loaders

A data loader handles tasks such as constructing minibatches from an underlying dataset on-the-fly, shuffling the data, and moving the data tensors between devices. In the example below we show a single batch of six MNIST images and their corresponding classification labels.

let loader = DataLoader(dataset, shuffle=true, batchSize=6)
let batch, labels = loader.batch()

printfn "%A\nLabels: %A" (batch.toImageString()) labels
"                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                           <J#J+                                    
             ```uzO:Y@%u                  +8###8                          u#@!      
        ";)CX#####*Z#&h!                 <W##8M#>                        1&$#!      
       +8########$)\\>~                .i*###b|##v                       u$$#!      
       `m#####k0%&                     U######1o#Y                      lB$p".      
        /C[##d' -C                    <8##ar#W_/@X                     iW$$)        
         ' C#(                       +8##0'?v,  #&+                    C##o         
           j#a                      ~Y#Mp|      ##Y                   ,#$Bl         
           'a#I                     O#8I,:      ##h                  "d#$u          
            ;&*J[                  >##!         ##h                  }$#Q`          
             \8##c^                k#a          @#h                 /8$a^           
              _o##L:              ?%#]          ##t                "###J            
               `)##o              |#M^         z#o'                L$$$~            
                 $#$!             |#*         n#*I                +m$$Z             
              _nQ##d              |#f       +Y#Z                  B$$h'             
            ~tW###$0              |#*      r8#U                   #$$(              
          ^rm####b/               |#$t+:|O*#*Y>                  J@##"              
        ^lq####k\                 |###Ww###hn                   +W#%j.              
      `Xm####h/.                  :k#####Mf                     !$#m                
    >ZW####&x'                     ^n###j~                      !$#m                
    z###qzx`                                                    ^a#m                
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                    lM~                                                '^{v         
    !\              c0~                                              ;Lp###t        
    uU              Cp~                 >tp##r|t>                  ~L&##*p#M~       
    mU             :#U                 |M##op###X                `L8###w"~##i       
    mU             Q#u                >&#a! '0##r                1####w: ~##i       
   _%U             k#>               1##Q'  )##*,                0##mY"  /##i       
   c#U            ^M#:              x##t'   w##/                 :n>^    {##i       
   J#c            U#w`             u#%O.  ./%#n                          J##i       
   J#l         '|O$#(             `M#O   ;b##X'                      ]vvvb#h        
   J#|   _+rfL&&B0&#~             ^##"^ck&##$/                    .<0##@##W;        
   L#8ddd##$8kf(: M$              `M#####WW#M                    <W#&WX&##Mc        
    cOOOOO1>     }#m               >B#wz-^a#f                   /##ui `p####|       
                 X#z                     !@#[                  z##0` ,b#%nZ##Ql++   
                 X#>                     I##,                 z#&[` <k#w! 'IU&##*   
                 X#>                      ##,                }#&(`?X&#u:     (00~   
                 X@)                     I##,                M#%dw###u              
                 X#1                     [##,                d####Or;               
                 X#C                     _@#,                ')fv^                  
                 X@C                      w#>                                       
                 1#C                      1#o-                                      
                                          'Q#X'                                     
                                           't#-                                     
                                                                                    
"
Labels: tensor([5., 0., 1., 4., 9., 2.])

In practice a data loader is typically used to iterate over all minibatches in a given dataset in order to feed each minibatch through a machine learning model. One full iteration over the dataset would be called an "epoch". Typically you would perform multiple such epochs of iterations during the training of a model.

for epoch = 1 to 10 do
    for i, data, labels in loader.epoch() do
        printfn "Epoch %A, minibatch %A" epoch (i+1)
        // Process the minibatch
        // ...

Models

Many machine learning models are differentiable functions whose parameters can be tuned via gradient-based optimization, finding an optimum for an objective function that quantifies the fit of the model to a given set of data. These models are typically built as compositions non-linear functions and ready-to-use building blocks such as linear, recurrent, and convolutional layers.

DiffSharp provides the most commonly used model building blocks including convolutions, transposed convolutions, batch normalization, dropout, recurrent and other architectures.

See the DiffSharp.Model namespace for the full API reference.

Constructing models, PyTorch style

If you have experience with PyTorch, you would find the following way of model definition familiar. Let's look at an example of a generative adversarial network (GAN) architecture.

open DiffSharp.Model
open DiffSharp.Compose

// PyTorch style

// Define a model class inheriting the base
type Generator(nz: int) =
    inherit Model()
    let fc1 = Linear(nz, 256)
    let fc2 = Linear(256, 512)
    let fc3 = Linear(512, 1024)
    let fc4 = Linear(1024, 28*28)
    do base.addModel(fc1, fc2, fc3, fc4)
    override self.forward(x) =
        x
        |> dsharp.view([-1;nz])
        |> fc1.forward
        |> dsharp.leakyRelu(0.2)
        |> fc2.forward
        |> dsharp.leakyRelu(0.2)
        |> fc3.forward
        |> dsharp.leakyRelu(0.2)
        |> fc4.forward
        |> dsharp.tanh

// Define a model class inheriting the base
type Discriminator(nz:int) =
    inherit Model()
    let fc1 = Linear(28*28, 1024)
    let fc2 = Linear(1024, 512)
    let fc3 = Linear(512, 256)
    let fc4 = Linear(256, 1)
    do base.addModel(fc1, fc2, fc3, fc4)
    override self.forward(x) =
        x
        |> dsharp.view([-1;28*28])
        |> fc1.forward
        |> dsharp.leakyRelu(0.2)
        |> dsharp.dropout(0.3)
        |> fc2.forward
        |> dsharp.leakyRelu(0.2)
        |> dsharp.dropout(0.3)
        |> fc3.forward
        |> dsharp.leakyRelu(0.2)
        |> dsharp.dropout(0.3)
        |> fc4.forward
        |> dsharp.sigmoid

// Instantiate the defined classes
let nz = 128
let gen = Generator(nz)
let dis = Discriminator(nz)

print gen
print dis
Model(Linear(128, 256), Linear(256, 512), Linear(512, 1024), Linear(1024, 784))
Model(Linear(784, 1024), Linear(1024, 512), Linear(512, 256), Linear(256, 1))

Constructing models, DiffSharp style

A key advantage of DiffSharp lies in the functional programming paradigm enabled by the F# language, where functions are first-class citizens, many algorithms can be constructed by applying and composing functions, and differentiation operations can be expressed as composable higher-order functions. This allows very succinct (and beautiful) machine learning code to be expressed as a powerful combination of lambda calculus and differential calculus.

For example, the following constructs the same GAN architecture (that we constructed in PyTorch style in the previous section) using DiffSharp's --> composition operator, which allows you to seamlessly compose Model instances and differentiable Tensor->Tensor functions.

// DiffSharp style

// Model as a composition of models and Tensor->Tensor functions
let generator =
    dsharp.view([-1;nz])
    --> Linear(nz, 256)
    --> dsharp.leakyRelu(0.2)
    --> Linear(256, 512)
    --> dsharp.leakyRelu(0.2)
    --> Linear(512, 1024)
    --> dsharp.leakyRelu(0.2)
    --> Linear(1024, 28*28)
    --> dsharp.tanh

// Model as a composition of models and Tensor->Tensor functions
let discriminator =
    dsharp.view([-1; 28*28])
    --> Linear(28*28, 1024)
    --> dsharp.leakyRelu(0.2)
    --> dsharp.dropout(0.3)
    --> Linear(1024, 512)
    --> dsharp.leakyRelu(0.2)
    --> dsharp.dropout(0.3)
    --> Linear(512, 256)
    --> dsharp.leakyRelu(0.2)
    --> dsharp.dropout(0.3)
    --> Linear(256, 1)
    --> dsharp.sigmoid

print generator
print discriminator
Model(Linear(128, 256), Linear(256, 512), Linear(512, 1024), Linear(1024, 784))
Model(Linear(784, 1024), Linear(1024, 512), Linear(512, 256), Linear(256, 1))
namespace DiffSharp
type dsharp = static member abs: input: Tensor -> Tensor static member acos: input: Tensor -> Tensor static member add: a: Tensor * b: Tensor -> Tensor static member arange: endVal: float * ?startVal: float * ?step: float * ?device: Device * ?dtype: Dtype * ?backend: Backend -> Tensor + 1 overload static member arangeLike: input: Tensor * endVal: float * ?startVal: float * ?step: float * ?device: Device * ?dtype: Dtype * ?backend: Backend -> Tensor + 1 overload static member argmax: input: Tensor -> int[] + 1 overload static member argmin: input: Tensor -> int[] + 1 overload static member asin: input: Tensor -> Tensor static member atan: input: Tensor -> Tensor static member backends: unit -> Backend list ...
<summary> Tensor operations </summary>
static member DiffSharp.dsharp.config: unit -> DiffSharp.Device * DiffSharp.Dtype * DiffSharp.Backend * DiffSharp.Printer
static member DiffSharp.dsharp.config: configuration: (DiffSharp.Device * DiffSharp.Dtype * DiffSharp.Backend * DiffSharp.Printer) -> unit
static member DiffSharp.dsharp.config: ?device: DiffSharp.Device * ?dtype: DiffSharp.Dtype * ?backend: DiffSharp.Backend * ?printer: DiffSharp.Printer -> unit
Multiple items
module Backend from DiffSharp
<summary> Contains functions and settings related to backend specifications. </summary>

--------------------
type Backend = | Reference | Torch | Other of name: string * code: int override ToString: unit -> string member Name: string
<summary> Represents a backend for DiffSharp tensors </summary>
union case DiffSharp.Backend.Reference: DiffSharp.Backend
<summary> The reference backend </summary>
static member DiffSharp.dsharp.seed: ?seed: int -> unit
namespace DiffSharp.Util
namespace DiffSharp.Data
val dataset: MNIST
Multiple items
type MNIST = inherit Dataset new: path: string * ?urls: seq<string> * ?train: bool * ?transform: (Tensor -> Tensor) * ?targetTransform: (Tensor -> Tensor) * ?n: int -> MNIST override item: i: int -> Tensor * Tensor member classNames: string[] member classes: int override length: int

--------------------
new: path: string * ?urls: seq<string> * ?train: bool * ?transform: (Tensor -> Tensor) * ?targetTransform: (Tensor -> Tensor) * ?n: int -> MNIST
val id: x: 'T -> 'T
<summary>The identity function</summary>
<param name="x">The input value.</param>
<returns>The same value.</returns>
<example id="id-example"><code lang="fsharp"> id 12 // Evaulates to 12 id "abc" // Evaulates to "abc" </code></example>
argument n: int option
val data: Tensor
val label: Tensor
val pngToHtml: fileName: string -> widthPixels: int -> string
<summary> Given a PNG image file name, returns an HTML image element with the image content included as a Base64 encoded string </summary>
val printfn: format: Printf.TextWriterFormat<'T> -> 'T
<summary>Print to <c>stdout</c> using the given format, and add a newline.</summary>
<param name="format">The formatter.</param>
<returns>The formatted result.</returns>
<example>See <c>Printf.printfn</c> (link: <see cref="M:Microsoft.FSharp.Core.PrintfModule.PrintFormatLine``1" />) for examples.</example>
val loader: DataLoader
Multiple items
type DataLoader = new: dataset: Dataset * batchSize: int * ?shuffle: bool * ?dropLast: bool * ?device: Device * ?dtype: Dtype * ?backend: Backend * ?targetDevice: Device * ?targetDtype: Dtype * ?targetBackend: Backend -> DataLoader member batch: ?batchSize: int -> Tensor * Tensor member epoch: ?numBatches: int -> seq<int * Tensor * Tensor> member length: int

--------------------
new: dataset: Dataset * batchSize: int * ?shuffle: bool * ?dropLast: bool * ?device: Device * ?dtype: Dtype * ?backend: Backend * ?targetDevice: Device * ?targetDtype: Dtype * ?targetBackend: Backend -> DataLoader
val batch: Tensor
val labels: Tensor
member DataLoader.batch: ?batchSize: int -> Tensor * Tensor
val epoch: int
val i: int
member DataLoader.epoch: ?numBatches: int -> seq<int * Tensor * Tensor>
namespace DiffSharp.Model
module Compose from DiffSharp
Multiple items
type Generator = inherit Model new: nz: int -> Generator override forward: x: Tensor -> Tensor

--------------------
new: nz: int -> Generator
val nz: int
Multiple items
val int: value: 'T -> int (requires member op_Explicit)
<summary>Converts the argument to signed 32-bit integer. This is a direct conversion for all primitive numeric types. For strings, the input is converted using <c>Int32.Parse()</c> with InvariantCulture settings. Otherwise the operation requires an appropriate static conversion method on the input type.</summary>
<param name="value">The input value.</param>
<returns>The converted int</returns>
<example id="int-example"><code lang="fsharp"></code></example>


--------------------
[<Struct>] type int = int32
<summary>An abbreviation for the CLI type <see cref="T:System.Int32" />.</summary>
<category>Basic Types</category>


--------------------
type int<'Measure> = int
<summary>The type of 32-bit signed integer numbers, annotated with a unit of measure. The unit of measure is erased in compiled code and when values of this type are analyzed using reflection. The type is representationally equivalent to <see cref="T:System.Int32" />.</summary>
<category>Basic Types with Units of Measure</category>
Multiple items
namespace DiffSharp.Model

--------------------
type Model = Model<Tensor,Tensor>

--------------------
new: ?f: ('In -> 'Out) * ?parameters: seq<Parameter> * ?buffers: seq<Parameter> * ?models: seq<ModelBase> -> Model<'In,'Out>
val fc1: Linear
Multiple items
type Linear = inherit Model new: inFeatures: int * outFeatures: int * ?bias: bool -> Linear override ToString: unit -> string override forward: value: Tensor -> Tensor member bias: Tensor member weight: Tensor
<summary>A model that applies a linear transformation to the incoming data: \(y = xA^T + b\)</summary>

--------------------
new: inFeatures: int * outFeatures: int * ?bias: bool -> Linear
val fc2: Linear
val fc3: Linear
val fc4: Linear
val self: Generator
val x: Tensor
static member dsharp.view: shape: seq<int> -> (Tensor -> Tensor)
static member dsharp.view: shape: int -> (Tensor -> Tensor)
static member dsharp.view: input: Tensor * shape: int -> Tensor
static member dsharp.view: input: Tensor * shape: seq<int> -> Tensor
override Linear.forward: value: Tensor -> Tensor
static member dsharp.leakyRelu: ?negativeSlope: float -> (Tensor -> Tensor)
static member dsharp.leakyRelu: input: Tensor * ?negativeSlope: float -> Tensor
static member dsharp.tanh: input: Tensor -> Tensor
Multiple items
type Discriminator = inherit Model new: nz: int -> Discriminator override forward: x: Tensor -> Tensor

--------------------
new: nz: int -> Discriminator
val self: Discriminator
static member dsharp.dropout: ?p: double -> (Tensor -> Tensor)
static member dsharp.dropout: input: Tensor * ?p: double -> Tensor
static member dsharp.sigmoid: input: Tensor -> Tensor
val gen: Generator
val dis: Discriminator
val print: x: 'a -> unit
<summary> Print the given value to the console using the '%A' printf format specifier </summary>
val generator: Model<Tensor,Tensor>
val discriminator: Model<Tensor,Tensor>

© Copyright 2021, DiffSharp Contributors.