🏡 drafts\arraymancer_tutorial.nim

Arraymancer Tutorial - First steps

A remake of the original tutorial using nimib: https://mratsim.github.io/Arraymancer/tuto.first_steps.html

I will note differences with the original in quoted sections.

Tensor properties

Tensors have the following properties:

Next properties are technical and there for completeness:

import
  arraymancer, sugar, sequtils

let d = [[1, 2, 3], [4, 5, 6]].toTensor()
echo d
Tensor[system.int] of shape [2, 3]" on backend "Cpu"
|1	2	3|
|4	5	6|

message changed, it was: Tensor of shape 2x3 of type "int" on backend "Cpu"

dump d.rank
dump d.shape
dump d.strides               ## [x,y] => next row is x elements away in memory while next column is 1 element away.
dump d.offset
d.rank = 2
d.shape = [2, 3]
d.strides = [3, 1]
d.offset = 0

echo of shape and strides changed (dropped @)

Tensor creation

The canonical way to initialize a tensor is by converting a seq of seq of ... or an array of array of ... into a tensor using toTensor. toTensor supports deep nested sequences and arrays, even sequences of array of sequences.

let c = [[[1, 2, 3], [4, 5, 6]], [[11, 22, 33], [44, 55, 66]],
         [[111, 222, 333], [444, 555, 666]],
         [[1111, 2222, 3333], [4444, 5555, 6666]]].toTensor()
echo c
Tensor[system.int] of shape [4, 2, 3]" on backend "Cpu"
| | 	1	2	3 | 	11	22	33 | 	111	222	333 | 	1111	2222	3333|
| | 	4	5	6 | 	44	55	66 | 	444	555	666 | 	4444	5555	6666|

I am not sure where the additional pipes come from, maybe a bug?

newTensor procedure can be used to initialize a tensor of a specific shape with a default value. (0 for numbers, false for bool...)

zeros and ones procedures create a new tensor filled with 0 and 1 respectively.

zeros_like and ones_like take an input tensor and output a tensor of the same shape but filled with 0 and 1 respectively.

let e = newTensor[bool]([2, 3])
dump e
e = Tensor[system.bool] of shape [2, 3]" on backend "Cpu"
|false	false	false|
|false	false	false|

let f = zeros[float]([4, 3])
dump f
f = Tensor[system.float] of shape [4, 3]" on backend "Cpu"
|0.0	0.0	0.0|
|0.0	0.0	0.0|
|0.0	0.0	0.0|
|0.0	0.0	0.0|

let g = ones[float]([4, 3])
dump g
g = Tensor[system.float] of shape [4, 3]" on backend "Cpu"
|1.0	1.0	1.0|
|1.0	1.0	1.0|
|1.0	1.0	1.0|
|1.0	1.0	1.0|

let tmp = [[1, 2], [3, 4]].toTensor()
let h = tmp.zeros_like
dump h
h = Tensor[system.int] of shape [2, 2]" on backend "Cpu"
|0	0|
|0	0|

let i = tmp.ones_like
dump i
i = Tensor[system.int] of shape [2, 2]" on backend "Cpu"
|1	1|
|1	1|

Accessing and modifying a value

Tensors value can be retrieved or set with array brackets.

var a = toSeq(1 .. 24).toTensor().reshape(2, 3, 4)
echo a
Tensor[system.int] of shape [2, 3, 4]" on backend "Cpu"
| | 	1	2	3	4 | 	13	14	15	16|
| | 	5	6	7	8 | 	17	18	19	20|
| | 	9	10	11	12 | 	21	22	23	24|

dump a[1, 1, 1]
echo a
a[1, 1, 1] = 18
Tensor[system.int] of shape [2, 3, 4]" on backend "Cpu"
| | 	1	2	3	4 | 	13	14	15	16|
| | 	5	6	7	8 | 	17	18	19	20|
| | 	9	10	11	12 | 	21	22	23	24|

a[1, 1, 1] = 999
echo a
Tensor[system.int] of shape [2, 3, 4]" on backend "Cpu"
| | 	1	2	3	4 | 	13	14	15	16|
| | 	5	6	7	8 | 	17	999	19	20|
| | 	9	10	11	12 | 	21	22	23	24|

Copying

Warning ⚠: When you do the following, both tensors a and b will share data. Full copy must be explicitly requested via the clone function.

let a = toSeq(1 .. 24).toTensor().reshape(2, 3, 4)
var b = a
var c = clone(a)

Here modifying b WILL modify a.

adding an example of modification and an example of clone:

dump a[1, 0, 0]
c[1, 0, 0] = 0
dump a[1, 0, 0]
b[1, 0, 0] = 0
dump a[1, 0, 0]
a[1, 0, 0] = 13
a[1, 0, 0] = 13
a[1, 0, 0] = 0

This behaviour is the same as Numpy and Julia, reasons can be found in the following under the hood article.

import nimib
# I want to use this notebook also to show how one can customzie the nbCode block output
# (to have output shown as comments) and also possibly to stitch together subsequent code samples
# (I should use a render change in nbDoc). Probably I should do this after rendering refactoring.
nbInit
nbText: """
# Arraymancer Tutorial - First steps

> A remake of the original tutorial using nimib: <https://mratsim.github.io/Arraymancer/tuto.first_steps.html>
>
> I will note differences with the original in quoted sections.

## Tensor properties

Tensors have the following properties:
- `rank`: 0 for scalar (cannot be stored), 1 for vector, 2 for matrices, *N* for *N* dimensional arrays
- `shape`: a sequence of the tensor dimensions along each axis

Next properties are technical and there for completeness:
- `stride`: a sequence of numbers of steps to get the next item along a dimension
- `offset`: the first element of the tensor
"""
# order of variable names (d, c, e, ..., a, b) I guess it reflects the original order of the sections.
nbCode:
  import arraymancer, sugar, sequtils

  let d = [[1, 2, 3], [4, 5, 6]].toTensor()

  echo d

nbText: """> message changed, it was: `Tensor of shape 2x3 of type "int" on backend "Cpu"`"""

nbCode:
  dump d.rank
  dump d.shape
  dump d.strides ## [x,y] => next row is x elements away in memory while next column is 1 element away.
  dump d.offset
nbText: "> echo of shape and strides changed (dropped @)"

nbText: """
## Tensor creation
The canonical way to initialize a tensor is by converting a seq of seq of ... or an array of array of ...
into a tensor using `toTensor`.
`toTensor` supports deep nested sequences and arrays, even sequences of array of sequences.
"""

nbCode:
  let c = [
            [
              [1,2,3],
              [4,5,6]
            ],
            [
              [11,22,33],
              [44,55,66]
            ],
            [
              [111,222,333],
              [444,555,666]
            ],
            [
              [1111,2222,3333],
              [4444,5555,6666]
            ]
          ].toTensor()
  echo c
nbText: "> I am not sure where the additional pipes come from, maybe a bug?"
nbText: """
`newTensor` procedure can be used to initialize a tensor of a specific
shape with a default value. (0 for numbers, false for bool...)

`zeros` and `ones` procedures create a new tensor filled with 0 and
1 respectively.

`zeros_like` and `ones_like` take an input tensor and output a
tensor of the same shape but filled with 0 and 1 respectively.
"""
nbCode:
  let e = newTensor[bool]([2, 3])
  dump e
nbCode:
  let f = zeros[float]([4, 3])
  dump f
nbCode:
  let g = ones[float]([4, 3])
  dump g
nbCode:
  let tmp = [[1,2],[3,4]].toTensor()
  let h = tmp.zeros_like
  dump h
nbCode:
  let i = tmp.ones_like
  dump i

nbText: """
## Accessing and modifying a value

Tensors value can be retrieved or set with array brackets.
"""
# need to import sequtils to have toSeq
nbCode:
    var a = toSeq(1..24).toTensor().reshape(2,3,4)
    echo a
nbCode:
    dump a[1, 1, 1]
    echo a
nbCode:
    a[1, 1, 1] = 999
    echo a
nbText: """
## Copying

Warning ⚠: When you do the following, both tensors `a` and `b` will share data.
Full copy must be explicitly requested via the `clone` function.
"""
block: # using a block I can reuse a
  nbCode:
    let a = toSeq(1..24).toTensor().reshape(2,3,4)
    var b = a
    var c = clone(a)
  nbText: """
  Here modifying `b` WILL modify `a`.

  > adding an example of modification and an example of clone:
  """
  # still in block scope in order to reuse b
  nbCode:
    dump a[1, 0, 0]
    c[1, 0, 0] = 0
    dump a[1, 0, 0]
    b[1, 0, 0] = 0
    dump a[1, 0, 0]
nbText: """
This behaviour is the same as Numpy and Julia,
reasons can be found in the following
[under the hood article](https://mratsim.github.io/Arraymancer/uth.copy_semantics.html).
"""
nbShow