# Structural nodes (SNodes)¶

After writing the computation code, the user needs to specify the internal data structure hierarchy. Specifying a data structure includes choices at both the macro level, dictating how the data structure components nest with each other and the way they represent sparsity, and the micro level, dictating how data are grouped together (e.g. structure of arrays vs. array of structures). Taichi provides Structural Nodes (SNodes) to compose the hierarchy and particular properties. These constructs and their semantics are listed below:

• dense: A fixed-length contiguous array.
• bitmasked: This is similar to dense, but it also uses a mask to maintain sparsity information, one bit per child.
• pointer: Store pointers instead of the whole structure to save memory and maintain sparsity.
• dynamic: Variable-length array, with a predefined maximum length. It serves the role of `std::vector` in C++ or `list` in Python, and can be used to maintain objects (e.g. particles) contained in a block.

See Advanced dense layouts for more details. `ti.root` is the root node of the data structure.

`snode.``place`(x, ...)
Parameters: snode – (SNode) where to place x – (tensor) tensor(s) to be placed (SNode) the `snode` itself

The following code places two 0-D tensors named `x` and `y`:

```x = ti.var(dt=ti.i32)
y = ti.var(dt=ti.f32)
ti.root.place(x, y)
```
`tensor.``shape`()
Parameters: tensor – (Tensor) (tuple of integers) the shape of tensor

For example,

```ti.root.dense(ti.ijk, (3, 5, 4)).place(x)
x.shape() # returns (3, 5, 4)
```
`snode.``get_shape`(index)
Parameters: snode – (SNode) index – axis (0 for `i` and 1 for `j`) (scalar) the size of tensor along that axis

Equivalent to `tensor.shape()[i]`.

```ti.root.dense(ti.ijk, (3, 5, 4)).place(x)
x.snode().get_shape(0)  # 3
x.snode().get_shape(1)  # 5
x.snode().get_shape(2)  # 4
```
`tensor.``dim`()
Parameters: tensor – (Tensor) (scalar) the dimensionality of the tensor

Equivalent to `len(tensor.shape())`.

```ti.root.dense(ti.ijk, (8, 9, 10)).place(x)
x.dim()  # 3
```
`snode.``parent`()
Parameters: snode – (SNode) (SNode) the parent node of `snode`
```blk1 = ti.root.dense(ti.i, 8)
blk2 = blk1.dense(ti.j, 4)
blk1.parent()  # ti.root
blk2.parent()  # blk1
blk3.parent()  # blk2
```

## Node types¶

`snode.``dense`(indices, shape)
Parameters: snode – (SNode) parent node where the child is derived from indices – (Index or Indices) indices used for this node shape – (scalar or tuple) shape the tensor of vectors (SNode) the derived child node

The following code places a 1-D tensor of size `3`:

```x = ti.var(dt=ti.i32)
ti.root.dense(ti.i, 3).place(x)
```

The following code places a 2-D tensor of shape `(3, 4)`:

```x = ti.var(dt=ti.i32)
ti.root.dense(ti.ij, (3, 4)).place(x)
```

Note

If `shape` is a scalar and there are multiple indices, then `shape` will be automatically expanded to fit the number of indices. For example,

```snode.dense(ti.ijk, 3)
```

is equivalent to

```snode.dense(ti.ijk, (3, 3, 3))
```
`snode.``dynamic`(index, size, chunk_size = None)
Parameters: snode – (SNode) parent node where the child is derived from index – (Index) the `dynamic` node indices size – (scalar) the maximum size of the dynamic node chunk_size – (optional, scalar) the number of elements in each dynamic memory allocation chunk (SNode) the derived child node

`dynamic` nodes acts like `std::vector` in C++ or `list` in Python. Taichi’s dynamic memory allocation system allocates its memory on the fly.

The following places a 1-D dynamic tensor of maximum size `16`:

```ti.root.dynamic(ti.i, 16).place(x)
```
`snode.``bitmasked`()
`snode.``pointer`()
`snode.``hash`()

## Working with `dynamic` SNodes¶

`ti.``length`(snode, indices)
Parameters: snode – (SNode, dynamic) indices – (scalar or tuple of scalars) the `dynamic` node indices (scalar) the current size of the dynamic node
`ti.``append`(snode, indices, val)
Parameters: snode – (SNode, dynamic) indices – (scalar or tuple of scalars) the `dynamic` node indices val – (depends on SNode data type) value to store (`int32`) the size of the dynamic node, before appending

Inserts `val` into the `dynamic` node with indices `indices`.

## Taichi tensors like powers of two¶

Non-power-of-two tensor dimensions are promoted into powers of two and thus these tensors will occupy more virtual address space. For example, a (dense) tensor of size `(18, 65)` will be materialized as `(32, 128)`.

## Indices¶

`ti.``i`()
`ti.``j`()
`ti.``k`()
`ti.``ij`()
`ti.``ijk`()
`ti.``ijkl`()
`ti.``indices`(a, b, ...)

(TODO)