Usage
General Purpose (@finch
)
Most users will want to use the @finch
macro, which executes the given program immediately in the given scope. The program will be JIT-compiled on the first call to @finch
with the given array argument types. If the array arguments to @finch
are type stable, the program will be JIT-compiled when the surrounding function is compiled.
Very often, the best way to inspect Finch compiler behavior is through the @finch_code
macro, which prints the generated code instead of executing it.
Finch.@finch
— Macro@finch [options...] prgm
Run a finch program prgm
. The syntax for a finch program is a set of nested loops, statements, and branches over pointwise array assignments. For example, the following program computes the sum of two arrays A = B + C
:
@finch begin
A .= 0
for i = _
A[i] = B[i] + C[i]
end
return A
end
Finch programs are composed using the following syntax:
arr .= 0
: an array declaration initializing arr to zero.arr[inds...]
: an array access, the array must be a variable and each index may be another finch expression.x + y
,f(x, y)
: function calls, wherex
andy
are finch expressions.arr[inds...] = ex
: an array assignment expression, settingarr[inds]
to the value ofex
.arr[inds...] += ex
: an incrementing array expression, addingex
toarr[inds]
.*, &, |
, are supported.arr[inds...] <<min>>= ex
: a incrementing array expression with a custom operator, e.g.<<min>>
is the minimum operator.for i = _ body end
: a loop over the indexi
, where_
is computed from array access withi
inbody
.if cond body end
: a conditional branch that executes only iterations wherecond
is true.return (tnss...,)
: at global scope, exit the program and return the tensorstnss
with their new dimensions. By default, any tensor declared in global scope is returned.
Symbols are used to represent variables, and their values are taken from the environment. Loops introduce index variables into the scope of their bodies.
Finch uses the types of the arrays and symbolic analysis to discover program optimizations. If B
and C
are sparse array types, the program will only run over the nonzeros of either.
Semantically, Finch programs execute every iteration. However, Finch can use sparsity information to reliably skip iterations when possible.
options
are optional keyword arguments:
algebra
: the algebra to use for the program. The default isDefaultAlgebra()
.mode
: the optimization mode to use for the program. Possible modes are::debug
: run the program in debug mode, with bounds checking and better error handling.:safe
: run the program in safe mode, with modest checks for performance and correctness.:fast
: run the program in fast mode, with no checks or warnings, this mode is for power users.
:safe
.
See also: @finch_code
Finch.@finch_code
— Macro@finch_code [options...] prgm
Return the code that would be executed in order to run a finch program prgm
.
See also: @finch
Ahead Of Time (@finch_kernel
)
While @finch
is the recommended way to use Finch, it is also possible to run finch ahead-of-time. The @finch_kernel
macro generates a function definition ahead-of-time, which can be evaluated and then called later.
There are several reasons one might want to do this:
- If we want to make tweaks to the Finch implementation, we can directly modify the source code of the resulting function.
- When benchmarking Finch functions, we can easily and reliably ensure the benchmarked code is inferrable.
- If we want to use Finch to generate code but don't want to include Finch as a dependency in our project, we can use
@finch_kernel
to generate the functions ahead of time and copy and paste the generated code into our project. Consider automating this workflow to keep the kernels up to date!
Finch.@finch_kernel
— Macro@finch_kernel [options...] fname(args...) = prgm
Return a definition for a function named fname
which executes @finch prgm
on the arguments args
. args
should be a list of variables holding representative argument instances or types.
See also: @finch
As an example, the following code generates an spmv kernel definition, evaluates the definition, and then calls the kernel several times.
let
A = Tensor(Dense(SparseList(Element(0.0))))
x = Tensor(Dense(Element(0.0)))
y = Tensor(Dense(Element(0.0)))
def = @finch_kernel function spmv(y, A, x)
y .= 0.0
for j = _, i = _
y[i] += A[i, j] * x[j]
end
return y
end
eval(def)
end
function main()
for i = 1:10
A2 = Tensor(Dense(SparseList(Element(0.0))), fsprand(10, 10, 0.1))
x2 = Tensor(Dense(Element(0.0)), rand(10))
y2 = Tensor(Dense(Element(0.0)))
spmv(y2, A2, x2)
end
end
main()