DSL
TensorBuilder's DSL enables you to express the computation do desire to do into a single flexible structure. The DSL preserves all features of given to you by the Builder
class:
- Composing operations
- Branching
- Scoping
The Applicative
was built to create elements that are accepted/play well will this language. It also two very import methods
compile
: generates a function out a given valid ast/structure (compiles it)pipe
: givenBuilder
orTensor
and an ast, compile it to a function and apply it to the Tensor/Builder.
Rules
- All final elements in the "AST" must be functions, non final elements are compiled to a function.
- A Tuple
()
denotes a sequential operation. Results in the composition of all elements within it. - A List
[]
denotes a branching operation. Results in the creation of a function that applies the.branch
method to its argument, and each element in the list results in a branch. It compiles to a function of typeBuilder -> BuilderTree
. - A Dict
{}
denotes a scoping operation. It only accepts a single key-value pair, its key must me a Disposable and its value can be any element of the language. It results in the creation of a function that takes aBuilder
as its argument, applies thewith
statemente to thekey
and applies the function of thevalue
to its argument inside thewith
block.
Example
Its easier to see the actual DSL with an example, especially because you can see a direct mapping of the concepts brought by the Builder
class into the DSL:
import tensorflow as tf
from tensorbuilder import tb
x = placeholder(tf.float32, shape=[None, 10])
y = placeholder(tf.float32, shape=[None, 5])
[h, trainer] = tb.pipe(
x,
[
{ tf.device("/gpu:0"):
tb.relu_layer(20)
}
,
{ tf.device("/gpu:1"):
tb.sigmoid_layer(20)
}
,
{ tf.device("/cpu:0"):
tb.tanh_layer(20)
}
],
tb.relu_layer(10)
.linear_layer(5),
[
tb.softmax() # h
,
tb.softmax_cross_entropy_with_logits(y)
.reduce_mean()
.map(tf.trainer.AdamOptimizer(0.01).minimize) # trainer
],
tb.tensors()
)
Lets go step by step to what is happening here:
- The Tensor
x
pluged inside aBuilder
and piped through the computational structured defined. All the arguments ofpipe
afterx
are grouped as if they were in a tuple()
and the whole expression is compiled to a single function with is then applied to theBuider
containingx
. - final elements you see here like
tb.softmax()
areApplicative
s which as you've been told are functions. As you see, almost all methods from theBuilder
class are also methods from theApplicative
class, the diference is that the methods of theBuilder
class actually perform the computation they intend (construct a new Tensor), but the methods from theApplicative
class rather compose/define the computation to be done later. - There is an implicit Tuple
()
element that is performing a sequential composition of all the other elements. As a result, the visual/spatial ordering of the code corresponds to the intended behavior. - Lists very naturally express branches. Notice how indentation and an intentional positioning of the
,
comma help to diferentiate each branch. - Expresions like
tb.relu_layer(10)
are polymorphic and work forBuilder
s orBuilderTree
s regardless. - Scoping is very clean with the
{}
notation. In constrast to usingthen_with
from theBuilder
class, here you can actually use the original functions fromtensorflow
unchanged in thekey
of the dict.