Merge layers¶
-
class
lasagne.layers.ConcatLayer(incomings, axis=1, cropping=None, **kwargs)[source]¶ Concatenates multiple inputs along the specified axis. Inputs should have the same shape except for the dimension specified in axis, which can have different sizes.
Parameters: incomings : a list of
Layerinstances or tuplesThe layers feeding into this layer, or expected input shapes
axis : int
Axis which inputs are joined over
cropping : None or [crop]
Cropping for each input axis. Cropping is described in the docstring for
autocrop(). Cropping is always disabled for axis.
-
lasagne.layers.concat[source]¶ alias of
ConcatLayer
-
class
lasagne.layers.ElemwiseMergeLayer(incomings, merge_function, cropping=None, **kwargs)[source]¶ This layer performs an elementwise merge of its input layers. It requires all input layers to have the same output shape.
Parameters: incomings : a list of
Layerinstances or tuplesthe layers feeding into this layer, or expected input shapes, with all incoming shapes being equal
merge_function : callable
the merge function to use. Should take two arguments and return the updated value. Some possible merge functions are
theano.tensor:mul,add,maximumandminimum.cropping : None or [crop]
Cropping for each input axis. Cropping is described in the docstring for
autocrop()See also
ElemwiseSumLayer- Shortcut for sum layer.
-
class
lasagne.layers.ElemwiseSumLayer(incomings, coeffs=1, cropping=None, **kwargs)[source]¶ This layer performs an elementwise sum of its input layers. It requires all input layers to have the same output shape.
Parameters: incomings : a list of
Layerinstances or tuplesthe layers feeding into this layer, or expected input shapes, with all incoming shapes being equal
coeffs: list or scalar
A same-sized list of coefficients, or a single coefficient that is to be applied to all instances. By default, these will not be included in the learnable parameters of this layer.
cropping : None or [crop]
Cropping for each input axis. Cropping is described in the docstring for
autocrop()Notes
Depending on your architecture, this can be used to avoid the more costly
ConcatLayer. For example, instead of concatenating layers before aDenseLayer, insert separateDenseLayerinstances of the same number of output units and add them up afterwards. (This avoids the copy operations in concatenation, but splits up the dot product.)