Source: keras_text/models/layers.py#L0


AttentionLayer

Attention layer that computes a learned attention over input sequence.

For details, see papers: - https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf - http://colinraffel.com/publications/iclr2016feed.pdf (fig 1)

Input: - x: Input tensor of shape (..., time_steps, features) where features must be static (known).

Output: 2D tensor of shape (..., features). i.e., time_steps axis is attended over and reduced.

AttentionLayer.built

AttentionLayer.input

Retrieves the input tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input tensor or list of input tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.input_mask

Retrieves the input mask tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input mask tensor (potentially None) or list of input mask tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.input_shape

Retrieves the input shape tuple(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input shape tuple (or list of input shape tuples, one tuple per input tensor).

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.losses

AttentionLayer.non_trainable_weights

AttentionLayer.output

Retrieves the output tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Output tensor or list of output tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.output_mask

Retrieves the output mask tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Output mask tensor (potentially None) or list of output mask tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.output_shape

Retrieves the output shape tuple(s) of a layer.

Only applicable if the layer has one inbound node, or if all inbound nodes have the same output shape.

Returns

Output shape tuple (or list of input shape tuples, one tuple per output tensor).

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

AttentionLayer.trainable_weights

AttentionLayer.updates

AttentionLayer.weights


AttentionLayer.__init__

__init__(self, kernel_initializer="he_normal", kernel_regularizer=None, kernel_constraint=None, \
    use_bias=True, bias_initializer="zeros", bias_regularizer=None, bias_constraint=None, \
    use_context=True, context_initializer="he_normal", context_regularizer=None, \
    context_constraint=None, attention_dims=None, **kwargs)

Args:

  • attention_dims: The dimensionality of the inner attention calculating neural network. For input (32, 10, 300), with attention_dims of 100, the output is (32, 10, 100). i.e., the attended words are 100 dimensional. This is then collapsed via summation to (32, 10, 1) to indicate the attention weights for 10 words. If set to None, features dims are used as attention_dims. (Default value: None)

AttentionLayer.build

build(self, input_shape)

AttentionLayer.call

call(self, x, mask=None)

AttentionLayer.compute_mask

compute_mask(self, input, input_mask=None)

AttentionLayer.compute_output_shape

compute_output_shape(self, input_shape)

AttentionLayer.get_attention_tensor

get_attention_tensor(self)

AttentionLayer.get_config

get_config(self)

ConsumeMask

Layer that prevents mask propagation.

ConsumeMask.built

ConsumeMask.input

Retrieves the input tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input tensor or list of input tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.input_mask

Retrieves the input mask tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input mask tensor (potentially None) or list of input mask tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.input_shape

Retrieves the input shape tuple(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Input shape tuple (or list of input shape tuples, one tuple per input tensor).

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.losses

ConsumeMask.non_trainable_weights

ConsumeMask.output

Retrieves the output tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Output tensor or list of output tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.output_mask

Retrieves the output mask tensor(s) of a layer.

Only applicable if the layer has exactly one inbound node, i.e. if it is connected to one incoming layer.

Returns

Output mask tensor (potentially None) or list of output mask tensors.

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.output_shape

Retrieves the output shape tuple(s) of a layer.

Only applicable if the layer has one inbound node, or if all inbound nodes have the same output shape.

Returns

Output shape tuple (or list of input shape tuples, one tuple per output tensor).

Raises

  • AttributeError: if the layer is connected to more than one incoming layers.

ConsumeMask.trainable_weights

ConsumeMask.updates

ConsumeMask.weights


ConsumeMask.call

call(self, x, mask=None)

ConsumeMask.compute_mask

compute_mask(self, input, input_mask=None)