Version: 3.x

rasa.utils.tensorflow.data_generator

RasaDataGenerator Objects

class RasaDataGenerator(Sequence)

Abstract data generator.

__init__

def __init__(model_data: RasaModelData,
batch_size: Union[int, List[int]],
batch_strategy: Text = SEQUENCE,
shuffle: bool = True)

Initializes the data generator.

Arguments:

  • model_data - The model data to use.
  • batch_size - The batch size(s).
  • batch_strategy - The batch strategy.
  • shuffle - If 'True', data should be shuffled.

__len__

def __len__() -> int

Number of batches in the Sequence.

Returns:

The number of batches in the Sequence.

__getitem__

def __getitem__(index: int) -> Tuple[Any, Any]

Gets batch at position index.

Arguments:

  • index - position of the batch in the Sequence.

Returns:

A batch (tuple of input data and target data).

on_epoch_end

def on_epoch_end() -> None

Update the data after every epoch.

prepare_batch

@staticmethod
def prepare_batch(
data: Data,
start: Optional[int] = None,
end: Optional[int] = None,
tuple_sizes: Optional[Dict[Text, int]] = None
) -> Tuple[Optional[np.ndarray], ...]

Slices model data into batch using given start and end value.

Arguments:

  • data - The data to prepare.
  • start - The start index of the batch
  • end - The end index of the batch
  • tuple_sizes - In case the feature is not present we propagate the batch with None. Tuple sizes contains the number of how many None values to add for what kind of feature.

Returns:

The features of the batch.

RasaBatchDataGenerator Objects

class RasaBatchDataGenerator(RasaDataGenerator)

Data generator with an optional increasing batch size.

__init__

def __init__(model_data: RasaModelData,
batch_size: Union[List[int], int],
epochs: int = 1,
batch_strategy: Text = SEQUENCE,
shuffle: bool = True,
drop_small_last_batch: bool = False)

Initializes the increasing batch size data generator.

Arguments:

  • model_data - The model data to use.
  • batch_size - The batch size.
  • epochs - The total number of epochs.
  • batch_strategy - The batch strategy.
  • shuffle - If 'True', data will be shuffled.
  • drop_small_last_batch - if 'True', the last batch in an epoch will be dropped if it has less examples than half the batch size

__len__

def __len__() -> int

Number of batches in the Sequence.

Returns:

The number of batches in the Sequence.

__getitem__

def __getitem__(index: int) -> Tuple[Any, Any]

Gets batch at position index.

Arguments:

  • index - position of the batch in the Sequence.

Returns:

A batch (tuple of input data and target data).

on_epoch_end

def on_epoch_end() -> None

Update the data after every epoch.