OmniSafe Model Utils#

initialize_layer(init_function, layer)

Initialize the layer with the given initialization function.

get_activation(activation)

Get the activation function.

build_mlp_network(sizes, activation[, ...])

Build the MLP network.

Model Building Utils#

Documentation

omnisafe.utils.model.initialize_layer(init_function, layer)[source]#

Initialize the layer with the given initialization function.

The init_function can be chosen from: kaiming_uniform, xavier_normal, glorot, xavier_uniform, orthogonal.

Parameters:
  • init_function (InitFunction) – The initialization function.

  • layer (nn.Linear) – The layer to be initialized.

Return type:

None

omnisafe.utils.model.get_activation(activation)[source]#

Get the activation function.

The activation can be chosen from: identity, relu, sigmoid, softplus, tanh.

Parameters:

activation (Activation) – The activation function.

Returns:
  • The activation function, ranging from ``nn.Identity``, ``nn.ReLU``, ``nn.Sigmoid``,

  • ``nn.Softplus`` to ``nn.Tanh``.

Return type:

type[nn.Identity | nn.ReLU | nn.Sigmoid | nn.Softplus | nn.Tanh]

omnisafe.utils.model.build_mlp_network(sizes, activation, output_activation='identity', weight_initialization_mode='kaiming_uniform')[source]#

Build the MLP network.

Examples

>>> build_mlp_network([64, 64, 64], 'relu', 'tanh')
Sequential(
    (0): Linear(in_features=64, out_features=64, bias=True)
    (1): ReLU()
    (2): Linear(in_features=64, out_features=64, bias=True)
    (3): ReLU()
    (4): Linear(in_features=64, out_features=64, bias=True)
    (5): Tanh()
)
Parameters:
  • sizes (list of int) – The sizes of the layers.

  • activation (Activation) – The activation function.

  • output_activation (Activation, optional) – The output activation function. Defaults to identity.

  • weight_initialization_mode (InitFunction, optional) – Weight initialization mode. Defaults to 'kaiming_uniform'.

Returns:

The MLP network.

Return type:

Module