phygnn.layers.handlers.HiddenLayers
- class HiddenLayers(hidden_layers)[source]
Bases:
object
Class to handle TensorFlow hidden layers
- Parameters:
hidden_layers (list) – List of dictionaries of key word arguments for each hidden layer in the NN. Dense linear layers can be input with their activations or separately for more explicit control over the layer ordering. For example, this is a valid input for hidden_layers that will yield 7 hidden layers:
- [{‘units’: 64, ‘activation’: ‘relu’, ‘dropout’: 0.01},
{‘units’: 64}, {‘batch_normalization’: {‘axis’: -1}}, {‘activation’: ‘relu’}, {‘dropout’: 0.01}]
Methods
add_layer
(layer_kwargs)Add a hidden layer to the DNN.
add_layer_by_class
(class_name, **kwargs)Add a new layer by the class name, either from phygnn.layers.custom_layers or tf.keras.layers
add_skip_layer
(name)Add a skip layer, looking for a prior skip connection start point if already in the layer list.
compile
(model, hidden_layers)Add hidden layers to model
parse_repeats
(hidden_layers)Parse repeat layers.
Attributes
Get a list of the NN bias weights (tensors)
List of dictionaries of key word arguments for each hidden layer in the NN.
Get a list of the NN kernel weights (tensors)
TensorFlow keras layers
Get a dictionary of unique SkipConnection objects in the layers list keyed by SkipConnection name.
Get a list of layer weights for gradient calculations.
- static parse_repeats(hidden_layers)[source]
Parse repeat layers. Must have “repeat” and “n” to repeat one or more layers.
- Parameters:
hidden_layers (list) – Hidden layer kwargs including possibly entries with {‘n’: 2, ‘repeat’: [{…}, {…}]} that will duplicate the list sub entry n times.
- Returns:
hidden_layers (list) – Hidden layer kwargs exploded for ‘repeat’ entries.
- property layers
TensorFlow keras layers
- Returns:
list
- property skip_layers
Get a dictionary of unique SkipConnection objects in the layers list keyed by SkipConnection name.
- Returns:
list
- property hidden_layer_kwargs
List of dictionaries of key word arguments for each hidden layer in the NN. This is a copy of the hidden_layers input arg that can be used to reconstruct the network.
- Returns:
list
- property weights
Get a list of layer weights for gradient calculations.
- Returns:
list
- property kernel_weights
Get a list of the NN kernel weights (tensors)
(can be used for kernel regularization).
Does not include input layer or dropout layers. Does include the output layer.
- Returns:
list
- property bias_weights
Get a list of the NN bias weights (tensors)
(can be used for bias regularization).
Does not include input layer or dropout layers. Does include the output layer.
- Returns:
list
- add_skip_layer(name)[source]
Add a skip layer, looking for a prior skip connection start point if already in the layer list.
- Parameters:
name (str) – Unique string identifier of the skip connection. The skip endpoint should have the same name.
- add_layer_by_class(class_name, **kwargs)[source]
Add a new layer by the class name, either from phygnn.layers.custom_layers or tf.keras.layers
- Parameters:
class_name (str) – Class name from phygnn.layers.custom_layers or tf.keras.layers
kwargs (dict) – Key word arguments to initialize the class.
- add_layer(layer_kwargs)[source]
Add a hidden layer to the DNN.
- Parameters:
layer_kwargs (dict) – Dictionary of key word arguments for list layer. For example, any of the following are valid inputs:
{‘units’: 64, ‘activation’: ‘relu’, ‘dropout’: 0.05} {‘units’: 64, ‘name’: ‘relu1’} {‘activation’: ‘relu’} {‘batch_normalization’: {‘axis’: -1}} {‘dropout’: 0.1}
- classmethod compile(model, hidden_layers)[source]
Add hidden layers to model
- Parameters:
model (tensorflow.keras) – Model to add hidden layers too
hidden_layers (list) – List of dictionaries of key word arguments for each hidden layer in the NN. Dense linear layers can be input with their activations or separately for more explicit control over the layer ordering. For example, this is a valid input for hidden_layers that will yield 7 hidden layers:
- [{‘units’: 64, ‘activation’: ‘relu’, ‘dropout’: 0.01},
{‘units’: 64}, {‘batch_normalization’: {‘axis’: -1}}, {‘activation’: ‘relu’}, {‘dropout’: 0.01}]
- Returns:
model (tensorflow.keras) – Model with layers added