Reference
Dataset
tf_autoencoder_dataset(iterator)
Create an autoencoder tf.data.Dataset from the iterator
The dataset is constructed from a generator
Parameters:
Name | Type | Description | Default |
---|---|---|---|
iterator |
WindowedDatasetIterator
|
The data iterator |
required |
Returns:
Type | Description |
---|---|
Dataset
|
A tensorflow dataset |
Source code in ceruleo/models/keras/dataset.py
tf_regression_dataset(iterator)
Create a forecast tf.data.Dataset from the iterator
The dataset is constructed from a generator
Parameters:
Name | Type | Description | Default |
---|---|---|---|
iterator |
WindowedDatasetIterator
|
The data iterator |
required |
Returns:
Type | Description |
---|---|
Dataset
|
A tensorflow dataset |
Source code in ceruleo/models/keras/dataset.py
tf_seq_to_seq_dataset(iterator)
Create a sequence to sequence tf.data.Dataset from the iterator
The dataset is constructed from a generator
Parameters:
Name | Type | Description | Default |
---|---|---|---|
iterator |
WindowedDatasetIterator
|
The data iterator |
required |
Returns:
Type | Description |
---|---|
Dataset
|
A tensorflow dataset |
Source code in ceruleo/models/keras/dataset.py
Callbacks
PredictionCallback
Bases: Callback
Generate a plot after each epoch with the predictions
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model |
The model used predict |
required | |
output_path |
Path
|
Path of the output image |
required |
dataset |
Dataset
|
The dataset that want to be plotted |
required |
Source code in ceruleo/models/keras/callbacks.py
Losses
AsymmetricLossPM
Bases: LossFunctionWrapper
Customizable Asymmetric Loss Functions for Machine Learning-based Predictive Maintenance
Ehrig, L., Atzberger, D., Hagedorn, B., Klimke, J., & Döllner, J. (2020, October). Customizable Asymmetric Loss Functions for Machine Learning-based Predictive Maintenance. In 2020 8th International Conference on Condition Monitoring and Diagnosis (CMD) (pp. 250-253). IEEE.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
theta_l |
float
|
Linear to exponential change point for overpredictions |
required |
alpha_l |
float
|
Quadratic term parameters for overpredictions |
required |
gamma_l |
float
|
Exponential term parameters for overpredictions |
required |
theta_r |
float
|
Linear to exponential change point for underpredictions |
required |
alpha_r |
float
|
Quadratic term parameters for underpredictions |
required |
gamma_r |
float
|
Exponential term parameters for underpredictions |
required |
relative_weight |
bool
|
Wether to use weigthing relative to the RUL |
True
|
Source code in ceruleo/models/keras/losses.py
asymmetric_loss_pm(y_true, y_pred, *, theta_l, alpha_l, gamma_l, theta_r, alpha_r, gamma_r, relative_weight=True)
Customizable Asymmetric Loss Functions for Machine Learning-based Predictive Maintenance
Ehrig, L., Atzberger, D., Hagedorn, B., Klimke, J., & Döllner, J. (2020, October). Customizable Asymmetric Loss Functions for Machine Learning-based Predictive Maintenance. In 2020 8th International Conference on Condition Monitoring and Diagnosis (CMD) (pp. 250-253). IEEE.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_true |
True RUL values |
required | |
y_pred |
Predicted RUL values |
required | |
theta_l |
Linear to exponential change point for overpredictions (Positive) |
required | |
alpha_l |
Quadratic term parameters for overpredictions |
required | |
gamma_l |
Exponential term parameters for overpredictions |
required | |
theta_r |
Linear to exponential change point for underpredictions |
required | |
alpha_r |
Quadratic term parameters for underpredictions |
required | |
gamma_r |
Exponential term parameters for underpredictions |
required | |
relative_weight |
bool
|
Wether to use weigthing relative to the RUL |
True
|
Returns:
Name | Type | Description |
---|---|---|
l |
float
|
the loss computed |
Source code in ceruleo/models/keras/losses.py
relative_mae(C=0.9)
MAE weighted by the relative error
Parameters:
Name | Type | Description | Default |
---|---|---|---|
C |
float
|
Minimal value for the RUL |
0.9
|
Returns:
Type | Description |
---|---|
The loss function |
Source code in ceruleo/models/keras/losses.py
relative_mse(C=0.9)
MSE weighted by the relative error
Parameters:
Name | Type | Description | Default |
---|---|---|---|
C |
float
|
Minimal value for the RUL |
0.9
|
Returns:
Type | Description |
---|---|
The loss function |
Source code in ceruleo/models/keras/losses.py
root_mean_squared_error(y_true, y_pred)
Root mean squared error
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_true |
array
|
True RUL values |
required |
y_pred |
array
|
Predicted RUL values |
required |
Layers
ConcreteDropout
Bases: Layer
Concrete Dropout layer class from https://arxiv.org/abs/1705.07832. Dropout Feature Ranking for Deep Learning Models Chun-Hao Chang Ladislav Rampasek Anna Goldenberg
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dropout_regularizer |
Positive float, satisfying $dropout_regularizer = 2 / ( au * N)$ with model precision $ au$ (inverse observation noise) and N the number of instances in the dataset. The factor of two should be ignored for cross-entropy loss, and used only for the eucledian loss. |
1e-05
|
|
init_min |
Minimum value for the randomly initialized dropout rate, in [0, 1]. |
0.1
|
|
init_min |
Maximum value for the randomly initialized dropout rate, in [0, 1], with init_min <= init_max. |
0.1
|
|
name |
String, name of the layer. |
None
|
Source code in ceruleo/models/keras/layers.py
LASSOLayer
Bases: Layer
LASSO Layer
Parameters:
Name | Type | Description | Default |
---|---|---|---|
l1 |
float
|
L1 regularization parameter |
required |
Source code in ceruleo/models/keras/layers.py
ResidualShrinkageBlock
Bases: Layer
ResidualShrinkageBlock