general information in model training
This commit is contained in:
parent
b252082991
commit
314c4433d3
@ -88,15 +88,25 @@ General information:
|
|||||||
Included model families:
|
Included model families:
|
||||||
- CNN variants (different fusion strategies)
|
- CNN variants (different fusion strategies)
|
||||||
- XGBoost
|
- XGBoost
|
||||||
- Isolation Forest
|
- Isolation Forest*
|
||||||
- OCSVM
|
- OCSVM*
|
||||||
- DeepSVDD
|
- DeepSVDD*
|
||||||
|
|
||||||
|
\* These trainings are unsupervised, which means only low cognitive load samples are used for training. Validation then also considers high low samples.
|
||||||
|
|
||||||
|
|
||||||
|
Supporting utilities in ```model_training/tools```:
|
||||||
|
- `scaler.py`: Functions to fit, transform, save and load either MinMaxScaler or StandardScaler, subject-wise and globally - for new subjects, a fallback scaler (using mean of all subjects scaling parameters) is used
|
||||||
|
- `performance_split.py`: Provides a function to split a group of subjects based on their performance in the AdaBase experiments, based on the results created in `researchOnSubjectPerformance.ipynb`
|
||||||
|
- `mad_outlier_removal.py`: Functions to fit and transform data with MAD outlier removal
|
||||||
|
- `evaluation_tools.py`: Especially used for Isolation Forest, Functions for ROC curve as well as confusion matrix
|
||||||
|
|
||||||
|
### 4.1 CNNs
|
||||||
|
### 4.2 XGBoost
|
||||||
|
### 4.3 Isolation Forest
|
||||||
|
### 4.4 OCSVM
|
||||||
|
### 4.5 DeepSVDD
|
||||||
|
|
||||||
Supporting utilities:
|
|
||||||
- `model_training/tools/scaler.py`
|
|
||||||
- `model_training/tools/performance_split.py`
|
|
||||||
- `model_training/tools/mad_outlier_removal.py`
|
|
||||||
- `model_training/tools/evaluation_tools.py`
|
|
||||||
|
|
||||||
## 5) Real-Time Prediction and Messaging
|
## 5) Real-Time Prediction and Messaging
|
||||||
|
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user