diff --git a/as2_Maggioni_Claudio/README.md b/as2_Maggioni_Claudio/README.md new file mode 100644 index 0000000..9e073da --- /dev/null +++ b/as2_Maggioni_Claudio/README.md @@ -0,0 +1,142 @@ +# Assignment 2 + +In this assignment you are asked to: + +1. Implement a neural network to classify images from the CIFAR10 dataset; +2. Fine-tune a pre-trained neural network to classify rock, paper, scissors hand gestures. + +Both requests are very similar to what we have seen during the labs. However, you are required to follow **exactly** the assignment's specifications. + +Once completed, please submit your solution on the iCorsi platform following the instructions below. + + +## Tasks + + +### T1. Follow our recipe + +Implement a multi-class classifier to identify the subject of the images from [CIFAR-10](https://www.cs.toronto.edu/%7Ekriz/cifar.html) data set. To simply the problem, we restrict the classes to 3: `airplane`, `automobile` and `bird`. + +1. Download and load CIFAR-10 dataset using the following [function](https://www.tensorflow.org/api_docs/python/tf/keras/datasets/cifar10/load_data), and consider only the first three classes. Check `src/utils.py`, there is already a function for this! +2. Preprocess the data: + - Normalize each pixel of each channel so that the range is [0, 1]; + - Create one-hot encoding of the labels. +3. Build a neural network with the following architecture: + - Convolutional layer, with 8 filters of size 5 by 5, stride of 1 by 1, and ReLU activation; + - Max pooling layer, with pooling size of 2 by 2; + - Convolutional layer, with 16 filters of size 3 by 3, stride of 2 by 2, and ReLU activation; + - Average pooling layer, with pooling size of 2 by 2; + - Layer to convert the 2D feature maps to vectors (Flatten layer); + - Dense layer with 8 neurons and tanh activation; + - Dense output layer with softmax activation; +4. Train the model on the training set from point 1 for 500 epochs: + - Use the RMSprop optimization algorithm, with a learning rate of 0.003 and a batch size of 128; + - Use categorical cross-entropy as a loss function; + - Implement early stopping, monitoring the validation accuracy of the model with a patience of 10 epochs and use 20% of the training data as validation set; + - When early stopping kicks in, and the training procedure stops, restore the best model found during training. +5. Draw a plot with epochs on the x-axis and with two graphs: the train accuracy and the validation accuracy (remember to add a legend to distinguish the two graphs!). +6. Assess the performances of the network on the test set loaded in point 1, and provide an estimate of the classification accuracy that you expect on new and unseen images. +7. **Bonus** (Optional) Tune the learning rate and the number of neurons in the last dense hidden layer with a **grid search** to improve the performances (if feasible). + - Consider the following options for the two hyper-parameters (4 models in total): + + learning rate: [0.01, 0.0001] + + number of neurons: [16, 64] + - Keep all the other hyper-parameters as in point 3. + - Perform a grid search on the chosen ranges based on hold-out cross-validation in the training set and identify the most promising hyper-parameter setup. + - Compare the accuracy on the test set achieved by the most promising configuration with that of the model obtained in point 4. Are the accuracy levels statistically different? + + +### T2. Transfer learning + +In this task, we will fine-tune the last layer of a pretrained model in order to build a classifier for the rock, paper, scissors dataset that we acquired for the lab. The objective is to make use of the experience collected on a task to bootstrap the performances on a different task. We are going to use the VGG16 network, pretrained on Imagenet to compete in the ILSVRC-2014 competition. + +VGG16 is very expensive to train from scratch, but luckily the VGG team publicly released the trained weights of the network, so that people could use it for transfer learning. As we discussed during classes, this can be achieved by removing the last fully connected layers form the pretrained model and by using the output of the convolutional layers (with freezed weights) as input to a new fully connected network. This last part of the model is then trained from scratch on the task of interest. + +1. Use `keras` to download a pretrained version of the `vgg16` network. You can start from this snippet of code: + +```python +from tensorflow.keras import applications + +# since VGG16 was trained on high-resolution images using a low resolution might not be a good idea +img_h, img_w = 224, 224 + +# Build the VGG16 network and download pre-trained weights and remove the last dense layers. +vgg16 = applications.VGG16(weights='imagenet', + include_top=False, + input_shape=(img_h, img_w, 3)) +# Freezes the network weights +vgg16.trainable = False + +# Now you can use vgg16 as you would use any other layer. +# Example: + +net = Sequential() +net.add(vgg16) +net.add(Flatten()) +net.add(Dense(...)) +... +``` +2. Download and preprocess the rock, paper, scissor dataset that we collected for the lab. + - You find the functions to download and build the dataset in `src/utils.py`. + - Vgg16 provides a function to prepropress the input (`applications.vgg16.preprocess_input`). You may decide to use it. + - Use 224x224 as image dimension. +4. Add a hidden layer (use any number of units and the activation function that you want), then add an output layer suitable for the hand gesture classification problem. +6. Train with and without data augmentation and report the learning curves (train and validation accuracy) for both cases. + - Turn on the GPU environment on Colab, otherwise training will be slow. + - Train for 50 epochs or until convergence. + - Comment if using data augmentation led to an improvement or not. + + +## Instructions + +### Tools + +Your solution must be entirely coded in **Python 3** ([not Python 2](https://python3statement.org/)). +We recommend to use Keras from TensorFlow2 that we seen in the labs, so that you can reuse the code in there as reference. + +All the required tasks can be completed using Keras. On the [documentation page](https://www.tensorflow.org/api_docs/python/tf/keras/) there is a useful search field that allows you to smoothly find what you are looking for. +You can develop your code in Colab, where you have access to a GPU, or you can install the libraries on your machine and develop locally. + + +### Submission + +In order to complete the assignment, you must submit a zip file named `as2_surname_name.zip` on the iCorsi platform containing: + +1. A report in `.pdf` format containing the plots and comments of the two tasks. You can use the `.tex` source code provided in the repo (not mandatory). +2. The two best models you find for both the tasks (one per task). By default, the keras function to save the model outputs a folder with several files inside. If you prefer a more compact solution, just append `.h5` at the end of the name you use to save the model to end up with a single file. +3. A working example `run_task1.py` that loads the test set in CIFAR-10 dataset, preprocesses the data, loads the trained model from file and evaluate the accuracy. In case you completed the bonus point, turn in the model with the highest accuracy. +3. A working example `run_task2.py` that loads the test set of the rock, paper, scissors dataset, preprocesses the data, loads the trained model from file and evaluate the accuracy. +4. A folder `src` with all the source code you used to build, train, and evaluate your models. + +The zip file should eventually looks like as follows + +``` +as2_surname_name/ + report_surname_name.pdf + deliverable/ + run_task1.py + run_task2.py + nn_task1/ # or any other file storing the model from task T1, e.g., nn_task1.h5 + nn_task2/ # or any other file storing the model from task T2, e.g., nn_task2.h5 + src/ + file1.py + file2.py + ... +``` + + +### Evaluation criteria + +You will get a positive evaluation if: + +- your code runs out of the box (i.e., without needing to change your code to evaluate the assignment); +- your code is properly commented; +- the performance assessment is conducted appropriately; + +You will get a negative evaluation if: + +- we realize that you copied your solution; +- your code requires us to edit things manually in order to work; +- you did not follow our detailed instructions in tasks T1 and T2. + +Bonus parts are optional and are not required to achieve the maximum grade, however they can grant you extra points. + diff --git a/as2_Maggioni_Claudio/deliverable/nn_task1/nn_task1.h5 b/as2_Maggioni_Claudio/deliverable/nn_task1/nn_task1.h5 new file mode 100644 index 0000000..d56dc56 Binary files /dev/null and b/as2_Maggioni_Claudio/deliverable/nn_task1/nn_task1.h5 differ diff --git a/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_aug.h5 b/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_aug.h5 new file mode 100644 index 0000000..9abf622 Binary files /dev/null and b/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_aug.h5 differ diff --git a/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_noaug.h5 b/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_noaug.h5 new file mode 100644 index 0000000..d4e08bb Binary files /dev/null and b/as2_Maggioni_Claudio/deliverable/nn_task2/nn_task2_noaug.h5 differ diff --git a/as2_Maggioni_Claudio/deliverable/run_task1.py b/as2_Maggioni_Claudio/deliverable/run_task1.py new file mode 100644 index 0000000..9b9a422 --- /dev/null +++ b/as2_Maggioni_Claudio/deliverable/run_task1.py @@ -0,0 +1,59 @@ +from tensorflow.keras.models import load_model +import os +import pickle +import urllib.request as http +from zipfile import ZipFile +from tensorflow.keras import utils + +import tensorflow as tf +import numpy as np +from PIL import Image + +from tensorflow.keras import layers as keras_layers +from tensorflow.keras import backend as K +from tensorflow.keras.datasets import cifar10 +from tensorflow.keras.models import save_model, load_model + + +def load_cifar10(num_classes=3): + """ + Downloads CIFAR-10 dataset, which already contains a training and test set, + and return the first `num_classes` classes. + Example of usage: + + >>> (x_train, y_train), (x_test, y_test) = load_cifar10() + + :param num_classes: int, default is 3 as required by the assignment. + :return: the filtered data. + """ + (x_train_all, y_train_all), (x_test_all, y_test_all) = cifar10.load_data() + + fil_train = tf.where(y_train_all[:, 0] < num_classes)[:, 0] + fil_test = tf.where(y_test_all[:, 0] < num_classes)[:, 0] + + y_train = y_train_all[fil_train] + y_test = y_test_all[fil_test] + + x_train = x_train_all[fil_train] + x_test = x_test_all[fil_test] + + return (x_train, y_train), (x_test, y_test) + +if __name__ == '__main__': + + _, (x_test, y_test) = load_cifar10() + + # Load the trained models + model_task1 = load_model('nn_task1/nn_task1.h5') + x_test_n = x_test / 255 + y_test_n = utils.to_categorical(y_test, 3) + + # Predict on the given samples + #for example + y_pred_task1 = model_task1.predict(x_test_n) + + # Evaluate the missclassification error on the test set + # for example + assert y_test_n.shape == y_pred_task1.shape + test_loss, test_accuracy = model_task1.evaluate(x_test_n, y_test_n) # evaluate accuracy with proper function + print("Accuracy model task 1:", test_accuracy) diff --git a/as2_Maggioni_Claudio/deliverable/run_task2.py b/as2_Maggioni_Claudio/deliverable/run_task2.py new file mode 100644 index 0000000..b13a7ee --- /dev/null +++ b/as2_Maggioni_Claudio/deliverable/run_task2.py @@ -0,0 +1,122 @@ +import os +import pickle +import urllib.request as http +from zipfile import ZipFile +from tensorflow.keras import Sequential, applications + +import tensorflow as tf +import numpy as np +from PIL import Image + +from tensorflow.keras import utils + +from tensorflow.keras import layers as keras_layers +from tensorflow.keras import backend as K +from tensorflow.keras.datasets import cifar10 +from tensorflow.keras.models import save_model, load_model + +import torch +from keras.preprocessing.image import img_to_array, array_to_img + +def load_keras_model(filename): + """ + Loads a compiled Keras model saved with models.save_model. + + :param filename: string, path to the file storing the model. + :return: the model. + """ + model = load_model(filename) + return model + +def load_rps(download=False, path='rps', reduction_factor=1): + """ + Downloads the rps dataset and returns the training and test sets. + Example of usage: + + >>> (x_train, y_train), (x_test, y_test) = load_rps() + + :param download: bool, default is False but for the first call should be True. + :param path: str, subdirectory in which the images should be downloaded, default is 'rps'. + :param reduction_factor: int, factor of reduction of the dataset (len = old_len // reduction_factor). + :return: the images and labels split into training and validation sets. + """ + url = 'https://drive.switch.ch/index.php/s/xjXhuYDUzoZvL02/download' + classes = ('rock', 'paper', 'scissors') + rps_dir = os.path.abspath(path) + filename = os.path.join(rps_dir, 'data.zip') + if not os.path.exists(rps_dir) and not download: + raise ValueError("Dataset not in the path. You should call this function with `download=True` the first time.") + if download: + os.makedirs(rps_dir, exist_ok=True) + print(f"Downloading rps images in {rps_dir} (may take a couple of minutes)") + path, msg = http.urlretrieve(url, filename) + with ZipFile(path, 'r') as zip_ref: + zip_ref.extractall(rps_dir) + os.remove(filename) + train_dir, test_dir = os.path.join(rps_dir, 'train'), os.path.join(rps_dir, 'test') + print("Loading training set...") + x_train, y_train = load_images_with_label(train_dir, classes) + x_train, y_train = x_train[::reduction_factor], y_train[::reduction_factor] + print("Loaded %d images for training" % len(y_train)) + print("Loading test set...") + x_test, y_test = load_images_with_label(test_dir, classes) + x_test, y_test = x_test[::reduction_factor], y_test[::reduction_factor] + print("Loaded %d images for testing" % len(y_test)) + return (x_train, y_train), (x_test, y_test) + +def load_images(path): + img_files = os.listdir(path) + imgs, labels = [], [] + for i in img_files: + if i.endswith('.jpg'): + # load the image (here you might want to resize the img to save memory) + imgs.append(Image.open(os.path.join(path, i)).copy()) + return imgs + +def load_images_with_label(path, classes): + imgs, labels = [], [] + for c in classes: + # iterate over all the files in the folder + c_imgs = load_images(os.path.join(path, c)) + imgs.extend(c_imgs) + labels.extend([c] * len(c_imgs)) + return imgs, labels + +if __name__ == '__main__': + model_aug = load_keras_model("nn_task2/nn_task2_aug.h5") + model_noaug = load_keras_model("nn_task2/nn_task2_noaug.h5") + + # Resize the input images + resize = lambda x: [e.resize((224,224)) for e in x] + + def process(x): + x_n = resize(x) + for i in range(len(x)): + bgr = img_to_array(x_n[i])[..., ::-1] + mean = [103.939, 116.779, 123.68] + bgr -= mean + x_n[i] = bgr + return x_n + + _, (x_test, y_test) = load_rps(download=not os.path.exists("rps")) + x_test_n = tf.convert_to_tensor(process(x_test)) + + MAP = {'scissors': 0, 'paper': 1, 'rock': 2} + print(MAP) + mapfunc = np.vectorize(lambda x: MAP[x]) + y_test_n = utils.to_categorical(mapfunc(y_test), 3) + + print(np.shape(y_test_n)) + + y_pred_task1 = model_noaug.predict(x_test_n) + y_pred_task2 = model_aug.predict(x_test_n) + + # Evaluate the missclassification error on the test set + # for example + assert y_test_n.shape == y_pred_task1.shape + acc = model_noaug.evaluate(x_test_n, y_test_n)[1] # evaluate accuracy with proper function + print("Accuracy model task 2 (no augmentation):", acc) + + assert y_test_n.shape == y_pred_task2.shape + acc = model_aug.evaluate(x_test_n, y_test_n)[1] # evaluate accuracy with proper function + print("Accuracy model task 2 (with augmentation):", acc) diff --git a/as2_Maggioni_Claudio/report_Maggioni_Claudio.pdf b/as2_Maggioni_Claudio/report_Maggioni_Claudio.pdf new file mode 100644 index 0000000..4159e57 Binary files /dev/null and b/as2_Maggioni_Claudio/report_Maggioni_Claudio.pdf differ diff --git a/as2_Maggioni_Claudio/report_Maggioni_Claudio.tex b/as2_Maggioni_Claudio/report_Maggioni_Claudio.tex new file mode 100644 index 0000000..d801b53 --- /dev/null +++ b/as2_Maggioni_Claudio/report_Maggioni_Claudio.tex @@ -0,0 +1,236 @@ + +%---------------------------------------------------------------------------------------- +% Machine Learning Assignment Template +%---------------------------------------------------------------------------------------- + +\documentclass[11pt]{scrartcl} +\newcommand*\student[1]{\newcommand{\thestudent}{{#1}}} + +%---------------------------------------------------------------------------------------- +% INSERT HERE YOUR NAME +%---------------------------------------------------------------------------------------- + +\student{Claudio Maggioni} + +%---------------------------------------------------------------------------------------- +% PACKAGES AND OTHER DOCUMENT CONFIGURATIONS +%---------------------------------------------------------------------------------------- + +\usepackage[utf8]{inputenc} % Required for inputting international characters +\usepackage[T1]{fontenc} % Use 8-bit encoding +\usepackage[sc]{mathpazo} +\usepackage{caption, subcaption} +\usepackage[colorlinks=true]{hyperref} +\usepackage{inconsolata} + +\usepackage[english]{babel} % English language hyphenation +\usepackage{amsmath, amsfonts} % Math packages +\usepackage{listings} % Code listings, with syntax highlighting +\usepackage{graphicx} % Required for inserting images +\graphicspath{{Figures/}{./}} % Specifies where to look for included images (trailing slash required) +\usepackage{float} + +%---------------------------------------------------------------------------------------- +% DOCUMENT MARGINS +%---------------------------------------------------------------------------------------- + +\usepackage{geometry} % For page dimensions and margins +\geometry{ + paper=a4paper, + top=2.5cm, % Top margin + bottom=3cm, % Bottom margin + left=3cm, % Left margin + right=3cm, % Right margin +} +\setlength\parindent{0pt} + +%---------------------------------------------------------------------------------------- +% SECTION TITLES +%---------------------------------------------------------------------------------------- + +\usepackage{sectsty} +\sectionfont{\vspace{6pt}\centering\normalfont\scshape} +\subsectionfont{\normalfont\bfseries} % \subsection{} styling +\subsubsectionfont{\normalfont\itshape} % \subsubsection{} styling +\paragraphfont{\normalfont\scshape} % \paragraph{} styling + +%---------------------------------------------------------------------------------------- +% HEADERS AND FOOTERS +%---------------------------------------------------------------------------------------- + +\usepackage{scrlayer-scrpage} +\ofoot*{\pagemark} % Right footer +\ifoot*{\thestudent} % Left footer +\cfoot*{} % Centre footer + +%---------------------------------------------------------------------------------------- +% TITLE SECTION +%---------------------------------------------------------------------------------------- + +\title{ + \normalfont\normalsize + \textsc{Machine Learning\\% + Universit\`a della Svizzera italiana}\\ + \vspace{25pt} + \rule{\linewidth}{0.5pt}\\ + \vspace{20pt} + {\huge Assignment 2}\\ + \vspace{12pt} + \rule{\linewidth}{1pt}\\ + \vspace{12pt} +} + +\author{\LARGE \thestudent} + +\date{\normalsize\today} + +\begin{document} + +\maketitle + +In this assignment you are asked to: + +\begin{enumerate} +\item Implement a neural network to classify images from the \texttt{CIFAR10} dataset; +\item Fine-tune a pre-trained neural network to classify rock, paper, scissors hand gestures. +\end{enumerate} + +Both requests are very similar to what we have seen during the labs. However, you are required to follow \textbf{exactly} the assignment's specifications. + +%---------------------------------------------------------------------------------------- +% Task 1 +%---------------------------------------------------------------------------------------- + +\section{Follow our recipe} + +Implement a multi-class classifier to identify the subject of the images from \href{https://www.cs.toronto.edu/\%7Ekriz/cifar.html}{\texttt{CIFAR-10}} data set. To simply the problem, we restrict the classes to 3: \texttt{airplane}, \texttt{automobile} and \texttt{bird}. + +\begin{enumerate} +\item Download and load \texttt{CIFAR-10} dataset using the following \href{https://www.tensorflow.org/api_docs/python/tf/keras/datasets/cifar10/load_data}{function}, and consider only the first three classes. Check \texttt{src/utils.py}, there is already a function for this! +\item Preprocess the data: +\begin{itemize} +\item Normalize each pixel of each channel so that the range is [0, 1]; +\item Create one-hot encoding of the labels. +\end{itemize} +\item Build a neural network with the following architecture: +\begin{itemize} +\item Convolutional layer, with 8 filters of size 5$\times$5, stride of 1$\times$1, and ReLU activation; +\item Max pooling layer, with pooling size of 2$\times$2; +\item Convolutional layer, with 16 filters of size 3$\times$3, stride of 2$\times$2, and ReLU activation; +\item Average pooling layer, with pooling size of 2$\times$2; +\item Layer to convert the 2D feature maps to vectors (Flatten layer); +\item Dense layer with 8 neurons and tanh activation; +\item Dense output layer with softmax activation; +\end{itemize} +\item Train the model on the training set from point 1 for 500 epochs: +\begin{itemize} +\item Use the RMSprop optimization algorithm, with a learning rate of 0.003 and a batch size of 128; +\item Use categorical cross-entropy as a loss function; +\item Implement early stopping, monitoring the validation accuracy of the model with a patience of 10 epochs and use 20\% of the training data as validation set; +\item When early stopping kicks in, and the training procedure stops, restore the best model found during training. +\end{itemize} +\item Draw a plot with epochs on the $x$-axis and with two graphs: the train accuracy and the validation accuracy (remember to add a legend to distinguish the two graphs!). +\item Assess the performances of the network on the test set loaded in point 1, and provide an estimate of the classification accuracy that you expect on new and unseen images. +\item \textbf{Bonus} (Optional) Tune the learning rate and the number of neurons in the last dense hidden layer with a \textbf{grid search} to improve the performances (if feasible). +\begin{itemize} +\item Consider the following options for the two hyper-parameters (4 models in total): +\begin{itemize} +\item learning rate: [0.01, 0.0001] +\item number of neurons: [16, 64] +\end{itemize} +\item Keep all the other hyper-parameters as in point 3. +\item Perform a grid search on the chosen ranges based on hold-out cross-validation in the training set and identify the most promising hyper-parameter setup. +\item Compare the accuracy on the test set achieved by the most promising configuration with that of the model obtained in point 4. Are the accuracy levels statistically different? +\end{itemize} +\end{enumerate} + +\subsection{Comment} + +The network model was built and trained according to the given specification. + +The performance on the given test set is of $0.3649$ loss and $86.4\%$ accuracy. In order to assess performance on new and unseen images +a statistical confidence interval is necessary. Since the accuracy is by construction a binomial measure (since an image can either be correctly +classified or not, and we repeat this Bernoulli process for each test set datapoint), we perform a binomial distribution confidence interval computation +for 95\% confidence. The code use to do this is found in the notebook \texttt{src/Assignment 2.ipynb} under the section \textit{Statistical tests on CIFAR classifier}. +We conclude stating that with 95\% confidence the accuracy for new and unseen images will fall between $\approx 85.12\%$ and $\approx 87.59\%$. + +The training and validation accuracy curves for the network is shown below: + +\begin{figure}[H] +\centering + \resizebox{\textwidth}{!}{% + \includegraphics{./t1_plot.png}} +\caption{Training and validation accuracy curves during fitting for the CIFAR10 classifier} +\end{figure} + +%---------------------------------------------------------------------------------------- +% Task 2 +%---------------------------------------------------------------------------------------- +\newpage +\section{Transfer learning} + +In this task, we will fine-tune the last layer of a pretrained model in order to build a classifier for the \emph{rock, paper, scissors dataset} that we acquired for the lab. The objective is to make use of the experience collected on a task to bootstrap the performances on a different task. We are going to use the \texttt{VGG16} network, pretrained on Imagenet to compete in the ILSVRC-2014 competition.\\ + +\texttt{VGG16} is very expensive to train from scratch, but luckily the VGG team publicly released the trained weights of the network, so that people could use it for transfer learning. As we discussed during classes, this can be achieved by \textbf{removing the last fully connected layers} form the pretrained model and by using the output of the convolutional layers (with freezed weights) as input to a \textbf{new fully connected network}. This last part of the model is then trained from scratch on the task of interest. + +\begin{enumerate} +\item Use \texttt{keras} to download a pretrained version of the \texttt{vgg16} network. You can start from the snippet of code you find on the \href{https://github.com/marshka/ml-20-21/tree/main/assignment\_2}{repository} of the assignment. +\item Download and preprocess the rock, paper, scissor dataset that we collected for the lab. You find the functions to download and build the dataset in \texttt{src/utils.py}. Vgg16 provides a function to prepropress the input\\ +\texttt{applications.vgg16.preprocess\_input}\\ +You may decide to use it. +Use $224 \times 224$ as image dimension. +\item Add a hidden layer (use any number of units and the activation function that you want), then add an output layer suitable for the hand gesture classification problem. +\item Train with and without \textbf{data augmentation} and report the learning curves (train and validation accuracy) for both cases. +\begin{itemize} +\item Turn on the GPU environment on Colab, otherwise training will be slow. +\item Train for 50 epochs or until convergence. +\item Comment if using data augmentation led to an improvement or not. +\end{itemize} +\end{enumerate} + +\subsection{Comment} + +The built network in its dense part is composed by a 128-neuron ReLU-activated hidden layer and a 3-neuron softmax output layer. +The input to the network is at first resized to 224x224 size and then normalized according to VGG16 normalization factors +(refer to the function \texttt{process\_vgg16} in the \texttt{src/Assignment 2.ipynb} notebook for details on the normalization process). +Classification labels were first converted from string labels to numeric ones, (i.e. \texttt{'scissors'} = 0, +\texttt{'paper'} = 1, \texttt{'rock'} = 2), and then the numeric encoding was in turn converted to a one-hot encoding using the +\texttt{keras.utils.to\_categorical} function. + +Both the data-augmented and non-augmented network were trained using the ADAM optimizer with $0.001$ learning rate for 50 epochs with +an early stopping procedure with 10 epochs patience. + +Both models were saved and both can be run at the same time on the given test set by executing \texttt{deliverable/run\_task2.py}. + +The training and validation accuracy curves for both the data-augmented and the not augmented networks are shown below: + +\begin{figure}[H] +\centering + \resizebox{\textwidth}{!}{% + \includegraphics{./t2_noaug.png}} +\caption{Training and validation accuracy curves during fitting for the not data augmented VGG16 classifier} +\end{figure} + +\begin{figure}[H] +\centering + \resizebox{\textwidth}{!}{% + \includegraphics{./t2_aug.png}} +\caption{Training and validation accuracy curves during fitting for the data augmented VGG16 classifier} +\end{figure} + +\subsection{T-Test} + +The findings shown below were computed using the script \texttt{src/t\_test.py}. + +To compare the trained model with and without data augmentation, we perform a two-tailed Student T-test between the models. +The test report that the models have different accuracy with $99.999973\%$ confidence, and the model trained with data augmentation +has lower variance. Therefore, we conclude that the model trained with data augmentation is the statistically better model out of the two. + +The student T-test of course is a valid argument only for this specific instance of the application of data augmentation. However, +in the general case we can say that performing data augmentation on the training and validation data is intuitively better in order to +assure the network is able to correctly identify rock, paper or scissors from all angles and zoom levels. + +On the given test set, the model trained with data augmentation has $\approx 90.00\%$ accuracy while the model trained without data augmentation +has $\approx 77.33\%$ accuracy. + +\end{document} diff --git a/as2_Maggioni_Claudio/src/Assignment 2.ipynb b/as2_Maggioni_Claudio/src/Assignment 2.ipynb new file mode 100644 index 0000000..7f75c2b --- /dev/null +++ b/as2_Maggioni_Claudio/src/Assignment 2.ipynb @@ -0,0 +1,1251 @@ +{ + "nbformat": 4, + "nbformat_minor": 0, + "metadata": { + "accelerator": "GPU", + "colab": { + "name": "Assignment 2.ipynb", + "provenance": [], + "collapsed_sections": [] + }, + "kernelspec": { + "display_name": "Python 3", + "name": "python3" + }, + "language_info": { + "name": "python" + } + }, + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "1q9o1iFr2RR1" + }, + "source": [ + "# Assignment 2\n", + "### Claudio Maggioni" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "fVEhcaD2y2TK" + }, + "source": [ + "import os\n", + "import pickle\n", + "import urllib.request as http\n", + "from zipfile import ZipFile\n", + "\n", + "import tensorflow as tf\n", + "import numpy as np\n", + "from PIL import Image\n", + "\n", + "from tensorflow.keras import layers as keras_layers\n", + "from tensorflow.keras import backend as K\n", + "from tensorflow.keras.datasets import cifar10\n", + "from tensorflow.keras.models import save_model, load_model\n", + "\n", + "\n", + "def load_cifar10(num_classes=3):\n", + " \"\"\"\n", + " Downloads CIFAR-10 dataset, which already contains a training and test set,\n", + " and return the first `num_classes` classes.\n", + " Example of usage:\n", + "\n", + " >>> (x_train, y_train), (x_test, y_test) = load_cifar10()\n", + "\n", + " :param num_classes: int, default is 3 as required by the assignment.\n", + " :return: the filtered data.\n", + " \"\"\"\n", + " (x_train_all, y_train_all), (x_test_all, y_test_all) = cifar10.load_data()\n", + "\n", + " fil_train = tf.where(y_train_all[:, 0] < num_classes)[:, 0]\n", + " fil_test = tf.where(y_test_all[:, 0] < num_classes)[:, 0]\n", + "\n", + " y_train = y_train_all[fil_train]\n", + " y_test = y_test_all[fil_test]\n", + "\n", + " x_train = x_train_all[fil_train]\n", + " x_test = x_test_all[fil_test]\n", + "\n", + " return (x_train, y_train), (x_test, y_test)\n", + "\n", + "\n", + "def load_rps(download=False, path='rps', reduction_factor=1):\n", + " \"\"\"\n", + " Downloads the rps dataset and returns the training and test sets.\n", + " Example of usage:\n", + "\n", + " >>> (x_train, y_train), (x_test, y_test) = load_rps()\n", + "\n", + " :param download: bool, default is False but for the first call should be True.\n", + " :param path: str, subdirectory in which the images should be downloaded, default is 'rps'.\n", + " :param reduction_factor: int, factor of reduction of the dataset (len = old_len // reduction_factor).\n", + " :return: the images and labels split into training and validation sets.\n", + " \"\"\"\n", + " url = 'https://drive.switch.ch/index.php/s/xjXhuYDUzoZvL02/download'\n", + " classes = ('rock', 'paper', 'scissors')\n", + " rps_dir = os.path.abspath(path)\n", + " filename = os.path.join(rps_dir, 'data.zip')\n", + " if not os.path.exists(rps_dir) and not download:\n", + " raise ValueError(\"Dataset not in the path. You should call this function with `download=True` the first time.\")\n", + " if download:\n", + " os.makedirs(rps_dir, exist_ok=True)\n", + " print(f\"Downloading rps images in {rps_dir} (may take a couple of minutes)\")\n", + " path, msg = http.urlretrieve(url, filename)\n", + " with ZipFile(path, 'r') as zip_ref:\n", + " zip_ref.extractall(rps_dir)\n", + " os.remove(filename)\n", + " train_dir, test_dir = os.path.join(rps_dir, 'train'), os.path.join(rps_dir, 'test')\n", + " print(\"Loading training set...\")\n", + " x_train, y_train = load_images_with_label(train_dir, classes)\n", + " x_train, y_train = x_train[::reduction_factor], y_train[::reduction_factor]\n", + " print(\"Loaded %d images for training\" % len(y_train))\n", + " print(\"Loading test set...\")\n", + " x_test, y_test = load_images_with_label(test_dir, classes)\n", + " x_test, y_test = x_test[::reduction_factor], y_test[::reduction_factor]\n", + " print(\"Loaded %d images for testing\" % len(y_test))\n", + " return (x_train, y_train), (x_test, y_test)\n", + "\n", + "\n", + "def make_dataset(imgs, labels, label_map, img_size, rgb=True, keepdim=True, shuffle=True):\n", + " x = []\n", + " y = []\n", + " n_classes = len(list(label_map.keys()))\n", + " for im, l in zip(imgs, labels):\n", + " # preprocess img\n", + " x_i = im.resize(img_size)\n", + " if not rgb:\n", + " x_i = x_i.convert('L')\n", + " x_i = np.asarray(x_i)\n", + " if not keepdim:\n", + " x_i = x_i.reshape(-1)\n", + " \n", + " # encode label\n", + " y_i = np.zeros(n_classes)\n", + " y_i[label_map[l]] = 1.\n", + " \n", + " x.append(x_i)\n", + " y.append(y_i)\n", + " x, y = np.array(x).astype('float32'), np.array(y)\n", + " if shuffle:\n", + " idxs = np.arange(len(y))\n", + " np.random.shuffle(idxs)\n", + " x, y = x[idxs], y[idxs]\n", + " return x, y\n", + "\n", + "\n", + "def load_images(path):\n", + " img_files = os.listdir(path)\n", + " imgs, labels = [], []\n", + " for i in img_files:\n", + " if i.endswith('.jpg'):\n", + " # load the image (here you might want to resize the img to save memory)\n", + " imgs.append(Image.open(os.path.join(path, i)).copy())\n", + " return imgs\n", + "\n", + "\n", + "def load_images_with_label(path, classes):\n", + " imgs, labels = [], []\n", + " for c in classes:\n", + " # iterate over all the files in the folder\n", + " c_imgs = load_images(os.path.join(path, c))\n", + " imgs.extend(c_imgs)\n", + " labels.extend([c] * len(c_imgs))\n", + " return imgs, labels\n", + "\n", + "\n", + "def save_keras_model(model, filename):\n", + " \"\"\"\n", + " Saves a Keras model to disk.\n", + " Example of usage:\n", + "\n", + " >>> model = Sequential()\n", + " >>> model.add(Dense(...))\n", + " >>> model.compile(...)\n", + " >>> model.fit(...)\n", + " >>> save_keras_model(model, 'my_model.h5')\n", + "\n", + " :param model: the model to save;\n", + " :param filename: string, path to the file in which to store the model.\n", + " :return: the model.\n", + " \"\"\"\n", + " save_model(model, filename)\n", + "\n", + "\n", + "def load_keras_model(filename):\n", + " \"\"\"\n", + " Loads a compiled Keras model saved with models.save_model.\n", + "\n", + " :param filename: string, path to the file storing the model.\n", + " :return: the model.\n", + " \"\"\"\n", + " model = load_model(filename)\n", + " return model\n", + "\n", + "\n", + "def save_vgg16(model, filename='nn_task2.pkl', additional_args=()):\n", + " \"\"\"\n", + " Optimize task2 model by only saving the layers after vgg16. This function\n", + " assumes that you only added Flatten and Dense layers. If it is not the case,\n", + " you should include into `additional_args` other layers' attributes you\n", + " need.\n", + "\n", + " :param filename: string, path to the file in which to store the model.\n", + " :param additional_args: tuple or list, additional layers' attributes to be \n", + " saved. Default are ['units', 'activation', 'use_bias']\n", + " :return: the path of the saved model.\n", + " \"\"\"\n", + " filename = filename if filename.endswith('.pkl') else (filename + '.pkl')\n", + " args = ['units', 'activation', 'use_bias', *additional_args]\n", + " layers = []\n", + " for l in model.layers[1:]:\n", + " layer = dict()\n", + " layer['class'] = l.__class__.__name__\n", + " if l.weights:\n", + " layer['weights'] = l.get_weights()\n", + " layer['kwargs'] = {k: v for k, v in vars(l).items() if k in args}\n", + " layers.append(layer)\n", + "\n", + " with open(filename, 'wb') as fp:\n", + " pickle.dump(layers, fp)\n", + " \n", + " return os.path.abspath(filename)\n", + "\n", + "\n", + "def load_vgg16(filename='nn_task2.pkl', img_h=224, img_w=224):\n", + " \"\"\"\n", + " Loads the model saved with save_vgg16.\n", + "\n", + " :param filename: string, path to the file storing the model.\n", + " :param img_h: int, the height of the input image.\n", + " :param img_w: int, the width of the input image.\n", + " :return: the model.\n", + " \"\"\"\n", + " K.clear_session()\n", + "\n", + " vgg16 = applications.VGG16(weights='imagenet', \n", + " include_top=False, \n", + " input_shape=(img_h, img_w, 3))\n", + " model = Sequential()\n", + " model.add(vgg16)\n", + "\n", + " with open(filename, 'rb') as fp:\n", + " layers = pickle.load(fp)\n", + " for l in layers:\n", + " cls = getattr(keras_layers, l['class'])\n", + " if 'weights' in l:\n", + " layer = cls(**l['kwargs'])\n", + " model.add(layer)\n", + " model.layers[-1].set_weights(l['weights'])\n", + " else:\n", + " model.add(cls())\n", + " \n", + " model.trainable = False\n", + " return model" + ], + "execution_count": 2, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "rQnQIX74Hoy8" + }, + "source": [ + "# Exercise 1" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "FxvuMnmmzAfa", + "colab": { + "base_uri": "https://localhost:8080/" + }, + "outputId": "f144d7fc-10be-4e98-98c7-26a3b3c55052" + }, + "source": [ + "# Load the training and test CIFAR10 data\n", + "(x_train, y_train), (x_test, y_test) = load_cifar10()" + ], + "execution_count": 3, + "outputs": [ + { + "output_type": "stream", + "text": [ + "Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz\n", + "170500096/170498071 [==============================] - 3s 0us/step\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "7fO_8u2qM8xb" + }, + "source": [ + "# Normalize the train and test data\n", + "x_train_n = x_train / 255\n", + "x_test_n = x_test / 255\n", + "\n", + "# Check if only 3 classes were loaded (no output should be printed)\n", + "for e in y_train:\n", + " if e[0] not in [0,1,2]:\n", + " print(e[0])" + ], + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "id": "6s2n8JukO8nT" + }, + "source": [ + "from tensorflow.keras import utils\n", + "\n", + "n_classes = 3\n", + "\n", + "# Convert output data to one-hot encoding\n", + "y_train_n = utils.to_categorical(y_train, n_classes)\n", + "y_test_n = utils.to_categorical(y_test, n_classes)" + ], + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "T4E5yvxTPnwP", + "outputId": "4bb2f7c4-d87e-4d6f-8f93-876be6cc5992" + }, + "source": [ + "from tensorflow.keras import Sequential\n", + "from tensorflow.keras.layers import Dense\n", + "from tensorflow.keras import optimizers\n", + "from tensorflow.keras.layers import Conv2D, MaxPooling2D, AveragePooling2D, \\\n", + " Flatten, Dropout\n", + "from tensorflow.keras.callbacks import EarlyStopping, CSVLogger \n", + "\n", + "# Build the CIFAR10 model architecture\n", + "model = Sequential()\n", + "model.add(Conv2D(8, (5, 5), activation='relu', input_shape=(32, 32, 3)))\n", + "model.add(MaxPooling2D(pool_size=(2, 2)))\n", + "model.add(Conv2D(16, (3, 3), strides=(2,2), activation='relu'))\n", + "model.add(AveragePooling2D(pool_size=(2, 2)))\n", + "model.add(Flatten())\n", + "model.add(Dense(8, activation='tanh'))\n", + "model.add(Dense(n_classes, activation='softmax'))\n", + "\n", + "# Compile the model and print model architecture\n", + "model.compile(optimizer=optimizers.RMSprop(learning_rate=0.003), \n", + " loss='categorical_crossentropy', \n", + " metrics=['accuracy'])\n", + "model.summary()\n", + "\n", + "# Implement early stopping monitoring validation accuracy\n", + "callback = EarlyStopping(monitor='val_accuracy', \n", + " patience=10,\n", + " restore_best_weights=True)\n", + "\n", + "# Log training data in the indicated CSV file\n", + "log_task1 = CSVLogger('my_civar10.csv')\n", + "\n", + "# Train the model\n", + "batch_size = 128\n", + "epochs = 500\n", + "model.fit(x_train_n, \n", + " y_train_n, \n", + " batch_size=batch_size, \n", + " epochs=epochs, \n", + " validation_split=0.2,\n", + " callbacks=[callback, log_task1])" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "stream", + "text": [ + "Model: \"sequential_7\"\n", + "_________________________________________________________________\n", + "Layer (type) Output Shape Param # \n", + "=================================================================\n", + "conv2d (Conv2D) (None, 28, 28, 8) 608 \n", + "_________________________________________________________________\n", + "max_pooling2d (MaxPooling2D) (None, 14, 14, 8) 0 \n", + "_________________________________________________________________\n", + "conv2d_1 (Conv2D) (None, 6, 6, 16) 1168 \n", + "_________________________________________________________________\n", + "average_pooling2d (AveragePo (None, 3, 3, 16) 0 \n", + "_________________________________________________________________\n", + "flatten_7 (Flatten) (None, 144) 0 \n", + "_________________________________________________________________\n", + "dense_13 (Dense) (None, 8) 1160 \n", + "_________________________________________________________________\n", + "dense_14 (Dense) (None, 3) 27 \n", + "=================================================================\n", + "Total params: 2,963\n", + "Trainable params: 2,963\n", + "Non-trainable params: 0\n", + "_________________________________________________________________\n", + "Epoch 1/500\n", + "94/94 [==============================] - 1s 6ms/step - loss: 0.9172 - accuracy: 0.5718 - val_loss: 0.9488 - val_accuracy: 0.5113\n", + "Epoch 2/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.7498 - accuracy: 0.6758 - val_loss: 0.6664 - val_accuracy: 0.7343\n", + "Epoch 3/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.6778 - accuracy: 0.7107 - val_loss: 0.6071 - val_accuracy: 0.7557\n", + "Epoch 4/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.6342 - accuracy: 0.7352 - val_loss: 0.6572 - val_accuracy: 0.7210\n", + "Epoch 5/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.5902 - accuracy: 0.7538 - val_loss: 0.5681 - val_accuracy: 0.7710\n", + "Epoch 6/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.5556 - accuracy: 0.7712 - val_loss: 0.5319 - val_accuracy: 0.7933\n", + "Epoch 7/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.5265 - accuracy: 0.7832 - val_loss: 0.4979 - val_accuracy: 0.8007\n", + "Epoch 8/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.5014 - accuracy: 0.7976 - val_loss: 0.4851 - val_accuracy: 0.8007\n", + "Epoch 9/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4874 - accuracy: 0.8020 - val_loss: 0.4577 - val_accuracy: 0.8227\n", + "Epoch 10/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4745 - accuracy: 0.8067 - val_loss: 0.4667 - val_accuracy: 0.8150\n", + "Epoch 11/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4576 - accuracy: 0.8162 - val_loss: 0.4599 - val_accuracy: 0.8180\n", + "Epoch 12/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4456 - accuracy: 0.8223 - val_loss: 0.4813 - val_accuracy: 0.8133\n", + "Epoch 13/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4402 - accuracy: 0.8247 - val_loss: 0.4650 - val_accuracy: 0.8140\n", + "Epoch 14/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4313 - accuracy: 0.8288 - val_loss: 0.4428 - val_accuracy: 0.8280\n", + "Epoch 15/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.4172 - accuracy: 0.8344 - val_loss: 0.4678 - val_accuracy: 0.8157\n", + "Epoch 16/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.4134 - accuracy: 0.8351 - val_loss: 0.4249 - val_accuracy: 0.8353\n", + "Epoch 17/500\n", + "94/94 [==============================] - 1s 5ms/step - loss: 0.4058 - accuracy: 0.8365 - val_loss: 0.4124 - val_accuracy: 0.8430\n", + "Epoch 18/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3955 - accuracy: 0.8430 - val_loss: 0.4277 - val_accuracy: 0.8313\n", + "Epoch 19/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3878 - accuracy: 0.8466 - val_loss: 0.4089 - val_accuracy: 0.8367\n", + "Epoch 20/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3865 - accuracy: 0.8499 - val_loss: 0.5154 - val_accuracy: 0.7947\n", + "Epoch 21/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3827 - accuracy: 0.8513 - val_loss: 0.4198 - val_accuracy: 0.8393\n", + "Epoch 22/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3752 - accuracy: 0.8518 - val_loss: 0.4005 - val_accuracy: 0.8447\n", + "Epoch 23/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.3663 - accuracy: 0.8551 - val_loss: 0.4908 - val_accuracy: 0.8120\n", + "Epoch 24/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3641 - accuracy: 0.8571 - val_loss: 0.4103 - val_accuracy: 0.8430\n", + "Epoch 25/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3591 - accuracy: 0.8580 - val_loss: 0.3885 - val_accuracy: 0.8547\n", + "Epoch 26/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3537 - accuracy: 0.8607 - val_loss: 0.4419 - val_accuracy: 0.8363\n", + "Epoch 27/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3477 - accuracy: 0.8650 - val_loss: 0.3892 - val_accuracy: 0.8490\n", + "Epoch 28/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.3448 - accuracy: 0.8679 - val_loss: 0.4389 - val_accuracy: 0.8340\n", + "Epoch 29/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3396 - accuracy: 0.8660 - val_loss: 0.3977 - val_accuracy: 0.8500\n", + "Epoch 30/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.3386 - accuracy: 0.8673 - val_loss: 0.4488 - val_accuracy: 0.8283\n", + "Epoch 31/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3359 - accuracy: 0.8698 - val_loss: 0.3940 - val_accuracy: 0.8473\n", + "Epoch 32/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.3290 - accuracy: 0.8699 - val_loss: 0.4283 - val_accuracy: 0.8343\n", + "Epoch 33/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3222 - accuracy: 0.8747 - val_loss: 0.3796 - val_accuracy: 0.8540\n", + "Epoch 34/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3185 - accuracy: 0.8778 - val_loss: 0.3872 - val_accuracy: 0.8553\n", + "Epoch 35/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3212 - accuracy: 0.8730 - val_loss: 0.4123 - val_accuracy: 0.8397\n", + "Epoch 36/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3160 - accuracy: 0.8738 - val_loss: 0.3843 - val_accuracy: 0.8533\n", + "Epoch 37/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3122 - accuracy: 0.8777 - val_loss: 0.3735 - val_accuracy: 0.8587\n", + "Epoch 38/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3127 - accuracy: 0.8755 - val_loss: 0.3968 - val_accuracy: 0.8450\n", + "Epoch 39/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3069 - accuracy: 0.8798 - val_loss: 0.4014 - val_accuracy: 0.8417\n", + "Epoch 40/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3056 - accuracy: 0.8823 - val_loss: 0.4605 - val_accuracy: 0.8237\n", + "Epoch 41/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3014 - accuracy: 0.8811 - val_loss: 0.3749 - val_accuracy: 0.8650\n", + "Epoch 42/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.3015 - accuracy: 0.8819 - val_loss: 0.3943 - val_accuracy: 0.8593\n", + "Epoch 43/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2996 - accuracy: 0.8835 - val_loss: 0.4209 - val_accuracy: 0.8373\n", + "Epoch 44/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2928 - accuracy: 0.8851 - val_loss: 0.3775 - val_accuracy: 0.8687\n", + "Epoch 45/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2901 - accuracy: 0.8882 - val_loss: 0.4303 - val_accuracy: 0.8313\n", + "Epoch 46/500\n", + "94/94 [==============================] - 0s 4ms/step - loss: 0.2915 - accuracy: 0.8869 - val_loss: 0.4693 - val_accuracy: 0.8233\n", + "Epoch 47/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2878 - accuracy: 0.8877 - val_loss: 0.3719 - val_accuracy: 0.8610\n", + "Epoch 48/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2852 - accuracy: 0.8861 - val_loss: 0.3751 - val_accuracy: 0.8647\n", + "Epoch 49/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2818 - accuracy: 0.8891 - val_loss: 0.4302 - val_accuracy: 0.8427\n", + "Epoch 50/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2842 - accuracy: 0.8893 - val_loss: 0.4167 - val_accuracy: 0.8477\n", + "Epoch 51/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2788 - accuracy: 0.8915 - val_loss: 0.4626 - val_accuracy: 0.8300\n", + "Epoch 52/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2767 - accuracy: 0.8950 - val_loss: 0.4610 - val_accuracy: 0.8333\n", + "Epoch 53/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2742 - accuracy: 0.8938 - val_loss: 0.3817 - val_accuracy: 0.8633\n", + "Epoch 54/500\n", + "94/94 [==============================] - 0s 5ms/step - loss: 0.2761 - accuracy: 0.8933 - val_loss: 0.4188 - val_accuracy: 0.8477\n" + ], + "name": "stdout" + }, + { + "output_type": "execute_result", + "data": { + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "execution_count": 60 + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "PLQie0MoB5xo" + }, + "source": [ + "### Plot validation loss and accuracy curves" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "tfeAmqWMbvF9", + "outputId": "961c7b14-dcaa-44ed-8cc2-acf3e8d729fe" + }, + "source": [ + "import pandas as pd\n", + "\n", + "# Load the CSV with saved training data\n", + "df = pd.read_csv('my_civar10.csv')\n", + "print(df[df.epoch == 0])" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "stream", + "text": [ + " epoch accuracy loss val_accuracy val_loss\n", + "0 0 0.571833 0.917225 0.511333 0.948794\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 225 + }, + "id": "Aedy102KeaaK", + "outputId": "9e1a249b-5687-4d61-ab12-bbde73756524" + }, + "source": [ + "import matplotlib as mpl\n", + "import matplotlib.pyplot as plt\n", + "\n", + "# Plot training and validation accuracy w.r.t. epoch for CIFAR10 model\n", + "plt.figure(figsize=(4,3))\n", + "ax = plt.gca()\n", + "lines = []\n", + "FEATURES = ['accuracy', 'val_accuracy']\n", + "for feature in FEATURES:\n", + " lines.append(ax.plot(df[\"epoch\"], df[feature], marker='.')[0])\n", + "plt.xlabel(\"Epoch\")\n", + "plt.ylabel(\"Accuracy [%]\")\n", + "lgd = plt.legend(lines, [\"Training accuracy\", \"Validation accuracy\"], \n", + " loc=\"best\", bbox_to_anchor=(1,1))\n", + "plt.show()" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZ0AAADQCAYAAADcbrykAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXxU5dXA8d+Zyc4edggQ1kDYCUZAUUTFpYK1qBVQXIpU31arqNVWi32x9rWL1gV3q1JXFC1F6oayKwiERXZkC2EJBBIgJCGZ5Xn/eGaSSUhCgJnMEM/388knc+/cuffMBO6ZZxdjDEoppVRtcIQ7AKWUUj8emnSUUkrVGk06Simlao0mHaWUUrVGk45SSqlao0lHKaVUrYkKdwCnqlmzZiY5OTncYSj1o5ORkXHQGNM83HGos9tZl3SSk5NZsWJFuMNQ6kdHRDLDHYM6+4W0ek1ELheRzSKyVUQequT5DiLytYh8LyLzRSQplPEopZQKr5AlHRFxAs8DVwCpwBgRSa1w2N+Bfxlj+gBTgP8LVTxKKaXCL5QlnXRgqzFmuzGmBHgfuLrCManAXN/jeZU8r5RSqg4JZZtOWyArYHs3cG6FY9YAPwOeAa4BGohIU2PMocCDRGQiMBGgffv2IQtYqbomIzOPpdsPMahTU9I6NAn++TMyWkRFRb0G9EJ7wyrwAuvcbveEtLS0A5UdEO6OBPcDU0XkFmAhsAfwVDzIGPMK8ArAwIEDdYZSparhTzT1Y5386b8bcXkMUQ7hjmGdadEglqzcQgZ1asqgTk3ZuO8o3+3IPe2kFBUV9VqrVq16NG/ePM/hcOj/zR85r9crOTk5qdnZ2a8Boyo7JpRJZw/QLmA7ybevlDFmL7akg4jUB0YbYw6HMCalznoVSy+B20UuD7e9sZwSj7fca9xew9S5W0u3X120o9zzcdEO3pkw6HQSTy9NOMrP4XCY5s2bH8nOzu5V1TGhTDrLga4i0hGbbG4AxgYeICLNgFxjjBf4HfB6CONRKmJUlziAKqvEVuzMZexr3+Fye4lyCj/p3ZrZ3+/D7TUIUPHO7xS7NzrKwYjUlsz+fh9eAwIkN6vHzoMFGMDl9rJ0+6HTSToOTTgqkO/fQ5VVrSFLOsYYt4j8GvgCcAKvG2PWi8gUYIUxZhYwDPg/ETHY6rVfhSoepcIpIzOPhVsO4HQ4WLYjl8VbDwLgEOiT1Ih1e46WJg4EjLEJ48ZB7WnbJJ4Ne49ytMjF0h25lLhtKcblMcxcvbf0Ggbo3bYhm7Pz8Xhtopl8VU/yCktKk9mXG/bjcnuJjnJw+9BOTJm9vnTbf8zZJDs72zls2LAUgIMHD0Y7HA6TmJjoBli9evXGuLi4KhPiwoULE15//fWmb775ZlZVxwD079+/+6pVqzYFN/IfLznbFnEbOHCg0cGhKhyqa5QPfG5A+8Z8vfEAX27IpmFcNLvzivhyQzZe33+12CgHxe6y6q8Yp+OE6rCq9G7biE3ZR21ScTq495Ju/OOrLbg9NnG8M2EQUHVJqboS1slKOSKSYYwZGLhvzZo1O/v27XuwRsGH2KRJk9rUr1/fM2XKlP3+fS6Xi+jo6HCGFRZut5uoqPA12a9Zs6ZZ3759kyt7LtwdCZSKWBmZeXyzNYfEhFjW7T3CByuy8BqIcgg/G9AWA7RrksCRwhKmLcm0JRWBGIeD4oAkEu2Q0oTjEBg9oC0fr9pTWsKYfFXP0hKH0yEggseXRK7q05qPV+7Ba8ApcHmvVvxxVM9yieKcjoknJI6qEkhahyblnqu4XRsWbz1Y75utBxuc16VZ/vldmhUE+/yjR49Ojo2N9a5bty4hPT392Lhx43Lvvffe9sXFxY64uDjvm2++uaNv377Fs2fPbvDkk0+2nDdv3tZJkya1ycrKisnMzIzdu3dvzB133LH/kUceOQCQkJDQv7CwcNXs2bMbTJkypU1iYqJr8+bN8b179y6cOXPmDofDwfTp0xs99NBDSQkJCd5zzjnnWGZmZuy8efO2Bsa1efPmmLFjx3YsKipyADzzzDO7Lr300gKAhx9+uNWHH36YKCJcfPHFR1544YU969ati504cWKHQ4cORTmdTvPhhx9u37FjR4w/ZoDx48e3HzhwYMHdd999qG3btr1HjRqVu2DBgob33HNPdn5+vvONN95o7nK5JDk5uXjGjBk7GjRo4M3Kyoq67bbbOuzatSsWYOrUqZn//e9/GyUmJronT558AOCuu+5q26JFC9cf/vCHSnugnQlNOqpOOZVv7lUd6/UaXl20nb98vqk0WQRyew0frNhd6TmNgcT6MWQfOY7BJorrzkni45VlSWZ0WjtGp7Urd+2UVg0qbdMBmP39vnJVYJGQOCrzwIw17bZk5ydUd0xBsdux7WBBgjHw0oJtdG5Wr7BebFSVxbxurRoU/u3avtVWf1Vm3759MStXrtwUFRVFbm6uY/ny5Zuio6OZOXNmg9/+9rdJX3zxxbaKr9m6dWvct99+u/nw4cPOHj169HrggQdyYmNjy/0L2LhxY/zq1au3Jycnu9LS0rrPmTOn/tChQwt+85vfdJg/f/6m7t27l4wcObJjZTG1adPGvWjRoi0JCQlm7dq1sWPGjOm0bt26jR988EHDTz/9tHFGRsamBg0aePfv3+8EGDt2bMf7778/e/z48YcLCwvF4/HIjh07Yqp7302bNnVv2LBhI9iqx/vuu+8gwN13393m2Wefbfbwww8fuOOOO9oPHTo0f/LkydvcbjdHjhxxtm/f3nXNNdd0njx58gGPx8PMmTObLF++fOOpfu41oUlHndUyMvNYsNl+GVu6/RDLduYBtgrr3dvL98byJ5l+7Rrz7baDvDh/m68EIUwY2pEuzevz+fps1u89QvbR4tLXOQR+NqBt6c1fRPB4Dcb33DX92/LftWWJ4a7hXcu1lYwe0I7RA9pVWxqpLJH4vTNhUEjH2tSmY8XuKH+NvjF2u15sVEmwr/Ozn/0sz1+9lJub6/z5z3/ecefOnXEiYlwul1T2mhEjRhyOj4838fHx7sTERNfu3bujOnfu7Ao8pnfv3gX+fT179izctm1bTIMGDTzt2rUr7t69ewnADTfckPvaa6+dMDFqSUmJ/OIXv+iwYcOGeIfDQWZmZizAnDlzGt54440HGzRo4AVo2bKlJy8vz7F///6Y8ePHHwZISEgwnNhP5ATjx4/P8z/OyMiInzx5ctv8/HxnQUGB88ILLzwC8O233zaYMWPGDoCoqCiaNm3qadq0qadx48bub775Jn7fvn3RPXv2LGzVqtUJw1eCQZOOOisZY5j27U6mzN5QWhpJiHaWPl/s9nLfB6s5v2tzerZpyNEiF3//cjMuz4n/bz3G8PLC7aXbInDDOe2YubqsdDImvQNj0juwdPshmiTElEsqY8/twNhzO1RZcjlZldfJREpJ5mRqUiJZvPVgvdveXN7N7fE6opwO79+v77c9FFVs9evXLy09Pfjgg20vvPDC/Dlz5mzbvHlzzPDhw1Mqe01gqcbpdOJ2u09ITjU5piqPP/54yxYtWrg++uijHV6vl/j4+LSavyMrOjraeL1lBcPi4uJy1/cnLoCJEyd2nDFjxtbBgwcXPfvss00XLFjQoLpz33rrrQdfe+21ZgcOHIi+9dZbD1V37JnQpKMi2gkN3ztzeWfZLtbtOcKW/cdKj3MI/HRAm9JqLAPsPFTIzkOVT4x8aY+WLNqaU5o4RqS25JM1+/D39WyXmFBpCcP/+2RJ5YREkbUMdi6C5KHQLj2on9HZ5PwuzQpev+WcLaFs06no6NGjzqSkpBKAl19+uVmwz9+nT5/jWVlZsZs3b45JSUkpmT59emJlxx05csSZlJRU4nQ6mTp1alOPxxYkLrvssqOPP/54m4kTJ+b6q9datmzpadWqVclbb73V+KabbjpcVFQkbrdbOnfuXLx169b4oqIiKSgocCxevLjheeedd6yy6xUWFjrat2/vKi4ulvfffz+xdevWLoDzzjsv/29/+1vzyZMnH/BXrzVt2tRz0003HX788cfbut1uGT169PbKzhkMmnRUrTLG8PXGA2zMPsqQzvb/f8XeVJU13gPUj3VyrNj+RxVgVL82fLkuG5fnxGqsvYeLeG/ZLrzGJqQRqa2Yt/lAaS+vO4Z15o5hncu1nQR2J66s7STQKZU+spbBm1eB1wXOWLh51o8+8dRGsvF78MEHsydMmNDxL3/5S5tLL7006IPP69evb5566qnMyy+/vGtCQoK3b9++lb63e+6558Do0aM7v//++02HDx9+JD4+3gtw7bXXHl25cmVCv379ekRHR5tLLrnkyNSpU/e8/fbbO26//fYOjz32WJvo6Gjz4YcfbktNTS0ZOXJkXvfu3XsmJSUV9+zZs7CquB566KG96enpPRITE90DBgw4duzYMSfAiy++uOuWW27p0K1bt2YOh4OpU6dmXnLJJQVxcXFmyJAhRxs3buwJZc837TKtQm7JtoN8tHIPBcVuMjJzOZBfVoXv8I1JcTiEbi3qsyk7v9KKawFaNowtbWtxCkwakcKgTk0rbe/IyMxj3GtLS5PIyboS+19T47aTUym5zH8C5vsmUBcnDH8Yht5X/WtO91ohPFekd5kOpyNHjjgaNWrk9Xq9jB8/vn3Xrl2PP/roo0Hv+RVKHo+Hnj17pn744YfbevfuXXzyV1RNu0yrWuW/eXduXo/P1mXzn4ABjG0bx5UbOe8vxXi8hsxDhaX7KzbeR0c5uPvibicMZqyqxJHWoUm11WOVqXHpJWsZvPkT8JSAIwpumgkdh1Z9vCPgv5kzxt7wT3b+nYsgKR12LbEJy3ghKv7MSklZy2DaSHCXQJSWuILp6aefbvbee+81c7lc0rNnz8JJkyadVYk4IyMj7uqrr+56xRVX5J1pwjkZTTrqjAWWEErcHm5+vWzur8BWTqfAhSkt+Hjl7rIxKQger00ij1yVWi6pBDbeV9dAX5WTJpHT/da/c5FNOABeN7w/Fgb/ChzRNvlUPFfO5rLHP/l79dcqTQzFnNBZyVNsr326iWLnorLznum5VDmPPvrogbOtZBMoLS3t+O7du9fWxrU06ahTlpGZx5JtB+nZpiF5hS4e+mgtLo8X8WUYf+lFgKv7teHz9dkB3YeTGD0gqcp5xk65gb6imiaSzCX25u71nPq3/uShvndnbMklpn5Z9VnF0ojHBVu+gC6XwtY5UJBT/bkDEwMCXUfAjgXgPh5w7dPU4fyyx8ZA23NO/1xKnSZNOuqklu/MZeaqPcRFOdiWU8CCLTnlvoMPkC0Mcm5gqTeVgpYD2J5TgNc399dNg5O5aXDySRNJ4ONT6h7sTzLtzoX8bJh5p00kzpgTE0nWMtixCMQBS6bahn049W/9TToCBjpfDMMegh0LYe6fqLQEsXMRFB+BgbfB0b2wfT6cf2/V504eavtsGwNRcXDB/fZn3p9h+zyIqVfzz6aiqFgbY6u+kL0Gtn0NnS44/fMpdRo06SjgxEb05Ttz+ShjNzsOFvDdjtzS42KjHKUJR4BbkrJ5JGcKDrwcJ4bM896joMXgoI1RqVbmEtu2YioZw+YpKX/zz1oG067ylSKAhObgjLYlEWOgdb+aX3fXEvv7wgfLzh8VW3lpZONsiK4HnS+CnYth+WvgKoLo+MrP3by7LeQknw8XP1p2/mtfh3/0hCXPw09fqHmsgTb8x3ZkGD8T5kyGb5+D3tdCq96ndz6lToMmnR+przfu57N1+2gUF82+o8f5fF126ZT3TRKiyS10nfAap8DotLblpnSZ5HgPp9j2mzhcdD++BjpcUvMxKlnLYMvn0O3yU29fWPpCQMIRe/7MxbbR3RlV/ua/9euyhIMDzv0ldLoQ1rwPGW/Cqreh83BK6wirs2uJLYW06W+326XDzZ/YKrZtcyHaNxOM1wub/gtdLrZJptMwWPo8ZH1nH1cm81vACxf8tvznkZAI/W+EFW/A8D9Aw9Zlz9WkStEY2DATOl5gz3XpFPu5z7gN+vzc7tf2HVULdHnZH4mMzDz++vkmHv/vBq58ZiG/mLaCGRl7+Oc3O/ly/f7SdhgDRDnLbrwO7ASXTqF0LMw7EwYxaUQKXwzdToMDy+23Z3ydBpIq3Li2zYfXL4Ov/wTTRtkbpF/WMnjjSlj0pB3HEvic387FMO//TnzO44Ldy+xVxWmTwMV/gJ+/Y8fCNO8BSQFtFnsy7G9x2FJJpwvtTfaqp2wX5vUfw9dTbCyVxREo8xt77qiAabDapcPof9qEs+T5smsey4YeI+12hyG2J9v2+VWfe/t8+17aVVzZHRh0p+24sOyVsn27vrOf3dxKPt9A+9dB7nbo+VO7nZAI50yAg1tO/toIdu6553b76KOPGgbumzJlSotx48ZVua59enp6ysKFCxMALrzwwi4HDx50Vjxm0qRJbSZPntyyumu/9dZbjTMyMuL82/fcc0+bmTNnVjvqX2lJp+6o8G3XX13Wt11jlm4/xAvztpYmloZxUaXdlp0C11eYkPKeS1KYO/sdUsx2MqQXo666pnRNlrQOTSBrGWlHpsHqd20D+dD7YM27sPJfsHdlWfdhY+CL39uSB5xY5bVxdvXtKru+s439xgvfPG1LE/7n17xv23AunWJvxIHf8i//P/jvJHtMvzGwZrptxO93EzTteGKJ4Lx7YMMsWPyUTUrVDeA8fhSy18LQ+098LrA0cvFk2PSJTTJdR9jnY+vbpFxd0tmxANoPgui4E59L7GQT2LJXbOKMT7RVZJ7iyj/fQOtn2uTc/aqyfQ7/lP+m+tdGsOuuuy73vffeSxw9evRR/76PPvoo8Yknnqh8RtYKFixYsPXkR1Vu5syZjd1u95G0tLTjAE8//fTek70m0oRjCQQt6dQF/m62Xz8G00ayaflXjH11KX/7YjM3vvYdU+duLTe1/lV9WxMb7ai09DLjuuaM3XAnrzmf4L6oD3k35s+MbZPNry7qUppwmHYVrH4bMLarcIfBMOo5e3Nd9CQU+eYcXPoiHFgPDv8XSVO+B9Vu/zdrX8mqcYfy72vlv8oSlvu4bbAHW8pZ+DdbvTXkbpv0Am+WabfaksIXv4fdK2wCaj8YRj594rFg4+t8kS9EL7iLYMFfbCKqWPLZvcwe02FI5X+LQXfaKr9lL9uk2vECiG9c9nynYbB3NRTmnvja/P1wYAN0vLDyc4PtvFByzFblffYAHD9sE6X/fVTWu81ftZZ8PtQLmAWm41BbqhJnzcYPBcv2+fX46o+t2D7/DHpFWDfddFPe3LlzGx0/flzALh9w4MCB6Msuu+zYuHHj2vfq1atHly5det57771tKnt927Zte+/bty8K4MEHH2yVnJzcKy0tLeWHH36I9R/z5JNPNuvVq1ePlJSU1Msuu6xzfn6+Y86cOfW++uqrxo888khS9+7dU9evXx87evTo5DfeeKMJwH/+858GPXr0SO3WrVvqddddl1xUVCT+6917771tUlNTe3Tr1i111apVJ3y72Lx5c0xaWlpKampqj9TU1B5z5swp/ZwefvjhVt26dUtNSUlJ/Z//+Z+2AOvWrYsdMmRIt5SUlNTU1NQe69evj509e3aDiy66qIv/dePHj2//7LPPNvXHcOedd7ZNTU3t8frrrzep7P0BZGVlRV166aWdU1JSUlNSUlLnzJlT75577mkzZcqUFv7z3nXXXW0fe+yxFpwCLenUBWveK23ENu7jrFn4CcVu++1agEsqzDNW6azHmz8j7Ye/wYJVtj0EcGBsSaRibyx/24g4bMnGf8O++FF46XxY9JRto/nyEfvNesjddt8Pn0PuNmh/rr0h71oC59wOcQ3tN/YdC2zDNtgbpb/6zO6A3B324ep34XAmXPm3yttgHA4Y+Qy8eB7881LbnvKzV0vfV6VSrrRJ0v/etn5lfxB7Y/aXfDK/tTfppCq6Gyd2su95yfO29ND9J+Wf7zQM5v/Zfo6pV5d/zp9UOw2rOs6iwHkYBQb/GpLPg/dvtJ9jmwEnvubABji01X5BCORviwrWbAczf9WOAxuqXdqA4mMODv2QAAYWPw1NuxYSW7/qFexapBby0+ernEi0ZcuWnr59+xbMmDGj0Y033nh42rRpiSNHjsxzOBw89dRTe1q2bOlxu90MGTIk5bvvvos/99xziyo7z6JFixL+/e9/J65du3aDy+WiX79+qf379y8EGDduXF5lSwRccsklh6+66qojt956a17guQoLC+WXv/xlxy+//HJznz59iq+55ppk/1xnAM2aNXNv2LBh4xNPPNH8iSeeaDl9+vRyEwTW9SUQNOmcxTYt+4rYpU+TnPsNIL7JKg0xeT/gkBEIlM4zdl/qYfLWz6VJz+F0r9ijLHMpvDcGMPbb8vDJ8NWj9hu7M7r8N+D2Ad/wK347btUL+o6xN+/l/4QGrWxPq7hGMOY9ePNK+PwhW8X02W+hZS9bFeaMtt/8V78DFz0CDVraBviDW+D8SXYczK4l9vkeI2Hh36FtWlm1VWWK821C8npsySh/HzRuV/XxFW/AG2bBkuc4oeopcwm07muryqrS+WLYOMs+XvaKjdl/Q287AGIa2Cq2E5LOfPtZte5b9bmTfaUTj8t+/p0vsuce9Qx8MB5W/ct2zw60fqb9gtB9ZOXvuzar1Eryo8oGvRq7HVv/jJY2uP7663OnT5/e5MYbbzz88ccfJ7766qs7AaZNm5b45ptvNnO73ZKTkxO9Zs2auKqSzrx58+pfeeWVh/2zNI8YMaJ0jraqlgioypo1a+KSkpKK+/TpUwxwyy23HHr++edbAAcAxo4dmweQnp5eOGvWrBO6ddb1JRA06ZxFvlifzdyN+2mXmECjnAzGbLiTKPHiMcJjrnEkSAm9ZAfXRH3DgAuuYnb0CAZ1TCQtaxp89UfAwO7XoVWF9ooN/6b036kBvCVw6WPw5e9tT6nAY/3VXb1Gw7l3nHjDSrnStu94XWDcdjR+u3Rb+rj6eXhxCLw8FFyFtoeW09euMOQu24ts2ctw0cMw73Fo2sU+dkZBSSG8drEvOXpt77PqeprtXGRLS2ATT03aKyregJe9YttLxNczznXcdg5Iv7368wSWRjwVSorOaFvNVbFdxxjYvsBWxzlOaNcuH2NlpZMeo2wV4rw/Q69rbakHbNXgitehZW+of8ISL8FVTYmk1Pb59Xjn+m54XQ4c0V6ueWk7nYad0eSfY8eOPfzwww+3W7x4ccLx48cdQ4cOLdy0aVPM1KlTW2ZkZGxs3ry5Z/To0cnHjx8/reaEU10i4GTi4uIMQFRUlKlsaYS6vgSCtumcBXILSrjz7Qx++VYG01fs5u9fbqHx2teJ8nVVNggJ4uIFz0+5x3M3mU2G0GHJI/xq38OkfXqVLbX4k4r/W3ugKF+1cmDdfvoEW8LIrbDA4qbZtqF95DOV38QPbaH0n5XHXf5aTTvDgJttwkFsqcffXtK0s62KWv5PWDnNVgld9PuyKrGYBF/jve//ytw/Vd/bKnmofS+n217RLh1umW3HsIgDGraxVYme4qrbcwKvXV1bSadhtifZl38oew+52+FIVvXtOYGxVWybEoHLHrczHnz6gG2Lmvd/8MYVUHjQfp6R0Dut07ACxn2whfN+s4dxH2w504QD0KhRI+/gwYPzJ0yYkHzNNdfkAuTl5Tnj4+O9iYmJnqysrKj58+c3qu4cw4cPP/bpp582PnbsmOTl5TnmzJlT2hBXcYkA//769et7jh49esI9tG/fvsf37NkTs27duliAf/3rX02HDh2aX9P3c+TIEWfr1q1dTqeTF154odwSCG+//XYzf5vL/v37nU2aNPH6l0AAKCoqkvz8fEfgEggHDx50Ll68uGFV16vq/fmXQADb4eDQoUNOgJtuuunwvHnzGq1Zs6be6NGjqy31VUZLOhFs3XdfsvHbz/g4ryNLXZ1L90eLh/PiduB1CV4EF1GskJ44BRxR0Rw/5074cokdhwHQ+3o7MNBTXHljc+FBW+Vz/r3l5w7rfBFs/hx+8lTZKPmNn9jxLLFVfBlKHmp7VnlKKr/h1m+BTUreE3tMnfcbm9RmT4ImnSD1mvKvPbzTJgBTyWsrapdu22HOpL2iXTrc8C48N9Amuaa+dtn2g0/+uuraShLs9D98+6wtTd38ie3SDNW355xM2zToNBy+f//E54w3cnqndRpWEIxkE+iGG27IHT9+fOf33ntvO8DgwYOLevXqVdi5c+derVu3LklLS6t0zRm/888/v/Caa67J7dWrV8+mTZu6+vTpUxpfVUsEjBs3LvfOO+9Mfumll1rOmDGj9NtZQkKCeemll3Zed911nT0eD3379i28//77TzL/UZm6vgSCLm0QQfzdnFNaNWDL8q+ZsO3XOPFSQjSz+rzIo6vr4XJ7GR29mL86XmD3gAfIyiukSepwClqklXUM2PW6vUkab9lU+u0H27EYKVfCz/9V/sJT06FJMoz7oPz+1e/aaWUmLoA2/WDvKnhlGFz9AvQfV/UbOdlg0GmjypJSYNfkrGV2TI/x2udu+e+J09hU9dpQmjMZvnnWdhJwxsCvlp7Z+RY+CXMfo7T0OfR+29C/ezncu75mA1Sr8uVk+PaZsm1HlP2yEITPS5c2UFCzJRB0aYPadAozFwdOPXP0uIuJ/1pRupzyX6PeJybKFqtjjIv2x1byzoQH+G7bfm5b9XtI6EXSVb8nyVFWui/tGOAYaqvAAksb7dJt1+a8HeWDKMqDg5uhz3UnBth1hC1ZbP7MJp2Nn9gklnJF9Z9BdY3T1ZVAAqviKmuHCUbp5XScP8l2387dZksTWcvO7Nr+rsruYsALq96y3aATu9jEcybn7nGVry3K97e//AnbxvQjX7FUBUcwlkDQpBNM/hH2Xnf5braVyMjMY+yrSyl2e8utLwOQyFGujFmJ1yMIBgeGVkmd6NihCWl5n0H+Trjybds4X5mqbs7th8DCv9oBjv6G5t2+kfqVdQGu18z2NNvyGVz0O5t0ks+3gyDPRFVJKbmSZFnT14ZSfGM7Vcx3L8Gelba0dSalhsC/T73m8NX/QkkBZH8f3HNrolFBFowlEDTpBNO6j8tG2LuP2x5KlVQvlYtoA0oAABq2SURBVGxbwMdrEyl2twJswumb1IiN+/LxeD38JeafJFBM5gV/51DmOvru/w8d1z4L546CBX+1jduBI8srU9nNuf0gW3W1exl0ucTu273MlmbaVtFBJuUK2xFh21zbhTl94il9JKckkm+YCc0oXc4gGKP3A/8+uTtg8T9Cc26lIowmnWDK2+574Ls5rf3QzgF2YIOdk2vXt5jNnxGF4RETzWbHI6wyXYmOcjB5ZE/qHcgg+ruX6XxwOVzyJzoOmUBHgN03wRuXwwuDbaP/JX88vXr/pHNs9diupWVJJ2sZtEitumOAP+n817e8csXBjsEWqTfMThfaHmHVlcJOV8oVdmxTKM4del6v1ysOh+PsahxWIeP1eoXSbqYn0qQTLPn7Yds8OxCwTX9A7FQtH9xYeogbJ05jcIidkfmVpu+S2egcGnYeROeDn8Dn99kbjzjKL7CVlGbHsSx60m7P/wt0OO/Ub86x9aF1H5t0wM6CvCfDjrmpSrNu0KCN7dLbvIftOvxjFMpSWCSX8E5uXU5OTmrz5s2PaOJRXq9XcnJyGgHrqjpGk06wLHvFDgS85H/tmBOAwkN2sTDAi4MP3BdwjfMboo3LLiGQv5nE/M2w++0KJxPY9Q10GFS2K6YeQaneaT/YDhZ0l9iG8eKj1Z9n93Io8K3Ce+iHM29EP5uFshQWqSW8k3C73ROys7Nfy87O7oWO+1O2hLPO7XZPqOoATTrBUFIAK/5pq56alo2nIfVqPMtew3hKcJkoMttdw627LyLNrCfJcYgbZC72b+Swa67sWGg7IVRWxVI6/ckZVsG0H2zXodm3xlb7QdXziIFvZL+vpGxM5Iz1UBEhLS3tADAq3HGos0dIk46IXA48AziB14wxT1R4vj0wDWjsO+YhY8ynoYwpJFa/a7seD7mr3O6FhR15tuh3pMsGlpHK7y4byQgRlm4/RP/6O5EvvilLIhf+1v5UVcUSrCqY9r7S064ltqt0fJOyQY+VqUmPMqWUqqGQDQ4VESewBbgU2A0sB8YYYzYEHPMKsMoY86KIpAKfGmOSqztvxA0O9XrguTTbvfgXc8jYdZivNu5n7+EiPl27r3TcjVNg0ogUfnVRwA3+FMb0BNWzA6B5ChzaBk06wLgPqz8+XHGqiFLZ4FClTlUoSzrpwFZjzHYAEXkfuBrYEHCMAfxzAjUCIn8RpMxv7XiVNgNs1+U179oBl/1vZNaavdw7fQ0eXyLv07YRm/bn4/HYJQUGdWpa/lzhqsfvMNjOolx8FHpXMii0orO0vUEpFXlCmXTaAoGzzu4GKq7B+0fgSxG5C6gHXBLCeM7c2o/ho9uoOHO4AUrm/oU3i514TDfALpZ2Wa9WPDqqZ/l1ayJB+8Gwytd5IUm/uCqlak+4OxKMAd40xjwpIoOBt0SklzGmXB9vEZkITARo377Kpc9Dw1+15Cq2yxmXJhwHhU26EZu7GacYnMbNuFa7WJ/THXdAySatQ5PISTZ+J5uwUimlQiSUSWcPELhqVpJvX6BfAJcDGGOWiEgc0AzfYkd+xphXgFfAtumEKuAT+JeB9q3KSZNOkL+3dAGtj5xXcC3biTZuXERhOgzl3Z8OirySTUUFAfMzvjem9ibOVEr96IUy6SwHuopIR2yyuQEYW+GYXcDFwJsi0gOIA2o8BXjIBS7NjEC/sXZk+s5F7E88h8ffK2CmtxGDHBvJkJ480P+iyCzZVJS5mKBO6aKUUjUUsqRjjHGLyK+BL7DdoV83xqwXkSnACmPMLOA+4FURuRdbb3WLiaS1FpLSKa1Oi4qzCaddOp6253DXK0uJdjqYOGYMWw8c44FILtlUFKwxP0opdYpC2qbjG3PzaYV9kwMebwDOC2UMZ8S/DED/m2DAeGiXTkZmHs99/QPLduby5HV9uaxnKy7rGd4wT9nZPe2KUuosFu6OBJHLGFjyArTsBaOeA5FyyxE4BJKbJoQ7ytOn3aCVUmFQZdIRkZ/V4PXHz8oZBGpi21zI2Qg/fbF0RucFmw9Q7LYd6wRYuiOXtOQzXFtGKaV+RKor6bwK/Ad7f63KBVSoPqszlr4A9VqUzsDs9nhZsMX2cXAIlQ/2VEopVa3qks5nxpjbqnuxiFScHrluOLARtn4FFz0CUbEA/PnTTazZfYT/GdaJerHRkd0lWimlIlSVSccYc2NVz53KMWelr/4XHFHQuh8ZmXm8unAbn6/fz23ndeS3l/cId3RKKXXWqnFHAhHpgp22Jh74uzFmSaiCCqstc2DLZwB4p9/IX4p/zzJ3FxwCl/dqGebglFLq7Fbloku+2QECPQb8DrgHeDGUQYXV2ulljz0uBpr1gG3YWr4zLzwxKaVUHVHdSn+fiMj4gG0XkAx0ADyhDCqsxFn62+OIYqm3h3YcUEqpIKmueu1y4E4R+Rz4M3A/cDe2em1cLcQWHgUHoEknSvqM5c7FCeQ378Wkfm0Y3LmZdhxQSqkzVF1HAg8wVUTeAv4A3Ak8YozZVlvB1Tpj7DLOKVfwnGsUXxds5eOb+zCgvSYbpZQKhuoGh54LPACUYEs6RcDjIrIHeMwYc7h2QgyBqlbCPLIbCg+R1yiVV77aztX92mjCUUqpIKqueu1l4EqgPvCGMeY84AYRuRCYDlxWC/EFX9YyeOMKu8x0VFz5af33rQbg6fUJeLyGK3u3DmOgSilV91TXkcBNWceBEv9OY8wCY8zZmXAAvv8AvG7KTevvt28NXnHyflZjPF7Db95fRUam9lhTSqlgqS7pjAVGA8OB8dUcd3bJ2Vz22Bldblp/s3c120mimBgM4HJ7Wbr9UO3HqJRSdVR1HQm2YNe7qTsObbMLmCWlw+5lMPhXZVVrxuDKWslqd0+iHIIxRrtJK6VUkFU3OHT2yV5ck2MiyrfP2eltfv4WNGhtk5CP98heYooPsSe+G+/efi6TRqTwzoRB2k1aKaWCqLqOBOeLyKxqnhcgNcjxhE7+flj9LvQdAw1aQZdLYMMs8LjBGcWyb+cyCOh/7kWkd2xKekct4SilVLBVl3SursHrS05+SIRY9rLtODDkbrvd9VJY9RbsXkZJ20FsXLmQc3Bw/nnDwhqmUkrVZdW16SyozUBCatt8WDIVOgyBZl3svk7DbFXbD3P469rGDC7+gaMNk2kSVz+MgSqlVN1WXe+1uiFrGbxzLbiLYfcKuw0Q1wjanUvB+s95bfEOejl2sCC/rXaRVkqpEKr7SWfnIvC67GOvu/y4nC6XUC9vAz1lBy3lMGu9ydpFWimlQuikSUdERorI2ZuckoeCP3xnTLlxOXS9FIBfR80EYJN00i7SSikVQjVJJj8HfhCRv4pI91AHFHTt0qFJR2jWtfyUN8AWOpBtmnCFczkG4f6br9cu0kopFUInTTq+Jan7A9uAN0VkiYhMFJEGIY8umFr2Lj+5J/DusiwWmb4ASEJT+sfuDUdkSin1o1GjajNjzFFgBvA+0Bq4BlgpIneFMLbgcRVCTL1yu4pKPHy0cjdxLXy92QoPwrRRZR0NlFJKBV1N2nRGici/gflANJBujLkC6MvZMk1OyYlJZ/b3e8k/7qZ/m/iynRUnAFVKKRVUNSnpjAb+YYzpbYz5mzHmAIAxphD4RUijCwZjoOTYCUnnne920aVFfdoOHGmXOBDniR0NlFJKBVV1MxL4/RHY598QkXigpTFmpzHm61AFFjSeEjAeiE4o3fVRxm5WZx3m1vOSkfY94eZPKl/UTSmlVFDVpKTzIeAN2Pb49p0dSgrsb19JJyMzj99+9D0A7363yw4GbZcOQ+/ThKOUUiFWk6QTZYwJXMStBIgJXUhBViHpfLvtIB6vAcDt0fVylFKqNtUk6eSIyCj/hohcDRysyclF5HIR2SwiW0XkoUqe/4eIrPb9bBGRwzUPvYZchfa3r3otMcHmS4eg6+UopVQtq0mbzh3AOyIyFbucQRY1WElURJzA88ClwG5guYjMMsZs8B9jjLk34Pi7sOOBgqvkmP3tK+nsO3IcAX59URcuTGmhg0GVUqoWnTTpGGO2AYNEpL5v+1gNz50ObDXGbAcQkfexyyVsqOL4McCjNTx3zZX4Sjq+pLNgSw5pHZowaURK0C+llFKqejUp6SAiPwF6AnEiAoAxZspJXtYWWyry2w2cW8X5OwAdgblVPD8RmAjQvn37moRcprR6rR45+cWs3XOE+0d0O7VzKKWUCoqaDA59CTv/2l3Y6rXrgA5BjuMGYIYxxlPZk8aYV4wxA40xA5s3b35qZw6oXlu4JQeAYSktziRWpZRSp6kmHQmGGGPGA3nGmP8FBgM1KSrsAdoFbCf59lXmBuC9Gpzz1JVWryUwf0sOzerHktq6YUgupZRSqno1STrHfb8LRaQN4MLOv3Yyy4GuItJRRGKwiWVWxYN8M1c3AZbULORT5Kte80QlsOiHHC7o1gyHQ0JyKaWUUtWrSdL5REQaA38DVgI7gXdP9iJjjBv4NfAFsBH4wBizXkSmBHbBxiaj940x5lSDrxFf9dr3B1wcLnRp1ZpSSoVRtR0JfIu3fW2MOQx8JCKzgThjzJGanNwY8ynwaYV9kyts//GUIj5VJYUgDuZtPYpD4IKuzUJ6OaWUUlWrtqRjjPFix9r4t4trmnAiRkkBRNdjwZYc+rVrTOOEs2cyBaWUqmtqUr32tYiMFn9f6bONqwBvdALf7zmiVWtKKRVmNUk6v8RO8FksIkdFJF9EjoY4ruApKeSIJwZjoFXD2HBHo5RSP2o1Wa66gTHGYYyJMcY09G2fNX2ODx85zN5C+zYnz1pvZ5VWSikVFiedkUBELqhsvzFmYfDDCb5j+UcoxJZwXG47q7TOt6aUUuFRk2lwHgh4HIedUy0DGB6SiIKscbSL7cYmHZ1VWimlwqsmE36ODNwWkXbA0yGLKMjqUUyRNOCc5CY8dEUPLeUopVQY1aQjQUW7gR7BDiRUTEkBx0wMI1JbacJRSqkwq0mbznOAf7YAB9APOzPBWcEUH6PIxNK8gfZcU0qpcKtJm86KgMdu4D1jzDchiif4XIUUEEdHTTpKKRV2NUk6M4Dj/mUHRMQpIgnGmMLQhhYEXg9Oz3GKiKWFJh2llAq7Gs1IAMQHbMcDX4UmnCDzzTBdYOK0ek0ppSJATZJOXOAS1b7HCaELKYh8a+mUOOJoFB8d5mCUUkrVJOkUiMgA/4aIpAFFoQspiHzLGkTF1edsnTpOKaXqkpq06dwDfCgie7HLVbfCLl8d+XzVa1HxDcIciFJKKajZ4NDlvtU9U3y7NhtjXKENK0h81WvxCZp0lFIqEpy0ek1EfgXUM8asM8asA+qLyP+EPrQg8FWvxdfTpKOUUpGgJm06t/tWDgXAGJMH3B66kILHXVwAQP0GjcIciVJKKahZ0nEGLuAmIk7grFh+81i+XfanQUNNOkopFQlq0pHgc2C6iLzs2/6lb1/EO5Z/mMZAw4aNwx2KUkopapZ0HgQmAnf6tucAr4YsoiAqOmZLOolNdKJPpZSKBDVZOdRrjHnJGHOtMeZaYAPwXOhDO3PHC/MBaKpJRymlIkJNSjqISH9gDHA9sAP4OJRBBUtJUT7HTTTNGsaf/GCllFIhV2XSEZFu2EQzBjgITAfEGHNRLcV2xtzHCyiSOJpEOcMdilJKKaov6WwCFgFXGWO2AojIvbUSVZB4j+dTInHhDkMppZRPdW06PwP2AfNE5FURuRg7Dc7Zw1WIy6lVa0opFSmqTDrGmJnGmBuA7sA87BxsLUTkRREZUVsBnglxFeKJOjsmxFZKqR+DmvReKzDGvGuMGQkkAauw3agjmjGGKE8hJrpeuENRSinlU5MZCUoZY/KMMa8YYy4OVUDBkl/sJs4UIzGadJRSKlKcUtI5m+TkF5PAcZxxmnSUUipShDTpiMjlIrJZRLaKyENVHHO9iGwQkfUi8m6wrn3gaDEJUkx0nM4wrZRSkaJGg0NPh29i0OeBS4HdwHIRmWWM2RBwTFfgd8B5xpg8EWkRrOvnHCumJ8V4EuoH65RKKaXOUChLOunAVmPMdmNMCfA+cHWFY24Hnvctl4Ax5kCwLn7gSBEJHCe+XsNgnVIppdQZCmXSaQtkBWzv9u0L1A3oJiLfiMhSEbm8shOJyEQRWSEiK3Jycmp08dz8Y0SJl1hdNVQppSJGuDsSRAFdgWHY6XZeFZET1iHw9ZgbaIwZ2Lx58xqd+OjRIwDae00ppSJIKJPOHqBdwHaSb1+g3cAsY4zLGLMD2IJNQmfsmC/poElHKaUiRiiTznKgq4h0FJEY4AZgVoVjZmJLOYhIM2x12/ZgXLzAt5YO0TojgVJKRYqQJR1jjBv4NfAFsBH4wBizXkSmiMgo32FfAIdEZAN2qp0HjDGHgnH9ogJf0tGSjlJKRYyQdZkGMMZ8CnxaYd/kgMcGmOT7CRqXx4v7+DGIQZOOUkpFkHB3JAiJQ8dKiKfYbujca0opFTHqZNI5kH+cehy3GzHapqOUUpGiTiadnPxi4sVX0tHqNaWUihh1Mums2JlbVtLR6jWllIoYdS7pZGTm8eqiHST42nRWZpeEOSKllFJ+dS7pLN1+CI/XEC/FeIywJPNYuENSSinlU+eSzqBOTYmNdlCf4xQRx6DOzcIdklJKKZ86l3TSOjThnQmDSE+KIyahAWkdmoQ7JKWUUj4hHRwaLmkdmkCzKCjRtXSUUiqS1LmSTqmSAu25ppRSEabuJh1XgY7RUUqpCFN3k05Jgc5GoJRSEaYOJ51CLekopVSEqbtJx6VtOkopFWnqbtLR6jWllIo4dTjpaPWaUkpFmrqZdLwecBdp9ZpSSkWYupl0XIX2t1avKaVURKmbSafEn3S0pKOUUpGkjiYd38zSWr2mlFIRpW4mHZeWdJRSKhLVzaRTom06SikViepo0tHqNaWUikR1M+lo9ZpSSkWkupl0tPeaUkpFpDqadPzVa9qmo5RSkaRuJp2cTeV/K6WUigh1L+lkLYMVr9vH742x20oppSJC3Us6OxfZudcAPCV2WymlVEQIadIRkctFZLOIbBWRhyp5/hYRyRGR1b6fCWd80eShEBUL4gRnjN1WSikVEaJCdWIRcQLPA5cCu4HlIjLLGLOhwqHTjTG/DtqF26XDzZ/YEk7yULutlFIqIoQs6QDpwFZjzHYAEXkfuBqomHSCr126JhullIpAoaxeawtkBWzv9u2raLSIfC8iM0SkXQjjUUopFWbh7kjwCZBsjOkDzAGmVXaQiEwUkRUisiInJ6dWA1RKKRU8oUw6e4DAkkuSb18pY8whY0yxb/M1IK2yExljXjHGDDTGDGzevHlIglVKKRV6oUw6y4GuItJRRGKAG4BZgQeISOuAzVHAxhDGo5RSKsxC1pHAGOMWkV8DXwBO4HVjzHoRmQKsMMbMAu4WkVGAG8gFbjnZeTMyMg6KSGYNQmgGHDztNxBakRqbxnVqIjUuCE1sHYJ8PvUjJMaYcMcQEiKywhgzMNxxVCZSY9O4Tk2kxgWRHZv6cQt3RwKllFI/Ipp0lFJK1Zq6nHReCXcA1YjU2DSuUxOpcUFkx6Z+xOpsm45SSqnIU5dLOkoppSJMnUw6J5vduhbjeF1EDojIuoB9iSIyR0R+8P1uEoa42onIPBHZICLrReQ3ERRbnIgsE5E1vtj+17e/o4h85/ubTveN/ap1IuIUkVUiMjtS4hKRnSKy1jdT+wrfvrD/LZWqTJ1LOgGzW18BpAJjRCQ1TOG8CVxeYd9DwNfGmK7A177t2uYG7jPGpAKDgF/5PqNIiK0YGG6M6Qv0Ay4XkUHAX4B/GGO6AHnAL8IQG8BvKD+IOVLiusgY0y+gm3Qk/C2VOkGdSzoEzG5tjCkB/LNb1zpjzELsoNdAV1M2x9w04Ke1GhRgjNlnjFnpe5yPvYm2jZDYjDHmmG8z2vdjgOHAjHDGJiJJwE+wUzYhIhIJcVUh7H9LpSpTF5NOTWe3DpeWxph9vsfZQMtwBiMiyUB/4DsiJDZfFdZq4AB2IthtwGFjjNt3SLj+pk8DvwW8vu2mERKXAb4UkQwRmejbFxF/S6UqCuV6OuokjDFGRMLWfVBE6gMfAfcYY47aL+7hj80Y4wH6iUhj4N9A93DEEUhErgIOGGMyRGRYuOOp4HxjzB4RaQHMEZFNgU+G+9+ZUoHqYknnpLNbh9l+/0Snvt8HwhGEiERjE847xpiPIyk2P2PMYWAeMBhoLCL+L0nh+JueB4wSkZ3YKtvhwDMREBfGmD2+3wewSTqdCPtbKuVXF5POSWe3DrNZwM2+xzcD/6ntAHxtEf8ENhpjnoqw2Jr7SjiISDx2ufON2ORzbbhiM8b8zhiTZIxJxv6bmmuMGRfuuESknog08D8GRgDriIC/pVKVqZODQ0XkSmz9u39268fDFMd7wDDsjL/7gUeBmcAHQHsgE7jeGFOxs0Go4zofWASspax94vfYdp1wx9YH2/DtxH4p+sAYM0VEOmFLGInAKuDGgLWYapWveu1+Y8xV4Y7Ld/1/+zajgHeNMY+LSFPC/LdUqjJ1MukopZSKTHWxek0ppVSE0qSjlFKq1mjSUUopVWs06SillKo1mnSUUkrVGk066pSJiMc3o7H/J2iTSYpIcuCs3EqpukWnwVGno8gY0y/cQSilzj5a0lFB41vX5a++tV2WiUgX3/5kEZkrIt+LyNci0t63v6WI/Nu3ds4aERniO5VTRF71rafzpW9mAqVUHaBJR52O+ArVaz8PeO6IMaY3MBU7KwTAc8A0Y0wf4B3gWd/+Z4EFvrVzBgDrffu7As8bY3oCh4HRIX4/SqlaojMSqFMmIseMMfUr2b8TuwDbdt+EotnGmKYichBobYxx+fbvM8Y0E5EcIClw2hjfUgtzfIuPISIPAtHGmD+F/p0ppUJNSzoq2EwVj09F4NxlHrTtUak6Q5OOCrafB/xe4nv8LXZmZoBx2MlGwS6jfCeULtzWqLaCVEqFh36DVKcj3reyp9/nxhh/t+kmIvI9trQyxrfvLuANEXkAyAFu9e3/DfCKiPwCW6K5E9iHUqrO0jYdFTS+Np2BxpiD4Y5FKRWZtHpNKaVUrdGSjlJKqVqjJR2llFK1RpOOUkqpWqNJRymlVK3RpKOUUqrWaNJRSilVazTpKKWUqjX/D3f0+nQvRAn1AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [], + "needs_background": "light" + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "L7iWpZ9x6u_q" + }, + "source": [ + "### Statistical tests on CIFAR classifier" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "HbRzQTmzDSLs", + "outputId": "76091cbe-bb3c-419d-97da-063751ff36ed" + }, + "source": [ + "# Compute loss and accuracy on the test set\n", + "print(np.shape(y_test_n))\n", + "test_loss, test_accuracy = model.evaluate(x_test_n, y_test_n)\n", + "print(\"Test accuracy is: %g\" % test_accuracy)" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "stream", + "text": [ + "(3000, 3)\n", + "94/94 [==============================] - 0s 2ms/step - loss: 0.3649 - accuracy: 0.8640\n", + "Test accuracy is: 0.864\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "0pNBVGIh5NKd", + "outputId": "1fcd7c60-f848-4890-cf6f-56936a352286" + }, + "source": [ + "# Compute confidence interval for accuracy using binomial distribution\n", + "import numpy as np\n", + "import scipy.stats as st\n", + "from statsmodels.stats.proportion import proportion_confint \n", + "\n", + "test_accuracy = 0.864\n", + "n = len(y_test)\n", + "\n", + "proportion_confint(test_accuracy * n, n, method='binom_test', alpha=0.05)" + ], + "execution_count": 16, + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": [ + "(0.8512018385349547, 0.8758694331900033)" + ] + }, + "metadata": { + "tags": [] + }, + "execution_count": 16 + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Slp1nUTRB1HK" + }, + "source": [ + "### Save the model" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "oYpYfGgcei6L", + "outputId": "dc052149-3a37-41ea-c92f-fd2f90838794" + }, + "source": [ + "# Save the model\n", + "save_keras_model(model, filename='/content/nn_task1.h5')" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "stream", + "text": [ + "None\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "xB3vSuEEHh6U" + }, + "source": [ + "# Exercise 2" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "TwujZ9o3G_2o", + "outputId": "6bee27a6-d79b-452f-cf06-0821c840af97" + }, + "source": [ + "# Load RPS data\n", + "(x_train, y_train), (x_test, y_test) = load_rps(download=True)" + ], + "execution_count": 6, + "outputs": [ + { + "output_type": "stream", + "text": [ + "Downloading rps images in /content/rps (may take a couple of minutes)\n", + "Loading training set...\n", + "Loaded 1500 images for training\n", + "Loading test set...\n", + "Loaded 300 images for testing\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "oeChSIb1H_Io" + }, + "source": [ + "import torch\n", + "import tensorflow as tf\n", + "from tensorflow.keras import applications\n", + "from keras.preprocessing.image import img_to_array, array_to_img\n", + "\n", + "# Resize the input images and normalize them according to VGG16 normalization \n", + "# factors\n", + "def process_vgg16(x):\n", + " x_n = [e.resize((224,224)) for e in x]\n", + " for i in range(len(x)):\n", + " bgr = img_to_array(x_n[i])[..., ::-1] \n", + " mean = [103.939, 116.779, 123.68] \n", + " bgr -= mean\n", + " x_n[i] = bgr\n", + " return x_n\n", + "\n", + "# Process train and test set\n", + "x_train_n = tf.convert_to_tensor(process_vgg16(x_train))\n", + "x_test_n = tf.convert_to_tensor(process_vgg16(x_test))" + ], + "execution_count": 7, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "k23n4z-1Jb-0", + "outputId": "3c83d47f-8fbd-4bb1-f3fb-a3d21ec49718" + }, + "source": [ + "from tensorflow.keras import utils\n", + "\n", + "LABELS = set(y_train)\n", + "MAP = {'scissors': 0, 'paper': 1, 'rock': 2}\n", + "print(MAP)\n", + "\n", + "# Convert string labels to numerical ones according to MAP\n", + "mapfunc = np.vectorize(lambda x: MAP[x])\n", + "\n", + "# Convert numerical labels to one-hot encoding\n", + "y_train_n = utils.to_categorical(mapfunc(y_train), len(LABELS))\n", + "y_test_n = utils.to_categorical(mapfunc(y_test), len(LABELS))" + ], + "execution_count": 8, + "outputs": [ + { + "output_type": "stream", + "text": [ + "{'scissors': 0, 'paper': 1, 'rock': 2}\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "VrSi1MRWWCYu" + }, + "source": [ + "# Download VGG16 convolution weights and architecture\n", + "\n", + "from tensorflow.keras import Sequential, optimizers, applications\n", + "from tensorflow.keras.layers import Dense, Flatten, Dropout\n", + "from tensorflow.keras.layers import Conv2D, MaxPooling2D, AveragePooling2D\n", + "from tensorflow.keras.callbacks import EarlyStopping, CSVLogger \n", + "\n", + "# Build the VGG16 network and download pre-trained weights and remove the last\n", + "# dense layers.\n", + "vgg16 = applications.VGG16(weights='imagenet', \n", + " include_top=False, \n", + " input_shape=(224, 224, 3))\n", + "\n", + "# Freezes the network weights\n", + "vgg16.trainable = False" + ], + "execution_count": 9, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "id": "3pn4tvKqMixE" + }, + "source": [ + "# Build VGG16-based classifier\n", + "net = Sequential()\n", + "net.add(vgg16)\n", + "net.add(Flatten())\n", + "net.add(Dense(128, activation='relu'))\n", + "net.add(Dropout(0.25))\n", + "net.add(Dense(3, activation='softmax'))\n", + "\n", + "# Compile and print network architecture\n", + "net.compile(optimizer=optimizers.Adam(learning_rate=0.001),\n", + " loss='categorical_crossentropy', \n", + " metrics=['acc'])\n", + "net.summary()\n", + "\n", + "# Implement early stopping monitoring validation loss\n", + "es = EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True)\n", + "\n", + "# Save training loss and accuracy over epochs in indicated CSV\n", + "log_task2_noaug = CSVLogger('my_vgg16_noaug.csv')\n", + "\n", + "# Fit the model with not data augmented training and validation data\n", + "history = net.fit(x_train_n, \n", + " y_train_n, \n", + " batch_size=8, \n", + " epochs=50, \n", + " validation_split=0.2,\n", + " verbose=1,\n", + " callbacks=[es, log_task2_noaug])" + ], + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "X1ZFbL2WxXnO", + "outputId": "a13ab168-ba74-4d3e-da4b-386a1860b0d8" + }, + "source": [ + "# Evaluate non-data-augmented model on test set\n", + "scores = net.evaluate(x_test_n, y_test_n)\n", + "print('Test loss: {} - Accuracy: {}'.format(*scores))\n", + "\n", + "# Save the model\n", + "save_keras_model(net, '/content/ynn_task2_noaug.h5')" + ], + "execution_count": null, + "outputs": [ + { + "output_type": "stream", + "text": [ + "10/10 [==============================] - 1s 118ms/step - loss: 0.5268 - acc: 0.7733\n", + "Test loss: 0.5268241763114929 - Accuracy: 0.7733333110809326\n", + "None\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "Dorbvx8dVyEM" + }, + "source": [ + "from tensorflow.keras.preprocessing.image import ImageDataGenerator\n", + "\n", + "train_gen = ImageDataGenerator(width_shift_range=0.15, # horizontal translation\n", + " height_shift_range=0.15, # vertical translation\n", + " channel_shift_range=0.3, # random channel shifts\n", + " rotation_range=360, # rotation\n", + " zoom_range=0.3, # zoom in/out randomly\n", + " shear_range=15, # deformation\n", + " )\n", + "val_gen = ImageDataGenerator()\n", + "\n", + "# Generate data-augmented training and validation set\n", + "train_loader = train_gen.flow(x_train_n, y_train_n, batch_size=40)\n", + "validation_loader = train_gen.flow(x_train_n, y_train_n, batch_size=10)" + ], + "execution_count": 10, + "outputs": [] + }, + { + "cell_type": "code", + "metadata": { + "id": "_jjHHzoEWTqM", + "colab": { + "base_uri": "https://localhost:8080/" + }, + "outputId": "13654303-f79e-4e71-83c2-714a9a8ea48f" + }, + "source": [ + "# Build VGG16-based classifier\n", + "net2 = Sequential()\n", + "net2.add(vgg16)\n", + "net2.add(Flatten())\n", + "net2.add(Dense(128, activation='relu'))\n", + "net2.add(Dropout(0.25))\n", + "net2.add(Dense(3, activation='softmax'))\n", + "\n", + "# Implement early stopping monitoring validation loss\n", + "es2 = EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True)\n", + "\n", + "# Compile and print network architecture\n", + "net2.compile(optimizer=optimizers.Adam(learning_rate=0.001),\n", + " loss='categorical_crossentropy', \n", + " metrics=['acc'])\n", + "net2.summary()\n", + "\n", + "# Save training loss and accuracy over epochs in indicated CSV\n", + "log_aug = CSVLogger('my_vgg16_aug.csv')\n", + "\n", + "# Fit the model with data augmented training and validation data\n", + "history = net2.fit(train_loader, \n", + " batch_size=16, \n", + " epochs=50, \n", + " validation_data=validation_loader,\n", + " verbose=1,\n", + " callbacks=[es2, log_aug])" + ], + "execution_count": 12, + "outputs": [ + { + "output_type": "stream", + "text": [ + "Model: \"sequential_1\"\n", + "_________________________________________________________________\n", + "Layer (type) Output Shape Param # \n", + "=================================================================\n", + "vgg16 (Functional) (None, 7, 7, 512) 14714688 \n", + "_________________________________________________________________\n", + "flatten_1 (Flatten) (None, 25088) 0 \n", + "_________________________________________________________________\n", + "dense_2 (Dense) (None, 128) 3211392 \n", + "_________________________________________________________________\n", + "dropout_1 (Dropout) (None, 128) 0 \n", + "_________________________________________________________________\n", + "dense_3 (Dense) (None, 3) 387 \n", + "=================================================================\n", + "Total params: 17,926,467\n", + "Trainable params: 3,211,779\n", + "Non-trainable params: 14,714,688\n", + "_________________________________________________________________\n", + "Epoch 1/50\n", + "38/38 [==============================] - 48s 1s/step - loss: 5.9368 - acc: 0.4780 - val_loss: 0.9557 - val_acc: 0.4833\n", + "Epoch 2/50\n", + "38/38 [==============================] - 32s 842ms/step - loss: 0.9711 - acc: 0.5707 - val_loss: 0.9521 - val_acc: 0.5767\n", + "Epoch 3/50\n", + "38/38 [==============================] - 32s 842ms/step - loss: 0.9213 - acc: 0.6033 - val_loss: 0.7592 - val_acc: 0.6940\n", + "Epoch 4/50\n", + "38/38 [==============================] - 32s 839ms/step - loss: 0.8037 - acc: 0.6560 - val_loss: 0.6932 - val_acc: 0.7393\n", + "Epoch 5/50\n", + "38/38 [==============================] - 32s 845ms/step - loss: 0.8075 - acc: 0.6547 - val_loss: 0.6627 - val_acc: 0.7313\n", + "Epoch 6/50\n", + "38/38 [==============================] - 33s 868ms/step - loss: 0.7375 - acc: 0.6980 - val_loss: 0.6030 - val_acc: 0.7540\n", + "Epoch 7/50\n", + "38/38 [==============================] - 33s 874ms/step - loss: 0.7086 - acc: 0.7080 - val_loss: 0.5946 - val_acc: 0.7680\n", + "Epoch 8/50\n", + "38/38 [==============================] - 32s 847ms/step - loss: 0.6479 - acc: 0.7100 - val_loss: 0.5195 - val_acc: 0.8107\n", + "Epoch 9/50\n", + "38/38 [==============================] - 32s 844ms/step - loss: 0.6702 - acc: 0.7167 - val_loss: 0.5435 - val_acc: 0.7847\n", + "Epoch 10/50\n", + "38/38 [==============================] - 32s 849ms/step - loss: 0.6077 - acc: 0.7500 - val_loss: 0.4991 - val_acc: 0.8180\n", + "Epoch 11/50\n", + "38/38 [==============================] - 32s 843ms/step - loss: 0.6208 - acc: 0.7333 - val_loss: 0.4954 - val_acc: 0.8060\n", + "Epoch 12/50\n", + "38/38 [==============================] - 32s 851ms/step - loss: 0.6244 - acc: 0.7313 - val_loss: 0.5026 - val_acc: 0.8060\n", + "Epoch 13/50\n", + "38/38 [==============================] - 32s 848ms/step - loss: 0.5998 - acc: 0.7640 - val_loss: 0.4789 - val_acc: 0.8127\n", + "Epoch 14/50\n", + "38/38 [==============================] - 32s 839ms/step - loss: 0.5802 - acc: 0.7507 - val_loss: 0.4533 - val_acc: 0.8273\n", + "Epoch 15/50\n", + "38/38 [==============================] - 32s 846ms/step - loss: 0.5767 - acc: 0.7533 - val_loss: 0.4746 - val_acc: 0.8233\n", + "Epoch 16/50\n", + "38/38 [==============================] - 32s 839ms/step - loss: 0.5643 - acc: 0.7600 - val_loss: 0.4329 - val_acc: 0.8253\n", + "Epoch 17/50\n", + "38/38 [==============================] - 32s 852ms/step - loss: 0.5584 - acc: 0.7673 - val_loss: 0.4671 - val_acc: 0.8067\n", + "Epoch 18/50\n", + "38/38 [==============================] - 32s 850ms/step - loss: 0.5940 - acc: 0.7587 - val_loss: 0.4413 - val_acc: 0.8300\n", + "Epoch 19/50\n", + "38/38 [==============================] - 32s 844ms/step - loss: 0.5850 - acc: 0.7573 - val_loss: 0.4237 - val_acc: 0.8373\n", + "Epoch 20/50\n", + "38/38 [==============================] - 32s 841ms/step - loss: 0.5519 - acc: 0.7820 - val_loss: 0.4195 - val_acc: 0.8373\n", + "Epoch 21/50\n", + "38/38 [==============================] - 32s 845ms/step - loss: 0.5243 - acc: 0.7867 - val_loss: 0.4205 - val_acc: 0.8433\n", + "Epoch 22/50\n", + "38/38 [==============================] - 32s 840ms/step - loss: 0.5330 - acc: 0.7800 - val_loss: 0.4232 - val_acc: 0.8427\n", + "Epoch 23/50\n", + "38/38 [==============================] - 32s 848ms/step - loss: 0.5486 - acc: 0.7927 - val_loss: 0.4149 - val_acc: 0.8393\n", + "Epoch 24/50\n", + "38/38 [==============================] - 32s 843ms/step - loss: 0.5066 - acc: 0.7987 - val_loss: 0.4016 - val_acc: 0.8480\n", + "Epoch 25/50\n", + "38/38 [==============================] - 32s 838ms/step - loss: 0.5062 - acc: 0.8000 - val_loss: 0.4163 - val_acc: 0.8360\n", + "Epoch 26/50\n", + "38/38 [==============================] - 31s 837ms/step - loss: 0.4952 - acc: 0.7940 - val_loss: 0.3879 - val_acc: 0.8533\n", + "Epoch 27/50\n", + "38/38 [==============================] - 31s 835ms/step - loss: 0.5135 - acc: 0.7893 - val_loss: 0.3924 - val_acc: 0.8480\n", + "Epoch 28/50\n", + "38/38 [==============================] - 31s 830ms/step - loss: 0.5359 - acc: 0.7933 - val_loss: 0.3887 - val_acc: 0.8547\n", + "Epoch 29/50\n", + "38/38 [==============================] - 32s 849ms/step - loss: 0.4884 - acc: 0.8040 - val_loss: 0.3913 - val_acc: 0.8540\n", + "Epoch 30/50\n", + "38/38 [==============================] - 32s 839ms/step - loss: 0.4803 - acc: 0.8040 - val_loss: 0.4148 - val_acc: 0.8447\n", + "Epoch 31/50\n", + "38/38 [==============================] - 31s 836ms/step - loss: 0.5072 - acc: 0.8060 - val_loss: 0.3828 - val_acc: 0.8527\n", + "Epoch 32/50\n", + "38/38 [==============================] - 31s 833ms/step - loss: 0.4988 - acc: 0.7980 - val_loss: 0.4236 - val_acc: 0.8580\n", + "Epoch 33/50\n", + "38/38 [==============================] - 32s 844ms/step - loss: 0.4721 - acc: 0.8093 - val_loss: 0.4001 - val_acc: 0.8580\n", + "Epoch 34/50\n", + "38/38 [==============================] - 31s 835ms/step - loss: 0.4841 - acc: 0.8100 - val_loss: 0.3656 - val_acc: 0.8753\n", + "Epoch 35/50\n", + "38/38 [==============================] - 31s 833ms/step - loss: 0.4884 - acc: 0.8067 - val_loss: 0.3772 - val_acc: 0.8547\n", + "Epoch 36/50\n", + "38/38 [==============================] - 31s 834ms/step - loss: 0.5038 - acc: 0.8060 - val_loss: 0.3824 - val_acc: 0.8627\n", + "Epoch 37/50\n", + "38/38 [==============================] - 32s 839ms/step - loss: 0.4988 - acc: 0.8007 - val_loss: 0.3792 - val_acc: 0.8600\n", + "Epoch 38/50\n", + "38/38 [==============================] - 31s 840ms/step - loss: 0.4807 - acc: 0.8133 - val_loss: 0.3693 - val_acc: 0.8727\n", + "Epoch 39/50\n", + "38/38 [==============================] - 31s 836ms/step - loss: 0.4792 - acc: 0.8053 - val_loss: 0.3456 - val_acc: 0.8720\n", + "Epoch 40/50\n", + "38/38 [==============================] - 32s 837ms/step - loss: 0.4598 - acc: 0.8120 - val_loss: 0.4263 - val_acc: 0.8493\n", + "Epoch 41/50\n", + "38/38 [==============================] - 31s 835ms/step - loss: 0.4712 - acc: 0.8120 - val_loss: 0.3842 - val_acc: 0.8687\n", + "Epoch 42/50\n", + "38/38 [==============================] - 31s 834ms/step - loss: 0.4660 - acc: 0.8060 - val_loss: 0.3825 - val_acc: 0.8673\n", + "Epoch 43/50\n", + "38/38 [==============================] - 31s 836ms/step - loss: 0.4617 - acc: 0.8193 - val_loss: 0.3563 - val_acc: 0.8667\n", + "Epoch 44/50\n", + "38/38 [==============================] - 32s 841ms/step - loss: 0.4561 - acc: 0.8227 - val_loss: 0.3650 - val_acc: 0.8647\n", + "Epoch 45/50\n", + "38/38 [==============================] - 31s 835ms/step - loss: 0.4573 - acc: 0.8240 - val_loss: 0.3844 - val_acc: 0.8507\n", + "Epoch 46/50\n", + "38/38 [==============================] - 32s 843ms/step - loss: 0.4898 - acc: 0.8020 - val_loss: 0.3444 - val_acc: 0.8740\n", + "Epoch 47/50\n", + "38/38 [==============================] - 31s 836ms/step - loss: 0.4352 - acc: 0.8273 - val_loss: 0.3574 - val_acc: 0.8640\n", + "Epoch 48/50\n", + "38/38 [==============================] - 32s 837ms/step - loss: 0.4651 - acc: 0.8187 - val_loss: 0.3359 - val_acc: 0.8793\n", + "Epoch 49/50\n", + "38/38 [==============================] - 32s 841ms/step - loss: 0.4681 - acc: 0.8173 - val_loss: 0.3494 - val_acc: 0.8687\n", + "Epoch 50/50\n", + "38/38 [==============================] - 31s 835ms/step - loss: 0.4628 - acc: 0.8113 - val_loss: 0.3382 - val_acc: 0.8760\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "zreO-NFo5xyv", + "outputId": "4c7b1dc3-25c3-489b-fd41-86a736a5d307" + }, + "source": [ + "# Evaluate non-data-augmented model on test set\n", + "scores = net2.evaluate(x_test_n, y_test_n)\n", + "print('Test loss: {} - Accuracy: {}'.format(*scores))\n", + "\n", + "# Save the model\n", + "print(save_keras_model(net2, filename='/content/nn_task2_aug.h5'))" + ], + "execution_count": 13, + "outputs": [ + { + "output_type": "stream", + "text": [ + "10/10 [==============================] - 11s 439ms/step - loss: 0.2486 - acc: 0.9000\n", + "Test loss: 0.2485782653093338 - Accuracy: 0.8999999761581421\n", + "None\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "c6Eq3hkLCBBS" + }, + "source": [ + "### Plot validation loss and accuracy curves (non-data-augmented model)" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 262 + }, + "id": "C5Vp-0DcR1Xt", + "outputId": "cf370216-f79a-45a1-fe3f-9ceac7908676" + }, + "source": [ + "import pandas as pd\n", + "import matplotlib as mpl\n", + "import matplotlib.pyplot as plt\n", + "\n", + "df = pd.read_csv('my_vgg16_noaug.csv')\n", + "print(df[df.epoch == 0])\n", + "plt.figure(figsize=(4,3))\n", + "ax = plt.gca()\n", + "lines = []\n", + "FEATURES = ['acc', 'val_acc']\n", + "for feature in FEATURES:\n", + " lines.append(ax.plot(df[\"epoch\"],df[feature], marker='.')[0])\n", + "plt.xlabel(\"Epoch\")\n", + "plt.ylabel(\"Accuracy [%]\")\n", + "lgd = plt.legend(lines, [\"Training accuracy\", \"Validation accuracy\"], \n", + " loc=\"best\", bbox_to_anchor=(1,1))\n", + "plt.show()" + ], + "execution_count": 4, + "outputs": [ + { + "output_type": "stream", + "text": [ + " epoch acc loss val_acc val_loss\n", + "0 0 0.565 0.955588 0.0 1.627829\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZ0AAADQCAYAAADcbrykAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXzU1bn48c8zkx0ikAQIECCsgRBACbIp4oaidbmIWgXFuqHW2qq11Xu12NLr/dn2aluvVuuKWxXFiqi4oKLgwhYW2TEsIUECCSQQErLMzPn9cWbCELJMlskyed6vF6/MfOfM93smwPeZc85zzhFjDEoppVRzcLR0BZRSSrUfGnSUUko1Gw06Simlmo0GHaWUUs1Gg45SSqlmo0FHKaVUswlr6QrUV0JCgklOTm7paijV7mRkZOQbY7q2dD1U29bmgk5ycjKrV69u6Woo1e6ISFZL10G1fUHrXhORF0XkgIhsrOF1EZEnRCRTRL4XkVHBqotSSqnWIZhjOnOBKbW8fhEwyPtnFvB0EOuilFKqFQha0DHGLAUO1VLkcuAVYy0HOotIj2DVRymlVMtryTGdXkC23/Mc77F9VQuKyCxsa4g+ffo0S+WUCiUZWQUs33mQcf3jSe/bpc7jAZ83I6NbWFjY80Aamg2rwANsdLlct6Snpx+orkCbSCQwxjwLPAswevRoXaFU1amxN9O2yv9zD+/ViR15R/lowz6e+nIHbo/B6RAuHNadnp2iOVhczvvrf8TtMUSGO3j9lnH1/l2FhYU9n5iYOLRr164FDodD/2+2cx6PR/Ly8lJzc3OfBy6rrkxLBp29QG+/50neY0oFzHeTPT25C33iOrD/SCnf7sjnsU+34/YYwpzCAxcN4YyBCXSPjWJn3lGW7zpUbTCqLlAZY/hy2wEysgo5a3ACY/rF11i+tMLNweJyvv0hn3U5hYztF8e4/vHERoUTFe5gzZ7COlsbp/XuzNFyF0eOVVBU6mJNVgFbc4sY3bcLo/p2ISbCSXSEk80/HmHp9jwGJ8bSo1MUh4orWLvnEM8u3YXLYxDA4QC358Tfl9tj+HzLAZwOoazCg9u7ynyFy8PynQcbEqDTNOAoH4fDYbp27Xo4Nzc3raYyLRl0FgK/EJE3gbHAYWPMSV1rqn2pqxtoVJ/OdIgMY8u+I3y1PY+PN+biqeV2V+E2/PGDLScdF4HTenemX0JH4jtGUFLmZt7qPbjcBodDSOt5CkVlLnIOlVDuthd4ckkmkWEOOseEE+50sLfwGMaAABFhDspcJ97hX1+xp/Kx0y8ACNA1NoLIcCdlFW7yisqp64796vL6ZSsbIL1vHDPG9gED9//7eypcHsLDjrdoMrIKmPH88srj4/rH13neajg04Ch/3n8PNXa1Bi3oiMgbwNlAgojkAA8D4QDGmGeARcDFQCZQAtwYrLqophHsLqvPNu/njtczcLltN9ClI3vSIdLJ7vwSvt2Rf1JwCXdK5TEBJqd25+rRvSkoLueh9zbicnsIczqYfUkqnWMieGdNDku2HsAAxsDewmPkHi7lYHH5CQHD7THsP1LKqL5diO8QwerdBRjvNYb36sSArh1Zm12Abysq4z1+zpBufJ9TyKeb9mMAh8B5Q7tzWp/OfLUtjxW7DlWW79oxipTEWLblFnGgqLzyM4wbEM95Q7pxSlQ432Tm8/73P+Ix9lwXDEtkwoB4Pt9ygKXb82ydBK4Y1YsbxifzY+Ex7n5zHRVuG0TunzKk8u8pKS7mpL+79L5deP2WcW26GzI3N9d59tlnpwDk5+eHOxwOExcX5wJYt27dlqioqBoD4tKlS2NefPHF+Llz52bXVAbgtNNOG7J27dqtTVvz9itoQccYc20drxvgzmBdXzWdCreHud/s5tGPtuI2BqcIt57VjytGJTGwa0fWZlffbVSbJVv388GGfYQ7HOQfLWfD3kL2HymrfN3lMby3bi9dYiJwG3NCcLlsZE/umTyY/KNlXPfCispv6rdNGlB5/f7dOp5Up8ROUXy7I7+y/D9mpJPetwvGGL7bcZAb567C5b1hP+V9rWpr4D8vHlrn8a+251Uev91bp7H94k8oP+c/0qo9z30XpFTWd0C3jnyyObfytVsn9ie9bxeG9ezEil0HK49PH9OXEUmdGZHUmddvjar27yK9b5dq/25qOt5WJCYmurdu3boZ4N577+3ZsWNH95w5c/b7Xq+oqCA8PLza95511lklZ511Vkld12iLAcflchEW1jqH7KWt7Rw6evRooysSBFdGVgHf7cgnNiqcrblFfLRxH4UlFdWWjYlwUlrhxhgIcwq/uySVc1K60aNTFGFOR+W5enSK5liFmzVZBXy7I59cvwDTq3MUY/rF0zkmnNeX78Ht8RDudPD6LWNJT4476cbsP+Bd39ZXbeXrm+EV7OMNfU+wiEiGMWa0/7H169fvHjlyZH59zvN1Zn6HbzLzY88YmFB05sCE4qaqny/obNq0KToyMtKzcePGmDFjxhydMWPGoXvuuadPWVmZIyoqyjN37txdI0eOLPvggw9iH3vsse5LlizJvPfee3tmZ2dHZGVlRf74448Rt99++/6HHnroAEBMTMxpJSUlaz/44IPYOXPm9IyLi6vYtm1b9PDhw0sWLFiwy+FwMG/evE4PPPBAUkxMjOf0008/mpWVFblkyZJM//pt27YtYvr06f2OHTvmAPj73/++Z/LkycUADz74YOLbb78dJyKcd955h//xj3/s3bhxY+SsWbP6Hjx4MMzpdJq33357565duyJ8dQaYOXNmn9GjRxf/8pe/PNirV6/hl1122aGvvvrqlLvvvju3qKjI+dJLL3WtqKiQ5OTksvnz5++KjY31ZGdnh91000199+zZEwnw5JNPZn344Yed4uLiXLNnzz4AcNddd/Xq1q1bxe9+97tqM9Dqsn79+oSRI0cmV/da6wyFqsV8sXU/s17JwOVtWkQ4HUxJS2RYz1P462fbK2/8/3vVSMoqPLzy3W7W5xwG7PjJ7Pc2AZtwOoQuMeEcLC7H/3tNQscIOsdEIEfKMIBTYPrYvtx5zkAALhnRs17dQPX9pl5b+fq2BoJ9vKHvaSm/mb++9/bcopjayhSXuRw78otjjIFnvtrBgIQOJR0iwzw1lR+cGFvylytH1tr9VZ19+/ZFrFmzZmtYWBiHDh1yrFq1amt4eDgLFiyI/e1vf5v0ySef7Kj6nszMzKhvv/12W2FhoXPo0KFpv/nNb/IiIyNP+Fa+ZcuW6HXr1u1MTk6uSE9PH7J48eKOEydOLP7Vr37V98svv9w6ZMiQ8ksvvbRfdXXq2bOna9myZdtjYmLMhg0bIq+99tr+Gzdu3PLWW2+dsmjRos4ZGRlbY2NjPfv373cCTJ8+vd99992XO3PmzMKSkhJxu92ya9euiNo+d3x8vGvz5s1bwHY9/vrXv84H+OUvf9nziSeeSHjwwQcP3H777X0mTpxYNHv27B0ul4vDhw87+/TpUzF16tQBs2fPPuB2u1mwYEGXVatWnTwY2gQ06CgAsg+V8Nyynby+Yg9ub8BxCPz8nAHcff5gAEYnx510409O6FDZCglzOrh/SgoxEWHsLTzGF1sOkH+0vPJct5zZn/+8eAhr9hTWOIDdlm6yqv6OlrnCKsfCjH3eITKsvKmvc8UVVxT4upcOHTrk/OlPf9pv9+7dUSJiKioqpLr3XHDBBYXR0dEmOjraFRcXV5GTkxM2YMCAE5r4w4cPL/YdGzZsWMmOHTsiYmNj3b179y4bMmRIOcA111xz6Pnnnz9pYdTy8nK5+eab+27evDna4XCQlZUVCbB48eJTrrvuuvzY2FgPQPfu3d0FBQWO/fv3R8ycObMQICYmxkCd+SbMnDmzwPc4IyMjevbs2b2KioqcxcXFzkmTJh0G+Pbbb2Pnz5+/CyAsLIz4+Hh3fHy8u3Pnzq5vvvkmet++feHDhg0rSUxMdNf9m64/DTrtWEZWAe+v/5GdeUf5ZsdBHAKTBifwTebByrGNiYOO/9+p7sZfWyvk7JRuJwSXC9MSEZGQGMBWJwukRfJ1Zn6Hm+auGuxyexxhTofnf68+dWdTdrH5dOzYsbL1dP/99/eaNGlS0eLFi3ds27Yt4txzz02p7j3+rRqn04nL5TopOAVSpiaPPPJI927dulW88847uzweD9HR0emBfyIrPDzceDzHG4ZlZWUnXN8XuABmzZrVb/78+Znjx48/9sQTT8R/9dVXsbWd+8Ybb8x//vnnEw4cOBB+4403Hqxv3QKlQSeE1NX/P6ZfHJ2jw/k+5zCfb93PRxtzK7u+LhvZg/+8eCg9OkXXe7ygttZJU3WLqdBw5sCE4hd/dvr2YIzp1OTIkSPOpKSkcoB//vOfCU19/hEjRpRmZ2dHbtu2LSIlJaV83rx5cdWVO3z4sDMpKanc6XTy5JNPxrvdtiFx4YUXHnnkkUd6zpo165Cve6179+7uxMTE8ldffbXz9ddfX3js2DFxuVwyYMCAsszMzOhjx45JcXGx4+uvvz7ljDPOOFrd9UpKShx9+vSpKCsrkzfffDOuR48eFQBnnHFG0V/+8peus2fPPuDrXouPj3dff/31hY888kgvl8sl06ZN29nUvycfDTptUIZ3ML5fQgc6RISx+2AxK3cd4pNNuZXptacndyEhNoqiYxV8k3mwchKgT7hTKgOOUyAl8RR6dIoGmjYgaHBRVZ05MKG4OYKNz/333597yy239PvTn/7Uc/LkyYVNff6OHTuaxx9/PGvKlCmDYmJiPCNHjqz2s919990Hpk2bNuDNN9+MP/fccw9HR0d7AK688soja9asiTn11FOHhoeHm/PPP//wk08+ufe1117bdeutt/b94x//2DM8PNy8/fbbO1JTU8svvfTSgiFDhgxLSkoqGzZsWI3Zdw888MCPY8aMGRoXF+caNWrU0aNHjzoBnn766T0/+9nP+g4ePDjB4XDw5JNPZp1//vnFUVFRZsKECUc6d+7sDmbmm2avtSGZB4r4x5IdvLt270mduxFOB+V+08+7dowgNjqcg0fLOXzMdksLcFFaIvdMHkzhsQqu90s3bsgSKKp9aarstVB0+PBhR6dOnTwej4eZM2f2GTRoUOnDDz/coMyvluJ2uxk2bFjq22+/vWP48OFldb+jZpq91gb5urhSe8SSdbCEd9fuZX3OYYTjo4kCXDumD/deMJis/GJm+AWRZ64fXe08kJsn9mdQd9u1q+MqSjWNv/3tbwlvvPFGQkVFhQwbNqzk3nvvbVOBOCMjI+ryyy8fdNFFFxU0NuDURVs6rVDG7kNc+9yKE1ouqT1O4YpRvUhO6MAv/rWmXnNW2uvil6ppaUtHBUpbOm2E22P4dFMuf3h/U2XAEeCGCcn8/rJhleXqOziv4ypKqdZCg04Ly8gq4JvMPEor3Hy8cT8784tJPCXSrivmMYSHObh0ZM8T3qNBRCnVVmnQaUErdx1k+nMrKmf/90+I4anpo5iSlsi6BqxnppRSrZ0GnRaydHse9761rjLgOASmpSfxkxF2x25tzSilQpFuL9vMMg8c5aa5q5j54krCHEK4U3CK3Y9lXP8mn7emVEgbO3bs4HfeeecU/2Nz5szpNmPGjBr3tR8zZkzK0qVLYwAmTZo0MD8/31m1zL333ttz9uzZ3Wu79quvvto5IyMjyvf87rvv7rlgwYJaZ/0rbek0m6Xb8vj7F9tZt6eQmIgw/uviIdwwIZmNe49oN5pSDXTVVVcdeuONN+KmTZt2xHfsnXfeiXv00UdzAnn/V199lVl3qeotWLCgs8vlOpyenl4K8Le//e3Hhp6rpbTEFgja0mkGn27K5YaXVpKRVQgi/P3aU5l11gAiw5yk9+3CnecM1ICj2o+dX3bgs98nsvPLDo091fXXX1/wxRdfdCotLRWw2wccOHAg/MILLzw6Y8aMPmlpaUMHDhw47J577ulZ3ft79eo1fN++fWEA999/f2JycnJaenp6yg8//BDpK/PYY48lpKWlDU1JSUm98MILBxQVFTkWL17c4bPPPuv80EMPJQ0ZMiR106ZNkdOmTUt+6aWXugC89957sUOHDk0dPHhw6lVXXZV87Ngx8V3vnnvu6Zmamjp08ODBqWvXro2qWqdt27ZFpKenp6Smpg5NTU0dunjx4srf04MPPpg4ePDg1JSUlNSf//znvQA2btwYOWHChMEpKSmpqampQzdt2hT5wQcfxJ5zjnfpduwWCE888US8rw533HFHr9TU1KEvvvhil+o+H0B2dnbY5MmTB6SkpKSmpKSkLl68uMPdd9/dc86cOd18573rrrt6/fGPf+xGPWhLJ8iyD5Xwm/nrj68gYAxb9hVx7pBaW+5KtT0L7uzNgc21bm1A2VEHB3+IAQNf/w3iB5UQ2bHGrQ3ollrCfzxV40Ki3bt3d48cObJ4/vz5na677rrCl19+Oe7SSy8tcDgcPP7443u7d+/udrlcTJgwIWXFihXRY8eOPVbdeZYtWxbz7rvvxm3YsGFzRUUFp556auppp51WAjBjxoyC6rYIOP/88wsvueSSwzfeeGOB/7lKSkrktttu6/fpp59uGzFiRNnUqVOTfWudASQkJLg2b9685dFHH+366KOPdp83b94Je5GH+hYIGnSCaEfeUWY8twKX2xAR5sDtbtRe9Eq1feVFYcfX1DD2eWTHRm1tcPXVVx+aN29el+uuu67w3//+d9xzzz23G+Dll1+Omzt3boLL5ZK8vLzw9evXR9UUdJYsWdLx4osvLvSt0nzBBRdUrtFW0xYBNVm/fn1UUlJS2YgRI8oAfvaznx186qmnugEHAKZPn14AMGbMmJKFCxee1MUR6lsgaNAJki37jnD9CysAmH/HBErK3Tp2o0JbLS2SSju/7MDrVw/GU+HAEe5h6jM76X92oxb/nD59euGDDz7Y++uvv44pLS11TJw4sWTr1q0RTz75ZPeMjIwtXbt2dU+bNi25tLS0QcMJ9d0ioC5RUVEGICwszFS3NUKob4GgYzpNLCOrgN8t2MiVT39LmMPBvNvGM7THKTp2oxRA/7OLmfHWds741V5mvLW9sQEHoFOnTp7x48cX3XLLLclTp049BFBQUOCMjo72xMXFubOzs8O+/PLLTrWd49xzzz26aNGizkePHpWCggLH4sWLO/teq7pFgO94x44d3UeOHDnpHjpy5MjSvXv3RmzcuDES4JVXXomfOHFiUaCf5/Dhw84ePXpUOJ1O/vGPf5ywBcJrr72W4Btz2b9/v7NLly4e3xYIAMeOHZOioiKH/xYI+fn5zq+//vqUmq5X0+fzbYEANuHg4MGDToDrr7++cMmSJZ3Wr1/fYdq0abW2+qqjLZ0mZNdMW0652yDA/141kgFdO7Z0tZRqXfqfXdwUwcbfNddcc2jmzJkD3njjjZ0A48ePP5aWllYyYMCAtB49epSnp6dXu+eMz5lnnlkyderUQ2lpacPi4+MrRowYUVm/mrYImDFjxqE77rgj+Zlnnuk+f/78yu2vY2JizDPPPLP7qquuGuB2uxk5cmTJfffdlxfoZwn1LRCCuuCniEwB/g44geeNMY9Web0P8DLQ2VvmAWPMotrO2RoX/NxzsIR/r81h7re7KSyx2wg4BH59QQp3Hk8gUapN0wU/FQS2BUKLLPgpIk7gKWAykAOsEpGFxpjNfsUeAt4yxjwtIqnAIqDairY2S7fn8fqKLPYcLGFLbhEikNbzFIrLXJVrpmnCgFIqlDTFFgjB7F4bA2QaY3YCiMibwOWAf9AxgK+vsRPQ6idX7c4v5tGPtvLxplzArgI9Y2wf7jxnID0713+rZ6WUaivS09NLc3JyNjTmHMEMOr0A/2yWHGBslTK/Bz4VkbuADsD51Z1IRGYBswD69KlxdYugsEEkn05RESz9IY/FW/bj4HgiiEOgZ+doenZu+q2elVIq1LR0IsG1wFxjzGMiMh54VUTSjDEnTBYzxjwLPAt2TKe5Kpex+xDXPLecCre9ZMdIJ784ZyAje3c+YSM17UZT7ZjH4/GIw+FoW7tBqqDxeDwC1DjhN5hBZy/Q2+95kveYv5uBKQDGmO9EJApIwDuJqiWVVriZvXBTZcAR4NaJ/fnV+YMB3epZKa+NeXl5qV27dj2sgUd5PB7Jy8vrBGysqUwwg84qYJCI9MMGm2uA6VXK7AHOA+aKyFAgCgg4tTBYcgpKuP21DDb9eIQwh2CMTQw4c1DXyjLajaYUuFyuW3Jzc5/Pzc1NQ+f9KdvC2ehyuW6pqUDQgo4xxiUivwA+waZDv2iM2SQic4DVxpiFwK+B50TkHmxSwc9MMHO4A/BNZj6/+NcaXG7DCzeMpnNMhLZolKpBenr6AeCylq6HajuCOk8nGII1Tydj9yGe+CKTpdvzGNitI/+8Pp3+OrFTqUrVzdNRqr5aOpGgVVix8yDXPrccj7HZaA9fNkwDjlJKBYH2wQIvfL0L767RCLA+u7DW8koppRqm3bd0PB7Dph8PI2IjsKZAK6VU8LT7oPPVD3nsLSzlV+cNJCLMqQkDSikVRO0+6Lz49S66xUZy5zmDiAjT3kallAqmdn2X3Zp7hGU/5HPDhGQNOEop1Qza9Z32xa93ERXuYMbY5l3PTSml2qt2G3TyispYsPZHpo1KonNMREtXRyml2oV2G3ReW55FudvDTWf2a+mqKKVUu1FjIoGIXBHA+0vr2umzNSqtcPPa8izOHdJNt5NWSqlmVFv22nPAe+C3eczJzsLu9tmmvLduLweLy7lZWzlKKdWsags6HxljbqrtzSLyWhPXJ+iMMbzw9S6GJMYyYYBOAlVKqeZU45iOMea6ut4cSJnWZtkP+Wzff5Sbz+yHSG2NOKWUUk0t4EQCERkoIq+JyDveXT7bnIysAh5euIlO0eFcdmrPlq6OUkq1O7UlEkQZY0r9Dv0R+K338fvAqcGsWFPLyCpg+nPLKXN5CHMIG/ce0eVuVOuXvRJ2L4PkidB7TEvXRqlGq21M530RedUY84r3eQWQjN1szR3sijW15TsPUuay23YbY1i+86AGHdW6ZS2HVy4FjxucEXDDQg08qs2rrXttCnCKiHwsImcB9wEXAlOBGc1RuaY0vGenyse6knQIyV4Jyx6zP0PN6hfAXQ7GbX/uXtbSNVKq0Wps6Rhj3MCTIvIq8DvgDuAhY8yO5qpcU9p+oAiAmeP7cvmpvbSVEwqyV8LLl4KrHMIiQ68lUHLI+0BsSyd5YotWR6mmUNuYzljgN0A58D/AMeAREdkL/NEY02Z2OnN7DK98l8Xovl2Yc3laS1dHNZXdy8DlHXb0tQRCJei4K2Cvd1v2HiPh4r+EzmdT7VptYzr/BC4GOgIvGWPOAK4RkUnAPGxXW5uwZOsB9hwq4TcXprR0VVRTSvK7CTvCQqslkPUNlBbaFk54tAYcFTJqG9NxYRMH+mJbOwAYY74yxrSZgAPw8ne76X5KJFPSElu6KqoplR05/njUzNC6MW/5AMKiYehlcLBN9mgrVa3ags50YBpwLjCzIScXkSkisk1EMkXkgRrKXC0im0Vkk4j8qyHXqU3mgSKW/ZDPdWP7Eu5st+ubhqYt70NUZ+jUB47mtnRtmo4xsPVDGHgeJKZB8QEoPVL3+5RqA2pLJNgO/LqhJxYRJ/AUMBnIAVaJyEJjzGa/MoOA/wTOMMYUiEi3hl6vJi9/m0WE08G1umdOaHGVw7ZFkPIT8Lhg11f2Zh0Kq0z8uAaKfoQhsyGigz12aAf0PK1l66VUE6jxq7+IfFDXm+soMwbINMbsNMaUA28Cl1cpcyvwlDGmAMAYc6DuKgfuSGkF76zJ4ZKRPUjoGNmUp1YtbfdSKD0MQy+13WpH90PhnpauVdPY8gGIEwZfCPED7DHtYlMhorZEgjNFZGEtrwuQWsvrvYBsv+c5wNgqZQYDiMg3gBP4vTHm45MuJDILmAXQp0/gLZa3V+dQUu7mxgm6mnTI2bwQIjrCgHMhf5s9lrMKuvRt+Dlby+z/rR9C8hkQE2eTCECDjgoZtQWdqq2S6pTXXaTO6w8CzgaSgKUiMrxqOrYx5lngWYDRo0ebQE7s8Rhe+W43o/p0ZnhSpzrLqzbE47Y35kEXQHgUdBsG4R1s0Bh+ZcPOmb0S5v6k5Wf/5/9gg+jpt9jn4dFwSpLtXlMqBNQ2pvNVI8+9F+jt9zzJe8xfDrDCGFMB7BKR7dggtKqR1+bL7QfIOljCvZMHN/ZUqrXZ8x2U5EPqZfa5Mwx6jYKcRqxKsGG+nesDLTvnZ8v79ueQi48fix8ABzObvy5KBUEw07lWAYNEpJ+IRADXAFW76xZgWzmISAK2u21nU1x87rdZdIuN5KK0Hk1xOtWabF4IYVEwcPLxY0mnQ+4GKC9p2Dn9ExCcLTjnZ+uHNmGgU9LxY76gYwJq5CvVqgUt6BhjXMAvgE+ALcBbxphNIjJHRLxfUfkEOCgim4ElwG+MMQcbe+2F6/eydHse56R0IyJM06RDisdjWwMDzoNIv63Ge4+xWWw/rm3YeQuyIKarDWbd01qmlXPkR7sKwZBLTjweP9AmTVQui6NU21XnHVlELhWRBt25jTGLjDGDjTEDjDGPeI/NNsYs9D42xph7jTGpxpjhxpg3G3IdfxlZBdwzbz0AC9btJSOroLGnVHVpzkU392bYdGJf15qPb3WChnSxuV12BYChP4HzZttr/PBZ4+taX1s/tD+rBp04bwabjuuoEBBIMPkp8IOI/FlEhgS7Qo21fOdBjLcbwuX2sHxnoxtOqjbZK+HlS+DzOXbxzWAHni3v2SVvBk858XiHeHtzzm7AcGDueru6QfJEOP1W6NIPPn3IBqPmtPVD26rpWmW5pviB9mdjMthCeTVu1abUGXS8W1KfBuwA5orIdyIyS0Rig167BhjXP56IMAdO0S0MmsXuZXaiJoCrLLjL7xtju9b6TYLozie/3nuMbenUd+xj11L7s99ZEBYBk/8AeVtg7auNr3OgjhXY392QS06e4Nqlr52309BkAt8Xgy/+G16+TAOPalEBdZsZY44A87ETPHtg99RZIyJ3BbFuDZLetwuv3zKOey9I4fVbxukWBsF2woC7gV6jg3et3ELQRpgAABz5SURBVA1QsPvkrjWfpNOhOM+WqY9dy6DrEOjoXRBj6GXQZzwseQTKihpT48Bt/9SOSVXtWgNwhtvA09DutczP7RcC49F9eVSLC2RM5zIReRf4EggHxhhjLgJG0ohlcoIpvW8X7jxnYPsLOC3RhdKlH2COdwEFc9xhy0IQR/U3Zjg++F+fz+8qtynY/c46fkwELnjEBrCv/9bw+tbH1g+gYyL0Sq/+9bhGpE0X7z/+WCS0VuNWbU4gLZ1pwF+9A/1/8S1VY4wpAW4Oau3as/oGkJ1L4cUL4PNm7kLZ8YX9ecVzkDgCVj4XvNTezQuh7xnQIaH617ul2lUK6pNM8OMaqCg5MegAJKVD2pXw3ZNwOKfhdQ7ErmWw7SNIGg2OGv5Lxg+Agzvr/7stPQwb34WksdDjVPv+6Hb2ZUy1KoEEnd8Dlf+LRSRaRJIBjDGfB6VW7V32Sphbz8H5Vc96b0jN3IWy43OISbA3tLG3wYHNsPvrpr/OhrftTP3EETWXcThtS6E+AXfXUkBsMKvq/Ift73Thr4LXgsxeCa9dAZ4K+OHTmq8RPxAqiqGonqtpL3/a7stz8Z9hxtt2AdGPH9A5P6rFBBJ03gY8fs/d3mMqWHYvOz47PtDB+bxtxx8319bGHo8dLxhwrv2GnjbNfote+WzTXmfPCnj3dvt49Qu13/x7j4H9m6C8OLBz71oKicPtOmdVde4DqZfDjs+CNwjv/3ftcdf8dx3X3/6sT/dlySH47inbHdnzVDtmNel+yPwMtn/SuHor1UCBBJ0w7yrRAHgfRwSvSoq+ZwK+b6IBDM7vXQP52yHyFHCEw/ULmmdyY+73djmagefZ5+HRdjO1rR82TZeUMfYG+c7NdpAd7DbOtQXhpDFg3PZ3UpeKUhtEqnat+YtL9talHi3I+nSNVm5XILV/WahMm67HuM53T9pEiHP+6/ixMbMgYbBt7bjKAj+XUk0kkKCT57eCACJyOZAfvCopwrwx3XdDqutGs/oFCI+Bs//TdtN07Brc+vlkeidQDjj3+LHRNwMGVr/YsHNmr4Slj8E3f4fnz4PXptng4Ai3acN1teKSvAE6e0Xd18pZCe6y2oPOwMl2XhDYLLK6WpDZK+Gli7xdowG0jA57lyMcfVPti4x2SrKfPdC5OsX5sPwZGDYVug87fjwsAqY8CgW7bCtIqWYWSNC5HfgvEdkjItnA/cBtwa1WO7d1kc3SmvGOHStZ8YztyqrOsQLY8A4Mvwr6jrfHcjc0Tz13fGHHWDr67b3XpS8Mvggy5tpgUR++lZ6/mAOLZ9sb8qV/h3s3w42L4NwH6179OSbOfpPPCWCS6K6lNpD1GV9zmd5jYNrz9vHwq+tuQW7/2K9VFkDX6PfzbGbaTx6r/dwOp+1iOxTg0oRf/xVcx+wXkaoGnmc3v1v6v3BkX2DnU6qJBDI5dIcxZhx275yhxpgJxhhd8jaYtn0EvcfZWfbj77RdZ74ssarWv2lvLqffDF2H2pvo/o2BXacxKdalR2xrwte15m/sLCg5CJverd85N7x9fHwDh13eP/1n9tt57zEw8deBdRsmjbFBp67B8l3LbGsy6pTayw2baltzmZ/VvUpBkf8+hHWkJxdm26A04qeB7XgaaNp0US6set4Gya41rLJ+4X/bVvFnD9d9PqWaUECTQ0XkJ8DPgXtFZLaIzA5utdqxgizYv+H40vap/2HnbyyvpivEeLuxeo2GHiPt3jIJgyA3gKBzQjdQA5av2bXUfqMfeP7Jr/WbBAkpsPKfgWdJlRfbFh7YwBkWCf0n1a9OPr1Pt0GvtlZB2VG7uGa/ABMuRt9s13zbftIeg8eVl8C2D23Lqbd3v8JOvWsuv8GbjzPi6sDqED8ADu2yCQe1Wfa4Hfua9Nuay8T1hwl32ZbW+3frKgWq2QQyOfQZ7Pprd2F3C70KaMT2jKpW2z6yP1O8QScsAsbcYls6B7acWHb317YVdLrfdKnuaYG1dHYtO94N5CqFjf+uXz0zP7NzYpKqaXmIwJhb7YrPezMCO9/HD8CRvXDRnwPrRquNr061jetkL7efv7bxHH+Dp8Apvez4WU3W/wuOHYLzHoap/7TJB6ueq76sMfaG33ssxAW4s238ANtlV1uSxpYPbCtn0OTjW13XJNn72TNe0uVxVLMJpKUzwRgzEygwxvwBGI93m2kVBNsW2VaC/w0j/Sa75P6KZ04su/oFiOpsu398EtPgcLYd66lN5bbO3m6djLnHVzmuizF2fk6/SceTHqoaeQ1ExMIXj9TdhbdpAax5Bc682871CbQbrSZdh9hMvtquuWupTU7oPS6wczrDYNQNNvhXN5jvccO3T9pWZ59xNpAM+Qmsfqn6PX5yv4e8rbZrLVB1rfqQvRLemmmz93Z+WXcQ+TGDyluALo+jmkkgQcc3GlwiIj2BCuz6a6qpHSu0S+z77xoJdmxnxE/t+E2xd9Xsov128ctTZ9hUZZ/uw+3P/Ztqv5ZvHsu4O+Caf0G3ofDmdPjqzzUnLfgc3AGFe2DguTWXiYy14z07v6h9jkthNrz/Szup85wHa79uoBwO2824bVHNN95dy+xabRExgZ931Ezb9Zfx0smvbf3QZoSd8cvj4zPjfm5bPt/PO7n8+nk26Pl/YaiLb4uDmjLYtn1kAw7UnVoOdrwpLDKwrEClmkggQed9EekM/AVYA+wG/hXMSrVbPyy2XT4pF5/82rg7bDeY74a39hVbdvRNJ5ZLTLM/6xrXyV4B0XFw4f/Yb+Q3LrKBbckjdoxnyf+r+YZdmSpdTRKBP9/ul8Zj677zyxNfd7vg37faIDftBZuS3BSyV8K+9XB0v82Gq/o5Sg/DvnWBd635nNLD/q7Wvn5iZp4x8O0Tdh06/3Xh+k6w2X3Lnz5xbMvtgo3zYfCF1U9KrUlsIoR3qDnoHPWusRZoEOk9xnZjNrY7U6l6qDXoeDdv+9wYU2iMeQc7ljPEGKOJBMGwbRF06Fb9ZNBuQ6H/Oba/vqIUMl72DtgPPLFcx+52WZr9daRN71luu4F838rDo+04xJjbIOtr+OrRmhMMdnxuv3XXNRaRern9Jg3YuTsv2NaZ7wa87H/tYpuXPB74uEYgdi+zgQ5st9Fnf7Df/H2yvrWvB5pE4O/0m23rZfOC48f2LLfZcuPvtKnNPiLe7MNtJ2Yf7vrSBohAEwj8zxffv/rutYpSm+TQe2z9gkh9sgKVagK1Bh1jjAd4yu95mTHmcNBr1R65ym0LImVKzYs+jr8TivbBe3facRv/BAIfEdvaqa2lc/SAvXH1qTKeIQKx3an8Z+EqPTlbq6LUdk1Vl7VWVe8xcMMHdjfOC/8fRHWBedfZYLb49/DlozDg/PrffOuSPBGc3m4jcdog+tJFNjMQ7HhOWJTtXqv3uc+yAdd/8uu3/2dbjafOOLn8sKn2i8Dyfxw/9v1bENXp5I3oAlFT2vSGt23G3jkPahBRrVog3Wufi8g0kUAmEqgG273M7l5ZXdeaz4DzIH6Q7ZqJ6Agdalh5oHuazXSraU6JL6urukH0yn5+7z+N7986cQLhnu/svKDq5udUx/dNevzP4fav7STIH9fBN38FjA0ITZ015d9tdNPHcOWLdm26ZybC0r/AxndsskFlK6weHA7bpZm9wgb2/B9sC/X0W6ofHwqLtK9lfmbrUHbUtvaGTW3Y9eMH2uDp33IzxnbhdRtW/y5DpZpZIEHnNuwCn2UickREikTkSJDr1f5s+8guZdP/7JrLOBy2JQQ2EeDVK6q/YScOt6m1NWU57VluWwI9Tz35tcob9kPwk8dtFtzci4+n6WZ+5h0vOLM+n85yhtkb8Lg7qMyaC2TAuyH8u43SpsFtS+2YyBf/bbu29m9seLA7dbptKa1+wa5v5oywa5rVZPRN9ve94hmbcFBRUr+sNX/xA2yygK/VBrDrKziwyf5e9buhauUCWZEg1hjjMMZEGGNO8T6vYwq3qhdjbNAZcO6JmWjViYjF3rBNzWmuvrW2aloOZ89y6DWq5m/avhv26TfD9e/adbxeutje6HZ8YSc/RnQI9NOdbNBke9NuzqypuH52qSBfsDOehge7mDgYdgWs/ResedW2+mpb765Dgu1CXPeGnbfTuU/gqdpVVZc2vfxpO443/KqGnVOpZhTI5NCzqvsTyMlFZIqIbBORTBF5oJZy00TEiEgQ9zpuxfathyM5kHJR3WUHnFP3DTshxabjVjdJtOKYvZ5vxnxdeo+BmQvsnizPnWP3y0lo5DStlsqa6j/J73cX2bhg12ccuEttq2PH53W3msbdYbslc1bZVcRrGrerS2XatHdc5+AOO+52+s12RQqlWrmwAMr8xu9xFDAGyABqmaQBIuLEJiFMBnKAVSKy0BizuUq5WOBXQADLAoeobR8BEtjAsu+GvXuZvWlWd8MOi4CuKdUnE+xdY9fcqppEUJte6Ta1+r077fO1r9hv7o0JFr3HNP9gdyC/u0AV+y207nbZc9Z2vvJiO05mPLDpHRh9Y8OuHxNnJwT70qaXP22/fIzWTXxV21Bn0DHGXOr/XER6A4FsHD8GyDTG7PS+703gcmBzlXJ/BP7EicGtfdn2oW151LQNc1WB3LC7p9m+/qr2fOc9R4AtHZ+j+4/fNAO5ybZWTRXs+k2EsGjbxRlIF6F/V15jfn8i3jXYdtjxtnWv2221Y7vX/1xKtYCGtPFzgKEBlOsFZFd5Xy//AiIyCuhtjKl1/RURmSUiq0VkdV5eXn3r27ptXmjHXnyTOptKYppNr/atYOCTvcJ2v9VnUiKcmIass9fr30XYlL+/uAG2pbPmVZuUMO72hp9LqWZWZ0tHRP6P49tYOoBTsSsTNIp34unjwM/qKmuMeRZ4FmD06NGhs7l79kqY711RYO2rNqOpqVoP3b1BbP+G4xlxHo8NOqn/Uf/zNWXXVKioT6upKX9/8QPtvJwVz9jxoR4jG34upZpZIGM6q/0eu4A3jDHfBPC+vYD/uu5J3mM+sUAa8KV3ClAisFBELjPG+F8zdO1eZsdXoOm7rBK9a7DlbjwedPK22iVg6jOe468lxmFCSVP9/uIHAMauyj3m1safT6lmFEjQmQ+UGmNXEhQRp4jEGGOqWTr3BKuAQSLSDxtsrgGm+170rmxQOYghIl8C97WbgAPeLhZv+nNTd1l1SLD78PhnsGUvtz/rO56jWheX37pvX/4J+p6hXwZUmxHQigSA/+SRaOCzut5kjHEBvwA+AbYAbxljNonIHBG5rCGVDTmdkgADgy4MTupw1eVw9qywa7vF9W/a66jmVeg3VKpbEqg2JpCWTpQx5qjviTHmqIgEtB68MWYRsKjKsWoXCzXGnB3IOUOKb27HpPshKb3pz989DXZ+Zdd1C4uwmWt9xuqs9bZu4Hnwzd8Dz5xTqhUJJOgUi8goY8waABFJB44Ft1rtRPZKO1nRN/7S1BKH2zGj/O0QEw+FWbUv16LaBk3qUG1YIEHnbuBtEfkROwCRiN2+WjVWzkrocWrNu282VmUG28bjS970GR+ca6nmpUkdqo0KZHLoKhEZAqR4D20zxlTU9h4VAFeZXY5m7G3Bu0b8QDs3JHeD3U45LBp6jAje9ZRSqg6BrL12J9DBGLPRGLMR6CgiPw9+1ULcvvW2Tz6YmWTOMOg2xLui8nK7nE1T7c6plFINEEj22q3GmELfE2NMAaCTAxrLl0SQFOQuku7D7f41+75v+PwcpZRqIoEEHaf/Bm7ehTyDNAjRjmSvsEvcB3vNrMQ0u0K0cWvQUUq1uEASCT4G5onIP73Pb/MeUw1ljHeJ+zOCf63ufmu6SQOX01dKqSYSyF3ofuAL4A7vn89pzytCN4XDOXYxzubIPnKXHX/85oym3xpaKaXqIZCdQz3GmGeMMVcaY67Ebk3wf8GvWgjL8d74myPo7Ft//LHOXldKtbBAutcQkdOAa4GrgV3Av4NZqZCXvcqmL3dv4u0MqpNcz31flFIqiGoMOiIyGBtorgXygXmAGGPOaaa6ha6cldBrVPOkL+vsdaVUK1JbS2crsAy4xBiTCSAi9zRLrUJZxTHb5TX+F813TZ29rpRqJWob07kC2AcsEZHnROQ87DI4qjF+XAcelwYBpVS7VGPQMcYsMMZcAwwBlmDXYOsmIk+LyAXNVcGQk9NMk0KVUqoVCiR7rdgY8y9jzKXY3T/XYtOoVUNkr4Qu/aBj15auiVJKNbt6zRY0xhQYY541xpwXrAqFNN+kUO1aU0q1UzpFvTkVZsHR/ZB0ekvXRCmlWoQGneaUvcr+DObK0kop1Ypp0Glq2Sth2WPVLzeTsxLCO0C31Oavl1JKtQIBrUigApS9El66yKZEh0XCDR+cOH6T7ZsUqr92pVT7pC2dpvTDYhtwwO4M+slDUHLIPi8vsZupaRKBUqodC2rQEZEpIrJNRDJF5IFqXr9XRDaLyPci8rmI9A1mfYLPO3dWHCBO2532f6NgxbOw9hUbkKI6t2wVlVKqBQWtn8e72dtTwGQgB1glIguNMZv9iq0FRhtjSkTkDuDPwE+DVaegO7AJYuJh3M+h31kQHgOf/Bd85LcTxJJH7GZq2uJRSrVDwWzpjAEyjTE7jTHlwJvA5f4FjDFLjDEl3qfLsZNP26ayo5D5GaRNg7Pus0ElMQ1mvgcj/OKou0K3F1BKtVvBDDq9gGy/5zneYzW5GfiouhdEZJaIrBaR1Xl5eU1YxSaU+Rm4SmHoZSceF4HTb7HbC4hTtxdQSrVrrSKNSkSuA0YDk6p73RjzLPAswOjRo00zVi1wWxZCTAL0nXDya7q9gFJKAcENOnuB3n7Pk7zHTiAi5wMPApOMMWVVX28TKkph+yeQdgU4nNWX0e0FlFIqqN1rq4BBItJPRCKAa4CF/gW8O5L+E7jMGHMgiHVpuNome/rsXALlR2Ho5TWXUUopFbyWjjHGJSK/AD4BnMCLxphNIjIHWG2MWQj8BegIvC0iAHuMMZfVeNLmtuNLeG0qGLyTPRdW31rZvBAiO9mMNaWUUjUK6piOMWYRsKjKsdl+j88P5vUbbfXzYDz2sbvcjslUDTruCti2CFIugrCI5q+jUkq1IboiQW0O7jj+WKT6rLPdy6C0EFJbTwNNKaVaKw06NcnbDgc223TnnqPA47GTPavavNAu4jng3Oavo1JKtTEadGqy7jU7r+as38J170BMHHxwjw0+Ph43bP0ABk2G8OiWq6tSSrURGnSq466AdW/A4AshtrsNOBf8t11Lbe0rx8tlr4DiPO1aU0qpAGnQqc4Pi6H4AJx2/fFjI6+BvmfC4oehON8e27wQnJEw6IKWqadSSrUxGnSqs/Y16NDNdpv5iMBPHrPzcRbPBmNgy/sw8DyIjG25uiqlVBuiQaeqov2w/WPbsnGGn/hatyEw4S5Y9zp89yQcyYGhl7ZMPZVSqg3SoFPV92+CcZ/YtebvrN9C5z7w6UN235yO3Zq3fkop1YZp0PFnjO1a6z0Wug6uvkxEDJx+q7e8B968rvYlcpRSSlXSoOMvZxXkb4fTrqu9nKeCyl1CfSsVKKWUqpMGHX9rXrETPYdNrb1c8kQIi9L9cZRSqp5axX46rULZUdj0rg04dWWj6f44SinVIO0r6GSvrDlQbH7PpkPX1bXmo/vjKKVUvbWfoJO9Eub+xI7BiAPSpkH/c6BrCiQMhuX/sCsPiPY4KqVUsLSfoLPzSxtwwGadbfw3bHi7SiGBVy6ved8cpZRSjdJ+gk5xnv0pDrt0zfXv2jk2eVth1fOwYwlgat43RymlVKO1j6BTtN8u4Jk0FlIuPHFMJ34AdOgKWd/ZgKPZaEopFTTtI+h8/gdwlcLUp22QqUqz0ZRSqlmEftDJWW3XSjvj7uoDjo9moymlVNCFdqqWxwMf/RY6JsJZ97V0bZRSqt0L7ZbO+jdgbwZM/aduP6CUUq1AUFs6IjJFRLaJSKaIPFDN65EiMs/7+goRSW6yi5cehs9+D0ljYPjVTXZapZRSDRe0oCMiTuAp4CIgFbhWRFKrFLsZKDDGDAT+CvypSS6evRL+dbXd/fOiP4EjtHsRlVKqrQjm3XgMkGmM2WmMKQfeBC6vUuZy4GXv4/nAeSIijbpq9kp4+RLYs9wuyOlxNep0Simlmk4wg04vINvveY73WLVljDEu4DAQX/VEIjJLRFaLyOq8vLzar7p7GbjKT3yulFKqVWgT/U7GmGeNMaONMaO7du1ae+HKbQccOtFTKaVamWBmr+0Fevs9T/Ieq65MjoiEAZ2Ag426qk70VEqpViuYQWcVMEhE+mGDyzXA9CplFgI3AN8BVwJfGGNMo6+sEz2VUqpVClrQMca4ROQXwCeAE3jRGLNJROYAq40xC4EXgFdFJBM4hA1MSimlQlRQJ4caYxYBi6ocm+33uBS4Kph1UEop1Xq0iUQCpZRSoUGDjlJKqWYjTTFu35xEJA/ICqBoApAf5Oq0Ru3xc7fHzwzN/7n7GmPqmLOgVO3aXNAJlIisNsaMbul6NLf2+Lnb42eG9vu5Vdum3WtKKaWajQYdpZRSzSaUg86zLV2BFtIeP3d7/MzQfj+3asNCdkxHKaVU6xPKLR2llFKtTEgGnbp2LA0VIvKiiBwQkY1+x+JEZLGI/OD92aUl69jURKS3iCwRkc0isklEfuU9HrKfW0SiRGSliKz3fuY/eI/38+64m+ndgTeipeuqVF1CLugEuGNpqJgLTKly7AHgc2PMIOBz7/NQ4gJ+bYxJBcYBd3r/fkP5c5cB5xpjRgKnAlNEZBx2p92/enfeLcDuxKtUqxZyQYfAdiwNCcaYpdiFUv3578b6MvAfzVqpIDPG7DPGrPE+LgK2YDcDDNnPbayj3qfh3j8GOBe74y6E2GdWoSsUg04gO5aGsu7GmH3ex7lA95asTDCJSDJwGrCCEP/cIuIUkXXAAWAxsAMo9O64C+3v37lqo0Ix6Cgv795EIZmeKCIdgXeAu40xR/xfC8XPbYxxG2NOxW6GOAYY0sJVUqpBQjHoBLJjaSjbLyI9ALw/D7RwfZqciIRjA87rxph/ew+H/OcGMMYUAkuA8UBn74670P7+nas2KhSDTuWOpd5snmuwO5S2F77dWPH+fK8F69LkRESwm/9tMcY87vdSyH5uEekqIp29j6OBydixrCXYHXchxD6zCl0hOTlURC4G/sbxHUsfaeEqBYWIvAGcjV1teD/wMLAAeAvog12N+2pjTNVkgzZLRM4ElgEbAI/38H9hx3VC8nOLyAhsooAT+0XxLWPMHBHpj02UiQPWAtcZY8parqZK1S0kg45SSqnWKRS715RSSrVSGnSUUko1Gw06Simlmo0GHaWUUs1Gg45SSqlmo0FH1ZuIuEVknd+fJltcU0SS/VfNVkqFlrC6iyh1kmPeJVmUUqpetKWjmoyI7BaRP4vIBu/+LwO9x5NF5AsR+V5EPheRPt7j3UXkXe8+MetFZIL3VE4Rec67d8yn3ln4SqkQoEFHNUR0le61n/q9dtgYMxx4ErsqBMD/AS8bY0YArwNPeI8/AXzl3SdmFLDJe3wQ8JQxZhhQCEwL8udRSjUTXZFA1ZuIHDXGdKzm+G7sZmM7vYty5hpj4kUkH+hhjKnwHt9njEkQkTwgyX/pFu92BYu9m7EhIvcD4caY/w7+J1NKBZu2dFRTMzU8rg//9cPc6NijUiFDg45qaj/1+/md9/G32NW+AWZgF+wEu630HVC5SVmn5qqkUqpl6DdI1RDR3l0sfT42xvjSpruIyPfY1sq13mN3AS+JyG+APOBG7/FfAc+KyM3YFs0dwD6UUiFLx3RUk/GO6Yw2xuS3dF2UUq2Tdq8ppZRqNtrSUUop1Wy0paOUUqrZaNBRSinVbDToKKWUajYadJRSSjUbDTpKKaWajQYdpZRSzeb/A0WgTZwAVcu6AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [], + "needs_background": "light" + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "rYbGePkkCFUu" + }, + "source": [ + "### Plot validation loss and accuracy curves (data augmented model)" + ] + }, + { + "cell_type": "code", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 262 + }, + "id": "K0vFuwg_R1_0", + "outputId": "fc416672-3507-4a87-a43c-6ee86dcb3238" + }, + "source": [ + "import pandas as pd\n", + "import matplotlib as mpl\n", + "import matplotlib.pyplot as plt\n", + "\n", + "df = pd.read_csv('my_vgg16_aug.csv')\n", + "print(df[df.epoch == 0])\n", + "plt.figure(figsize=(4,3))\n", + "ax = plt.gca()\n", + "lines = []\n", + "FEATURES = ['acc', 'val_acc']\n", + "for feature in FEATURES:\n", + " lines.append(ax.plot(df[\"epoch\"],df[feature], marker='.')[0])\n", + "plt.xlabel(\"Epoch\")\n", + "plt.ylabel(\"Accuracy [%]\")\n", + "lgd = plt.legend(lines, [\"Training accuracy\", \"Validation accuracy\"], \n", + " loc=\"best\", bbox_to_anchor=(1,1))\n", + "plt.show()" + ], + "execution_count": 15, + "outputs": [ + { + "output_type": "stream", + "text": [ + " epoch acc loss val_acc val_loss\n", + "0 0 0.478 5.93678 0.483333 0.955677\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZ0AAADQCAYAAADcbrykAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3dd3hUZdr48e89k04CpNEhoUOoGkSKCGIv6Cp2FHXFtrt2d9XVxV18fdd9XV3XRdf2E7AhgoqI7CoqIL2E3oWQEEoIKYSEkDIzz++PM0kmIWUCmUwS7s915Zo5Z065Q/Tc83QxxqCUUko1BJu/A1BKKXX20KSjlFKqwWjSUUop1WA06SillGowmnSUUko1GE06SimlGkyAvwOoq5iYGBMfH+/vMJQ66yQlJWUaY2L9HYdq2ppc0omPj2fdunX+DkOps46IpPo7BtX0afWaUkqpBqNJRymlVIPRpKOUUqrBNLk2HaVUHaStgZSlED8KOg+t98snJSW1CQgIeB/oj36JVeACtjocjkmJiYkZVR2gSUep5iptDUy7ClxOCAiGu+admnjOMCkFBAS8365du76xsbE5NptNZw8+y7lcLjl69GhCenr6+8C1VR2jSUednfYthbTV0PVCn5QAGkxNSWPVv8FVYr13FFnHeR6TtgZmXAOOYggIqTop1a6/JhxVymazmdjY2Nz09PT+1R2jSUedffb+BB/dABgICD29h23aGti1ADoPg26jrYf2gbU+rco6RfIS+PgGMC6wVyrJZO2FXf8FsVmf44Ki/PJznQ5Y/FcrGQE4i09NSt6xacJRntz/PVRb1apJR519fnwRcD8nnVWUAGqTtgamXQkuR/k+sVsPd5FTE4DnefWZlJb/szwGz9/DUQxf3AsBQXDDu5CxDX5ZCMteg5JCME7Y+yNk7bGSEgL2ICuuJiY9Pd0+ZsyY3gCZmZmBNpvNREVFOQA2bty4IyQkpNqE+PPPP4d98MEH0dOnT0+r6R7nnHNOnw0bNuys38jPXpp01Nll389waL07STitfXEX1O0aK/7lkXAEul8EjkJIXQHGVF1q8ExUtgC44HFo0QZy9lkP+26jIX0rpC7zLimVnISD660kZ4yV8Gzu/51//Asc2gC3fAx9x0HCOOt+H10Pq98qv8bYP1nViw1ZOqtn7dq1c+7cuXM7wBNPPNEhPDzcOWXKlCOln5eUlBAYGFjluRdeeGHBhRdeWFDbPZpiwnE4HAQENM7He+OMSilfKC6AeY9AZFcY9wasegt2/wcKj516bHWlkj0/ws5vK5YQxjxrfTbtKqsNxRZwaqlh/YflicrlgJ9fKf9s1VsVjxUb9L8RorpDyQkrcVROCOs/gsIcuOpVyDsEO76BH/5sxb1zvnVO33HlxwcEQ7eLrMSIsZKuiHXdBk42y/Zktli+JzNiZI+YvAt6xJyo7+uPHz8+Pjg42LV169awoUOH5k+YMCH78ccf71JUVGQLCQlxTZ8+fd+gQYOK5s+fH/Hqq6+2XbRo0Z4nnniiQ1paWlBqamrwoUOHgh588MEjzz//fAZAWFjYOQUFBRvmz58fMWXKlA5RUVElu3btCh0wYEDB3Llz99lsNmbNmtXqmWee6RQWFuY677zz8lNTU4MXLVq0xzOuXbt2Bd1+++1dT548aQP45z//uf/SSy89AfDcc8+1mz17dpSIcPHFF+e+9dZbB7du3Rp8//33x2VlZQXY7XYze/bs5H379gWVxgwwceLELkOGDDnxyCOPZHXs2HHAtddem71kyZKWjz32WHpeXp592rRpsSUlJRIfH180Z86cfREREa60tLSAX//613H79+8PBpg6dWrqt99+2yoqKsoxefLkDICHH364Y5s2bUr+9Kc/VdkD7Uxo0lFnj8V/tUoWd31jfcOPGw5vDrUe1j0uAZvdOq5CA3sQTPzGekBv/hw2fAxt+sJlL1qlCc+kdMeXMPNWiO1V8UHudFgJDLESij0Q+lwNW7/C6mEqENUNspMBd6lly2zKqgBXvwN3zy+/pqPYqlrrPAzOu9eK7YIn4MNfWQkHrOq0tDUV4+g2Gpa+apXEfFCd9vs5mzrvTs8Lq+mYE0UO297ME2HGwNtL9tI9pkVBi+AAV3XH92oXUfDKjYNqrP6qyuHDh4PWr1+/MyAggOzsbNvatWt3BgYGMnfu3Ig//OEPnb777ru9lc/Zs2dPyIoVK3YdO3bM3rdv3/6///3vjwYHB1eontuxY0foxo0bk+Pj40sSExP7LFy4MHzUqFEnHn300bjFixfv7NOnT/G4ceO6VhVThw4dHEuXLt0dFhZmtmzZEnzbbbd127p1647PP/+85YIFC1onJSXtjIiIcB05csQOcPvtt3d96qmn0idOnHisoKBAnE6n7Nu3L6im3zs6Otqxffv2HWBVPT755JOZAI888kiHN954I+a5557LePDBB7uMGjUqb/LkyXsdDge5ubn2Ll26lFx//fXdJ0+enOF0Opk7d27k2rVrd9T1390bmnRU81S5pLL+I6tarNeVVsIB6+F/8WSYfTdsmgnn3AEuFyz5W3kDu6PISkAuh7tBXuCiP0L3sdaPp24XwphnYOGfYP9q6HK+tT9pGuSkwMUvAKb8Yb9zQXkCGPEI/PeZ8u1Bt1ilI+Oy2mt2LShPIJtnwfEDMO51K+EABIdDr8vh4DrrHs6SU6v4Og+12pr8WJ2WX+QIMO7HuDHWdovggOL6vs8NN9yQU1q9lJ2dbb/lllu6pqSkhIiIKSkpkarOueyyy46Fhoaa0NBQR1RUVMmBAwcCunfvXuJ5zIABA06U7uvXr1/B3r17gyIiIpydO3cu6tOnTzHArbfemv3++++fMjFqcXGx3HvvvXHbt28PtdlspKamBgMsXLiw5R133JEZERHhAmjbtq0zJyfHduTIkaCJEyceAwgLCzOUfQup3sSJE3NK3yclJYVOnjy5Y15env3EiRP20aNH5wKsWLEiYs6cOfsAAgICiI6OdkZHRztbt27tWL58eejhw4cD+/XrV9CuXTtn7f/SdadJRzU+lRNGXRrgXS5Y+S+r9GJcVskiqjtk/WJ9nryoYgkg4VfQMREWvgAZOyBlGRze6K4+w6qGah1Xfr7Y4OhOq6RSlfPuheWvW4nrzi+hIBt++h8r0V3weHmSgFMTQNuE8m2ATbOsJGScVk+0Mc9aVXfL/gHtB1mlM0/elGR8WJ3mTYlk2Z7MFr+evraXw+myBdhtrr/fPDjZF1Vs4eHhZaWnp59+uuPo0aPzFi5cuHfXrl1BY8eO7V3VOZ6lGrvdjsPhOCU5eXNMdV566aW2bdq0Kfniiy/2uVwuQkNDE73/jSyBgYHG5SovGBYVFVW4f2niArj//vu7zpkzZ8/w4cNPvvHGG9FLliyJqOna99xzT+b7778fk5GREXjPPfdk1TU2b2nSUQ2vpiSyf7VVsnC620YG3mxVNbkcNfcKS15iDYLcMc/qrVXKuCDfo1q6cglABAbdBguegpVTrX0jH4feV5Y36gPMuNa7aqmgFjDiYSvpHVhnlUqKjsMVL1dMOHBqAqi8XZqUDPDTFCvGrqMhey/c/GHV1/NzSaY2F/SIOfHB3eft9mWbTmXHjx+3d+rUqRjgnXfeianv6w8cOLAwLS0teNeuXUG9e/cunjVrVlRVx+Xm5to7depUbLfbmTp1arTTaRUkLr/88uMvvfRSh/vvvz+7tHqtbdu2znbt2hV/9NFHre+8885jJ0+eFIfDId27dy/as2dP6MmTJ+XEiRO2ZcuWtRw5cmR+VfcrKCiwdenSpaSoqEg+++yzqPbt25cAjBw5Mu+VV16JnTx5ckZp9Vp0dLTzzjvvPPbSSy91dDgcMn78+OT6/ncqpUlHNazd38OnNwPGeoDfNb+8GirzF5j7oPVwB6tRfuMn5edWN8Bx+tXl57SOgwufhhVvlCeJS6dUrLqqnDSKjpe/FzuERFgxlcYFdXuYn3cfLH8DvrzfaqfpOw7a9qvTPxNQMQmVnLBKMZtnQasu0Gdc7ec0Uhf0iDnREMmm1NNPP50+adKkrn/72986XHrppVX0Gjkz4eHh5rXXXku94ooreoaFhbkGDRpU5e/22GOPZYwfP777Z599Fj127Njc0NBQF8CNN954fP369WGDBw/uGxgYaC655JLcqVOnHvz444/33XfffXEvvvhih8DAQDN79uy9CQkJxePGjcvp06dPv06dOhX169ev2t53zzzzzKGhQ4f2jYqKcpx77rn5+fn5doB///vf++++++64Xr16xdhsNqZOnZp6ySWXnAgJCTEjRow43rp1a6cve76JMbVWEzYqQ4YMMbqeTiOWtsbqllzdSP9/j4QjW8u3W3aAIZNg7w9WKScg2CqNGJfV5jLqSfj571a7BkD/8XDDe1ajv6MIPrwW9q+yPhMbXPQcXPhU3aro0tZULMmc3sj8iuY/BuumWe8DQqzOC2dyzdSVMP2q8n+Xuxc0eHIRkSRjzBDPfZs2bUoZNGhQZoMG0gjl5ubaWrVq5XK5XEycOLFLz549C1944YV67/nlS06nk379+iXMnj1774ABA4rO5FqbNm2KGTRoUHxVn2lJR9WsLu0pqSvLG93tQXD3txXP2f2dlXBsAVYrstiswYo/TbE+FzvcNB1CIyves9sYq/osfTNs/QKOHYB2/WHvIshJts4D656lnQRqq7ry5ItqqRZtyt9X1ahfV/tXAO7qNJfrzK+n6tXrr78eM3PmzJiSkhLp169fwRNPPNGkEnFSUlLIdddd1/PKK6/MOdOEUxtNOqp6ZV2HS6xv19e+AYEtIGO7NSDS86F3Igu+/q3HCPliq/dV6TGFx2H+4xDbF67+uzXvWfwoK5ks+l+srsNYSWnUk9UnjO+ft3qhHVhtbV86BboMP/OEUd/VUj0usarY6qt7cvwo6zo+6u6szswLL7yQ0dRKNp4SExMLDxw4sKUh7qVJR1Xt+CH4z9Mec3MVwVcPlH++5GUY9hvofD7sWwLb58HJbLAFWlVAxgWbZ0Pi3dBpiNWwfvwQ3DsDOp8H8R6zANRl7EhoJNa0Ti6rhONyNM52jPouPTWBTgJKeUOTztkueYnVjbjTUOhwjjVp5eq3rVeX02o7McaqEosbCcmLKRvAuHJqeY8vBK79F8T2th6MbRLgv8/Cx+Ph3Imw7v9BvxushOOprg/T+FHudp8m8I2/vpNhY0yuStWRJp2zlcsF3z1XcS4uT2KHGz+wGvo9x47sX1X+wO99FWz7EmtaFRucyIBz7yx/MLbpC++NtXqSgTXAsfIoeajbw1S/8SvVpGnSOdukrYFtc60Si+d4FmwQ2xOO7qZs4HP2Xuj3q6rHjpQmoV0Lqi91RMbDgJth9b+t7fpoUAf9xq9UE6bLyzYlaWus9o+0Nad3/v7V1kzHq960Es7gCdZ6MmK3qqzO/43VvVfsNY9oL23oLy11jH2u+m7G/W8ov0djrw5rBpJSc3hz0R6SUnNqP7gZOP/883t98cUXLT33TZkypc2ECRO6VHfO0KFDe//8889hAKNHj+6RmZlpr3zME0880WHy5Mlta7r3Rx991DopKSmkdPuxxx7rMHfu3BpH/Sst6TQdnj3JAoKqHvdRW/fmpOnlvcvEDtHda56KxZvSRG2lDq0OazBJqTnc9u4qHC4XQQE2Ppk0jMS4SH+H5VM33XRT9syZM6PGjx9fNsL3iy++iHr55ZcPeHP+kiVL9tR+VNXmzp3b2uFw5CYmJhYCvP7664dO91r+4o8lELSk01TsWuDuSeay1m7577PW4MMl/webPoNFf7VKMT++CDPGnVoaMsaaUwypWOrwLLnAqdv1wRfXPAt4llqMMSzelcHL/9lRbSlm3saDFDtduAwUO1ysSvbZ9FlnJnlxC374czuSF7c400vdeeedOT/99FOrwsJCAWv5gIyMjMDLL788f8KECV369+/ft0ePHv0ef/zxDlWd37FjxwGHDx8OAHj66afbxcfH909MTOz9yy+/BJce8+qrr8b079+/b+/evRMuv/zy7nl5ebaFCxe2+OGHH1o///zznfr06ZOwbdu24PHjx8dPmzYtEuDrr7+O6Nu3b0KvXr0SbrrppviTJ09K6f0ef/zxDgkJCX179eqVsGHDhpDKMe3atSsoMTGxd0JCQt+EhIS+CxcuLPt3eu6559r16tUroXfv3gm/+c1vOgJs3bo1eMSIEb169+6dkJCQ0Hfbtm3B8+fPj7jooot6lJ43ceLELm+88UZ0aQwPPfRQx4SEhL4ffPBBZFW/H0BaWlrApZde2r13794JvXv3Tli4cGGLxx57rMOUKVPKBqE9/PDDHV988UWPQWm105JOU+AssaaqB8Bmzbl1eKN7RuEqVDVdTMoya3zNsN9Ci2gtdTRCSak5rErOYljXKPZlneCZL7bgcBkEsNsEh8tqa/tgeQoz7zu1FJOSVT77isvA0bxCjDFI5TnafGXubzuTsb3GpQ0oyreR9UsYGFj2OkT3LCC4fHLOU7RJKOBXb1Y7kWjbtm2dgwYNOjFnzpxWd9xxx7EZM2ZEjRs3Lsdms/Haa68dbNu2rdPhcDBixIjeq1evDj3//PNPVnWdpUuXhn311VdRW7Zs2V5SUsLgwYMTzjnnnAKACRMm5FS1RMAll1xy7Jprrsm95557KnwLKCgokAceeKDr999/v2vgwIFF119/fXzpXGcAMTExju3bt+94+eWXY19++eW2s2bNSvU8v7kvgaBJp6GcyVLFP/7FGjQ55llrkGb8KGsxsSV/w2r0t0HCddaCZI5Ca19Mn4rXWPp3a5T8xX+CwNB6+qVUfUlKzeH291ZR5Dj1+WuANi2DOXysEEN5KcYz6WQcL2TF3iyuGtCOnm3CWZWczfQVqexMz2NAx1Zc0rct53eLLk9s3aL9U/VWnBdQPkO/sbaDw89oaYObb745e9asWZF33HHHsS+//DLqvffeSwGYMWNG1PTp02McDoccPXo0cNOmTSHVJZ1FixaFX3XVVcdKZ2m+7LLLyuZoq26JgOps2rQppFOnTkUDBw4sArj77ruz3nzzzTZABsDtt9+eAzB06NCCefPmnfJHaO5LIGjSaQhl7TFFVU8PU5Md860R+OdNstZq8bT8n+U9x4b/xvrZ8Y01zmbrbOjrnn7/QJLVW+2Sv2jCaaQWbk+vkHDO7xbFxv3HcDhdBAbY+N1FPZkyfxtFJS4MEB5c8X/dD1em4nAZnr6iD3HRLTDG8Od525ixMpVVydm8t3QfEcF28oudYCAowManVZSWzkgNJZIyyYtb8MnNvXCV2LAFurj+7WS6jTmjyT9vv/32Y88991znZcuWhRUWFtpGjRpVsHPnzqCpU6e2TUpK2hEbG+scP358fGFh4Wk1J9R1iYDahISEGICAgABT1dIIzX0JBG3TqS819SxLWWqt9ghWktj4qXfX3PYVzPk1xPSCy/+34mdV9RzrPNRa0XLUU9a5exdZxy79O4S0ttZ6UfUiKSWb//l2O0kp2dUfU6knWXU9y04UOfjv1nQAbAIhgTb+cHkfPr1vGE9c1ptPJg3j9vO78MmkYTx2SU8iwwL5dvNhSifrLSxx8snqVC7t25a4aKv6X0Ro0zIEm/tRI0BYcADGWF+DixwuXv9hNyv2ZDZsb7duY04w4fPdjHz0IBM+332mCQegVatWruHDh+dNmjQp/vrrr88GyMnJsYeGhrqioqKcaWlpAYsXL25V0zXGjh2bv2DBgtb5+fmSk5NjW7hwYevSzyovEVC6Pzw83Hn8+PFTnqGDBg0qPHjwYNDWrVuDAT788MPoUaNG5Xn7++Tm5trbt29fYrfbeeuttyosgfDxxx/HlLa5HDlyxB4ZGekqXQIB4OTJk5KXl2fzXAIhMzPTvmzZspbV3a+63690CQSwOhxkZWXZAe68885jixYtarVp06YW48ePr7HUVxUt6dSHtDVW472zuOo1X+I8pnxBYMvnVsmlXf/qr7nyTWvwJgaO7YfDm7wfVDnyUdj0KfznD3DDu1YnhNHPQPDZ25uzcrVSUko2q/Zln1Y1U1JqDje/uwqny/DBsn1Mv2coF/aquFDk1xsO8vjnG3EZK5HER4eRklWAMRAcYOMTdynD5TI8Pmsj+7MLeP7qvhQ5XBVi8owtMS6SxLhIIlsEMfnrbSzbk8monrF8uf4gOQUl3HtBxVWSh3WLJijARonDKi09enEvpszfRrG7RLX0l0yW/pKJAMGBDdjbrduYE/WRbDzdeuut2RMnTuw+c+bMZIDhw4ef7N+/f0H37t37t2/fvjgxMbHKNWdKXXDBBQXXX399dv/+/ftFR0eXDBw4sCy+6pYImDBhQvZDDz0U//bbb7edM2dO2fLXYWFh5u2330656aabujudTgYNGlTw1FNPHfX2d2nuSyDo0gb1YemrVq8xjNUzbOxzVm+tUgfWwfsXW6tU9h/vXtulBK78G+Tsqzj9/o5v4GASpC4vP7+qa9Zm93fWujWBYVbPtSe2Q1iVa0s1ebW1Uyz95Sh3fbAGl7G+8QcH2Ch0P3jtNuFPV/cloUNL1qbklCelGq55z7Q1LNpV/gxpGRLA+3edh90m/Lw7g+SjJ/h2y2FcHv9rtQwJ4Hiho2z7oj5teODCrry28BfW7Mvmz+MSuHtkxaRRnSKHk4teWUzbViF88eAILnv9Z0ICbXzzuwtO6TRwSrL12P4i6QCfrtlv/TsIPHFZb35b3uHpFLq0gQLvlkDw29IGInIF8E/ADrxvjHm50uddgBlAa/cxzxhjFvgyJp/wHPAocuoAyG1fWe0u4/4Joa0hto+VhOb82vrcZrdWhExebC1NDNZyyHt+tJLT6QyqDI10Lx1QYM2blrUHwppmb7XqHpzndmnNgZyT/PGrLTicpsp2irzCEv4wZ3NZAjBAZIsg0nOtRnmny/Dnb7aXHW8TGNipFVsOHsflMhVKJQDzNx9i0a6j2MRKYHa7jbAgO7e8sxKbCE73l7jz4iLZfDC3rE3mmSv7VihlLNqZwaKd1qTEdpswoGONtT8VBAfY+d3Ynvzxqy1Mmb+dPRn5vH7L4Cp7qZWWjqrb/nLDgbKS0LBu0V7HoM5O9bEEgs+SjojYgTeBS4EDwFoRmWeM2e5x2PPA58aYf4tIArAAiPdVTD7TqjPW40wgNBo6erT7uVzWtDPdL7YSDkBsLxh4C6x9z32MA/b+RFmnE7Fb1xj52On3eEtZWv7eGL+uv3ImPaaSUnO49Z2VlLgMItChVQiHcgupqoBe5HAxe11a2T3yixzcPW0tGccLCbQLLpchMMDGw2OtRvkSh4tAu43zu0WzZLdVcnEZ2HE4D6c7SxU6XLz2/W7uv7Ar/9mazpykAwyJi+TJy3qxfv8xhnWLpne7CG59ZyVbD1njE20CY/q04Zmr+lb4vXu3iyjb/mztfmavc49fNIZV+7JJjPe+JHpjYifeWryH6StSCA+2077VKcM9apUYF8knk4b5tzebalLqYwkEX5Z0hgJ7jDHJACLyGXAd4Jl0DFDawNUKaHIjegFrpUywVqz8+RWrDaWveznhA2vh+AG4eHLFcwbeDBs+Lu99dsXLpy6pfCZzjMWPstqXGng2Zs+xJj3aRLBwRzp//HLraY2SN8bwj4W7KXEnAGOgoNhZlnAEGNkjhrUp2ZQ4XRgDn61NI6pFEMO6RvPsV5s5nFvIm7efS5uWIdUmAIDV+7LKvvFPvqZfhVLJ8r2ZLN+bWXbPh8f2YHj3GIZ3jymL9S/X9ef291aVlWxK71NTKeObTYdOu5QRFGDjusEdeHPRXk4UOblr2prTapOpHJNSvubLpNMR8OxCeQA4v9Ixfwa+F5GHgRbAJVVdSETuB+4H6NKl2imV/Cd5MYRGweinYdMsWP1OedLZ9pX18O99ZcVzqpoepq5T0NTED9PPrNmXxYT3VpclicqKSlys2Jvp1UPOGMPL/9nJsj2Z2EUAq5Ty+8v7lJdSAmw8fmkvAFYlZzG4c2u+2XSItxbv5a3FVrtuoN3qxVVbAqj8jd8zKX2z6RDTV6QAVu3p1kPHGd274iDsxLhIPr3P+1JDfZQyQgPtWP8yUFLF2J0G4nK5XGKz2ZpW47DyGZfLJZStyngqf/deuw2Ybox5VUSGAx+JSH9jTIWAjTHvAu+C1ZHAD3FWzxhrEbOuo6yBm0MnwcLJkL7VWlNm+1xrFcmQKnos1mVJ5dPRALMxJ6XmsHJvJi4D05bvK0s4AozpHcuQuEhe//EXSpwGA8xcvR/BengP6xZT5UNynbs78sa0XO4cFsd1gzuw2qOnmWdCqNzLa2SPGE4WO/l6k1VodrmMVw/j2pLSZ2v311oqqWup4UxLGcO7xxAcuMffbTJbjx49mhAbG5uriUe5XC45evRoK2Brdcf4MukcBDp7bHdy7/N0L3AFgDFmpYiEADG4R+42CVl74fhB6OruWXbuRFj8sjVAc/AEyDsM/a73b4w+UnkUfduI4AptJ78b25PEuEiGdY9hVXImdpuNj1el8vfvdwMQYPuFafecR1hQAKuSszinc2tW78vmXz/9gstYDey/GtyBxPgohni0d9T2sJ44Ip7vtqfX28O4sbZ9NIa4HA7HpPT09PfT09P7o+P+lFXC2epwOCZVd4Avk85aoKeIdMVKNrcCt1c6Zj9wMTBdRPoCIYDX/dkbhX2LrdduY6zX0Eirk8CmmVBy0loqoPcVfgruzNXUCcBzFL0Adw6PY3j3mCpLIaXvHU4Xr36/GwM4XIa7P1gDImUN9xWcRgN76f3q+2HcWNs+/B1XYmJiBnCt3wJQTY7Pko4xxiEivwO+w+oO/YExZpuITAHWGWPmAU8C74nI41hV03ebpjZwKHmx1Xstqlv5vvMfhKRpsHWOtcRzEx2UmZSSzW3vr8bhPLUTwPHCEv6zpXwUfVCAjeHdY2p9CHpWCdntNjpFhpJ81BqHJ8DFfduwbE/mGZdS/P0wVkpVzadtOu4xNwsq7Zvs8X47MNKXMfiUywn7lkKfa6xGilJFx60xMsYFB9ZUvURzI5WUmsPSX45SWOLky/UHy3pwFZa4+GHHERLjIilyOHngwyQOHjvJC90Gz1cAABtlSURBVNckUFDi9LpEUbkUAjDhvVWUuHt9PTSmBw+N6dHoqrKUUvXD3x0Jmrb0zVB4DLqNrrjfc4yMy+XXMTJ1kZSawy3vrCybQr9dRDCB7in1DTBt2T5OFDlYuy+bHel5vHbzIG44t1Od73NKz7Eqen1pslGqedKkcyaSl1ivXS+suN9PY2TO1Hdb08sSjk3gzhFxDOtmtdHER4fxwfIUPlxpLf0RYJOyySXPlFaFKXX20KRzJvYtgdi+ENGu4v4mukTzgWPWnIB2wd2eUrGNJiWrgPWpORiscTR+GheilGrCNOmcLkcRpK6ExLuq/rwBxsjUp2MFxSzaeZTRvWIY2jW6yvaUYd2iCQ60+XtciFKqCdOkc7qSZoDjJES093ck9eLjVamcLHHy7FV96dOu6qU3GsO4EKVU06ZJ53SkrYHvnrXeL/4rxI1olKWadSnZLNhymKsHtK9xrEthiZPpK1K5sFdstQmnlLa/KKXOhI4gPh3b5lozQ4O19IBnb7V6VN1Kk974dvMhbn5nJR8sT+Gmd1ayfE/1S558vfEgmflFPHBht2qPUUqp+qAlnboqPA475lnvxe6z3mlJqTnc9PYKXMZavtibGYRL50E7mlfEJ6v3l60h4zLwyMwNzHpgOD3ahFc4x+UyvPtzMgntWzKiu7bRKKV8q9qkIyI3eHF+YZNcdO10uVzw1YNw/BBc+QoU5/msd9p/PFaeLCqpfQbhyvOg9WkXwb7MEzicLuw2GyVOF9dNXcaDo7tjs5VPtvne0mT2Hj3Boxf3rHIRMKWUqk81lXTeA77Gmp2kOhdSacaBZittDfz0kjXX2hUvw/n3+/R2+UXlSxsboKDYUe2xuSdLeOnb7WUJxyYwblD7sjE2w7pF07F1KBM/WM2rC63JNoXd9Gwbzi9HrKXj3/l5Lxf2itX2GqWUT9WUdP5jjPl1TSeLyMf1HE/jlLYGpl9tDfYUO3Q416e3M8awLjWHvu0juGpAexZuO8K7PydzYc9YzvfoprwuJZv3l+1jxZ5M8god2G0CxlQ5xgbgmoEd+MdCa7JNAxzMOVm6Vqk/12NRSp1Fqk06xpg7ajvZm2Oahb0/WQmnVOoy6FJ5Pbqa1WXJ5l1H8tiTkc+L1/XjzuHxTBwez/VvLWfSjLVc2q8dLYIC2HH4OOvcHQxE4P9uHEi32PAa7zGyRwxvLS5ff+W5qxMqLIim426UUr7mdUcCEemBtdJnKPB3Y8xKXwXV6Jx09x4T22l1HPBsbwkOsPHpfTV3Cvhm0yFsAlcOsMYAtQoN5MlLe/HbTzfw5XprSaKI4PI/nQ3IyCvipiGda7xuVeNsqloQTSmlfKWmjgQhxphCj10vAn9wv/8GGOzLwBqNkpPWktPtB0PCtafVcWBVclbZbM1FDhcra1iy2RjD/M2HGdkjhpjw4LL9KVkF2MTqiWYXGDe4A1+uP1DnUkptK2QqpZQv1VTS+UZEPjLGfOjeLgHisZoDnL4OrNFImgH5R+DGaRB/eqswDOtacWDmkeNF1R675WAuqVkF/HZMj4rX6BZNUED5FDTjz+3E+HM7aSlFKdWk1JR0rgAeEpH/Av8LPAU8glW9NqEBYmtYaWtOnaCzpBCWvw5xF5x2wgFoERKAAS5JaEveyRJmrU1j4vA4erY9dXG3bzYdItAuXN6v4iSi1U1Bo8lGKdWU1NSRwAlMFZGPgD8BDwHPG2P2NlRwDWbvYvj4ejDGWl76rnlW4tnwEeQdhhvePaPLf7v5MDaBl28YAMBl//iZJ2dv4suHRhBgL58UwuWyqtZG94qlVVjgKdfRqjClVFNX7TQ4InK+iMwB/g1MB54HXhKRV0WkdQPF1zA2zbRW+cRYk3ju/NaaRXrZP6DL8DOaccAYw7ebDzO8ezQx4cHEhAfz4nX92Xwgl8lfb60wzU3S/hwO5xYyblCHevrFlFKqcampeu0d4CogHJhmjBkJ3Coio4FZwOUNEF/DMKUDLwUwsPZ9SN8Kxw/CiIcrLkVdRzsO55GceYJJo8rnNbt6YHs+XR3Np2vSECDALvz1hgF8t/UIATap0IFAKaWak5om/HRgdRyIA8oGqRhjlhhjmk/CAchKhrYD4OI/WR0GWnaAvT9Yn/3wF6u95zR9u+UQdptweb+2Ffaf08WqJjNAidPw1OzNLNxxBIfLcO+Mtac1yadSSjV2NSWd24HxwFhgYsOE4weFx+HwJuh9BYx6EvrfAANupmz2H2fxac8iXVq1NqJ7NNGVSi8X9WlDSKANu0BwgI0LesSUzTdUOjuAUko1NzV1JNgNPNmAsfhH2mowTojz6J3WbTQsfdVKOGcwi/S2Q8dJySrgwdHdT/mscm80gHWp2To7gFKqWatpcOh8Y8w1NZ3szTGNXsoysAVWHPDZeajVg61yF+o6mr/5MAG2U7s/l6rcG01X5VRKNXc1dSS4QETm1fC5AAn1HE/DS10OHc+FoBYV93ceekZLFiSlZDNzTSr9O7YiskWQV+dol2ilVHNXU9K5zovzi2s/pBEryoeD62Hko/V62aTUHG57bzXFThfbDuWSlJqjyUQppai5TWdJQwbiF6XtOfEX1Nsl8wpL+OuCHRQ7rbnWXC6jSwYopZTb2b1cdepya32cznVbpgBOXapgXUo2Hyzfx/I9WeSeLKm0to12ClBKKTjbk07KMuhwDgSH1+m00qUKih0ubDahQ6sQ0nJOAu61bcYPoHsbXTJAKaUqq2mcDgAiMk5Eaj2umnOvEJFdIrJHRJ6p4vN/iMhG989uETl2Ovc5LcUFVnvOaUzk+eX6AxQ5XNZ02y5D7smSss9swNH8YhLjIvntRT004SillAdvksktwC8i8n8i0sfbC4uIHXgTuBKrl9ttIlKht5sx5nFjzGBjzGDgX8CX3od+hg6sAVdJncfgfL8tnc/XWdPX2ARCAm08c2XfsoGeWp2mlFLVq7V6zRhzh4i0BG4DpouIAaYBM40xeTWcOhTYY4xJBhCRz7B6xG2v5vjbgBfqEvwZSVlurQTqZXtOUmoO7yzZy8LtRxjYuTWPXdyD7YfzdAVOpZSqA6/adIwxx90zTocCjwHXA78XkTeMMf+q5rSOQJrH9gGgyie8iMQBXYGfqvn8fuB+gC5dungTcu1SlkH7QRDSstZDk1JzuOWdlThcBpvA05f3ZkSPGC7qUz6fmo6xUUqp2nnTpnOtiHwFLAYCgaHGmCuBQdTfNDm3AnPca/icwhjzrjFmiDFmSGxs7JnfreSkVb0WEOLVZJ4r9mbicBnAGhG7Ia3hmp6UUqo58aZNZzzwD2PMAGPMK8aYDABjTAFwbw3nHQQ6e2x3cu+ryq3ATC9iqR+bZoLLAftXw4xra008sRHWZJ02bbNRSqkz4k312p+Bw6UbIhIKtDXGpBhjfqzhvLVATxHpipVsbsWauboCd+eESGBlHeI+M/tKZ412lc8iXcOUN9knrIkXHhzdnYv7ttVqNKWUOk3elHRmAy6Pbad7X42MMQ7gd8B3wA7gc2PMNhGZIiLXehx6K/CZMcZ4H/YZCm9jvYrdq1mkV+7NonfbCP5wRR9NOEopdQa8KekEGGM8F3ErFhGvZrA0xiwAFlTaN7nS9p+9uVa9CggBWwCM+SN0rXkW6SKHk7Up2dx6Xj11YFBKqbOYN0nnqIhca4yZByAi1wGZvg3Lx/KPQER7uLD2fhAb9x+jsMTFyB4xDRCYUko1b94knQeBT0RkKlbnrTSa+kqi+UfKq9hqsXxvFjaBoV2jfByUUko1f94MDt0LDBORcPd2vs+j8rW8IxAZ79WhK/dmMqBjK1qFBvo2JqWUOgt4NThURK4G+gEhIgKAMWaKD+PyrfwjXi3QVlDsYMP+Y0wa1a0BglJKqebPm8Ghb2PNv/YwVvXaTUCcj+PyHWcJFGRCeNtaD12bkoPDZRjRXcflKKVUffCmy/QIY8xEIMcY8xdgONDLt2H50Imj1mtE7Ulnxd5MAu3CefHanqOUUvXBm6RT6H4tEJEOQAnQ3nch+VheuvXqRUlnxZ4szukSSWiQ3cdBKaXU2cGbpPONiLQGXgHWAynAp74MyqfyM6zX8HY1HpZbUMLWQ7lataaUUvWoxo4E7sXbfjTGHAO+EJH5QIgxJrdBovOF/CPWay1dplfty8IYGNFdx+copVR9qbGkY4xxYS3EVrpd1KQTDlSbdJb+cpR//rCbpNQcAL7eeJAAm9CQs/MopVRz502X6R9FZDzwZYPOj+Yr+UcgNBICgst2/bTzCL+evg6Af/zwCx0jQzmYcxKAu6at4ZNJw3TONaWUqgfetOk8gDXBZ5GIHBeRPBE57uO4fCcv/ZT2nHkbD5W9F+BEkaNsu8ThYlVyVkNFp5RSzVqtSccYE2GMsRljgowxLd3btS+32VjlZ5xStWZzD3i1CwQH2vjD5X0ICbRh1/VzlFKqXtVavSYiF1a13xjzc/2H0wDy06HL8Aq70o8X0i2mBeMTOzGsWzSJcZH0bhfBquSssm2llFJnzps2nd97vA8BhgJJwFifRORLxpxS0nG5DJsP5PKrczrw24t6lO1PjIvUZKOUUvXMmwk/x3lui0hn4HWfReRLRcfBUVihTWfv0XzyixwM7qwJRimlfM2bjgSVHQD61ncgDSKvtLt0+WwEG9OOATC4cyt/RKSUUmcVb9p0/gWUdpW2AYOxZiZoeqoYo7Mx7RgRwQF0iwn3U1BKKXX28KZNZ53Hewcw0xiz3Efx+FZp0okor17bdOAYAzu3wmYTPwWllFJnD2+Szhyg0BjjBBARu4iEGWMKfBuaD1Qq6RSWONl5OI8HRut6OUop1RC8adP5EQj12A4FfvBNOD6Wlw72YAhpDcC2Q7k4XIZBnVr7OTCllDo7eJN0QjyXqHa/D/NdSD6Un2F1InAPBt2wv7QTgSYdpZRqCN4knRMicm7phogkAid9F5IP5adXWLxt04FcOrQKoU3LED8GpZRSZw9v2nQeA2aLyCGsqcnaYS1f3fTkZ0BUefvNxrQcBnfRUo5SSjUUbwaHrhWRPkBv965dxpgS34blI/lHyqbAycovIi37JHecH+fnoJRS6uxRa/WaiPwWaGGM2WqM2QqEi8hvfB9aPXMUQ0FW2cDQTQe0PUcppRqaN20697lXDgXAGJMD3Oe7kHzkxFHr1d2mszEtF5tA/446E4FSSjUUb5KOXUTKRk6KiB0I8l1IPpKfbr2GlyadY/RqG0GLYG+atZRSStUHb5LOf4FZInKxiFwMzHTvq5WIXCEiu0Rkj4g8U80xN4vIdhHZJiKfeh96HeVnWK/hbTHGkJSSTZDdVrY8tVJKKd/zJuk8DfwEPOT++ZGKyx1UyV0iehO4EkgAbhORhErH9ASeBUYaY/ph9ZTzjbzyks63Ww5zotjJloO5THh/lSYepZRqIN6sHOoyxrxtjLnRGHMjsB34lxfXHgrsMcYkG2OKgc+A6yodcx/wprudCGNMRt3Cr4PSkk6LWL5cfxCwZjHV5aiVUqrheLW0gYicIyL/JyIpwBRgpxendQTSPLYPuPd56gX0EpHlIrJKRK6o5v73i8g6EVl39OhRb0I+Vf4RCIuGgCAO5BQgoMtRK6VUA6u2FV1EegG3uX8ygVmAGGMuquf79wTGAJ2An0VkgGdvOQBjzLvAuwBDhgwxlS/ilfwjEN6WtOwCdh/J547z42jfOkSXo1ZKqQZUU9etncBS4BpjzB4AEXm8Dtc+CHT22O7k3ufpALDaPdh0n4jsxkpCa+twH++4k878zYcBeGB0NzpHNc0p5JRSqqmqqXrtBuAwsEhE3nP3XKvLojNrgZ4i0lVEgoBbgXmVjpmLVcpBRGKwqtuS63AP7+VZSeebTYcY3Lm1JhyllPKDapOOMWauMeZWoA+wCKtnWRsR+beIXFbbhY0xDuB3wHfADuBzY8w2EZkiIte6D/sOyBKR7e57/N4YU/+t+sZA/hFy7FFsP3yccYM61PstlFJK1c6buddOAJ8Cn4pIJHATVjfq7704dwGwoNK+yR7vDfCE+8d3Co+Bs4gtx4IRgasHtPfp7ZRSSlXNq95rpYwxOcaYd40xF/sqIJ9wd5f++bCNofFRtGulSxkopZQ/1CnpNFnJSwAoysvSqjWllPKj5p900tbA938E4LmATxgXlVbLCUoppXyl+SedlKUYpwOAQHHS6shqPweklFJnr2afdHaGDMJpBGOg2ASwM2SQv0NSSqmzVrNPOj/mx7PB1YMM05o7S/7Ij/nx/g5JKaXOWs0+6QzrFk1LOckm052t9j46z5pSSvlRs086iV1a09meSWZAWz6ZNEznWVNKKT9q9kmHwlzCzElo1VkTjlJK+VnzTzq57i7SrTr5Nw6llFLNP+kUZaYAEBgV599AlFJKNf+kk5tuTVod3q6rnyNRSinV7JNOYWYqhSaQ2DaVFy1VSinV0Jp90jHH0jhoYuio6+copZTfNfukE5h3kMPE0CZCZ5ZWSil/a/ZJJ7zwEDmBbbHb6rLoqVJKKV9o3kmnpJCWzhwKwnQ5A6WUagyad9I5fhAAZ4SO0VFKqcagWSedkqxUAAKiuvg5EqWUUtDMk85x9xidsFgdo6OUUo1Bs046BZkpOI0Q1V5nI1BKqcagWScdZ/Z+jhBJh6iW/g5FKaUUzTzp2PMOcNDE0L61jtFRSqnGoFknnbCCQ2TZ2xIcYPd3KEoppWjOScflpFXJUU6EtvN3JEoppdyab9LJP0IADkrCdYyOUko1Fs026bhy9gMgkZ39HIlSSqlSzTbp5B2xxuiExsT7NxCllFJlfJp0ROQKEdklIntE5JkqPr9bRI6KyEb3z6T6unfekRQAWunibUop1WgE+OrCImIH3gQuBQ4Aa0VknjFme6VDZxljflff9y/JTiXHhNMuNra+L62UUuo0+bKkMxTYY4xJNsYUA58B1/nwfhXYcq0xOh0jQxvqlkoppWrhy6TTEUjz2D7g3lfZeBHZLCJzRKTKVn8RuV9E1onIuqNHj3p185CCg2TYYgkP9llhTimlVB35uyPBN0C8MWYgsBCYUdVBxph3jTFDjDFDYr2pLjOGVkXp5AXrGB2llGpMfJl0DgKeJZdO7n1ljDFZxpgi9+b7QGK93LnwGCHmJEUtqipYKaWU8hdfJp21QE8R6SoiQcCtwDzPA0SkvcfmtcCO+rixOWaN0TGtdIyOUko1Jj5r8DDGOETkd8B3gB34wBizTUSmAOuMMfOAR0TkWsABZAN318e9C3csJBRoF1JcH5dTSilVT8QY4+8Y6mTIkCFm3bp11R+QtgbXtCuxuRw4bcHY75kPnYc2XIBKNVMikmSMGeLvOFTT5u+OBPUvZSnicgBYrylL/RyQUkqpUs0u6ewMGUSRCcRhhCJjZ2fIIH+HpJRSyq3ZDWL5MT+eH4ufY5htB2tMXy7Kj6ePv4NSSikFNMOkM6xbNP+y92GjsxdBATae7Rbt75CUUkq5NbukkxgXySf3DWNVchbDukWTGBfp75CUUkq5NbukA1bi0WSjlFKNT7PrSKCUUqrx0qSjlFKqwWjSUUop1WA06SillGowTW4aHBE5CqR6cWgMkOnjcOpDU4hTY6w/TSHO6mKMM8boUrzqjDS5pOMtEVnXFOaJagpxaoz1pynE2RRiVE2XVq8ppZRqMJp0lFJKNZjmnHTe9XcAXmoKcWqM9acpxNkUYlRNVLNt01FKKdX4NOeSjlJKqUamWSYdEblCRHaJyB4Recbf8QCIyAcikiEiWz32RYnIQhH5xf3q1wnjRKSziCwSke0isk1EHm2kcYaIyBoR2eSO8y/u/V1FZLX77z5LRIL8Gac7JruIbBCR+Y04xhQR2SIiG0VknXtfo/qbq+aj2SUdEbEDbwJXAgnAbSKS4N+oAJgOXFFp3zPAj8aYnsCP7m1/cgBPGmMSgGHAb93/do0tziJgrDFmEDAYuEJEhgF/A/5hjOkB5AD3+jHGUo8COzy2G2OMABcZYwZ7dJVubH9z1Uw0u6QDDAX2GGOSjTHFwGfAdX6OCWPMz0B2pd3XATPc72cAv2rQoCoxxhw2xqx3v8/Delh2pPHFaYwx+e7NQPePAcYCc9z7/R6niHQCrgbed28LjSzGGjSqv7lqPppj0ukIpHlsH3Dva4zaGmMOu9+nA239GYwnEYkHzgFW0wjjdFdbbQQygIXAXuCYMcbhPqQx/N1fB/4AuNzb0TS+GMFK2N+LSJKI3O/e1+j+5qp5aJbr6TRFxhgjIo2iK6GIhANfAI8ZY45bX9AtjSVOY4wTGCwirYGvoHGtSi4i1wAZxpgkERnj73hqcYEx5qCItAEWishOzw8by99cNQ/NsaRzEOjssd3Jva8xOiIi7QHcrxl+jgcRCcRKOJ8YY7507250cZYyxhwDFgHDgdYiUvpFyt9/95HAtSKSglXFOxb4J40rRgCMMQfdrxlYCXwojfhvrpq25ph01gI93b2EgoBbgXl+jqk684C73O/vAr72YyylbQ7/D9hhjHnN46PGFmesu4SDiIQCl2K1Py0CbnQf5tc4jTHPGmM6GWPisf4b/MkYM4FGFCOAiLQQkYjS98BlwFYa2d9cNR/NcnCoiFyFVZ9uBz4wxrzk55AQkZnAGKwZfI8ALwBzgc+BLlgzZ99sjKnc2aAhY7wAWApsobwd4o9Y7TqNKc6BWI3bdqwvTp8bY6aISDesUkUUsAG4wxhT5K84S7mr154yxlzT2GJ0x/OVezMA+NQY85KIRNOI/uaq+WiWSUcppVTj1Byr15RSSjVSmnSUUko1GE06SimlGowmHaWUUg1Gk45SSqkGo0lH1ZmION0zEpf+1NtkkCIS7zkTt1KqedFpcNTpOGmMGezvIJRSTY+WdFS9ca/L8n/utVnWiEgP9/54EflJRDaLyI8i0sW9v62IfOVeF2eTiIxwX8ouIu+518r53j3rgFKqGdCko05HaKXqtVs8Pss1xgwApmLNCgHwL2CGMWYg8Anwhnv/G8AS97o45wLb3Pt7Am8aY/oBx4DxPv59lFINRGckUHUmIvnGmPAq9qdgLa6W7J44NN0YEy0imUB7Y0yJe/9hY0yMiBwFOnlOA+NeUmGhe/EwRORpINAY8z++/82UUr6mJR1V30w17+vCcy4yJ9r2qFSzoUlH1bdbPF5Xut+vwJppGWAC1qSiYC2D/BCULcrWqqGCVEr5h36DVKcj1L1qZ6n/GmNKu01HishmrNLKbe59DwPTROT3wFHgHvf+R4F3ReRerBLNQ8BhlFLNlrbpqHrjbtMZYozJ9HcsSqnGSavXlFJKNRgt6SillGowWtJRSinVYDTpKKWUajCadJRSSjUYTTpKKaUajCYdpZRSDUaTjlJKqQbz/wE1QZY2fAmy5wAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [], + "needs_background": "light" + } + } + ] + } + ] +} \ No newline at end of file diff --git a/as2_Maggioni_Claudio/src/Assignment 2.pdf b/as2_Maggioni_Claudio/src/Assignment 2.pdf new file mode 100644 index 0000000..23932da Binary files /dev/null and b/as2_Maggioni_Claudio/src/Assignment 2.pdf differ diff --git a/as2_Maggioni_Claudio/src/my_civar10.csv b/as2_Maggioni_Claudio/src/my_civar10.csv new file mode 100644 index 0000000..1ff37dd --- /dev/null +++ b/as2_Maggioni_Claudio/src/my_civar10.csv @@ -0,0 +1,65 @@ +epoch,accuracy,loss,val_accuracy,val_loss +0,0.5835000276565552,0.9027522206306458,0.6930000185966492,0.7313074469566345 +1,0.6981666684150696,0.7197360992431641,0.7573333382606506,0.6185170412063599 +2,0.7268333435058594,0.649580717086792,0.7166666388511658,0.6847538352012634 +3,0.7490000128746033,0.6051502823829651,0.6726666688919067,0.8005362749099731 +4,0.7670000195503235,0.5711988806724548,0.7213333249092102,0.6812124252319336 +5,0.7771666646003723,0.5591645836830139,0.7956666946411133,0.5224239230155945 +6,0.7922499775886536,0.5163288712501526,0.7746666669845581,0.560214638710022 +7,0.796750009059906,0.5033565163612366,0.7696666717529297,0.5671213269233704 +8,0.8021666407585144,0.4886069595813751,0.7850000262260437,0.5377761721611023 +9,0.8159166574478149,0.46416983008384705,0.7973333597183228,0.5045124292373657 +10,0.8112499713897705,0.4600376784801483,0.7876666784286499,0.5360596776008606 +11,0.8222500085830688,0.44709986448287964,0.8173333406448364,0.46527454257011414 +12,0.8263333439826965,0.4383203983306885,0.8193333148956299,0.4525858163833618 +13,0.8272500038146973,0.42477884888648987,0.8209999799728394,0.46459323167800903 +14,0.828166663646698,0.4239223301410675,0.8133333325386047,0.4657036066055298 +15,0.8364999890327454,0.4106995165348053,0.8316666483879089,0.43970581889152527 +16,0.8379999995231628,0.40464073419570923,0.8309999704360962,0.43921899795532227 +17,0.843500018119812,0.3954917788505554,0.7919999957084656,0.5195531845092773 +18,0.8410833477973938,0.3955632746219635,0.8273333311080933,0.43989425897598267 +19,0.8451666831970215,0.3902503252029419,0.8213333487510681,0.4508914649486542 +20,0.8479999899864197,0.38045644760131836,0.8373333215713501,0.4345586895942688 +21,0.8519166707992554,0.3750855624675751,0.8363333344459534,0.42357707023620605 +22,0.8511666655540466,0.3793516457080841,0.8386666774749756,0.4240257143974304 +23,0.8541666865348816,0.36603105068206787,0.8256666660308838,0.44586217403411865 +24,0.8585000038146973,0.36231645941734314,0.8309999704360962,0.445521742105484 +25,0.856166660785675,0.36122143268585205,0.8373333215713501,0.4368632435798645 +26,0.8581666946411133,0.35858675837516785,0.8403333425521851,0.4214838743209839 +27,0.859250009059906,0.3539867699146271,0.8429999947547913,0.427225261926651 +28,0.8620833158493042,0.35064488649368286,0.8456666469573975,0.4140232801437378 +29,0.8643333315849304,0.34821170568466187,0.8349999785423279,0.41891446709632874 +30,0.8681666851043701,0.3375576138496399,0.8396666646003723,0.4178491234779358 +31,0.8667500019073486,0.3405560851097107,0.8173333406448364,0.4889259934425354 +32,0.8673333525657654,0.3408810496330261,0.8376666903495789,0.4319263994693756 +33,0.8669999837875366,0.3369854688644409,0.843999981880188,0.4151623845100403 +34,0.8674166798591614,0.333772212266922,0.8483333587646484,0.4048003852367401 +35,0.8738333582878113,0.3248937129974365,0.8296666741371155,0.44129908084869385 +36,0.8731666803359985,0.32493579387664795,0.8226666450500488,0.4935603141784668 +37,0.8696666955947876,0.3261963427066803,0.8453333377838135,0.41957756876945496 +38,0.8737499713897705,0.32006925344467163,0.8403333425521851,0.4206053912639618 +39,0.874750018119812,0.31747952103614807,0.8383333086967468,0.44560351967811584 +40,0.8740000128746033,0.31876373291015625,0.8349999785423279,0.42805150151252747 +41,0.878250002861023,0.3102569580078125,0.8486666679382324,0.4195503890514374 +42,0.8784166574478149,0.31086069345474243,0.8446666598320007,0.415301114320755 +43,0.8816666603088379,0.3063320219516754,0.8423333168029785,0.44579166173934937 +44,0.8821666836738586,0.3044925630092621,0.8429999947547913,0.4366433620452881 +45,0.8818333148956299,0.3025430738925934,0.8339999914169312,0.4812167286872864 +46,0.8836666941642761,0.3021080493927002,0.8456666469573975,0.4139736592769623 +47,0.8850833177566528,0.295707106590271,0.8486666679382324,0.4226844906806946 +48,0.8834999799728394,0.2955981492996216,0.8413333296775818,0.45863327383995056 +49,0.8835833072662354,0.29366767406463623,0.8529999852180481,0.4104471504688263 +50,0.8878333568572998,0.28793343901634216,0.7730000019073486,0.6682868599891663 +51,0.8879166841506958,0.28880414366722107,0.8410000205039978,0.42320355772972107 +52,0.8888333439826965,0.28470084071159363,0.8463333249092102,0.4198496639728546 +53,0.8919166922569275,0.28202226758003235,0.8536666631698608,0.41628679633140564 +54,0.8900833129882812,0.2791743874549866,0.7583333253860474,0.7379936575889587 +55,0.89083331823349,0.28209903836250305,0.8263333439826965,0.5025476813316345 +56,0.8914166688919067,0.2790076732635498,0.8493333458900452,0.41152042150497437 +57,0.8944166898727417,0.27665185928344727,0.8523333072662354,0.4220462143421173 +58,0.8953333497047424,0.27209198474884033,0.843999981880188,0.4234614670276642 +59,0.8932499885559082,0.2731625735759735,0.8393333554267883,0.4399682283401489 +60,0.893833339214325,0.27378541231155396,0.8426666855812073,0.4259805977344513 +61,0.8951666951179504,0.270453542470932,0.8266666531562805,0.46296101808547974 +62,0.893750011920929,0.26978862285614014,0.8383333086967468,0.4388676881790161 +63,0.8979166746139526,0.2648368179798126,0.8486666679382324,0.4405277371406555 diff --git a/as2_Maggioni_Claudio/src/my_vgg16_aug.csv b/as2_Maggioni_Claudio/src/my_vgg16_aug.csv new file mode 100644 index 0000000..e69de29 diff --git a/as2_Maggioni_Claudio/src/my_vgg16_noaug.csv b/as2_Maggioni_Claudio/src/my_vgg16_noaug.csv new file mode 100644 index 0000000..f101b0c --- /dev/null +++ b/as2_Maggioni_Claudio/src/my_vgg16_noaug.csv @@ -0,0 +1,35 @@ +epoch,acc,loss,val_acc,val_loss +0,0.5649999976158142,0.955588161945343,0.0,1.6278291940689087 +1,0.6608333587646484,0.7884675860404968,0.07000000029802322,1.5708225965499878 +2,0.753333330154419,0.6204149127006531,0.06333333253860474,1.3915884494781494 +3,0.7858333587646484,0.5166330337524414,0.17000000178813934,1.3135371208190918 +4,0.8041666746139526,0.4865335524082184,0.10333333164453506,1.3810070753097534 +5,0.8191666603088379,0.4453645348548889,0.23000000417232513,1.46454918384552 +6,0.8583333492279053,0.37045857310295105,0.20666666328907013,1.4838533401489258 +7,0.8816666603088379,0.3096787631511688,0.38999998569488525,1.2644957304000854 +8,0.9008333086967468,0.2890835702419281,0.5233333110809326,1.0314409732818604 +9,0.9158333539962769,0.2402680516242981,0.4300000071525574,1.3342552185058594 +10,0.92166668176651,0.22173365950584412,0.5299999713897705,1.3371758460998535 +11,0.9208333492279053,0.20474030077457428,0.30666667222976685,1.889732837677002 +12,0.9291666746139526,0.1892756074666977,0.4833333194255829,1.3816964626312256 +13,0.9483333230018616,0.14501522481441498,0.44999998807907104,1.5450505018234253 +14,0.9624999761581421,0.1241658627986908,0.4099999964237213,1.6707463264465332 +15,0.95333331823349,0.1430375725030899,0.5566666722297668,1.2758015394210815 +16,0.9483333230018616,0.1334572732448578,0.46000000834465027,1.5948697328567505 +17,0.9574999809265137,0.12817558646202087,0.43666666746139526,1.7551212310791016 +18,0.965833306312561,0.09678161144256592,0.4933333396911621,1.5502151250839233 +19,0.9683333039283752,0.09066082537174225,0.5133333206176758,1.7609212398529053 +20,0.9758333563804626,0.08929727226495743,0.6399999856948853,1.2165297269821167 +21,0.9800000190734863,0.07535427808761597,0.4566666781902313,2.07904052734375 +22,0.9758333563804626,0.08100691437721252,0.4266666769981384,2.1591386795043945 +23,0.9725000262260437,0.08202476799488068,0.5866666436195374,1.6197659969329834 +24,0.9775000214576721,0.06814886629581451,0.5133333206176758,1.9039390087127686 +25,0.9750000238418579,0.06604871153831482,0.41333332657814026,2.4949522018432617 +26,0.9783333539962769,0.06543738394975662,0.503333330154419,2.129647731781006 +27,0.9758333563804626,0.07011312991380692,0.4099999964237213,2.689142942428589 +28,0.9758333563804626,0.08310797810554504,0.5233333110809326,1.7345548868179321 +29,0.9783333539962769,0.061532020568847656,0.653333306312561,1.3687851428985596 +30,0.9766666889190674,0.05942004546523094,0.3733333349227905,2.6938610076904297 +31,0.9800000190734863,0.07085715979337692,0.46666666865348816,2.068704843521118 +32,0.987500011920929,0.053222622722387314,0.54666668176651,1.8517098426818848 +33,0.9866666793823242,0.04822404310107231,0.476666659116745,2.0873990058898926 diff --git a/as2_Maggioni_Claudio/src/t_test.py b/as2_Maggioni_Claudio/src/t_test.py new file mode 100644 index 0000000..395853a --- /dev/null +++ b/as2_Maggioni_Claudio/src/t_test.py @@ -0,0 +1,22 @@ +import joblib +import numpy as np +from keras import models +import scipy.stats + +# Import the accuracy of both models +e_a = 0.7733333110809326 # without augmentation +e_b = 0.8999999761581421 # with data augmentation + +# # of data points in both test sets +L = 300 + +# Compute classification variance for both models +s_a = e_a * (1 - e_a) +s_b = e_b * (1 - e_b) + +# Compute Student's T-test +T = (e_a - e_b) / np.sqrt((s_a / L) + (s_b / L)) +print("T test:\t\t\t %1.06f" % T) +print("P-value:\t\t %1.06f" % (scipy.stats.t.sf(abs(T), df=L) * 2)) +print("No aug variance:\t %1.06f" % s_a) +print("With aug variance:\t %1.06f" % s_b) diff --git a/as2_Maggioni_Claudio/t1_plot.png b/as2_Maggioni_Claudio/t1_plot.png new file mode 100644 index 0000000..a02bab6 Binary files /dev/null and b/as2_Maggioni_Claudio/t1_plot.png differ diff --git a/as2_Maggioni_Claudio/t2_aug.png b/as2_Maggioni_Claudio/t2_aug.png new file mode 100644 index 0000000..84f2b99 Binary files /dev/null and b/as2_Maggioni_Claudio/t2_aug.png differ diff --git a/as2_Maggioni_Claudio/t2_noaug.png b/as2_Maggioni_Claudio/t2_noaug.png new file mode 100644 index 0000000..04e5d51 Binary files /dev/null and b/as2_Maggioni_Claudio/t2_noaug.png differ