{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "ND3MyDtqk4hX"
},
"source": [
"Updated 19/Nov/2021 by Yoshihisa Nitta \n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "fQVCsdk0mByf"
},
"source": [
"# AutoEncoder Training for MNIST dataset with Tensorflow 2 on Google Colab\n",
"## MNISTデータセットに対して AutoEncoder を Google Colab 上で Tensorflow 2 で訓練する"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "hHAN1RoasvZb"
},
"outputs": [],
"source": [
"#! pip install tensorflow==2.7.0"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 2265,
"status": "ok",
"timestamp": 1637562083627,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "MKpI5MGclKv9",
"outputId": "98d2c9ca-77aa-480e-c785-2cc212eeb3f4"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"2.7.0\n"
]
}
],
"source": [
"%tensorflow_version 2.x\n",
"\n",
"import tensorflow as tf\n",
"print(tf.__version__)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "d9Eo7tUQVTGj"
},
"source": [
"# AutoEncoder\n",
"\n",
"
\n", "「難しい事後分布を持つ連続潜在変数が存在し、さらにデータセットが大きい場合に、有効確率モデルを用いて効果的な推論と学習をどうやって行えばよいのだろうか」という問題がある。\n", "この論文では、ある穏やかな(mild)微分可能条件下では適用できる、確率的変分推論と学習アルゴリズムを紹介する。\n", "貢献する点は2点である。\n", "「変分下限の再パラメータ化により、普通のSGDを用いてtraining可能な下限推定量が得られる」\n", "「提案する下限推定器を用いて近似推論モデルを学習することによって、データポイント毎の連続潜在変数を持つ i.i.d データセットに対して事後推定が効率的に実行できる。」\n", "
\n", "gdown
from Google Drive. Download from nw.tsuda.ac.jp above only if the specifications of Google Drive change and you cannot download from Google Drive.\n",
"\n",
"## Google Drive または nw.tsuda.ac.jp からファイルをダウンロードする\n",
"\n",
"基本的に Google Drive から gdown
してください。 Google Drive の仕様が変わってダウンロードができない場合にのみ、nw.tsuda.ac.jp からダウンロードしてください。"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 2587,
"status": "ok",
"timestamp": 1637562115282,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "E_pTWqexKLEK",
"outputId": "9928a675-7968-46a3-a542-551bc363435a"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading...\n",
"From: https://drive.google.com/uc?id=1ZDgWE7wmVwG_ZuQVUjuh_XHeIO-7Yn63\n",
"To: /content/nw/AutoEncoder.py\n",
"\r",
" 0% 0.00/13.9k [00:00, ?B/s]\r",
"100% 13.9k/13.9k [00:00<00:00, 21.5MB/s]\n"
]
}
],
"source": [
"# Download source file\n",
"nw_path = './nw'\n",
"! rm -rf {nw_path}\n",
"! mkdir -p {nw_path}\n",
"\n",
"if True: # from Google Drive\n",
" url_model = 'https://drive.google.com/uc?id=1ZDgWE7wmVwG_ZuQVUjuh_XHeIO-7Yn63'\n",
" ! (cd {nw_path}; gdown {url_model})\n",
"else: # from nw.tsuda.ac.jp\n",
" URL_NW = 'https://nw.tsuda.ac.jp/lec/GoogleColab/pub'\n",
" url_model = f'{URL_NW}/models/AutoEncoder.py'\n",
" ! wget -nd {url_model} -P {nw_path} # download to './nw/AutoEncoder.py'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 413,
"status": "ok",
"timestamp": 1637562115693,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "r4iGaqqgbx9s",
"outputId": "dfbe9b22-bc54-4e98-cc44-21e6b4114169"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"import tensorflow as tf\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"\n",
"import os\n",
"import pickle\n",
"import datetime\n",
"\n",
"class AutoEncoder():\n",
" def __init__(self, \n",
" input_dim,\n",
" encoder_conv_filters,\n",
" encoder_conv_kernel_size,\n",
" encoder_conv_strides,\n",
" decoder_conv_t_filters,\n",
" decoder_conv_t_kernel_size,\n",
" decoder_conv_t_strides,\n",
" z_dim,\n",
" use_batch_norm = False,\n",
" use_dropout = False,\n",
" epoch = 0\n",
" ):\n",
" self.name = 'autoencoder'\n",
" self.input_dim = input_dim\n",
" self.encoder_conv_filters = encoder_conv_filters\n",
" self.encoder_conv_kernel_size = encoder_conv_kernel_size\n",
" self.encoder_conv_strides = encoder_conv_strides\n",
" self.decoder_conv_t_filters = decoder_conv_t_filters\n",
" self.decoder_conv_t_kernel_size = decoder_conv_t_kernel_size\n",
" self.decoder_conv_t_strides = decoder_conv_t_strides\n",
" self.z_dim = z_dim\n",
" \n",
" self.use_batch_norm = use_batch_norm\n",
" self.use_dropout = use_dropout\n",
"\n",
" self.epoch = epoch\n",
" \n",
" self.n_layers_encoder = len(encoder_conv_filters)\n",
" self.n_layers_decoder = len(decoder_conv_t_filters)\n",
" \n",
" self._build()\n",
" \n",
"\n",
" def _build(self):\n",
" ### THE ENCODER\n",
" encoder_input = tf.keras.layers.Input(shape=self.input_dim, name='encoder_input')\n",
" x = encoder_input\n",
" \n",
" for i in range(self.n_layers_encoder):\n",
" x = tf.keras.layers.Conv2D(\n",
" filters = self.encoder_conv_filters[i],\n",
" kernel_size = self.encoder_conv_kernel_size[i],\n",
" strides = self.encoder_conv_strides[i],\n",
" padding = 'same',\n",
" name = 'encoder_conv_' + str(i)\n",
" )(x)\n",
" x = tf.keras.layers.LeakyReLU()(x)\n",
" if self.use_batch_norm:\n",
" x = tf.keras.layers.BatchNormalization()(x)\n",
" if self.use_dropout:\n",
" x = tf.keras.layers.Dropout(rate = 0.25)(x)\n",
" \n",
" shape_before_flattening = tf.keras.backend.int_shape(x)[1:] # shape for 1 data\n",
" \n",
" x = tf.keras.layers.Flatten()(x)\n",
" encoder_output = tf.keras.layers.Dense(self.z_dim, name='encoder_output')(x)\n",
" \n",
" self.encoder = tf.keras.models.Model(encoder_input, encoder_output)\n",
" \n",
" ### THE DECODER\n",
" decoder_input = tf.keras.layers.Input(shape=(self.z_dim,), name='decoder_input')\n",
" x = tf.keras.layers.Dense(np.prod(shape_before_flattening))(decoder_input)\n",
" x = tf.keras.layers.Reshape(shape_before_flattening)(x)\n",
" \n",
" for i in range(self.n_layers_decoder):\n",
" x = tf.keras.layers.Conv2DTranspose(\n",
" filters = self.decoder_conv_t_filters[i],\n",
" kernel_size = self.decoder_conv_t_kernel_size[i],\n",
" strides = self.decoder_conv_t_strides[i],\n",
" padding = 'same',\n",
" name = 'decoder_conv_t_' + str(i)\n",
" )(x)\n",
" \n",
" if i < self.n_layers_decoder - 1:\n",
" x = tf.keras.layers.LeakyReLU()(x)\n",
" if self.use_batch_norm:\n",
" x = tf.keras.layers.BatchNormalization()(x)\n",
" if self.use_dropout:\n",
" x = tf.keras.layers.Dropout(rate=0.25)(x)\n",
" else:\n",
" x = tf.keras.layers.Activation('sigmoid')(x)\n",
" \n",
" decoder_output = x\n",
" self.decoder = tf.keras.models.Model(decoder_input, decoder_output)\n",
" \n",
" ### THE FULL AUTOENCODER\n",
" model_input = encoder_input\n",
" model_output = self.decoder(encoder_output)\n",
" \n",
" self.model = tf.keras.models.Model(model_input, model_output)\n",
"\n",
"\n",
" def save(self, folder):\n",
" self.save_params(os.path.join(folder, 'params.pkl'))\n",
" self.save_weights(os.path.join(folder, 'weights/weights.h5'))\n",
"\n",
"\n",
" @staticmethod\n",
" def load(folder, epoch=None): # AutoEncoder.load(folder)\n",
" params = AutoEncoder.load_params(os.path.join(folder, 'params.pkl'))\n",
" AE = AutoEncoder(*params)\n",
" if epoch is None:\n",
" AE.model.load_weights(os.path.join(folder, 'weights/weights.h5'))\n",
" else:\n",
" AE.model.load_weights(os.path.join(folder, f'weights/weights_{epoch-1}.h5'))\n",
" AE.epoch = epoch\n",
"\n",
" return AE\n",
"\n",
"\n",
" def save_params(self, filepath):\n",
" dpath, fname = os.path.split(filepath)\n",
" if dpath != '' and not os.path.exists(dpath):\n",
" os.makedirs(dpath)\n",
" with open(filepath, 'wb') as f:\n",
" pickle.dump([\n",
" self.input_dim,\n",
" self.encoder_conv_filters,\n",
" self.encoder_conv_kernel_size,\n",
" self.encoder_conv_strides,\n",
" self.decoder_conv_t_filters,\n",
" self.decoder_conv_t_kernel_size,\n",
" self.decoder_conv_t_strides,\n",
" self.z_dim,\n",
" self.use_batch_norm,\n",
" self.use_dropout,\n",
" self.epoch\n",
" ], f)\n",
"\n",
"\n",
" @staticmethod\n",
" def load_params(filepath):\n",
" with open(filepath, 'rb') as f:\n",
" params = pickle.load(f)\n",
" return params\n",
"\n",
"\n",
" def save_weights(self, filepath):\n",
" dpath, fname = os.path.split(filepath)\n",
" if dpath != '' and not os.path.exists(dpath):\n",
" os.makedirs(dpath)\n",
" self.model.save_weights(filepath)\n",
" \n",
" \n",
" def load_weights(self, filepath):\n",
" self.model.load_weights(filepath)\n",
"\n",
"\n",
" def save_images(self, imgs, filepath):\n",
" z_points = self.encoder.predict(imgs)\n",
" reconst_imgs = self.decoder.predict(z_points)\n",
" txts = [ f'{p[0]:.3f}, {p[1]:.3f}' for p in z_points ]\n",
" AutoEncoder.showImages(imgs, reconst_imgs, txts, 1.4, 1.4, 0.5, filepath)\n",
" \n",
"\n",
" @staticmethod\n",
" def r_loss(y_true, y_pred):\n",
" return tf.keras.backend.mean(tf.keras.backend.square(y_true - y_pred), axis=[1,2,3])\n",
"\n",
"\n",
" def compile(self, learning_rate):\n",
" self.learning_rate = learning_rate\n",
" optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate)\n",
" self.model.compile(optimizer=optimizer, loss = AutoEncoder.r_loss)\n",
"\n",
" \n",
" def train_with_fit(self,\n",
" x_train,\n",
" y_train,\n",
" batch_size,\n",
" epochs,\n",
" run_folder='run/',\n",
" validation_data=None\n",
" ):\n",
" history= self.model.fit(\n",
" x_train,\n",
" y_train,\n",
" batch_size = batch_size,\n",
" shuffle = True,\n",
" initial_epoch = self.epoch,\n",
" epochs = epochs,\n",
" validation_data = validation_data\n",
" )\n",
" if self.epoch < epochs:\n",
" self.epoch = epochs\n",
"\n",
" if run_folder != None:\n",
" self.save(run_folder)\n",
" self.save_weights(os.path.join(run_folder,f'weights/weights_{self.epoch-1}.h5'))\n",
" #idxs = np.random.choice(len(x_train), 10)\n",
" #self.save_images(x_train[idxs], os.path.join(run_folder, f'images/image_{self.epoch-1}.png'))\n",
"\n",
" return history\n",
" \n",
" \n",
" def train(self,\n",
" x_train,\n",
" y_train,\n",
" batch_size = 32,\n",
" epochs = 10,\n",
" shuffle=False,\n",
" run_folder='run/',\n",
" optimizer=None,\n",
" save_epoch_interval=100,\n",
" validation_data = None\n",
" ):\n",
" start_time = datetime.datetime.now()\n",
" steps = x_train.shape[0] // batch_size\n",
"\n",
" losses = []\n",
" val_losses = []\n",
"\n",
" for epoch in range(self.epoch, epochs):\n",
" epoch_loss = 0\n",
" indices = tf.range(x_train.shape[0], dtype=tf.int32)\n",
" if shuffle:\n",
" indices = tf.random.shuffle(indices)\n",
" x_ = x_train[indices]\n",
" y_ = y_train[indices]\n",
" \n",
" for step in range(steps):\n",
" start = batch_size * step\n",
" end = start + batch_size\n",
"\n",
" with tf.GradientTape() as tape:\n",
" outputs = self.model(x_[start:end])\n",
" tmp_loss = AutoEncoder.r_loss(y_[start:end], outputs)\n",
"\n",
" grads = tape.gradient(tmp_loss, self.model.trainable_variables)\n",
" optimizer.apply_gradients(zip(grads, self.model.trainable_variables))\n",
"\n",
" epoch_loss = np.mean(tmp_loss)\n",
" losses.append(epoch_loss)\n",
"\n",
" val_str = ''\n",
" if validation_data != None:\n",
" x_val, y_val = validation_data\n",
" outputs_val = self.model(x_val)\n",
" val_loss = np.mean(AutoEncoder.r_loss(y_val, outputs_val))\n",
" val_str = f'val loss: {val_loss:.4f} '\n",
" val_losses.append(val_loss)\n",
"\n",
"\n",
" if (epoch+1) % save_epoch_interval == 0 and run_folder != None:\n",
" self.save(run_folder)\n",
" self.save_weights(os.path.join(run_folder,f'weights/weights_{self.epoch}.h5'))\n",
" #idxs = np.random.choice(len(x_train), 10)\n",
" #self.save_images(x_train[idxs], os.path.join(run_folder, f'images/image_{self.epoch}.png'))\n",
"\n",
" elapsed_time = datetime.datetime.now() - start_time\n",
" print(f'{epoch+1}/{epochs} {steps} loss: {epoch_loss:.4f} {val_str}{elapsed_time}')\n",
"\n",
" self.epoch += 1\n",
"\n",
" if run_folder != None:\n",
" self.save(run_folder)\n",
" self.save_weights(os.path.join(run_folder,f'weights/weights_{self.epoch-1}.h5'))\n",
" #idxs = np.random.choice(len(x_train), 10)\n",
" #self.save_images(x_train[idxs], os.path.join(run_folder, f'images/image_{self.epoch-1}.png'))\n",
"\n",
" return losses, val_losses\n",
"\n",
" @staticmethod\n",
" @tf.function\n",
" def compute_loss_and_grads(model,x,y):\n",
" with tf.GradientTape() as tape:\n",
" outputs = model(x)\n",
" tmp_loss = AutoEncoder.r_loss(y,outputs)\n",
" grads = tape.gradient(tmp_loss, model.trainable_variables)\n",
" return tmp_loss, grads\n",
"\n",
"\n",
" def train_tf(self,\n",
" x_train,\n",
" y_train,\n",
" batch_size = 32,\n",
" epochs = 10,\n",
" shuffle=False,\n",
" run_folder='run/',\n",
" optimizer=None,\n",
" save_epoch_interval=100,\n",
" validation_data = None\n",
" ):\n",
" start_time = datetime.datetime.now()\n",
" steps = x_train.shape[0] // batch_size\n",
"\n",
" losses = []\n",
" val_losses = []\n",
"\n",
" for epoch in range(self.epoch, epochs):\n",
" epoch_loss = 0\n",
" indices = tf.range(x_train.shape[0], dtype=tf.int32)\n",
" if shuffle:\n",
" indices = tf.random.shuffle(indices)\n",
" x_ = x_train[indices]\n",
" y_ = y_train[indices]\n",
"\n",
" step_losses = []\n",
" for step in range(steps):\n",
" start = batch_size * step\n",
" end = start + batch_size\n",
"\n",
" tmp_loss, grads = AutoEncoder.compute_loss_and_grads(self.model, x_[start:end], y_[start:end])\n",
" optimizer.apply_gradients(zip(grads, self.model.trainable_variables))\n",
"\n",
" step_losses.append(np.mean(tmp_loss))\n",
"\n",
" epoch_loss = np.mean(step_losses)\n",
" losses.append(epoch_loss)\n",
"\n",
" val_str = ''\n",
" if validation_data != None:\n",
" x_val, y_val = validation_data\n",
" outputs_val = self.model(x_val)\n",
" val_loss = np.mean(AutoEncoder.r_loss(y_val, outputs_val))\n",
" val_str = f'val loss: {val_loss:.4f} '\n",
" val_losses.append(val_loss)\n",
"\n",
"\n",
" if (epoch+1) % save_epoch_interval == 0 and run_folder != None:\n",
" self.save(run_folder)\n",
" self.save_weights(os.path.join(run_folder,f'weights/weights_{self.epoch}.h5'))\n",
" #idxs = np.random.choice(len(x_train), 10)\n",
" #self.save_images(x_train[idxs], os.path.join(run_folder, f'images/image_{self.epoch}.png'))\n",
"\n",
" elapsed_time = datetime.datetime.now() - start_time\n",
" print(f'{epoch+1}/{epochs} {steps} loss: {epoch_loss:.4f} {val_str}{elapsed_time}')\n",
"\n",
" self.epoch += 1\n",
"\n",
" if run_folder != None:\n",
" self.save(run_folder)\n",
" self.save_weights(os.path.join(run_folder,f'weights/weights_{self.epoch-1}.h5'))\n",
" #idxs = np.random.choice(len(x_train), 10)\n",
" #self.save_images(x_train[idxs], os.path.join(run_folder, f'images/image_{self.epoch-1}.png'))\n",
"\n",
" return losses, val_losses\n",
"\n",
"\n",
" @staticmethod\n",
" def showImages(imgs1, imgs2, txts, w, h, vskip=0.5, filepath=None):\n",
" n = len(imgs1)\n",
" fig, ax = plt.subplots(2, n, figsize=(w * n, (2+vskip) * h))\n",
" for i in range(n):\n",
" if n == 1:\n",
" axis = ax[0]\n",
" else:\n",
" axis = ax[0][i]\n",
" img = imgs1[i].squeeze()\n",
" axis.imshow(img, cmap='gray_r')\n",
" axis.axis('off')\n",
"\n",
" axis.text(0.5, -0.35, txts[i], fontsize=10, ha='center', transform=axis.transAxes)\n",
"\n",
" if n == 1:\n",
" axis = ax[1]\n",
" else:\n",
" axis = ax[1][i]\n",
" img2 = imgs2[i].squeeze()\n",
" axis.imshow(img2, cmap='gray_r')\n",
" axis.axis('off')\n",
"\n",
" if not filepath is None:\n",
" dpath, fname = os.path.split(filepath)\n",
" if dpath != '' and not os.path.exists(dpath):\n",
" os.makedirs(dpath)\n",
" fig.savefig(filepath, dpi=600)\n",
" plt.close()\n",
" else:\n",
" plt.show()\n",
"\n",
" @staticmethod\n",
" def plot_history(vals, labels):\n",
" colors = ['red', 'blue', 'green', 'orange', 'black']\n",
" n = len(vals)\n",
" fig, ax = plt.subplots(1, 1, figsize=(9,4))\n",
" for i in range(n):\n",
" ax.plot(vals[i], c=colors[i], label=labels[i])\n",
" ax.legend(loc='upper right')\n",
" ax.set_xlabel('epochs')\n",
" # ax[0].set_ylabel('loss')\n",
" \n",
" plt.show()\n"
]
}
],
"source": [
"!cat {nw_path}/AutoEncoder.py"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "WUuOKXf9wIrn"
},
"source": [
"# Preparing the MNIST datasets\n",
"## MNIST データセットを用意する"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "ERKfpLv7UPvU"
},
"outputs": [],
"source": [
"import tensorflow as tf\n",
"import numpy as np"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 259,
"status": "ok",
"timestamp": 1637562115945,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "2POVZ4obViuQ",
"outputId": "611f5642-e101-4a9f-d2c5-6509cdd66993"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz\n",
"11493376/11490434 [==============================] - 0s 0us/step\n",
"11501568/11490434 [==============================] - 0s 0us/step\n",
"(60000, 28, 28)\n",
"(60000,)\n",
"(10000, 28, 28)\n",
"(10000,)\n"
]
}
],
"source": [
"# MNIST datasets\n",
"(x_train_raw, y_train_raw), (x_test_raw, y_test_raw) = tf.keras.datasets.mnist.load_data()\n",
"print(x_train_raw.shape)\n",
"print(y_train_raw.shape)\n",
"print(x_test_raw.shape)\n",
"print(y_test_raw.shape)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 4,
"status": "ok",
"timestamp": 1637562115945,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "uI_CQRWMvxHB",
"outputId": "09da4335-e9ff-44b8-824d-3a2feaa36100"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"(60000, 28, 28, 1)\n",
"(10000, 28, 28, 1)\n"
]
}
],
"source": [
"x_train = x_train_raw.reshape(x_train_raw.shape+(1,)).astype('float32') / 255.0\n",
"x_test = x_test_raw.reshape(x_test_raw.shape+(1,)).astype('float32') / 255.0\n",
"print(x_train.shape)\n",
"print(x_test.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "86P5y0nqzvBU"
},
"source": [
"# Define the Neural Network Model\n",
"\n",
"Use the AutoEncoder
class downloaded from nw.tsuda.ac.jp.\n",
"\n",
"## ニューラルネットワーク・モデル の定義\n",
"\n",
"nw.tsuda.ac.jp からダウンロードした AutoEncoder
クラスを使う。"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "tLCR0nt-zGOE"
},
"outputs": [],
"source": [
"from nw.AutoEncoder import AutoEncoder\n",
"\n",
"AE = AutoEncoder(\n",
" input_dim = (28, 28, 1),\n",
" encoder_conv_filters = [32, 64, 64, 64],\n",
" encoder_conv_kernel_size = [3, 3, 3, 3],\n",
" encoder_conv_strides = [1, 2, 2, 1],\n",
" decoder_conv_t_filters = [64, 64, 32, 1],\n",
" decoder_conv_t_kernel_size = [3, 3, 3, 3],\n",
" decoder_conv_t_strides = [1, 2, 2, 1],\n",
" z_dim = 2\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 14,
"status": "ok",
"timestamp": 1637562119696,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "jhSugt50koOt",
"outputId": "8e071c5c-eec9-4aa6-d02e-95321bf540d0"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"model\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" encoder_input (InputLayer) [(None, 28, 28, 1)] 0 \n",
" \n",
" encoder_conv_0 (Conv2D) (None, 28, 28, 32) 320 \n",
" \n",
" leaky_re_lu (LeakyReLU) (None, 28, 28, 32) 0 \n",
" \n",
" encoder_conv_1 (Conv2D) (None, 14, 14, 64) 18496 \n",
" \n",
" leaky_re_lu_1 (LeakyReLU) (None, 14, 14, 64) 0 \n",
" \n",
" encoder_conv_2 (Conv2D) (None, 7, 7, 64) 36928 \n",
" \n",
" leaky_re_lu_2 (LeakyReLU) (None, 7, 7, 64) 0 \n",
" \n",
" encoder_conv_3 (Conv2D) (None, 7, 7, 64) 36928 \n",
" \n",
" leaky_re_lu_3 (LeakyReLU) (None, 7, 7, 64) 0 \n",
" \n",
" flatten (Flatten) (None, 3136) 0 \n",
" \n",
" encoder_output (Dense) (None, 2) 6274 \n",
" \n",
"=================================================================\n",
"Total params: 98,946\n",
"Trainable params: 98,946\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n"
]
}
],
"source": [
"AE.encoder.summary()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 12,
"status": "ok",
"timestamp": 1637562119697,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "zG6ifGO2SjTD",
"outputId": "520660c4-5149-4070-be4e-82374fed726f"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"model_1\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" decoder_input (InputLayer) [(None, 2)] 0 \n",
" \n",
" dense (Dense) (None, 3136) 9408 \n",
" \n",
" reshape (Reshape) (None, 7, 7, 64) 0 \n",
" \n",
" decoder_conv_t_0 (Conv2DTra (None, 7, 7, 64) 36928 \n",
" nspose) \n",
" \n",
" leaky_re_lu_4 (LeakyReLU) (None, 7, 7, 64) 0 \n",
" \n",
" decoder_conv_t_1 (Conv2DTra (None, 14, 14, 64) 36928 \n",
" nspose) \n",
" \n",
" leaky_re_lu_5 (LeakyReLU) (None, 14, 14, 64) 0 \n",
" \n",
" decoder_conv_t_2 (Conv2DTra (None, 28, 28, 32) 18464 \n",
" nspose) \n",
" \n",
" leaky_re_lu_6 (LeakyReLU) (None, 28, 28, 32) 0 \n",
" \n",
" decoder_conv_t_3 (Conv2DTra (None, 28, 28, 1) 289 \n",
" nspose) \n",
" \n",
" activation (Activation) (None, 28, 28, 1) 0 \n",
" \n",
"=================================================================\n",
"Total params: 102,017\n",
"Trainable params: 102,017\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n"
]
}
],
"source": [
"AE.decoder.summary()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "z3DuY-9Z9Yf-"
},
"source": [
"# Training the Neural Model\n",
"\n",
"\n",
"Try the training in 3 ways.\n",
"\n",
"\n",
"\n",
"With each way, you first train a few times and save the state to some files.\n",
"Then, after loading the saved states, further training proceeds.\n",
"\n",
"## ニューラルモデルを学習する\n",
"\n",
"3通りの方法で学習を試みる。\n",
"どの方法においても、まず数回学習を進めて、状態をファイルに保存する。\n",
"そして、保存した状態をロードしてから、さらに学習を進める。"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "i4olTqB1xt0m"
},
"outputs": [],
"source": [
"MAX_EPOCHS = 200"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "nA-X-4s1Z7q6"
},
"outputs": [],
"source": [
"learning_rate = 0.0005"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "P7ae_ydlhZPn"
},
"source": [
"# (1) Simple Training with fit()\n",
"\n",
"\n",
"Instead of using callbacks, simply train using fit() function.\n",
"\n",
"## (1) fit() 関数を使った単純なTraining\n",
"\n",
"callbackは使わずに、単純にfit()を使ってtrainingしてみる。"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "uS1trkRkvTLz"
},
"outputs": [],
"source": [
"save_path1 = '/content/drive/MyDrive/ColabRun/AE01'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "FSy_oa-qiFTV"
},
"outputs": [],
"source": [
"optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate)\n",
"AE.model.compile(optimizer=optimizer, loss=AutoEncoder.r_loss)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 84826,
"status": "ok",
"timestamp": 1637562204883,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "d14V6JZUvOhs",
"outputId": "89346613-692f-4c64-97f2-5ef7642701a7"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/3\n",
"1875/1875 [==============================] - 27s 6ms/step - loss: 0.0550 - val_loss: 0.0487\n",
"Epoch 2/3\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0464 - val_loss: 0.0449\n",
"Epoch 3/3\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0444 - val_loss: 0.0439\n"
]
}
],
"source": [
"# At first, train for a few epochs.\n",
"# まず、少ない回数 training してみる\n",
"\n",
"history=AE.train_with_fit(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs = 3,\n",
" run_folder = save_path1,\n",
" validation_data = (x_test, x_test)\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 6,
"status": "ok",
"timestamp": 1637562204884,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "49j-U8xx9kzZ",
"outputId": "932b4ab5-abbd-4f22-c980-a6b4e8d32930"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'loss': [0.05495461821556091, 0.0464060977101326, 0.04438251256942749], 'val_loss': [0.04873434454202652, 0.04490825906395912, 0.043926868587732315]}\n"
]
}
],
"source": [
"print(history.history)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 3,
"status": "ok",
"timestamp": 1637562204884,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "K0Msc4koyR-J",
"outputId": "b6a066ea-9e18-4f9c-b519-964469f196f5"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"3\n"
]
}
],
"source": [
"# Load the trained states saved before\n",
"# 保存されている学習結果をロードする\n",
"\n",
"AE_work = AutoEncoder.load(save_path1)\n",
"\n",
"# display the epoch count of training\n",
"# training のepoch回数を表示する\n",
"print(AE_work.epoch)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 1972373,
"status": "ok",
"timestamp": 1637564177734,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "fiPR6yjwi2_1",
"outputId": "a2f8a1f0-8b7d-4a28-8202-2cb9be708e95"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 4/200\n",
"1875/1875 [==============================] - 11s 5ms/step - loss: 0.0439 - val_loss: 0.0428\n",
"Epoch 5/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0422 - val_loss: 0.0421\n",
"Epoch 6/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0417 - val_loss: 0.0414\n",
"Epoch 7/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0412 - val_loss: 0.0412\n",
"Epoch 8/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0409 - val_loss: 0.0409\n",
"Epoch 9/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0406 - val_loss: 0.0411\n",
"Epoch 10/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0404 - val_loss: 0.0405\n",
"Epoch 11/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0402 - val_loss: 0.0403\n",
"Epoch 12/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0401 - val_loss: 0.0402\n",
"Epoch 13/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0399 - val_loss: 0.0402\n",
"Epoch 14/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0398 - val_loss: 0.0399\n",
"Epoch 15/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0396 - val_loss: 0.0401\n",
"Epoch 16/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0395 - val_loss: 0.0406\n",
"Epoch 17/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0394 - val_loss: 0.0399\n",
"Epoch 18/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0393 - val_loss: 0.0400\n",
"Epoch 19/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0392 - val_loss: 0.0395\n",
"Epoch 20/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0391 - val_loss: 0.0393\n",
"Epoch 21/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0390 - val_loss: 0.0393\n",
"Epoch 22/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0390 - val_loss: 0.0397\n",
"Epoch 23/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0389 - val_loss: 0.0394\n",
"Epoch 24/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0388 - val_loss: 0.0395\n",
"Epoch 25/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0388 - val_loss: 0.0393\n",
"Epoch 26/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0387 - val_loss: 0.0398\n",
"Epoch 27/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0387 - val_loss: 0.0395\n",
"Epoch 28/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0386 - val_loss: 0.0395\n",
"Epoch 29/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0385 - val_loss: 0.0390\n",
"Epoch 30/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0385 - val_loss: 0.0391\n",
"Epoch 31/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0384 - val_loss: 0.0395\n",
"Epoch 32/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0384 - val_loss: 0.0391\n",
"Epoch 33/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0383 - val_loss: 0.0394\n",
"Epoch 34/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0383 - val_loss: 0.0390\n",
"Epoch 35/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0382 - val_loss: 0.0393\n",
"Epoch 36/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0382 - val_loss: 0.0391\n",
"Epoch 37/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0381 - val_loss: 0.0390\n",
"Epoch 38/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0381 - val_loss: 0.0391\n",
"Epoch 39/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0381 - val_loss: 0.0388\n",
"Epoch 40/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0380 - val_loss: 0.0392\n",
"Epoch 41/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0380 - val_loss: 0.0394\n",
"Epoch 42/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0379 - val_loss: 0.0389\n",
"Epoch 43/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0379 - val_loss: 0.0392\n",
"Epoch 44/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0379 - val_loss: 0.0393\n",
"Epoch 45/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0378 - val_loss: 0.0390\n",
"Epoch 46/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0378 - val_loss: 0.0389\n",
"Epoch 47/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0378 - val_loss: 0.0392\n",
"Epoch 48/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0377 - val_loss: 0.0390\n",
"Epoch 49/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0377 - val_loss: 0.0386\n",
"Epoch 50/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0376 - val_loss: 0.0391\n",
"Epoch 51/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0377 - val_loss: 0.0392\n",
"Epoch 52/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0376 - val_loss: 0.0391\n",
"Epoch 53/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0376 - val_loss: 0.0385\n",
"Epoch 54/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0375 - val_loss: 0.0386\n",
"Epoch 55/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0375 - val_loss: 0.0386\n",
"Epoch 56/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0375 - val_loss: 0.0387\n",
"Epoch 57/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0375 - val_loss: 0.0385\n",
"Epoch 58/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0375 - val_loss: 0.0386\n",
"Epoch 59/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0374 - val_loss: 0.0389\n",
"Epoch 60/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0374 - val_loss: 0.0387\n",
"Epoch 61/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0374 - val_loss: 0.0387\n",
"Epoch 62/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0373 - val_loss: 0.0386\n",
"Epoch 63/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0373 - val_loss: 0.0387\n",
"Epoch 64/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0373 - val_loss: 0.0387\n",
"Epoch 65/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0373 - val_loss: 0.0384\n",
"Epoch 66/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0372 - val_loss: 0.0385\n",
"Epoch 67/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0372 - val_loss: 0.0386\n",
"Epoch 68/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0372 - val_loss: 0.0386\n",
"Epoch 69/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0372 - val_loss: 0.0389\n",
"Epoch 70/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0371 - val_loss: 0.0385\n",
"Epoch 71/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0372 - val_loss: 0.0384\n",
"Epoch 72/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0371 - val_loss: 0.0387\n",
"Epoch 73/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0371 - val_loss: 0.0386\n",
"Epoch 74/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0371 - val_loss: 0.0384\n",
"Epoch 75/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0371 - val_loss: 0.0387\n",
"Epoch 76/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0370 - val_loss: 0.0387\n",
"Epoch 77/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0370 - val_loss: 0.0383\n",
"Epoch 78/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0370 - val_loss: 0.0385\n",
"Epoch 79/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0370 - val_loss: 0.0384\n",
"Epoch 80/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0385\n",
"Epoch 81/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0384\n",
"Epoch 82/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0383\n",
"Epoch 83/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0385\n",
"Epoch 84/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0385\n",
"Epoch 85/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0386\n",
"Epoch 86/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0369 - val_loss: 0.0386\n",
"Epoch 87/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0384\n",
"Epoch 88/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0383\n",
"Epoch 89/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0385\n",
"Epoch 90/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0384\n",
"Epoch 91/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0368 - val_loss: 0.0384\n",
"Epoch 92/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0386\n",
"Epoch 93/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0385\n",
"Epoch 94/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0382\n",
"Epoch 95/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0383\n",
"Epoch 96/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0383\n",
"Epoch 97/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0367 - val_loss: 0.0384\n",
"Epoch 98/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0383\n",
"Epoch 99/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0385\n",
"Epoch 100/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0385\n",
"Epoch 101/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0383\n",
"Epoch 102/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0384\n",
"Epoch 103/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0384\n",
"Epoch 104/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0385\n",
"Epoch 105/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0366 - val_loss: 0.0386\n",
"Epoch 106/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0382\n",
"Epoch 107/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0383\n",
"Epoch 108/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0384\n",
"Epoch 109/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0381\n",
"Epoch 110/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0385\n",
"Epoch 111/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0365 - val_loss: 0.0385\n",
"Epoch 112/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0385\n",
"Epoch 113/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0383\n",
"Epoch 114/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0364 - val_loss: 0.0384\n",
"Epoch 115/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0381\n",
"Epoch 116/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0385\n",
"Epoch 117/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0382\n",
"Epoch 118/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0364 - val_loss: 0.0386\n",
"Epoch 119/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0383\n",
"Epoch 120/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0364 - val_loss: 0.0383\n",
"Epoch 121/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0385\n",
"Epoch 122/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0389\n",
"Epoch 123/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0363 - val_loss: 0.0385\n",
"Epoch 124/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0383\n",
"Epoch 125/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0385\n",
"Epoch 126/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0384\n",
"Epoch 127/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 128/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0363 - val_loss: 0.0384\n",
"Epoch 129/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 130/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0388\n",
"Epoch 131/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 132/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 133/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 134/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0384\n",
"Epoch 135/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0386\n",
"Epoch 136/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0362 - val_loss: 0.0385\n",
"Epoch 137/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0383\n",
"Epoch 138/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0383\n",
"Epoch 139/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0383\n",
"Epoch 140/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0384\n",
"Epoch 141/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0385\n",
"Epoch 142/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0385\n",
"Epoch 143/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0386\n",
"Epoch 144/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0382\n",
"Epoch 145/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0384\n",
"Epoch 146/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0387\n",
"Epoch 147/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0361 - val_loss: 0.0384\n",
"Epoch 148/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0385\n",
"Epoch 149/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0382\n",
"Epoch 150/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0381\n",
"Epoch 151/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0385\n",
"Epoch 152/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0382\n",
"Epoch 153/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0385\n",
"Epoch 154/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0384\n",
"Epoch 155/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0384\n",
"Epoch 156/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0382\n",
"Epoch 157/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0384\n",
"Epoch 158/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0384\n",
"Epoch 159/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0383\n",
"Epoch 160/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0360 - val_loss: 0.0387\n",
"Epoch 161/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0383\n",
"Epoch 162/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0383\n",
"Epoch 163/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0384\n",
"Epoch 164/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0386\n",
"Epoch 165/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0383\n",
"Epoch 166/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0382\n",
"Epoch 167/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0384\n",
"Epoch 168/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0382\n",
"Epoch 169/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0359 - val_loss: 0.0387\n",
"Epoch 170/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0389\n",
"Epoch 171/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0381\n",
"Epoch 172/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0390\n",
"Epoch 173/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0386\n",
"Epoch 174/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0382\n",
"Epoch 175/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0382\n",
"Epoch 176/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0388\n",
"Epoch 177/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0383\n",
"Epoch 178/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0383\n",
"Epoch 179/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0382\n",
"Epoch 180/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0358 - val_loss: 0.0384\n",
"Epoch 181/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 182/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 183/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 184/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0385\n",
"Epoch 185/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 186/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0386\n",
"Epoch 187/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 188/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 189/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0386\n",
"Epoch 190/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 191/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0383\n",
"Epoch 192/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0382\n",
"Epoch 193/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0357 - val_loss: 0.0386\n",
"Epoch 194/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0383\n",
"Epoch 195/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0385\n",
"Epoch 196/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0383\n",
"Epoch 197/200\n",
"1875/1875 [==============================] - 10s 6ms/step - loss: 0.0356 - val_loss: 0.0382\n",
"Epoch 198/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0385\n",
"Epoch 199/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0384\n",
"Epoch 200/200\n",
"1875/1875 [==============================] - 10s 5ms/step - loss: 0.0356 - val_loss: 0.0384\n"
]
}
],
"source": [
"# Then, train for more epochs. The training continues from the current self.epoch to the epoches specified.\n",
"# 追加でtrainingする。保存されている現在のepoch数から始めて、指定したepochs までtrainingが進む。\n",
"\n",
"AE_work.model.compile(optimizer, loss=AutoEncoder.r_loss)\n",
"\n",
"history_work = AE_work.train_with_fit(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs=MAX_EPOCHS,\n",
" run_folder = save_path1,\n",
" validation_data=(x_test, x_test)\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 17,
"status": "ok",
"timestamp": 1637564177735,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "134mgICN_4X3",
"outputId": "9991cf88-3cda-4027-8a54-d6e4bc2e9860"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"197\n"
]
}
],
"source": [
"# the return value contains the loss values in the additional training. \n",
"# 追加で行ったtraining時のlossが返り値に含まれる\n",
"print(len(history_work.history['loss']))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "NSTlPaDhBSC6"
},
"outputs": [],
"source": [
"loss1_1 = history.history['loss']\n",
"vloss1_1 = history.history['val_loss']\n",
"\n",
"loss1_2 = history_work.history['loss']\n",
"vloss1_2 = history_work.history['val_loss']\n",
"\n",
"loss1 = np.concatenate([loss1_1, loss1_2], axis=0)\n",
"val_loss1 = np.concatenate([vloss1_1, vloss1_2], axis=0)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 279
},
"executionInfo": {
"elapsed": 8,
"status": "ok",
"timestamp": 1637564177736,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "0Y20iXDaCB-x",
"outputId": "bb7e7449-485a-424a-ed74-63770b1bfead"
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAEGCAYAAABCXR4ZAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdeZzVZf3//8drZmDYRxzZBJRVXCBBEC0DM9PcEk0N1NyyzNwyy5+W1pcPaYu5VZpLaalpaphJaWqppeYSA+KCqCCiDiC7wzrAzLx+f7zex3NmOLPBMGdmeN5vt3M75/1+X+/rXNfZ3q9zXdf7epu7IyIiItIa5OW6ACIiIiINpcBFREREWg0FLiIiItJqKHARERGRVkOBi4iIiLQaBbkuQFPYZZddfMCAAbkuhoiIiDSBGTNmLHf3Htm2tYnAZcCAAZSUlOS6GCIiItIEzOz92rapq0hERERaDQUuIiIi0moocBEREZFWo02McREREWlJNm/eTGlpKeXl5bkuSovWoUMH+vXrR7t27Rq8jwIXERGRJlZaWkrXrl0ZMGAAZpbr4rRI7s6KFSsoLS1l4MCBDd6vQV1FZnaEmb1tZvPM7PIs2wvN7IFk+8tmNiBZP8DMNpjZrOR2a8Y+/07yTG3rWVdeIiIirUV5eTnFxcUKWupgZhQXFze6VareFhczywduBg4DSoHpZjbN3d/MSHY2sMrdh5jZJODnwMRk27vuPrKW7E9195rnMdeVl4iISKugoKV+W/MaNaTFZSwwz93nu/sm4H5gQo00E4C7ksdTgUNt69+xpsxr2/zyl/DnP+fkqUVERGRLDQlc+gIfZiyXJuuypnH3CqAMKE62DTSzV8zsP2Y2rsZ+v0+6iX6YEZzUldcnzOwcMysxs5Jly5Y1oBpb4ZZbYOrU7ZO3iIjIdtSlS5dcF2G72N6nQy8GdnP3UcAlwH1m1i3Zdqq7jwDGJbfTGpOxu9/u7mPcfUyPHllnBd52hYWwadP2yVtEREQarSGBy0Kgf8Zyv2Rd1jRmVgAUASvcfaO7rwBw9xnAu8AeyfLC5H4NcB/RJVVrXo2tWJNo3x42bszJU4uIiDQFd+fSSy9l+PDhjBgxggceeACAxYsXM378eEaOHMnw4cN57rnnqKys5Mwzz/wk7Q033JDj0m+pIadDTweGmtlAIqiYBJxSI8004AzgReBE4Gl3dzPrAax090ozGwQMBeYnAclO7r7czNoBxwD/qiuvbarl1iosVOAiIiLb5uKLYdasps1z5Ei48cYGJf3LX/7CrFmzePXVV1m+fDn7778/48eP57777uOLX/wiV1xxBZWVlaxfv55Zs2axcOFC3njjDQA+/vjjpi13E6g3cHH3CjO7AHgCyAfudPfZZjYFKHH3acAdwD1mNg9YSQQ3AOOBKWa2GagCznX3lWbWGXgiCVryiaDlt8k+teXV/BS4iIhIK/f8889z8sknk5+fT69evTj44IOZPn06+++/P1/72tfYvHkzxx13HCNHjmTQoEHMnz+fCy+8kKOPPprDDz8818XfQoMmoHP3x4DHaqz7UcbjcuCkLPs9BDyUZf06YHQtz5U1r5xo3x7WrMl1KUREpDVrYMtIcxs/fjzPPvssjz76KGeeeSaXXHIJp59+Oq+++ipPPPEEt956Kw8++CB33nlnrotaja5VVBe1uIiISCs3btw4HnjgASorK1m2bBnPPvssY8eO5f3336dXr1584xvf4Otf/zozZ85k+fLlVFVVccIJJ3DVVVcxc+bMXBd/C5ryvy4KXEREpJU7/vjjefHFF9l3330xM6655hp69+7NXXfdxS9+8QvatWtHly5duPvuu1m4cCFnnXUWVVVVAPz0pz/Ncem3ZLka99qUxowZ4yUlNSfgbQKnnw7PPQfvvdf0eYuISJs1Z84c9tprr1wXo1XI9lqZ2Qx3H5MtvbqK6tK+veZxERERaUEUuNRFXUUiIiItigKXuihwERERaVEUuNRFU/6LiIi0KApc6pIa49IGBjCLiIi0BQpc6lJYGPdqdREREWkRFLjUJRW4aJyLiIhIi6DApS5qcRERkR1Aly5dat22YMEChg8f3oylqZsCl7q0bx/3anERERFpETTlf13UVSQiItvo4oth1qymzXPkyLqv3Xj55ZfTv39/zj//fAAmT55MQUEBzzzzDKtWrWLz5s1cddVVTJgwoVHPW15ezre+9S1KSkooKCjg+uuv55BDDmH27NmcddZZbNq0iaqqKh566CF23XVXvvKVr1BaWkplZSU//OEPmThx4rZUG1DgUjcFLiIi0gpNnDiRiy+++JPA5cEHH+SJJ57goosuolu3bixfvpwDDzyQY489FjNrcL4333wzZsbrr7/OW2+9xeGHH84777zDrbfeyre//W1OPfVUNm3aRGVlJY899hi77rorjz76KABlZWVNUjcFLnVJdRVpjIuIiGylulpGtpdRo0axdOlSFi1axLJly+jevTu9e/fmO9/5Ds8++yx5eXksXLiQJUuW0Lt37wbn+/zzz3PhhRcCsOeee7L77rvzzjvv8OlPf5qrr76a0tJSvvzlLzN06FBGjBjBd7/7XS677DKOOeYYxo0b1yR10xiXuqjFRUREWqmTTjqJqVOn8sADDzBx4kTuvfdeli1bxowZM5g1axa9evWivLy8SZ7rlFNOYdq0aXTs2JGjjjqKp59+mj322IOZM2cyYsQIrrzySqZMmdIkz9WgwMXMjjCzt81snpldnmV7oZk9kGx/2cwGJOsHmNkGM5uV3G5N1ncys0fN7C0zm21mP8vI60wzW5axz9ebpKZbQ4GLiIi0UhMnTuT+++9n6tSpnHTSSZSVldGzZ0/atWvHM888w/vvv9/oPMeNG8e9994LwDvvvMMHH3zAsGHDmD9/PoMGDeKiiy5iwoQJvPbaayxatIhOnTrx1a9+lUsvvZSZM2c2Sb3q7Soys3zgZuAwoBSYbmbT3P3NjGRnA6vcfYiZTQJ+DqRG4Lzr7iOzZH2tuz9jZu2Bp8zsSHf/R7LtAXe/YGsr1WR0OrSIiLRS++yzD2vWrKFv37706dOHU089lS996UuMGDGCMWPGsOeeezY6z/POO49vfetbjBgxgoKCAv7whz9QWFjIgw8+yD333EO7du3o3bs3P/jBD5g+fTqXXnopeXl5tGvXjltuuaVJ6mVez3T2ZvZpYLK7fzFZ/j6Au/80I80TSZoXzawA+AjoAewO/N3d6zwB3Mx+Cbzh7r81szOBMY0JXMaMGeMlJSUNTd5wL78MBx4Ijz4KRx3V9PmLiEibNGfOHPbaa69cF6NVyPZamdkMdx+TLX1Duor6Ah9mLJcm67KmcfcKoAwoTrYNNLNXzOw/ZrbFyBwz2wn4EvBUxuoTzOw1M5tqZv2zFcrMzjGzEjMrWbZsWQOqsRXUVSQiItKibO+zihYDu7n7CjMbDfzVzPZx99UASevMn4Bfufv8ZJ+/AX9y941m9k3gLuDzNTN299uB2yFaXLZL6RW4iIjIDuL111/ntNNOq7ausLCQl19+OUclyq4hgctCILPVo1+yLlua0iQYKQJWePRDbQRw9xlm9i6wB5Dq17kdmOvun5ws5u4rMvL9HXBNw6vTxDTGRUREtpK7N2qOlFwbMWIEs5p6prx61DdcJZuGdBVNB4aa2cBkIO0kYFqNNNOAM5LHJwJPu7ubWY9kcC9mNggYCsxPlq8iApyLMzMysz4Zi8cCcxpXpSakKf9FRGQrdOjQgRUrVmzVgXlH4e6sWLGCDh06NGq/eltc3L3CzC4AngDygTvdfbaZTQFK3H0acAdwj5nNA1YSwQ3AeGCKmW0GqoBz3X2lmfUDrgDeAmYmEelN7v474CIzOxaoSPI6s1E1akrqKhIRka3Qr18/SktL2W5jMNuIDh060K9fv0btU+9ZRa3BdjuraPVqKCqCa6+F73636fMXERGRLWzrWUU7Lk35LyIi0qIocKmLxriIiIi0KApc6pKXB+3aKXARERFpIRS41KewUF1FIiIiLYQCl/q0b68WFxERkRZCgUt9CgsVuIiIiLQQClzqo8BFRESkxVDgUh+NcREREWkxFLjUR2NcREREWgwFLvVRV5GIiEiLocClPgpcREREWgwFLvXRGBcREZEWQ4FLfTTGRUREpMVQ4FIfdRWJiIi0GApc6qOuIhERkRZDgUt91FUkIiLSYihwqY+6ikRERFqMBgUuZnaEmb1tZvPM7PIs2wvN7IFk+8tmNiBZP8DMNpjZrOR2a8Y+o83s9WSfX5mZJet3NrN/mtnc5L5701R1KylwERERaTHqDVzMLB+4GTgS2Bs42cz2rpHsbGCVuw8BbgB+nrHtXXcfmdzOzVh/C/ANYGhyOyJZfznwlLsPBZ5KlnNHY1xERERajIa0uIwF5rn7fHffBNwPTKiRZgJwV/J4KnBoqgUlGzPrA3Rz95fc3YG7geOy5HVXxvrc0BgXERGRFqMhgUtf4MOM5dJkXdY07l4BlAHFybaBZvaKmf3HzMZlpC+tJc9e7r44efwR0CtboczsHDMrMbOSZcuWNaAaWynV4uK+/Z5DREREGmR7D85dDOzm7qOAS4D7zKxbQ3dOWmOyRgzufru7j3H3MT169Gia0mZTWBj36i4SERHJuYYELguB/hnL/ZJ1WdOYWQFQBKxw943uvgLA3WcA7wJ7JOn71ZLnkqQrKdWltLQxFWpyClxERERajIYELtOBoWY20MzaA5OAaTXSTAPOSB6fCDzt7m5mPZLBvZjZIGIQ7vykK2i1mR2YjIU5HXgkS15nZKzPjfbt417jXERERHKuoL4E7l5hZhcATwD5wJ3uPtvMpgAl7j4NuAO4x8zmASuJ4AZgPDDFzDYDVcC57r4y2XYe8AegI/CP5AbwM+BBMzsbeB/4yrZXcxukWlwUuIiIiORcvYELgLs/BjxWY92PMh6XAydl2e8h4KFa8iwBhmdZvwI4tCHlahbqKhIREWkxNHNufdTiIiIi0mIocKmPxriIiIi0GApc6qMWFxERkRZDgUt9NMZFRESkxVDgUh91FYmIiLQYClzqUdVOXUUiIiItRYNOh95R7b03jB40lHtAgYuIiEgLoBaXOhQWQtn6drGgMS4iIiI5p8ClDkVFULY2aZRSi4uIiEjOKXCpQ1ERlK3LjwUFLiIiIjmnwKUO0eKSBC7qKhIREck5BS51KCqCstXJS6QWFxERkZxT4FKHoiJYvQYcFLiIiIi0AApc6lBUBJWVxjo6K3ARERFpARS41KGoKO7LCnbRGBcREZEWQIFLHT4JXNrvohYXERGRFqBBgYuZHWFmb5vZPDO7PMv2QjN7INn+spkNqLF9NzNba2bfS5aHmdmsjNtqM7s42TbZzBZmbDtq26u5dT4JXPKLFbiIiIi0APVO+W9m+cDNwGFAKTDdzKa5+5sZyc4GVrn7EDObBPwcmJix/XrgH6kFd38bGJmR/0Lg4Yz0N7j7tVtXpaaT7ipS4CIiItISNKTFZSwwz93nu/sm4H5gQo00E4C7ksdTgUPNzADM7DjgPWB2LfkfCrzr7u83tvDbWzpw2VljXERERFqAhgQufYEPM5ZLk3VZ07h7BVAGFJtZF+Ay4P/qyH8S8Kca6y4ws9fM7E4z655tJzM7x8xKzKxk2bJlDahG430SuFh3tbiIiIi0ANt7cO5kottnbbaNZtYeOBb4c8bqW4DBRFfSYuC6bPu6++3uPsbdx/To0aNJC53ySeCSp8BFRESkJah3jAsx/qR/xnK/ZF22NKVmVgAUASuAA4ATzewaYCegyszK3f2mZL8jgZnuviSVUeZjM/st8PfGVanpdO4M+fnwMTupq0hERKQFaEjgMh0YamYDiQBlEnBKjTTTgDOAF4ETgafd3YFxqQRmNhlYmxG0AJxMjW4iM+vj7ouTxeOBNxpcmyZmBt26QRlFanERERFpAeoNXNy9wswuAJ4A8oE73X22mU0BStx9GnAHcI+ZzQNWEsFNncysM3Gm0jdrbLrGzEYSM+0vyLK9WRUVQdmGbgpcREREWoCGtLjg7o8Bj9VY96OMx+XASfXkMbnG8jqgOEu60xpSpuZSVARl67oqcBEREWkBNHNuPYqKoKyqi8a4iIiItAAKXOpRVARllV3U4iIiItICKHCpR1ERlG3W1aFFRERaAgUu9dhpJyjb3EldRSIiIi2AApd6FBXB6s0d8HK1uIiIiOSaApd6FBVBpeezbr3luigiIiI7PAUu9fhk2v8N7WD16twWRkREZAenwKUenwQuFMGHH9adWERERLYrBS71qBa4lJbmtjAiIiI7OAUu9VCLi4iISMuhwKUe6cBlJ7W4iIiI5JgCl3p8ErgU9VeLi4iISI4pcKnHJ4FLt93U4iIiIpJjClzq0bkz5OdDWac+ClxERERyTIFLPcygWzcoK+ypriIREZEcU+DSAEVFUFZQDGvWaBI6ERGRHFLg0gBFRVBmyWAXtbqIiIjkTIMCFzM7wszeNrN5ZnZ5lu2FZvZAsv1lMxtQY/tuZrbWzL6XsW6Bmb1uZrPMrCRj/c5m9k8zm5vcd9/66jWNoiIoq+waCxrnIiIikjP1Bi5mlg/cDBwJ7A2cbGZ710h2NrDK3YcANwA/r7H9euAfWbI/xN1HuvuYjHWXA0+5+1DgqWQ5p4qKoGxTx1hQi4uIiEjONKTFZSwwz93nu/sm4H5gQo00E4C7ksdTgUPNzADM7DjgPWB2A8uUmdddwHEN3G+7KSpKLrJophYXERGRHGpI4NIXyGxmKE3WZU3j7hVAGVBsZl2Ay4D/y5KvA0+a2QwzOydjfS93X5w8/gjola1QZnaOmZWYWcmyZcsaUI2tV1QEK1ca9O6twEVERCSHtvfg3MnADe6+Nsu2z7r7fkQX1PlmNr5mAnd3IsDZgrvf7u5j3H1Mjx49mrLMWxg2DMrK4IMeo9VVJCIikkMNCVwWAv0zlvsl67KmMbMCoAhYARwAXGNmC4CLgR+Y2QUA7r4wuV8KPEx0SQEsMbM+SV59gKWNrlUTO+iguP9v4efV4iIiIpJDDQlcpgNDzWygmbUHJgHTaqSZBpyRPD4ReNrDOHcf4O4DgBuBn7j7TWbW2cy6AphZZ+Bw4I0seZ0BPLKVdWsyn/pUzKD7wka1uIiIiORSQX0J3L0iaSV5AsgH7nT32WY2BShx92nAHcA9ZjYPWEkEN3XpBTycjN8tAO5z98eTbT8DHjSzs4H3ga9sRb2aVEEBHHAA/PetYelJ6Lp1y3WxREREdjgWw0hatzFjxnhJSUn9CbfBj34EV19VRZl3o8sbL8M++2zX5xMREdlRmdmMGlOlfEIz5zbQQQdBlefxMgfA3/6W6+KIiIjskBS4NNCBB8Y0Lv8dciZcdx2sW5frIomIiOxwFLg0UFERDB8OLxQfA8uXw2235bpIIiIiOxwFLo1w0EHw4pzuVB7yBfjFL2DDhlwXSUREZIeiwKURPvOZOKFo9qk/gY8+grvvznWRREREdigKXBphbDJF3gwbA336wMsv57ZAIiIiOxgFLo0wdGhMRPfKLIPBg+Hdd3NdJBERkR2KApdGyMuDffeFV14hApf583NdJBERkR2KApdGGjUKZs2CqgGDYOFCKC/PdZFERER2GApcGmnkSFi7Ft7tsi+4w/vv57pIIiIiOwwFLo00alTczyrfMx6ou0hERKTZKHBppOHD46KLryztGysUuIiIiDQbBS6NVFgIe+8Nr8ztDB07KnARERFpRgpctsKoUfDKKwaDBumUaBERkWakwGUrjBoFS5bA4l1Hq8VFRESkGSlw2QqpAbqvdPxMBC7uuS2QiIjIDqJBgYuZHWFmb5vZPDO7PMv2QjN7INn+spkNqLF9NzNba2bfS5b7m9kzZvammc02s29npJ1sZgvNbFZyO2rbqtj09t037ks27wvr1sGyZbktkIiIyA6i3sDFzPKBm4Ejgb2Bk81s7xrJzgZWufsQ4Abg5zW2Xw/8I2O5Aviuu+8NHAicXyPPG9x9ZHJ7rFE1agZFRXDAAfDnN/fGQd1FIiIizaQhLS5jgXnuPt/dNwH3AxNqpJkA3JU8ngocamYGYGbHAe8Bs1OJ3X2xu89MHq8B5gB9t6Uize300+GN97vxKvsqcBEREWkmDQlc+gIfZiyXsmWQ8Ukad68AyoBiM+sCXAb8X22ZJ91Ko4DMSy1fYGavmdmdZta9lv3OMbMSMytZloOumkmToF07525OV+AiIiLSTLb34NzJRLfP2mwbk8DmIeBid1+drL4FGAyMBBYD12Xb191vd/cx7j6mR48eTV7w+uy8M3zpS8a9eadRMfe9Zn9+ERGRHVFDApeFQP+M5X7JuqxpzKwAKAJWAAcA15jZAuBi4AdmdkGSrh0RtNzr7n9JZeTuS9y90t2rgN8SXVUt0umnw9KqHjw5ozjXRREREdkhNCRwmQ4MNbOBZtYemARMq5FmGnBG8vhE4GkP49x9gLsPAG4EfuLuNyXjX+4A5rj79ZkZmVmfjMXjgTcaXatmcuSRUNxhLXfN2R9Wrcp1cURERNq8egOXZMzKBcATxCDaB919tplNMbNjk2R3EGNa5gGXAFucMl3DQcBpwOeznPZ8jZm9bmavAYcA32l8tZpH+/bwlWM28Peqoyi/+8FcF0dERKTNM28Dk6eNGTPGS0pKcvLcj//DOfIo47EhF3Hk3F/lpAwiIiJtiZnNcPcx2bZp5txt9LlDjM7tNzFt3l7w6qu5Lo6IiEibpsBlG3XoAIcf5vydY/A77sx1cURERNo0BS5N4NgTCymlP7PuehXKy3NdHBERkTZLgUsTOOooMHP+tno83HZbrosjIiLSZilwaQI9e8KBBxrTup4KP/lJXHhRREREmpwClyZy7LEwY80wfrb0LCpuvCnXxREREWmTFLg0kfPPhxNOgO/zMz79/w7j/ddX17+TiIiINIoClybStStMnQp/vuY93qkczKTDV1BRketSiYiItC0KXJrYiZcO5LYjHualjwby07Pn4Q4PPxxDX9rAXH8iIiI5VZDrArRFk/4ykb/1/hv/d/eRPD1/M/9+vh0ABx0EBx+c48KJiIi0Ympx2R46duSmh/uxK4uY+dImrv2FU1wMN96Y64KJiIi0bgpctpPunx/FzMl/472K/nx33RTOPRceeQTmz891yURERFovBS7b0S4/Oo+dzzgWJk/mvL6PkJ8Pv/51rkslIiLSeilw2Z7MYibdgw5i1+9MZOL4xdxxB5SVNS6bJUvgnns0uFdERESBy/ZWWAh//SsMG8Z3nj+BtWuds86CysqGZ3H55XD66TBjxvYrpoiISGugwKU57LIL/OtfjN5jDdcVXMbDD8P3vtewXT/6CO67Lx7/8Y/br4giIiKtgQKX5tKjBzz1FN8Z/Rzf5kZuvBEuOq+ClSvr3u2WW2DTJth/f/jTn9CkdiIiskNrUOBiZkeY2dtmNs/MLs+yvdDMHki2v2xmA2ps383M1prZ9+rL08wGJnnMS/Jsv/XVa2F69oRnn+W6SxbxLX7DTbfkMXhgJeedBxdfDJMnw5o16eTl5RG4HHMMXHEFLF0KTz6Zs9KLiIjkXL2Bi5nlAzcDRwJ7Ayeb2d41kp0NrHL3IcANwM9rbL8e+EcD8/w5cEOS16ok77ajXTvyr7uG3/xtN17tNp5x657g3j9s4ve/hylT4OijYe3aSHr33bBsGXznO3DkkVBcHIN0AVasgA0bclcNERGRXGhIi8tYYJ67z3f3TcD9wIQaaSYAdyWPpwKHmpkBmNlxwHvA7PryTPb5fJIHSZ7HNb5arcAxxzDi9fuYNvYqyjYUUnbKt/jTXZt44YUIUo47Dr75TRg9Gg45BNq3h4kTY5zvqadC795w2mm5roSIiEjzakjg0hf4MGO5NFmXNY27VwBlQLGZdQEuA/6vgXkWAx8nedT2XACY2TlmVmJmJcuWLWtANVqg3XaD//wHLr0Ubr2VideN5d6fvM8LL8Bzz8EPfwiPPx5nVQOccUZ0H02bBqNGwV/+Au+9l9sqiIiINKftPTh3MtHts7apM3b32919jLuP6dGjR1Nn33zatYNrroG//x0WLWLiFUN455xr+eDtDUyZEickpYwdCy++CKWlEbTk5cUYmG2xebOCHxERaT0aErgsBPpnLPdL1mVNY2YFQBGwAjgAuMbMFgAXAz8wswvqyHMFsFOSR23P1TYdfTTMmQNf/SqDb72Uzp/ZN1pjajjwQCgqgn794Pjj4Xe/g/Xrs2f5u9/B979f98R1F10EQ4fCM880UT1ERES2o4YELtOBocnZPu2BScC0GmmmAWckj08EnvYwzt0HuPsA4EbgJ+5+U215ursDzyR5kOT5yDbUr3UpLobf/x7++c+Yoe5zn4OTT4YXXsgafVx4IaxalZ7nJdPs2fCtb8HPfha3bN57L4Ib9xg/8+GH2dNlWrECqqoaVy0REZGmUm/gkow3uQB4ApgDPOjus81sipkdmyS7gxjTMg+4BNjilOmG5Jlsvgy4JMmrOMl7x/KFL8Drr0dzyaOPwkEHxSjdGudCjxsHI0bAtdfGqdIpVVVwzjnQrVsM8r3iijhD6eqrI/3vfhfprr4a8vMjTiovhy9/GW6/HX7zm+zdRx99BAMGxNlPIiIiOeHurf42evRob7PWrHG/7Tb3QYPcwf3II93/8Q/3TZvc3f3RR907dHDv08f9mWfcly51v+GGSPr737uvW+e+336xDO677Rb33/uee0GB+4UXxtM8/HAsp9IddJB7VVX1olxxRWzbaSf31atj3eLF7vfcs2Xaxnr3XffDD3d//vns28vLty1/ERFpPYASr+WYn/OgoylubTpwSSkvd7/22ogawH3nnd2/9jX3xx/3WdM3+ZAh6aAD3A85JB1MLFzo/pOfuL/9tvvGje4nnRRpOnSIbSkrVriXlrpfd11s/9e/0tvWrnXv3t19+PDYdt117ps3u3/607F8113ptDff7P7AAw2v2ptvuu+6a+Rz2GHVt23Y4H7ppRFU3Xtv4182ERFpfeoKXCy2t25jxozxkpKSXBejeZSXR5fRgw/GedFr1kCvXqz+1mXc3flc6NCRrnq6bnsAACAASURBVF2ji6ioKHsWlZVxqnX//jEOpqaNG2HwYBg4EJ59Nk7H/vWvYyDvCy9ED9a778LXvhbdRn37RrHmzIGHH475ZwAuuSROmMrPr706c+dGT1heHnzxi9Gl9c47MWD4nXfghBPgjTegTx9Yty560HbbLb3/8uVxKYSzz4ZOnbI/x6ZN8PzzsN9+sNNODXuZRUQkd8xshruPybqxtoimNd12iBaXbDZscP/rX6OPJdWHc+WV7suWbXPWN90UWT71VDT2DBzo/pnPxLbHHku37Eya5P7GG+7t2rl/9rPRMnLkke4XXBDbP/9595kzY78ZM9xPPz16ulJOOMG9a9doDVq0KPb/3veiasOHuxcXx/PNn+/epUu0JFVWxr7r17sfeGC6pWbDhup1qKhwv/129913TzdS/eIXW6YTEZGWBXUV7QCmT3f/8pfjLe3UKaKI4493/3//L/p/GmnDhui+6dTJPT8/sv3LX2JbVZX7qFHu/fq5r1wZ6668MtLstZf7xx/Hut/+Nt2z9alPpYOdnj1jv9dei+Urr0w/74knRoBx3nmx7bHH0tvuuCPWnXJKVPfEE93N3M85J9YfdVR0haVcdVWsHzs2urKOOCKWv/rVRr8cIiLVvP12jO+T7UOBy45k9mz3b37T/XOfc9977ziyFxTEwJaHHorBKg30t79FkHDlldGwkzkAd+XKGBOTsmGD+49/7P7ee9Xz+Phj9ylT3EeOjO3/+U8EQt/8ZhSpa9fq+Tz1VDrAOf/86nlVVUVrTMeO6TTXXRfbbrstlr/73Vhet859l10imMks90UXxcvx0UcNfhmquftu9z/9aev2rc+mTenWJGmbFi2K1sglS3JdEmmsd96p/lvy1a/Gb85zz+WuTG2ZApcd2bvvxtG8uDje7o4doyXmnnty9ut5ySVRFDP3H/yg+raqqugi2nPPCD6yWbUqurJ+9avqPyRf/3oEJXPmxABhiEAp01tvxfqf/azx5X7mmSgzuP/yl43fvy7z50d33HHHbfsZWtJyfeUr8fn59re3Pa/Fi9Xt2Ryqqty///143+64I72ub99YN2ZM7X84qqriuy2Np8BF4hSgp5+Ov3upU3gg+nuOP9791lvdFyxolqKsWRNP26WL+/LlW25fubJRDUOfWLLEvagohvwMGuR+wAHZg4CDD47t2X5sysvdzz3XfcKE6mVbsSLKPHRovFypFqFvfSuWMwOkysoIkP70p3hZ6zu4vPdenKberl3ke+edW6a58073Rx6JcTs1bd4c/wanTXOfO7fu55Lc+ec/4/3t0SPO6NvaVj939/ffj8/65z/fcgLd8vI4sK9albsypLqpm8qmTe5nnhnvW36++zHHxPq5c2PdwQfHfW1dRr/5TfzZKSlp2nLtCBS4SHWVle4vvRT9LKeemp7cBeKo//jj273P4p133F9+uenzvf76dFUeeih7mnvvje3//GcEHGefHS00r7ySPr27XbsIbmbMcH/xxfjBKiiIsTWbNsWg5NR46J493fPyYjjRTTdFcJN5avopp2Q/uMyb5/7Tn0ZA1L17/LiNHx8HpA8/TKd78MF0XoMHR3dVyuzZ8fyp7f37p+fY2VZvvRVdemeeGWOOnn66/n1qeumlbTtAb63Nm5v/Od1r/9qUl7vvsYf7kCExtisvL07z3xpVVTEYPdX6ly3Qrempp2JM2GOPbVugs2FDjF3L9p6ee26U57OfjYHzze211yIgvPrqutMtXtzw8qXGyU2eHHNedewY+95+e6x/881ocenXb8sW4srK9G/BySdvXZ3qU9vn7f774zeqrj9Ny5ZFYJVMCdbiKHCRulVVRf/KVVfFTHapAb777RdH3R//OGao25pmkGa2cWN0M+2xR/bWCff4MhcXxxiYVO9Z6sDfsaP7n/8cwUqvXtUDkOuvr57PqlXx0q1e7X7aael0Y8fGD9srr8RLBxHUvPJK/PiNHh2BSir9gQdGgOQewUynTnH21IoVMSZi553jx/H++933398/GduzbFkEV716xWSD994bB7PU2KC1a2PsT+ZB5pFHoptszZrqdamqijxPOCHKeuGFEah16hRxbdeuEcw9+uiWr2dl5ZYHw6qqGNsEcZA+7LA4eDbG2rVxcD/rrOhezDwbraaqKvc//MH90EPjI1xQEPF3Q3z8sfuzz0b5nn46zoLLnN8om9Wr432+5Zb0uhtvjM/UK69smf6yy+K1SNXh5JPdO3fessWxtu7RTKnxXDff7D5uXHyW6goOn3oqDuh5ebHfiBHu//tf/c/jHt0c//hHfC6fe8592DD/ZB7MzPf8D39I/+8xixbL+oLHjRubbmLJqqr4zkC8rosXb5mmosL9mmvc27ePlpLafh9S1q6N34mjj47lxx+P/B99NH4We/eO5332Wc/a/Zw6+3KvvaK15v33t61+zz+fPhnCPb6n3btv+YdiwYJozYb0BKPZnHxypPnxj2O5oiK+a+efnz24Xbu2eYMcBS7ScBs3Rh/HxRfHaTipc4lTwcwpp8QpOjVHqrUgS5Zk/+HK9MMfxpf76qvjH9ScOdFa8vrr6TQffBBz/j38cDyuz1NPRcCTqaoq3dQM8aP5hS9EF9OvfpW9d+7OO+OHrkePCFg6dIjyuccPR2oCwQED3AsLo1Uj5aKL4sBx223pg0z37rF8+unpcvTuHcHV+vXpQc8QvYhmcZD75jfTw6BWrYo4trCwegDx179GXh07RmvCMce4//rXMYYDokHviisi+CksdP/vf7O/dhs2xIH/u9+NszWWLInAIC8vxhJ06BDlSo0xcI9Zol97zf3JJ+MADjEe/cwzI3Dt02fLwKCqqvoBa+3a9KTUNW9XXFH9I575ePJk/6RlbsYM91mz0l19/ftXDyRSUwucc0563Rtv+Cdnwi1fHoFQavzL5z4XrWrXXhsBwqc+FcHGPvvEQbCwMD5Dqf8b7dvHTNc33hgTP152WXQhHX98vJ4dO8a4sUWL4qu7++5xcH/iiezvhXuU57LL0nVK3XbbLVoowX3q1Ej70kvx/hxySAQrv/pVbC8sjPVjx245nG7TpujKHTQofdLjf/8b79v48TFh5t13R3A2dWr1137Tpij7OefEgXn58vizARHgFhRE60+muXNjOgdIB/9TpmxZ78znufHGSJeazXvDhvgJPO+8+GxNmpROe9RR8T3L7CY74ohIN3dufJ9TJw7UVFrqfsYZUYdsLSjvvx/fq9Trn2osh/gN69AhHaRXVbl/8Yux/pRTIs0jj2yZ5wsvxLZddon3+LXX0uN4sgVhCxbE97xnT/fLL49g7Z//jN/Gpu6eS1HgIttm7Vr3f/87jmQ775z+dKf+jlx1VRy1a/6Nb8EqK5uvO2HjxvjR+uUvq59BVZdXX40f9mwDgTdvTh/kap7htGZNuudv112j7z11UM/Pj4DtuefSXWLdusWBIjVmp6oq/vUvXbplmZYvT5/WPny4+5e+FI9Hjoz6TZzo1WZwPv/89A/x8uXRbF5cHIHJ6tURJP797/EjnCpzfn4EKMXFccD9619j//Xr06ezX3hhdEdkHlB33tn9d79LP9/MmfGDfOKJ0Z121llxkOzYMbr3UrNCX3xx7P/738dH/Omn48c4FeSdf777fffF6f977BEHmWXLogXq8MMjqBo2LF6P3r3jgNqxYxwkH3wwDsBm7sceu+Xn7de/jjLuumvknZcXk2EPGJCu1557xr7HHx+tYSedFIFD5gwHt9xS/WvZrl0EvMOGxeu5777VA4dFi2JdQUEEZ//5T9Tp3Xfja3zuuen8zjwzXpPf/S4O5KtXRz1Gjoxy//rXEaAMGFD9Of74x2gtu+SSeD1GjKgeRKamT+jQIer40ENxsB0wIF7rmkHkF78Yn5tf/CLdEtqlS9ShR48IEPbdN4LSCy6Ies+enR7I36lTvO9//GN8xk89NV7v1BlBmzdHXXr0iG3LlkX3z7hx1d+zY4+N9x7iz0DKzJmx7oc/jOXUSQCp4GjSpNhv1qxokUtNtfXhh9W/M8OHu//oR5H39ddHy1WnTnG78sp4fVKXZjnppAiQ9903gteJE9PddTfdFK1Zo0ZFvb/0pRj4f9NNEYAdcEC8Zu+9F3Xu1y/2+8Y3oiXGLN26unp1vH9FRZFPquUuddte43cUuEjTqaiII87tt8fRYK+9tjyC7LFHfMOvuSb+RulCQ1uloiJ+6LI1bFVW1j6W+sUX48c79eNYWRkBTuYPTFVVHKTOOCN+UL/znYY1oK1aFYHU+PHxr/3KK6vPneMe/zCffXbL/ObNix/J1LxAmbfRo+Mf3EcfRZ77779l61V5eTpY2muviJf//OfYL7MJPeWnP03n37Fj/NBfckm0XHTsGC0aZvEPuqaqqjjwZgYQXbvG/VlnxY/37NnxGqbGmqTmHPrzn9PrIA5+tXUBzZwZefbqFWetucf7/sILjZt+qaoqAocZM6qP39i4Mfv7+vHH8Z8js5yZDasnn1z3GLSXXkrve9hhdc95+a9/RYCy777x7//pp+P1O/PMCJpSXbV77hlBlXvUZe7c+Dz85jdRplT5vvCFCGg3bIiWgrFjoyypAfJLlqS7SlK3I46o/nqWlUUgaxbjxlKtbqNHx+ezW7fq72nKrbem83z77erbvvKV+E785jcRMLRvn255mz69enny86MegwbFc/33vxEgjxhR/T0ZPDhallLf9ZUr43lOOin9s7piRQTaqeBj3Lh0AP/OO9H6NmpUerxNKij9/e8jzdSp6f02bozP6qhRESAdfHAE4fn50bLpHsHWY4/Fd3zmzIZ1b26NugIXTfkv227VKnj5ZZg+PS4hvWwZvPpqzNkPUFgI++8PI0fCpz4Vl6gePhy6dMltuQWIn0iz5tnvtdfgD3+AXr1g993Ttz59GpZXZSXMnw9DhtSfvrIyLlPRsyecfz7sskusX7oUPve5uERF//4wezZ07Zq9fvfdF1dZP/rouGzEEUfAhg1wxhlRD4Bbb4X16+MSFynvvx9X4+jYMa6oXtdlLyoqYPPmSNvcVq2C556L13SnnaBHDzj44IZ9NX/1q6j3pZfWXT+Axx+HU0+FlStjecCA+Ino1g3+9S+480644Yb4XGQzd2683kcfDZ/5TPVtlZWwcGH1S4E8+SS8+GLkP3gwfOlLW35e3nsv8pwzJy4dctFFMGFC7HfqqdC7d1ziJHO/Dz+M59l1Vygtrb7trbdgn32gqio+n1deGZ+TzDKtWBE/hzNnwgMPxPM+/jgccEA6XUVF/Izm5cXzNMaiRfE+Zrv8iTs89RRMnhyftSeeiOeAWD9mTPoyMUuWwI03wmOPxWVWbr45++Vhtqe6pvxX4CLbz9Kl8N//xu3FF+OotXZtevuwYTBuHBx4YHxDe/aMb3xtF1kSaSKLF0cwc/HFMH58w/d78km4+mq4557qB0qp3+bN8PTT8OijcUAfPTrXJapdRUXcOnTYctshh8T/rl//esttzzwTQcPYsfUH1u7xmrRv3zRl3l42bMhNUK3ARVqGqqr4K/raa3H73//ib+zHH1dP179//HXZZ59onRkzBvbcs/6/dSIi0ibUFbgUNHdhZAeWlxeXnB44MNpkIYKZ+fOjdWbJEnj77bgc9OzZ8fdl48ZI17Fj/MXt2zdu/frBHnvE35/dd89dnUREpFk1KHAxsyOAXwL5wO/c/Wc1thcCdwOjgRXARHdfYGZjgdtTyYDJ7v6wmQ0DHsjIYhDwI3e/0cwmA98AliXbfuDuj21V7aTly8uL7qEhQ7bcVlER42RmzIBXXokO5oUL4d//jrb+iopIt/vu0dVUXJy+7bNPBEfFxc1aHRER2b7q7Soys3zgHeAwoBSYDpzs7m9mpDkP+JS7n2tmk4Dj3X2imXUCNrl7hZn1AV4FdnX3ihr5LwQOcPf3k8Blrbtf29BKqKtoB1RVBW++GaPKXnopBgSvWJG+rV8PBQUxILisLEYFjh0bwcxee0UH9C67RBfU1oxMFRGR7WZbu4rGAvPcfX6S2f3ABODNjDQTgMnJ46nATWZm7r4+I00HIFuUdCjwrru/34CyiIS8vBghN3w4fPvb1be5RwvNgw/GmU6DB8dpEs88A+eeWz3t4MFxqkKXLnF6wuDBcNBBEdCkhtyLiEiL0ZDApS/wYcZyKXBAbWmS1pUyoBhYbmYHAHcCuwOnZba2JCYBf6qx7gIzOx0oAb7r7qtqFsrMzgHOAdhNw/slkxnst1/cMrnHuY+LF8fyvHnw8MNxPmtVVey3eXNsKyxMdz9t2hRjbfbYI86AGjkShg6NLqqWfkqAiEgb05CuohOBI9z968nyaUS3zgUZad5I0pQmy+8maZZnpNkLuAsY7+7lybr2wCJgH3dfkqzrBSwnWmd+DPRx96/VVUZ1FUmTcI8JI/773+iGWrQoup06dIhupzfeiMHDmTp0iIkTRo+OCTAGDIhgpkePWFdYmM5bXVIiIg2yrV1FC4H+Gcv9knXZ0pSaWQFQRAzS/YS7zzGztcBwoiUF4EhgZipoSdJ98tjMfgv8vQFlFNl2ZtGqsscetadZuTJabebNgw8+iFnGli2LmaoefbR62sJC2HvvCH4WLYqzoUaPju6t/v3jtttuca/J+EREGqQhgct0YKiZDSQClEnAKTXSTAPOAF4ETgSedndP9vkw6T7aHdgTWJCx38nU6CYysz7unrTlczzwRuOqJLId7bxzjIE56KAtty1dGlNhbtoECxbElKSzZ8cZTn36xBw2M2ZE91TNls6ddkoHMf37x1SuS5dGYPTpT8NRR8WEfZrLRkR2cA2agM7MjgJuJE6HvtPdrzazKcS1BKaZWQfgHmAUsBKY5O7zk26ly4HNQBUwxd3/muTZGfgAGOTuZRnPdQ8wkugqWgB8MyOQyUpdRdKqbN4cp3V/+GHcPvgg/Ti1vHZtzCRcWAjvvpvet0uXmFm4qCim6Fy/Pm677hqBzR57xP3gwbF/cTG0a5e7uoqIbAXNnCvSmn34Ycw1X1oap3anbuvXR/DSoUOkeeeduMhJJrMYRLznnnH6d7t20Zqz667RCpTaf8CACHrat4d166LVqHv3nFRXREQz54q0Zv37w9lnNyxtWVkEMAsWRLfVRx/F8ltvxcDizZth9eq41VRQEDMUr1kTy5/6FBx6aKxftSqugHfggREILVwYY3f22SfG7BTop0REmod+bUTakqKiuBL3/vvXnW7t2ghqysujhWXevDhrav36aImprIzL9t50U8xn0717DEKurNwyr86dY+bjvn2ja6qwMFpu2rePx4MHRxDUr1+07hQWxq2gQGdaiUijqatIRGpXVZWeiG/9+hhcvGhRBCE77QSzZsHLL8N778X6lSujmyl127AhPTdOTWYRwHToEONxBg+OfDt3rn4bPDjmzikujvw3box0Grsj0mZpjIuI5EbqiuCvvhpnSW3cmL6Vl6fvFy2KQciLF0eAtG5d7Fub/Pxo4Ul1fRUXw6hR0Y21dm3kO2wY7LtvpF26NAKlQYNiLE/v3s33GohIo2mMi4jkRuYVwRvDPYKPNWtibM6sWTF+J3WW1IIFERAVFkK3bhHwzJoVl3Xo2jWClXvvrT3/PfeEz38+uqtSp51v3hytPyNGxNidtWvjiuVm0QXXo0cMYh48OAY6i0hOKHARkZbHLIKIDh0iYPjsZxufx5o1MY+OWXRFVVZGl9Zrr8X4nd//PoKgnj0j2GnXLoKVRx/NPpYnU9++0X1VWBjPk5cXc/ykbjvtFC1Ja9ZE0DNkSLT29OoV28vKYnBznz6xLCINpq4iEdkx1XYZhvLyGKzcrVsEGhCBxpIl0dIzdy7MnBndX1VVEfRUVcWZVytXxn3qdzU/v/4gaMCACGry8qI8eXnREnTggXDCCZHmb3+LIKy4OAK5nj3jvrg4Bk6nbh06NNWrI5JTGuMiItJcqqpi3E3qDKo1a2L8zoIFEfysWhWtMN27x2SDM2bE6eXu6dv69fD669Xz3XXXCKDWrav9udu3j6AnPz8Cr+LiaB3ac89o3Vm0KG7du6cHWLdrF+XZe+9oGUoFad26xan4GgQtOaAxLiIizSUvLwKClG7dYuDwqFGNy2fRIpg2LVphjj46Ag2IoGbZshibk2rhSd1Wr4aKiritXh1z+XzwAfz733GGV+fOEQB9/HHk0ZC6dOkS443y8mLA8557xnO/+WY8T79+MYZp7NjoPlu6NFqlIFqGevaMlqvU4513Tl+J3T2CLZ0WL42gFhcRkbauqirG73Ttmg4Systj3ebNEeDMng3z50drTM+eEfi8917cFxZGujlzYjLDXr1gr71ifWlpTHKYClYgWnzcs58ZlpeXbllK6dIlBj0PGRLP36lT3Dp3jlajESPibLAlS6KMH38cgVjXrhEQ9u8fLVczZkTQOHRopO/VS0FRK6UWFxGRHVleXrT8ZEoNfoboRhoxYtueY8WKmMSwT59ogcnLixahpUu3vKXm8DGL1pyyshhXNHt2BErr1sWtomLbytS1awQ1hYVRnhUr4vl79IhJGnv0iOdNXb19yJB0ANW+fVxKY+3amEBx1KjIZ+PGyLugILrR2rWLQK2xAVIqsNOFUxtNLS4iItIybd4cp7q//noEGKmgqLg4Lk+xYgW88kp0h40cGcHI2rXpFqC5c6NFaPPmGCSdGty8cCFMnx6B1dCh0X22cGE8R1lZ/eWqKT8/uswGDIhWIki3KLmnn799+3S533wzxj+NHg2f+UxcOmPYsAgwU919FRWxXypI2mWXmIOoffsme4lbKg3OFRERqY97BDPz5kWwsdtuEWjMmhWn0VdVpYOGiopIU1ER444+/DAGYJeXp/NLtcKkWmU2bYq0qcHQXbvCSy9BSUm6JachOnWK/PLz00FN5sVUU8+dev7U2Wrdu0eQ1qlTBGibNkXA1b9/lHvp0qjvsGERJK5dGy1g7dunu+6aqetNXUUiIiL1MYtWmeLi6usPOyxu20tFRQQ9b78dY3cKCtK3vLxoddm0KQZUL1wYwURlZfq2cWOMR3r88diWOYYo9biqatu73goKYvxTly7py3o8/vi2dzM2thjN+mwiIiJSXUFBjKsZMmT7PYd7dE0tXhzBUVFRPG9paXS1deoU3Wjr18cA7I8+ijRdu0bAs25dnLmWmmk6dbHUrl23X5lrocBFRESkrTOL8TM1B2n37w+f/nT1dZ//fPOVayvk5boAIiIiIg3VoMDFzI4ws7fNbJ6ZXZ5le6GZPZBsf9nMBiTrx5rZrOT2qpkdn7HPAjN7PdlWkrF+ZzP7p5nNTe67b3s1RUREpC2oN3Axs3zgZuBIYG/gZDPbu0ays4FV7j4EuAH4ebL+DWCMu48EjgBuM7PM7qlD3H1kjZHDlwNPuftQ4KlkWURERKRBLS5jgXnuPt/dNwH3AxNqpJkA3JU8ngocambm7uvdPTWMuQPQkHOvM/O6CziuAfuIiIjIDqAhgUtf4MOM5dJkXdY0SaBSBhQDmNkBZjYbeB04NyOQceBJM5thZudk5NXL3Rcnjz8CemUrlJmdY2YlZlayrCHX3BAREZFWb7sPznX3l919H2B/4Ptmlrru+mfdfT+iC+p8MxufZV+nllYad7/d3ce4+5gePXpsr+KLiIhIC9KQwGUh0D9juV+yLmuaZAxLEbAiM4G7zwHWAsOT5YXJ/VLgYaJLCmCJmfVJ8uoDLG14dURERKQta0jgMh0YamYDzaw9MAmYViPNNOCM5PGJwNPu7sk+BQBmtjuwJ7DAzDqbWddkfWfgcGIgb828zgAe2bqqiYiISFvToGsVmdlRwI1APnCnu19tZlOAEneflnT/3AOMAlYCk9x9vpmdRpwVtBmoAqa4+1/NbBDRygIxCd597n518lzFwIPAbsD7wFfcfWU95VuWpN0edgGWb6e8W4odoY6wY9RTdWw7doR6qo5tR1PXc3d3zzoOpE1cZHF7MrOS2i701FbsCHWEHaOeqmPbsSPUU3VsO5qznpo5V0RERFoNBS4iIiLSaihwqd/tuS5AM9gR6gg7Rj1Vx7ZjR6in6th2NFs9NcZFREREWg21uIiIiEirocBFREREWg0FLrUwsyPM7G0zm2dmbeYK1WbW38yeMbM3zWy2mX07WT/ZzBaa2azkdlSuy7otzGyBmb2e1KUkWbezmf3TzOYm991zXc6tZWbDMt6rWWa22swubgvvo5ndaWZLzeyNjHVZ3zsLv0q+p6+Z2X65K3nD1VLHX5jZW0k9HjaznZL1A8xsQ8Z7emvuSt44tdSz1s+omX0/eS/fNrMv5qbUjVNLHR/IqN8CM5uVrG+V72Udx43cfC/dXbcaN2KivXeBQUB74FVg71yXq4nq1gfYL3ncFXgH2BuYDHwv1+VrwnouAHapse4a4PLk8eXAz3Ndziaqaz5xQdLd28L7CIwH9gPeqO+9A44C/gEYcCDwcq7Lvw11PBwoSB7/PKOOAzLTtaZbLfXM+hlNfodeBQqBgclvcH6u67A1dayx/TrgR635vazjuJGT76VaXLIbC8xz9/nuvgm4H5iQ4zI1CXdf7O4zk8drgDlsebXvtmoCcFfy+C7guByWpSkdCrzr7ttr9uhm5e7PEjNwZ6rtvZsA3O3hJWAnS6511pJlq6O7P+nuFcniS8R14Vq1Wt7L2kwA7nf3je7+HjCP9DXsWqy66mhmBnwF+FOzFqqJ1XHcyMn3UoFLdn2BDzOWS2mDB3czG0BcpuHlZNUFSbPena25GyXhwJNmNsPMzknW9XL3xcnjj4BeuSlak5tE9R/GtvQ+ptT23rXV7+rXiH+sKQPN7BUz+4+ZjctVoZpQts9oW3wvxwFL3H1uxrpW/V7WOG7k5HupwGUHZWZdgIeAi919NXALMBgYCSwmmjdbs8+6+37AkcD5ZjY+c6NHe2arnwvA4sKnxwJ/Tla1GVBzfwAABEFJREFUtfdxC23lvauNmV0BVAD3JqsWA7u5+yjgEuA+M+uWq/I1gTb/Gc1wMtX/VLTq9zLLceMTzfm9VOCS3UKgf8Zyv2Rdm2Bm7YgP373u/hcAd1/i7pXuXgX8llbQRFsXd1+Y3C8lLug5FliSaq5M7pfmroRN5khgprsvgbb3Pmao7b1rU99VMzsTOAY4NTkQkHSdrEgezyDGfuyRs0Juozo+o23tvSwAvgw8kFrXmt/LbMcNcvS9VOCS3XRgqJkNTP7RTgKm5bhMTSLpc70DmOPu12esz+x/PB54o+a+rYWZdTazrqnHxKDHN4j38Iwk2RnAI7kpYZOq9o+uLb2PNdT23k0DTk/OYjgQKMtoum5VzOwI4P8DjnX39Rnre5hZfvJ4EDAUmJ+bUm67Oj6j04BJZlZoZgOJev6vucvXhL4AvOXupakVrfW9rO24Qa6+l7kerdxSb8So6HeIiPiKXJenCev1WaI57zVgVnI7CrgHeD1ZPw3ok+uybkMdBxFnJ7wKzE69f0Ax8BQwF/gXsHOuy7qN9ewMrACKMta1+veRCMQWA5uJvvGza3vviLMWbk6+p68DY3Jd/m2o4zxiXEDqe3lrkvaE5HM8C5gJfCnX5d/Getb6GQWuSN7Lt4Ejc13+ra1jsv4PwLk10rbK97KO40ZOvpea8l9ERERaDXUViYiISKuhwEVERERaDQUuIiIi0moocBEREZFWQ4GLiIiItBoKXESk1TOzz5nZ33NdDhHZ/hS4iIiISKuhwEVEmo2ZfdXM/mdms8zsNjPLN7O1ZnaDmc02s6fMrEeSdqSZvZRcjO/h1MX4zGyImf3LzF41s5lmNjjJvouZTTWzt8zs3mS2T8zsZ2b2ZpLPtTmquog0EQUuItIszGwvYCJwkLuPBCqBU4kZgEvcfR/gP8D/S3a5G7jM3T9FzL6ZWn8vcLO77wt8hpi1FOKKtRcDexOzJx9kZsXEtPL7JPlctX1rKSLbmwIXEWkuhwKjgelmNitZHgRUkb4Q3R+Bz5pZEbCTu/8nWX8XMD65BlVfd38YwN3LPX1dn/+5e6nHxftmAQOAMqAcuMPMvgx8cg0gEWmdFLiISHMx4C53H5nchrn75CzptvY6JBszHlcCBe5eQVx9eCpx1eXHtzJvEWkhFLiISHN5CjjRzHoCmNnOZrY78Tt0YpLmFOB5dy8DVpnZuGT9acB/3H0NUGpmxyV5FJpZp9qe0My6EBehfAz4DrDv9qiYiDSfglwXQER2DO7+ppldCTxpZnnE1XTPB9YBY5NtS4lxMABnALcmgcl84Kxk/WnAbWY2JcnjpDqetivwiJl1IFp8LmniaolIM9PVoUUkp8xsrbt3yXU5RKR1UFeRiIiItBpqcREREZFWQy0uIiIi0moocBEREZFWQ4GLiIiItBoKXERERKTV+P832nAZBaNgFIyCUTAKRsGQAQCyDP31iQuOBQAAAABJRU5ErkJggg==\n",
"text/plain": [
"tf.GradientTape()
function.\n",
"\n",
"\n",
"Instead of using fit()
, calculate the loss in your own train()
function, find the gradients, and apply them to the variables.\n",
"\n",
"The train_tf()
function is speeding up by declaring @tf.function
the compute_loss_and_grads()
function.\n",
"\n",
"\n",
"## (2) tf.GradientTape()
関数を使った学習\n",
"\n",
"\n",
"fit()
関数を使わずに、自分で記述した train()
関数内で loss を計算し、gradients を求めて、変数に適用する。\n",
"\n",
"train_tf()
関数では、lossとgradientsの計算を行う compute_loss_and_grads()
関数を @tf.function
宣言することで高速化を図っている。\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "L-s-ylxkD9O6"
},
"outputs": [],
"source": [
"save_path2 = '/content/drive/MyDrive/ColabRun/AE02/'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "YpFibxw-CVzk"
},
"outputs": [],
"source": [
"from nw.AutoEncoder import AutoEncoder\n",
"\n",
"AE2 = AutoEncoder(\n",
" input_dim = (28, 28, 1),\n",
" encoder_conv_filters = [32, 64, 64, 64],\n",
" encoder_conv_kernel_size = [3, 3, 3, 3],\n",
" encoder_conv_strides = [1, 2, 2, 1],\n",
" decoder_conv_t_filters = [64, 64, 32, 1],\n",
" decoder_conv_t_kernel_size = [3, 3, 3, 3],\n",
" decoder_conv_t_strides = [1, 2, 2, 1],\n",
" z_dim = 2\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "xYyyVaOw_CeI"
},
"outputs": [],
"source": [
"optimizer2 = tf.keras.optimizers.Adam(learning_rate=learning_rate)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 117923,
"status": "ok",
"timestamp": 1637564297137,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "JmYIPWvfCd6E",
"outputId": "2c1d1f30-afb3-4ee9-d75b-f0994c82a923"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1/3 1875 loss: 0.0406 val loss: 0.0481 0:00:39.580740\n",
"2/3 1875 loss: 0.0391 val loss: 0.0448 0:01:18.183180\n",
"3/3 1875 loss: 0.0529 val loss: 0.0432 0:01:56.549291\n"
]
}
],
"source": [
"# At first, train for a few epochs.\n",
"# まず、少ない回数 training してみる\n",
"\n",
"loss2_1, vloss2_1 = AE2.train(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs = 3, \n",
" shuffle=True,\n",
" run_folder= save_path2,\n",
" optimizer = optimizer2,\n",
" save_epoch_interval=50,\n",
" validation_data=(x_test, x_test)\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 5,
"status": "ok",
"timestamp": 1637564297137,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "ECMO7RjRDx9d",
"outputId": "d4e7ad93-5757-4f1e-f1f4-bfdedd92dd52"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"3\n"
]
}
],
"source": [
"# Load the parameters and the weights saved before.\n",
"# 保存したパラメータと、重みを読み込む。\n",
"\n",
"AE2_work = AutoEncoder.load(save_path2)\n",
"print(AE2_work.epoch)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 3066556,
"status": "ok",
"timestamp": 1637567363691,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "O6zeeIiBFEXv",
"outputId": "0e2a95ce-c41d-4a3f-8291-41b32fff9f84"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4/200 1875 loss: 0.0441 val loss: 0.0430 0:00:16.042304\n",
"5/200 1875 loss: 0.0425 val loss: 0.0423 0:00:31.745769\n",
"6/200 1875 loss: 0.0419 val loss: 0.0420 0:00:47.550041\n",
"7/200 1875 loss: 0.0415 val loss: 0.0414 0:01:03.278043\n",
"8/200 1875 loss: 0.0412 val loss: 0.0411 0:01:18.886151\n",
"9/200 1875 loss: 0.0409 val loss: 0.0408 0:01:34.403325\n",
"10/200 1875 loss: 0.0406 val loss: 0.0404 0:01:49.879346\n",
"11/200 1875 loss: 0.0404 val loss: 0.0406 0:02:05.370021\n",
"12/200 1875 loss: 0.0403 val loss: 0.0406 0:02:21.087446\n",
"13/200 1875 loss: 0.0401 val loss: 0.0401 0:02:36.769709\n",
"14/200 1875 loss: 0.0399 val loss: 0.0401 0:02:52.391754\n",
"15/200 1875 loss: 0.0398 val loss: 0.0401 0:03:07.901212\n",
"16/200 1875 loss: 0.0397 val loss: 0.0400 0:03:23.392766\n",
"17/200 1875 loss: 0.0396 val loss: 0.0397 0:03:38.888386\n",
"18/200 1875 loss: 0.0394 val loss: 0.0396 0:03:54.470497\n",
"19/200 1875 loss: 0.0393 val loss: 0.0397 0:04:10.006842\n",
"20/200 1875 loss: 0.0392 val loss: 0.0396 0:04:25.572727\n",
"21/200 1875 loss: 0.0391 val loss: 0.0395 0:04:41.166492\n",
"22/200 1875 loss: 0.0391 val loss: 0.0395 0:04:56.777322\n",
"23/200 1875 loss: 0.0390 val loss: 0.0393 0:05:12.350585\n",
"24/200 1875 loss: 0.0389 val loss: 0.0394 0:05:28.061553\n",
"25/200 1875 loss: 0.0388 val loss: 0.0400 0:05:43.522782\n",
"26/200 1875 loss: 0.0387 val loss: 0.0391 0:05:59.371244\n",
"27/200 1875 loss: 0.0387 val loss: 0.0394 0:06:15.072191\n",
"28/200 1875 loss: 0.0386 val loss: 0.0394 0:06:30.590397\n",
"29/200 1875 loss: 0.0385 val loss: 0.0389 0:06:46.193198\n",
"30/200 1875 loss: 0.0385 val loss: 0.0393 0:07:01.744139\n",
"31/200 1875 loss: 0.0385 val loss: 0.0392 0:07:17.398188\n",
"32/200 1875 loss: 0.0384 val loss: 0.0391 0:07:33.097819\n",
"33/200 1875 loss: 0.0383 val loss: 0.0388 0:07:48.744927\n",
"34/200 1875 loss: 0.0382 val loss: 0.0388 0:08:04.346553\n",
"35/200 1875 loss: 0.0382 val loss: 0.0389 0:08:19.798364\n",
"36/200 1875 loss: 0.0381 val loss: 0.0390 0:08:35.371745\n",
"37/200 1875 loss: 0.0381 val loss: 0.0386 0:08:51.082935\n",
"38/200 1875 loss: 0.0380 val loss: 0.0388 0:09:06.822892\n",
"39/200 1875 loss: 0.0380 val loss: 0.0385 0:09:22.394035\n",
"40/200 1875 loss: 0.0379 val loss: 0.0389 0:09:37.953852\n",
"41/200 1875 loss: 0.0379 val loss: 0.0387 0:09:53.514515\n",
"42/200 1875 loss: 0.0378 val loss: 0.0387 0:10:09.035459\n",
"43/200 1875 loss: 0.0378 val loss: 0.0386 0:10:24.666275\n",
"44/200 1875 loss: 0.0378 val loss: 0.0388 0:10:40.257557\n",
"45/200 1875 loss: 0.0377 val loss: 0.0385 0:10:55.745129\n",
"46/200 1875 loss: 0.0377 val loss: 0.0388 0:11:11.432226\n",
"47/200 1875 loss: 0.0376 val loss: 0.0386 0:11:27.144229\n",
"48/200 1875 loss: 0.0376 val loss: 0.0390 0:11:42.676218\n",
"49/200 1875 loss: 0.0375 val loss: 0.0388 0:11:58.487087\n",
"50/200 1875 loss: 0.0375 val loss: 0.0383 0:12:14.850839\n",
"51/200 1875 loss: 0.0375 val loss: 0.0390 0:12:30.468638\n",
"52/200 1875 loss: 0.0375 val loss: 0.0388 0:12:46.049661\n",
"53/200 1875 loss: 0.0374 val loss: 0.0384 0:13:01.636156\n",
"54/200 1875 loss: 0.0374 val loss: 0.0383 0:13:17.325480\n",
"55/200 1875 loss: 0.0374 val loss: 0.0385 0:13:32.836645\n",
"56/200 1875 loss: 0.0374 val loss: 0.0388 0:13:48.441919\n",
"57/200 1875 loss: 0.0373 val loss: 0.0384 0:14:03.917869\n",
"58/200 1875 loss: 0.0373 val loss: 0.0388 0:14:19.634660\n",
"59/200 1875 loss: 0.0372 val loss: 0.0389 0:14:35.261167\n",
"60/200 1875 loss: 0.0372 val loss: 0.0384 0:14:50.896159\n",
"61/200 1875 loss: 0.0372 val loss: 0.0390 0:15:06.445663\n",
"62/200 1875 loss: 0.0372 val loss: 0.0381 0:15:22.134292\n",
"63/200 1875 loss: 0.0372 val loss: 0.0382 0:15:37.757501\n",
"64/200 1875 loss: 0.0371 val loss: 0.0384 0:15:53.316315\n",
"65/200 1875 loss: 0.0371 val loss: 0.0382 0:16:08.820412\n",
"66/200 1875 loss: 0.0371 val loss: 0.0385 0:16:24.565601\n",
"67/200 1875 loss: 0.0370 val loss: 0.0384 0:16:40.101123\n",
"68/200 1875 loss: 0.0370 val loss: 0.0383 0:16:55.609609\n",
"69/200 1875 loss: 0.0370 val loss: 0.0382 0:17:11.264953\n",
"70/200 1875 loss: 0.0370 val loss: 0.0383 0:17:26.949355\n",
"71/200 1875 loss: 0.0370 val loss: 0.0381 0:17:42.623016\n",
"72/200 1875 loss: 0.0369 val loss: 0.0381 0:17:58.321779\n",
"73/200 1875 loss: 0.0369 val loss: 0.0382 0:18:13.832138\n",
"74/200 1875 loss: 0.0369 val loss: 0.0381 0:18:29.598127\n",
"75/200 1875 loss: 0.0369 val loss: 0.0383 0:18:45.208392\n",
"76/200 1875 loss: 0.0368 val loss: 0.0385 0:19:00.743062\n",
"77/200 1875 loss: 0.0368 val loss: 0.0381 0:19:16.186948\n",
"78/200 1875 loss: 0.0368 val loss: 0.0381 0:19:31.760451\n",
"79/200 1875 loss: 0.0368 val loss: 0.0385 0:19:47.388234\n",
"80/200 1875 loss: 0.0367 val loss: 0.0383 0:20:02.935055\n",
"81/200 1875 loss: 0.0367 val loss: 0.0385 0:20:18.402500\n",
"82/200 1875 loss: 0.0367 val loss: 0.0381 0:20:33.940910\n",
"83/200 1875 loss: 0.0367 val loss: 0.0384 0:20:49.569920\n",
"84/200 1875 loss: 0.0367 val loss: 0.0385 0:21:05.242798\n",
"85/200 1875 loss: 0.0366 val loss: 0.0382 0:21:20.880114\n",
"86/200 1875 loss: 0.0367 val loss: 0.0381 0:21:36.641503\n",
"87/200 1875 loss: 0.0366 val loss: 0.0381 0:21:52.095492\n",
"88/200 1875 loss: 0.0366 val loss: 0.0379 0:22:07.601546\n",
"89/200 1875 loss: 0.0366 val loss: 0.0381 0:22:23.401748\n",
"90/200 1875 loss: 0.0366 val loss: 0.0387 0:22:39.066528\n",
"91/200 1875 loss: 0.0366 val loss: 0.0387 0:22:54.610725\n",
"92/200 1875 loss: 0.0365 val loss: 0.0385 0:23:10.169099\n",
"93/200 1875 loss: 0.0365 val loss: 0.0385 0:23:25.674254\n",
"94/200 1875 loss: 0.0365 val loss: 0.0381 0:23:41.366783\n",
"95/200 1875 loss: 0.0365 val loss: 0.0382 0:23:56.902391\n",
"96/200 1875 loss: 0.0365 val loss: 0.0382 0:24:12.496421\n",
"97/200 1875 loss: 0.0365 val loss: 0.0383 0:24:28.063963\n",
"98/200 1875 loss: 0.0364 val loss: 0.0384 0:24:43.599283\n",
"99/200 1875 loss: 0.0365 val loss: 0.0381 0:24:59.157835\n",
"100/200 1875 loss: 0.0364 val loss: 0.0379 0:25:15.526026\n",
"101/200 1875 loss: 0.0364 val loss: 0.0387 0:25:31.212898\n",
"102/200 1875 loss: 0.0364 val loss: 0.0383 0:25:46.802330\n",
"103/200 1875 loss: 0.0364 val loss: 0.0382 0:26:02.178094\n",
"104/200 1875 loss: 0.0364 val loss: 0.0382 0:26:17.746102\n",
"105/200 1875 loss: 0.0363 val loss: 0.0382 0:26:33.309578\n",
"106/200 1875 loss: 0.0363 val loss: 0.0384 0:26:49.121648\n",
"107/200 1875 loss: 0.0363 val loss: 0.0381 0:27:04.702489\n",
"108/200 1875 loss: 0.0363 val loss: 0.0382 0:27:20.170574\n",
"109/200 1875 loss: 0.0363 val loss: 0.0379 0:27:35.856174\n",
"110/200 1875 loss: 0.0363 val loss: 0.0381 0:27:51.299808\n",
"111/200 1875 loss: 0.0362 val loss: 0.0384 0:28:06.870872\n",
"112/200 1875 loss: 0.0362 val loss: 0.0381 0:28:22.438025\n",
"113/200 1875 loss: 0.0362 val loss: 0.0383 0:28:37.875336\n",
"114/200 1875 loss: 0.0362 val loss: 0.0385 0:28:53.328504\n",
"115/200 1875 loss: 0.0362 val loss: 0.0382 0:29:08.972971\n",
"116/200 1875 loss: 0.0362 val loss: 0.0379 0:29:24.502631\n",
"117/200 1875 loss: 0.0362 val loss: 0.0382 0:29:39.941896\n",
"118/200 1875 loss: 0.0362 val loss: 0.0381 0:29:55.477538\n",
"119/200 1875 loss: 0.0362 val loss: 0.0384 0:30:11.112526\n",
"120/200 1875 loss: 0.0361 val loss: 0.0381 0:30:26.374847\n",
"121/200 1875 loss: 0.0361 val loss: 0.0380 0:30:41.861327\n",
"122/200 1875 loss: 0.0361 val loss: 0.0383 0:30:57.370377\n",
"123/200 1875 loss: 0.0361 val loss: 0.0381 0:31:12.900791\n",
"124/200 1875 loss: 0.0361 val loss: 0.0380 0:31:28.312363\n",
"125/200 1875 loss: 0.0361 val loss: 0.0380 0:31:43.843139\n",
"126/200 1875 loss: 0.0361 val loss: 0.0385 0:31:59.553265\n",
"127/200 1875 loss: 0.0361 val loss: 0.0385 0:32:14.916876\n",
"128/200 1875 loss: 0.0361 val loss: 0.0381 0:32:30.487089\n",
"129/200 1875 loss: 0.0360 val loss: 0.0380 0:32:45.878726\n",
"130/200 1875 loss: 0.0360 val loss: 0.0382 0:33:01.336908\n",
"131/200 1875 loss: 0.0360 val loss: 0.0377 0:33:16.793144\n",
"132/200 1875 loss: 0.0360 val loss: 0.0383 0:33:32.367575\n",
"133/200 1875 loss: 0.0360 val loss: 0.0383 0:33:47.764421\n",
"134/200 1875 loss: 0.0360 val loss: 0.0381 0:34:03.307962\n",
"135/200 1875 loss: 0.0360 val loss: 0.0383 0:34:18.773369\n",
"136/200 1875 loss: 0.0360 val loss: 0.0380 0:34:34.307721\n",
"137/200 1875 loss: 0.0360 val loss: 0.0382 0:34:49.981894\n",
"138/200 1875 loss: 0.0360 val loss: 0.0384 0:35:05.470105\n",
"139/200 1875 loss: 0.0359 val loss: 0.0383 0:35:20.803749\n",
"140/200 1875 loss: 0.0359 val loss: 0.0379 0:35:36.185748\n",
"141/200 1875 loss: 0.0359 val loss: 0.0382 0:35:51.533243\n",
"142/200 1875 loss: 0.0359 val loss: 0.0380 0:36:06.931450\n",
"143/200 1875 loss: 0.0359 val loss: 0.0381 0:36:22.431496\n",
"144/200 1875 loss: 0.0359 val loss: 0.0381 0:36:37.869902\n",
"145/200 1875 loss: 0.0359 val loss: 0.0384 0:36:53.547983\n",
"146/200 1875 loss: 0.0359 val loss: 0.0383 0:37:09.217082\n",
"147/200 1875 loss: 0.0359 val loss: 0.0382 0:37:24.778358\n",
"148/200 1875 loss: 0.0358 val loss: 0.0379 0:37:40.239433\n",
"149/200 1875 loss: 0.0359 val loss: 0.0381 0:37:55.704042\n",
"150/200 1875 loss: 0.0358 val loss: 0.0381 0:38:12.101171\n",
"151/200 1875 loss: 0.0358 val loss: 0.0380 0:38:27.662723\n",
"152/200 1875 loss: 0.0358 val loss: 0.0379 0:38:43.202257\n",
"153/200 1875 loss: 0.0358 val loss: 0.0385 0:38:58.810277\n",
"154/200 1875 loss: 0.0358 val loss: 0.0380 0:39:14.231378\n",
"155/200 1875 loss: 0.0358 val loss: 0.0381 0:39:29.652152\n",
"156/200 1875 loss: 0.0358 val loss: 0.0379 0:39:45.085332\n",
"157/200 1875 loss: 0.0358 val loss: 0.0380 0:40:00.572288\n",
"158/200 1875 loss: 0.0358 val loss: 0.0381 0:40:16.141797\n",
"159/200 1875 loss: 0.0357 val loss: 0.0381 0:40:31.634852\n",
"160/200 1875 loss: 0.0357 val loss: 0.0381 0:40:47.056919\n",
"161/200 1875 loss: 0.0357 val loss: 0.0383 0:41:02.554172\n",
"162/200 1875 loss: 0.0358 val loss: 0.0380 0:41:18.121788\n",
"163/200 1875 loss: 0.0357 val loss: 0.0379 0:41:33.599777\n",
"164/200 1875 loss: 0.0357 val loss: 0.0385 0:41:49.118886\n",
"165/200 1875 loss: 0.0357 val loss: 0.0378 0:42:04.560262\n",
"166/200 1875 loss: 0.0357 val loss: 0.0381 0:42:20.288644\n",
"167/200 1875 loss: 0.0357 val loss: 0.0381 0:42:35.660883\n",
"168/200 1875 loss: 0.0357 val loss: 0.0383 0:42:51.115505\n",
"169/200 1875 loss: 0.0357 val loss: 0.0380 0:43:06.762465\n",
"170/200 1875 loss: 0.0356 val loss: 0.0383 0:43:22.257651\n",
"171/200 1875 loss: 0.0356 val loss: 0.0383 0:43:37.670103\n",
"172/200 1875 loss: 0.0357 val loss: 0.0380 0:43:53.056826\n",
"173/200 1875 loss: 0.0357 val loss: 0.0381 0:44:08.524716\n",
"174/200 1875 loss: 0.0356 val loss: 0.0381 0:44:24.027149\n",
"175/200 1875 loss: 0.0356 val loss: 0.0379 0:44:39.346028\n",
"176/200 1875 loss: 0.0356 val loss: 0.0381 0:44:54.734347\n",
"177/200 1875 loss: 0.0356 val loss: 0.0384 0:45:10.213102\n",
"178/200 1875 loss: 0.0356 val loss: 0.0379 0:45:25.773002\n",
"179/200 1875 loss: 0.0356 val loss: 0.0382 0:45:41.326772\n",
"180/200 1875 loss: 0.0356 val loss: 0.0380 0:45:56.666135\n",
"181/200 1875 loss: 0.0356 val loss: 0.0382 0:46:11.978621\n",
"182/200 1875 loss: 0.0356 val loss: 0.0384 0:46:27.301725\n",
"183/200 1875 loss: 0.0356 val loss: 0.0381 0:46:42.745618\n",
"184/200 1875 loss: 0.0356 val loss: 0.0380 0:46:58.128569\n",
"185/200 1875 loss: 0.0355 val loss: 0.0380 0:47:13.711115\n",
"186/200 1875 loss: 0.0355 val loss: 0.0379 0:47:29.307111\n",
"187/200 1875 loss: 0.0356 val loss: 0.0382 0:47:44.756529\n",
"188/200 1875 loss: 0.0355 val loss: 0.0381 0:48:00.214915\n",
"189/200 1875 loss: 0.0355 val loss: 0.0383 0:48:15.668996\n",
"190/200 1875 loss: 0.0355 val loss: 0.0381 0:48:31.229319\n",
"191/200 1875 loss: 0.0355 val loss: 0.0382 0:48:46.675617\n",
"192/200 1875 loss: 0.0355 val loss: 0.0380 0:49:02.254153\n",
"193/200 1875 loss: 0.0355 val loss: 0.0382 0:49:17.595616\n",
"194/200 1875 loss: 0.0355 val loss: 0.0380 0:49:32.985089\n",
"195/200 1875 loss: 0.0355 val loss: 0.0381 0:49:48.470253\n",
"196/200 1875 loss: 0.0354 val loss: 0.0382 0:50:03.960498\n",
"197/200 1875 loss: 0.0355 val loss: 0.0383 0:50:19.343814\n",
"198/200 1875 loss: 0.0355 val loss: 0.0382 0:50:34.860656\n",
"199/200 1875 loss: 0.0355 val loss: 0.0381 0:50:50.302304\n",
"200/200 1875 loss: 0.0355 val loss: 0.0380 0:51:06.366335\n"
]
}
],
"source": [
"# Additional Training.\n",
"# 追加でtrainingする。\n",
"\n",
"# Compiles the part for loss and gradients fo train_tf() function into a graph of Tensorflow 2, so it is a little over twice as fast as train(). However, it is still nearly twice as slow as fit().\n",
"# train_tf() は loss と gradients を求める部分を tf のgraphにコンパイルしているので、train()よりも2倍強高速になっている。しかし、それでもfit()よりは2倍近く遅い。\n",
"\n",
"loss2_2, vloss2_2 = AE2_work.train_tf(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs = MAX_EPOCHS, \n",
" shuffle=True,\n",
" run_folder= save_path2,\n",
" optimizer = optimizer2,\n",
" save_epoch_interval=50,\n",
" validation_data=(x_test, x_test)\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 279
},
"executionInfo": {
"elapsed": 12,
"status": "ok",
"timestamp": 1637567363692,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "ez2T78hkFQRG",
"outputId": "1818b76a-2336-491f-da40-da1d8724bd05"
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAEGCAYAAABCXR4ZAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdd5xU1f3/8ddnC70sICJFXVAsCAqKLVFjiUb92qOCUYMlsbeoiSQmhhiNsUTMLxqNUWOJBUs0JNbYY2yAooCKIFIWKUsXqbv7+f3xucPMLsvuLOzusPB+Ph7z2Lll7pwzMzv3Peece6+5OyIiIiJNQV6uCyAiIiKSLQUXERERaTIUXERERKTJUHARERGRJkPBRURERJqMglwXoD5sscUWXlxcnOtiiIiISD0YM2bMPHfvXN2yTSK4FBcXM3r06FwXQ0REROqBmU1b1zJ1FYmIiEiToeAiIiIiTYaCi4iIiDQZm8QYFxERkY3J6tWrKSkpYcWKFbkuykatRYsW9OjRg8LCwqwfo+AiIiJSz0pKSmjbti3FxcWYWa6Ls1Fyd+bPn09JSQk9e/bM+nHqKhIREalnK1asoFOnTgotNTAzOnXqVOdWKQUXERGRBqDQUrv1eY0UXOqirAzuuw/Ky3NdEhERkc2SgktdvPUWnH02vPNOrksiIiJSozZt2uS6CA0iq+BiZoeb2UQzm2xmQ6tZ3tzMRiTL3zOz4mR+sZktN7Oxye2uZH4rM3vWzD4zswlm9vuMbZ1hZqUZj/lR/VS1HqxcGX+XL89tOURERDZTtQYXM8sH7gCOAPoAp5hZnyqrnQ0sdPftgeHAjRnLvnD3/sntvIz5t7j7TsAA4NtmdkTGshEZj7lnPerVMMrK4u+qVbkth4iISJbcnZ/+9Kf07duXfv36MWLECABmzZrFAQccQP/+/enbty///e9/KS8v54wzzliz7vDhw3Nc+rVlczj0XsBkd58CYGaPAccCn2SscywwLLn/JHC71TDixt2XAa8l91eZ2QdAjzqXvrEpuIiISF1ddhmMHVu/2+zfH267LatV//GPfzB27Fg++ugj5s2bx5577skBBxzAI488wve+9z2uvvpqysvLWbZsGWPHjmXmzJmMHz8egEWLFtVvuetBNl1F3YEZGdMlybxq13H3MmAx0ClZ1tPMPjSzN8xs/6obN7Mi4GjglYzZ3zezj83sSTPburpCmdk5ZjbazEaXlpZmUY16oOAiIiJNzFtvvcUpp5xCfn4+Xbp04Tvf+Q6jRo1izz335G9/+xvDhg1j3LhxtG3bll69ejFlyhQuvvhiXnjhBdq1a5fr4q+loU9ANwvYxt3nm9kewDNmtou7LwEwswLgUeD/pVp0gH8Bj7r7SjM7F3gAOLjqht39buBugIEDB3oD1yOkjiZScBERkWxl2TLS2A444ADefPNNnn32Wc444wwuv/xyfvjDH/LRRx/x4osvctddd/H4449z33335bqolWTT4jITyGz16JHMq3adJIy0B+a7+0p3nw/g7mOAL4AdMh53NzDJ3de8q+4+392TUbDcA+yRfXUamFpcRESkidl///0ZMWIE5eXllJaW8uabb7LXXnsxbdo0unTpwo9//GN+9KMf8cEHHzBv3jwqKir4/ve/z3XXXccHH3yQ6+KvJZsWl1FAbzPrSQSUwcAPqqwzEhgCvAOcCLzq7m5mnYEF7l5uZr2A3kBqrMx1RMCpdNSQmXV191nJ5DHAp+tVs4ag4CIiIk3M8ccfzzvvvMNuu+2GmXHTTTex1VZb8cADD3DzzTdTWFhImzZtePDBB5k5cyZnnnkmFRUVANxwww05Lv3aag0u7l5mZhcBLwL5wH3uPsHMrgVGu/tI4F7gITObDCwgwg3AAcC1ZrYaqADOc/cFZtYDuBr4DPggGcd7e3IE0SVmdgxQlmzrjPqr7gZScBERkSZi6dKlQJyd9uabb+bmm2+utHzIkCEMGTJkrcdtjK0smbIa4+LuzwHPVZl3Tcb9FcBJ1TzuKeCpauaXANUedeTuPwd+nk25Gl1qjMvq1bkth4iIyGZKZ86tC7W4iIiI5JSCS10ouIiIiOSUgktdKLiIiIjklIJLXeg8LiIiIjml4FIXanERERHJKQWXulBwERERySkFl7pQcBERkU1QmzZt1rls6tSp9O3btxFLUzMFl7rQGBcREZGcauiLLG5a1OIiIiJ1dNllMHZs/W6zf/+ar904dOhQtt56ay688EIAhg0bRkFBAa+99hoLFy5k9erVXHfddRx77LF1et4VK1Zw/vnnM3r0aAoKCrj11ls56KCDmDBhAmeeeSarVq2ioqKCp556im7dunHyySdTUlJCeXk5v/rVrxg0aNCGVBtQcKkbBRcREWkCBg0axGWXXbYmuDz++OO8+OKLXHLJJbRr14558+axzz77cMwxx5Bcdicrd9xxB2bGuHHj+OyzzzjssMP4/PPPueuuu7j00ks59dRTWbVqFeXl5Tz33HN069aNZ599FoDFixfXS90UXOpCwUVEROqoppaRhjJgwADmzp3LV199RWlpKR06dGCrrbbiJz/5CW+++SZ5eXnMnDmTOXPmsNVWW2W93bfeeouLL74YgJ122oltt92Wzz//nH333Zfrr7+ekpISTjjhBHr37k2/fv244ooruOqqqzjqqKPYf//966VuGuNSF7pWkYiINBEnnXQSTz75JCNGjGDQoEE8/PDDlJaWMmbMGMaOHUuXLl1YsWJFvTzXD37wA0aOHEnLli058sgjefXVV9lhhx344IMP6NevH7/85S+59tpr6+W51OJSF2pxERGRJmLQoEH8+Mc/Zt68ebzxxhs8/vjjbLnllhQWFvLaa68xbdq0Om9z//335+GHH+bggw/m888/Z/r06ey4445MmTKFXr16cckllzB9+nQ+/vhjdtppJzp27Mhpp51GUVER99xzT73US8GlLhRcRESkidhll134+uuv6d69O127duXUU0/l6KOPpl+/fgwcOJCddtqpztu84IILOP/88+nXrx8FBQXcf//9NG/enMcff5yHHnqIwsJCttpqK37xi18watQofvrTn5KXl0dhYSF33nlnvdTL3L1eNpRLAwcO9NGjRzf8E511Fvztb7D77jBmTMM/n4iINEmffvopO++8c66L0SRU91qZ2Rh3H1jd+hrjUhdqcREREckpdRXVhU5AJyIim6hx48Zx+umnV5rXvHlz3nvvvRyVqHoKLnWhFhcREcmSu9fpHCm51q9fP8bW95nyarE+w1XUVVQXCi4iIpKFFi1aMH/+/PXaMW8u3J358+fTokWLOj0uqxYXMzsc+COQD9zj7r+vsrw58CCwBzAfGOTuU82sGPgUmJis+q67n5c8Zg/gfqAl8Bxwqbu7mXUERgDFwFTgZHdfWKdaNRQFFxERyUKPHj0oKSmhtLQ010XZqLVo0YIePXrU6TG1BhczywfuAA4FSoBRZjbS3T/JWO1sYKG7b29mg4EbgdQFCb5w9/7VbPpO4MfAe0RwORx4HhgKvOLuvzezocn0VXWqVUPRGBcREclCYWEhPXv2zHUxNknZdBXtBUx29ynuvgp4DKh6VaZjgQeS+08Ch1gNHXtm1hVo5+7verSjPQgcV822HsiYn3tqcREREcmpbIJLd2BGxnRJMq/addy9DFgMdEqW9TSzD83sDTPbP2P9knVss4u7z0ruzwa6ZFORRqHgIiIiklMNfVTRLGAbd5+fjGl5xsx2yfbByZiXakc2mdk5wDkA22yzTb0Utlap4FJREd1G+fmN87wiIiICZNfiMhPYOmO6RzKv2nXMrABoD8x395XuPh/A3ccAXwA7JOtnjsbJ3OacpCsp1aU0t7pCufvd7j7Q3Qd27tw5i2rUg9QYF9CFFkVERHIgm+AyCuhtZj3NrBkwGBhZZZ2RwJDk/onAq0lrSedkcC9m1gvoDUxJuoKWmNk+yViYHwL/rGZbQzLm516qxQXUXSQiIpIDtXYVuXuZmV0EvEgcDn2fu08ws2uB0e4+ErgXeMjMJgMLiHADcABwrZmtBiqA89x9QbLsAtKHQz+f3AB+DzxuZmcD04CTN7ya9UTBRUREJKd0kcW6GDAAUmcVnDkTunVr+OcUERHZzOgii/Ulc4yLWlxEREQanYJLXairSEREJKcUXOqirAwKkmFBCi4iIiKNTsGlLsrKoFWruK/gIiIi0ugUXOqivFzBRUREJIcUXOqirAxatoz7Ci4iIiKNTsGlLtRVJCIiklMKLnWhFhcREZGcUnCpi8wxLrpWkYiISKNTcKkLdRWJiIjklIJLXSi4iIiI5JSCS11ojIuIiEhOKbhky13ncREREckxBZdsVVTEXwUXERGRnFFwyVbqAosKLiIiIjmj4FKDyy+HP/0pmUgFF41xERERyRkFlxr85z/w+uvJRHl5/FVwERERyRkFlxq0bw+LFiUTqRaXZs0gP1/BRUREJAcUXGpQVFRNcCkoiPCi4CIiItLoFFxqUFQEixcnE6ngkp+v4CIiIpIjWQUXMzvczCaa2WQzG1rN8uZmNiJZ/p6ZFVdZvo2ZLTWzK5PpHc1sbMZtiZldliwbZmYzM5YdueHVXD+VuopSY1xSLS66VpGIiEijK6htBTPLB+4ADgVKgFFmNtLdP8lY7Wxgobtvb2aDgRuBQRnLbwWeT024+0Sgf8b2ZwJPZ6w/3N1vWb8q1Z9UV5E7mLqKREREci6bFpe9gMnuPsXdVwGPAcdWWedY4IHk/pPAIWZmAGZ2HPAlMGEd2z8E+MLdp9W18A2tfftoaFm2DI1xERER2QhkE1y6AzMypkuSedWu4+5lwGKgk5m1Aa4CflPD9gcDj1aZd5GZfWxm95lZh+oeZGbnmNloMxtdWlqaRTXqrqgo/i5ahMa4iIiIbAQaenDuMKLbZ2l1C82sGXAM8ETG7DuB7YiupFnAH6p7rLvf7e4D3X1g586d67XQKZWCS9UxLgouIiIija7WMS7E+JOtM6Z7JPOqW6fEzAqA9sB8YG/gRDO7CSgCKsxshbvfnjzuCOADd5+T2lDmfTP7K/DvulWp/rRvH38XLwZaqatIREQk17IJLqOA3mbWkwgog4EfVFlnJDAEeAc4EXjV3R3YP7WCmQ0DlmaEFoBTqNJNZGZd3X1WMnk8MD7r2tSzSi0uzRRcREREcq3W4OLuZWZ2EfAikA/c5+4TzOxaYLS7jwTuBR4ys8nAAiLc1MjMWhNHKp1bZdFNZtYfcGBqNcsbTaXg0lFjXERERHItmxYX3P054Lkq867JuL8COKmWbQyrMv0N0Kma9U7PpkyNoVJXUdWjipZWO2xHREREGpDOnFuDdQ7OLSxUi4uIiEgOKLjUoEWLaFypdDi0xriIiIjkjIJLLdZcr0jBRUREJOcUXGqx5grROgGdiIhIzim41GLNhRZ1kUUREZGcU3CphbqKRERENh4KLrVYq6tIwUVERCRnFFxq0b59lRYXjXERERHJGQWXWqxpcdFFFkVERHJOwaUWRUWwfDmsWlERM1LBpawMKipyWzgREZHNjIJLLdac9v/r5KVKBRfQkUUiIiKNTMGlFmtO+/91ftzJz49T/oO6i0RERBqZgkstUi0ui5Ym16PMbHFRcBEREWlUCi61SLW4LF6atLgouIiIiOSMgkst1nQVfZN0Dym4iIiI5IyCSy3WdBV9k3QVpc7jAgouIiIijUzBpRZruoqWJWFFRxWJiIjkjIJLLdq0gbw8WJQKLmpxERERyRkFl1rk5UG7dklwycuLm4KLiIhITmQVXMzscDObaGaTzWxoNcubm9mIZPl7ZlZcZfk2ZrbUzK7MmDfVzMaZ2VgzG50xv6OZ/cfMJiV/O6x/9epHUREsXt4sWltAwUVERCRHag0uZpYP3AEcAfQBTjGzPlVWOxtY6O7bA8OBG6ssvxV4vprNH+Tu/d19YMa8ocAr7t4beCWZzqmiIli0okWMbwEFFxERkRzJpsVlL2Cyu09x91XAY8CxVdY5Fngguf8kcIiZGYCZHQd8CUzIskyZ23oAOC7LxzWY9u1h0YrmCi4iIiI5lk1w6Q7MyJguSeZVu467lwGLgU5m1ga4CvhNNdt14CUzG2Nm52TM7+Lus5L7s4Eu1RXKzM4xs9FmNrq0tDSLaqy/oiJYrBYXERGRnGvowbnDgOHuvrSaZfu5++5EF9SFZnZA1RXc3YmAsxZ3v9vdB7r7wM6dO9dnmdfSvj0sWpkRXHStIhERkZwoyGKdmcDWGdM9knnVrVNiZgVAe2A+sDdwopndBBQBFWa2wt1vd/eZAO4+18yeJrqk3gTmmFlXd59lZl2BuRtQv3rRoQMsWNka2mhwroiISC5l0+IyCuhtZj3NrBkwGBhZZZ2RwJDk/onAqx72d/didy8GbgN+5+63m1lrM2sLYGatgcOA8dVsawjwz/WsW73p0QOWrm7BoryOMUPBRUREJCdqbXFx9zIzuwh4EcgH7nP3CWZ2LTDa3UcC9wIPmdlkYAERbmrSBXg6Gb9bADzi7i8ky34PPG5mZwPTgJPXo171qrg4/k71bekPCi4iIiI5kk1XEe7+HPBclXnXZNxfAZxUyzaGZdyfAuy2jvXmA4dkU67GouAiIiKycdCZc7OwJriUJ0N9dK0iERGRnFBwyUKnTtA6f/nawUUtLiIiIo1KwSULZtCz1VymlvWIGQouIiIiOaHgkqXilnOYurpbTOTnx8UWFVxEREQalYJLlopbzObLVd3x1OnwWrSApdWdV09EREQaioJLloqbz2JJeRsWLUpmbL01TJ+e0zKJiIhsbhRcslTc7CsApk5NZvTsCV9+mbPyiIiIbI4UXLJUXFgCKLiIiIjkkoJLlooL4vJMlYLLokWk+45ERESkoSm4ZKkjC2ibv6xycAG1uoiIiDQiBZcsWXkZxa3mKriIiIjkkIJLtsrKKG5dms4pCi4iIiKNTsElW2VlFLcpZepU4lwuHTpAu3YZg15ERESkoSm4ZKu8nOI28/n6a1i4kOQ6ADqySEREpDEpuGSrrIzidgsA+OCDZJ6Ci4iISKNScMlWWRkHdp/E1lvDaafBlClEcFnTdyQiIiINTcElW2VldGy1ghdfhNWr4bDDYO4WfWDZMpg7N9elExER2SwouGSrvBwKCth5Z/j3v6GkBK59+7uxTN1FIiIijULBJVtlZVBQAMC++8Lhh8PIMd1wUHARERFpJAou2coILgBHHw0zZjfjY3ZVcBEREWkkWQUXMzvczCaa2WQzG1rN8uZmNiJZ/p6ZFVdZvo2ZLTWzK5Pprc3sNTP7xMwmmNmlGesOM7OZZjY2uR25YVWsJ1WCy5FJqf7derCCi4iISCOpNbiYWT5wB3AE0Ac4xcz6VFntbGChu28PDAdurLL8VuD5jOky4Ap37wPsA1xYZZvD3b1/cnuuTjVqKOXlkJ+/ZrJrV9hzT/gXR+skdCIiIo0kmxaXvYDJ7j7F3VcBjwHHVlnnWOCB5P6TwCFmZgBmdhzwJTAhtbK7z3L3D5L7XwOfAt03pCINrkqLC8BRR8H73/RhzpiSWC4iIiINKpvg0h2YkTFdwtohY8067l4GLAY6mVkb4CrgN+vaeNKtNAB4L2P2RWb2sZndZ2Yd1vG4c8xstJmNLi0tzaIaG8C92uBy9NHg5PHcwn3gjTcatgwiIiLS4INzhxHdPkurW5gEm6eAy9x9STL7TmA7oD8wC/hDdY9197vdfaC7D+zcuXO9F7ySior4WyW49O8P3bs5/8w/AZ56qmHLICIiIlkFl5nA1hnTPZJ51a5jZgVAe2A+sDdwk5lNBS4DfmFmFyXrFRKh5WF3/0dqQ+4+x93L3b0C+CvRVZVb5eXxN2OMC8Tlik473fhn+dG8+tic9HoiIiLSILIJLqOA3mbW08yaAYOBkVXWGQkMSe6fCLzqYX93L3b3YuA24Hfufnsy/uVe4FN3vzVzQ2bWNWPyeGB8nWtV31LjV6q0uABccw3s0HUJZywczuIX323kgomIiGxeag0uyZiVi4AXiUG0j7v7BDO71syOSVa7lxjTMhm4HFjrkOkqvg2cDhxczWHPN5nZODP7GDgI+Endq1XPaggurVrBgw8XMJPuXHrF2stFRESk/mS1p00OSX6uyrxrMu6vAE6qZRvDMu6/Bdg61js9mzI1qhqCC8DeB7Vi6A5P8rvPTuSKjyrot5vO6yciItIQtIfNxjrGuGS69PJ88injseu/aKRCiYiIbH4UXLJRS4sLwJZnHcXBLd7hsZEt8QpvpIKJiIhsXhRcspFFcKGwkMEnlTFlZQ/G/GV045RLRERkM6Pgko1sggtw/I37UsgqHrtB1y4SERFpCAou2chijAtAh64t+F6fGYyYsS8V//1fIxRMRERk86Lgko0sW1wABl/RnRK25u0z/wqrVzdwwURERDYvCi7ZqENwOeakFnRos4pff3E6fuvwBi6YiIjI5kXBJRt1CC5t28J1NzbjVQ7hiV99DF/o8GgREZH6ouCSjSzHuKScey7032UVV5T9nqVnXxpXlxYREZENpuCSjTq0uEDkmzvubkaJ9+C6N/aDBx9swMKJiIhsPhRcslHH4ALwrW/BmWc4t3AlYy+5D0pLG6hwIiIimw8Fl2ysR3ABuOUPxhadnLOW3EbZJZc3QMFEREQ2Lwou2ajjGJeUjh3hjr8U8iED+MNj3eCWWxqgcCIiIpsPBZdsrGeLC8D3vw8nHO8My/8tM386HB56qJ4LJyIisvlQcKlNRcUGBReILqMyK+R33f8MZ50F//wnAJMnw003wV131VdhRURENm0KLjUZMAB+/OMNDi49e8JZZxl/nXsM0/seyfwTz+XgXUvp3RuuugouvBCmTavHcouIiGyiFFxq0qZNNIukxrisZ3AB+OUvwcy4svgJDix8i7fHteXG08fz1lux/C9/qYfyioiIbOIUXGqy3XZx5ttUi0sdB+dm2nprOOcceOKZZnyZtx3P7nQlP3t4N7497i6OPhruuQdWrqyncouIiGyiFFxq0qsXzJwJX38d0xvQ4gJw9dVwwgnw0kvGIaNvhCOOgPPP54JW91NaCk8+WQ9lFhER2YRlFVzM7HAzm2hmk81saDXLm5vZiGT5e2ZWXGX5Nma21MyurG2bZtYz2cbkZJvN1r96G2i77eLv5MnxdwODy1ZbwVNPxcnpaN0annkGzjqL7z56FtsXlfLnP+vSACIiIjWpNbiYWT5wB3AE0Ac4xcz6VFntbGChu28PDAdurLL8VuD5LLd5IzA82dbCZNu5kQoun38efzcwuKyloADuuYe8Cy/g/EU38Pbbxo47OhdeCEOHwnnnwe23p3uq6tOqVQ2zXRERkYaUTYvLXsBkd5/i7quAx4Bjq6xzLPBAcv9J4BAzMwAzOw74EphQ2zaTxxycbINkm8fVvVr1JBVcJk6MvxswxmWdzOBPf+KSS/P4ExexXel73P+3CoYPhyeegIsvhr33hg8/rN+nPfxwGDKkfrcpIiLS0LIJLt2BGRnTJcm8atdx9zJgMdDJzNoAVwG/yXKbnYBFyTbW9VwAmNk5ZjbazEaXNtR1gLbYAtq2hUmTYrq+W1xSzCgYfjMX/W0gz5UdxtK89qz8873MK3WeeAK++gr22Qdef71+nm7qVHjtNXjxRV24WkREmpaGHpw7jOj2WVrfG3b3u919oLsP7Ny5c31vPpjFAN3U4T4NFVxSz3XGGTBuHLbXnvCjH2HHHM2J357F+PHR+HP88fDZZxv+VKlBwPPnpzOZiIhIU5BNcJkJbJ0x3SOZV+06ZlYAtAfmA3sDN5nZVOAy4BdmdlEN25wPFCXbWNdzNa5UdxE0bHBJ2XZbePll+OMf4ZVXoG9fOr36BM89B82awZFHbnh4eeIJSGW9d97Z8CKLiIg0lmyCyyigd3K0TzNgMDCyyjojgdSIiROBVz3s7+7F7l4M3Ab8zt1vX9c23d2B15JtkGzznxtQvw2XGVwaYoxLdfLy4JJLYmDLdtvBySdTPHQw//rLTObNgz594NRT4YorYJddoHfvNVcRqNW0afD++/CTn0D79gouIiLStNQaXJLxJhcBLwKfAo+7+wQzu9bMjklWu5cY0zIZuBxY65DpbLaZLL4KuDzZVqdk27nT2C0umXbaCd5+G669Fp55hr1OKmbysVfw03MW88wzccRR9+7QsiUcd1ycI+aJJ2D6dFi0CEpK1j6pXaqbaNCgGPT79tuNWyUREZENYb4JjM4cOHCgjx49umE2/vLLcOihcX/VKigsbJjnqc3MmXDDDfDXv4I7S394AXlDf0ar7buxejXccgtcdx0sW1b5YdtvD+++C506xfQ++8Dq1TBmDAwbFplo0SJo1y4OnurVK3dVFBERATCzMe4+sLplOnNubXLRVVSd7t2jiWXyZDj7bNo8+Gda9d8BbryRQl/Fz38OCxdGN9Dtt8Ott8If/hCtL4MHxzlbbr4Z3nsvWlsA9t03jip6/3149NFo4NlmmzjD77x5uauqiIjIuqjFpTZlZdEXU14OFRUN8xzr48svY6DKP/8ZzSSXXhpHJbVrV2m1++6Ds8+Gvn1h/Hg4+WR44AFo0SJaWjp0gPPPh8ceg+LiyEfPPQff+178TVm0KMbExNl5REREGo5aXDZEQUEc6dPY41tq07NnXDLgueegS5cILt27w5VXxolfEmedFcFk/Hj4xS+iZaVFi1hWVBQDfe+8E5Yvj2X/+ld0OT3/fHQnATz0UAScbt1iLM3IkTr/i4iI5IaCSza2227jCy4pRxwRI2zffx+OPhqGD49Q8+Mfr7lUwe23x93rr48DljLtu2/8veEG2HHHuH/hhRFqrr8eZs+OA5wGDIihPh9+CMceCwceCGPHNl41RUREQMElOzvuCK1a5boUNdtzT3jkkTij3I9+BH//ewxaOfpo8h57hN5bfV3twy6+GK65JsJJSrt2Mf3003GkUqo15sEHY4jNnXfGuWQOPjgOr65ORUU0Bj3ySAPUdR1yde2l//1v4+pFFBHZlCm4ZOPqq+Hf/851KbLTqxfccUckil/8IppITj01Lk193nnwySeVVt9tN/jNb9ZuibnkEmjTJs7zMgtd8aMAACAASURBVGxYujWmsDA28/bbsbM+6aTKh1yvWgX33w+77gr/93/x1B980KA1BmIwcffucOKJsLTKeZqXLo1y1NclEzK9/DLstx/84x/1v20REVmbgks2unSJ44ibki23jMEq06fDf/8bhxbdf3+cse5b34I//Qnmzl3nwzt1iu6j446LYTNVbbcd/O1vMGoUnHkm3H13BJyePWM6Lw/uuQc6doRf/jIe4x7nkclspXnhhWgY+t//6lY998qB6Y9/jOo8/TR8+9uVnyPV8nPkkfDGG3V7npSKihjAvMsu0aCV8vDD8beu5RcRkfXk7k3+tscee7hkYe5c99//3n3XXd3BvbDQfdAg9+efd1+4cL02+bOfxaZSt0MOcX/hBfeKilh+000x/8033X/967i/++7uZWXuq1e777BDzGvZMopRnYUL3R96yH3p0piuqHD//vfde/RwLylxX7TIvX179xNOiOdu39594MD0408/3b1jR/c+fdxbt3Z/662a6zR5svuHH6anly5133PPKGfbtu55ee5ffum+fLl7u3Yxf9991+vlExGRagCjfR37/JyHjvq4Kbish/Hj3S+7zL1Dh3Tq6N3bfdgw91mz6rSp6dMjQFSXfb75xn2rrdy33DKeYq+94u8dd7j/5S9x/69/de/fP3LUqafG/Jdfdn/ttchZqSIecYT7qlXud90V03l5EVB+9auYHjMmnvP//b+YHjs2AlKnTu6nnRbV2n579+Ji92XLqq/LmDHuRUURUObOjXk335wu57Rp7vn57ldc4f6Pf8T8XXd1b97cfeXKOr1sm5wxY9yvusr97LPdzz23zh+jrDz9tPu227rPmVP/2xaRjYeCi6zbsmXuL77ofv317t/9rq9pifnBD9zffjvddLIB7rgjNnvqqREkDjkkwsFWW7l/+9vxFAsXup9xRszLbMEB9yOPTLfWHHtstM5897uxEzNLh5qU0tKowk9+4v6//8Xyxx6LZa+9FtO/+c3a5fzwwwhJPXpEOLngAvevv3bv3Nn9sMPS6w0eHK06RxwRgezhh2Obo0Zt8EvVYOrhbazVwIHxunXrFkGuRw/3Dz6o3+cYNChe65/+tH63KyIbFwUXyd7Eie6XXpruA9l220gXgwZFU8iMGXXeZHl5dBWtXh3Tn34awQLW7rapqHD//HP3N95wf/VV948/Ti/7zW/iMR07us+cGfN+9zv3goLIWJlOOCFCxZVXxs40szXo5JPdW7Rwnzq1crW32MJ9663dp0xxv/DCeNxZZ8VzvvNOet133kmHqgsvjO2A++231/5aXHJJhKDXX6993bpaubL6Vp9VqyJUnHdedgHmm2+iu6wuJk2K1+Dmm2P6gw/itWzZ0n3vveN2551122ZVFRURIsG9VavGb3VpjPDn7r5ggXvPnvE5+cc/0v83m4PPPovvi4Y0ZEi0nsrGTcFF6u7rr2NPM3iw+0EHxc/n1N76W99yv/tu98WL13vzd9/tfvXVdXtMRYX7n/60dthZtGjtdUeOjKI2b+5+4IGVl02bFjvUww+P7qDZs2NH0blzhCb3mN+2ra/VmpOy997p4FVR4d6lS4ylyfT22+7HHRdfxu7uI0akd7oQrUZffVW316A6FRXujz8eQe3QQ9fewd5/f/qt+8tfYvktt7h37+4+YUJ6vYkT3c85JzJrXl5012Xruuti+9OmpefNnu3+wx/G69ynT4TVL75Y/3p+/HE8x9ChUb7aWl3Ky92HD9/wVp+33nLfZ58Yj5Xt+zVt2voHnVQLZZcuvqbFsSFD0/vvx//V+gaksWPdf/lL9/nzN6wcY8dGC+of/rBh26nJxInxmu62W8M9x7qsq3u6OuPHuz/3XMOVpSlQcJENV1ERe7kbbnDfeWdfM1L1yivXqxWmoa1alR5Xc8stay+/7TZfMyi4uDjCxHvvVV7n97+PL9Kq892jReicc9K/Do85JnZsmcvbtEnvgF5+OVqK9trLfcmS2KG2bh3hoWoXU0VF9juqlSvdTzwxnidV31deSS8vK4ty7bZb/IJv1ixanFJB5vjjY72lS9232SZehyFD3HfaKcqd7TiVvn2jYW5dZs6M1/qUU2reziefRMvQjjuu/boPHx5lnj49uh1btXIfN6767ZSXu595ZqzfvXu0YtTm1lsjoKTGNpWVRfclRPdX69Yxnqm2cezPPx+PGTas9ueszsCB8X6tXp0OhKmuztosWxav9aRJlVsuXn89Xo9vvqm8/mOPRbgH9/32q/u/8rx50bKW+vw9+uj6h6yLL47tdO3qvmLF+m2jNqnXE9KtttUpK1u79fLttyu30tbFW2/F5zWbVtl3303/eBg/fv2er74sXRpd8g0xXq02Ci5Svyoqor9k8OD47yosjL3duHGN156ehcsvj094qsWjqk8+iS/zTp3c//3vtZeXl2ffQnD99fFcCxbEkKFWrWLn/8or6XE7LVpULstHH0VPXIsWcZTUzTfHuJxevSJEvPnm2s/z0Ucxbiflssti29dfH41k3bunxw25x44J3J94InbIqYazoUPT44ZGjYpBteD+3//G48aNi6Bx8MHxJV6T8ePjsX/6U83rXX11rDd6dJSvpCT9K3/ChAh/qVayLbeM9+WTT9KPP+qoGD/uHq9jquVqjz0iP992m/sjj0RdTz89lp15ZnQlDhoUz/npp9FNULVOd96Z3qEdemgs/+lPY/qqq+IL/D//iY/6fvvFEWWZ9U+FrCVL4r2DCImfflr5eSZOdP+//4udU3U++ige+8c/xnRZWRyF161bbLsmr78e3Zupehx+eASVTz+NMVkQLSMpqUHn++0XLaBt2sRrPmlSzc+TUl4edWnWzP3BByNwQczLbHnLxvLlMb6sd29fMxB+fZWWRmvm4MFrv8+77RavJbjfe2/1j1+yJD5Tu+6aDnqPP55+Xbfdtm7l+/LLdBdnt27pUDZ9ehwtmfmVmQot220X4wC/973snyelosL9b39zv+iiyp/TuvjmmzgiNFXuwsL40dGYQUrBRRrOlCnxUym1F2nRIv6zjzwyPvnvv5+zTvpFi6oPJA3h5Zej+oMHx85j112jq8Q9up8GDKj+i3LOnNi5Fhend3ZHHBFf4AUF0cowaVJ8yZ1/frQAmcUvx2eeicdcfHF6e3/+c8x76aXoyevbNxrIUr++P/88fdj54sWxoxowIJ7rzDMrl+2++3xNV0VpacxbtWrtX6K/+lXk19p+laWer7g4flWnPi79+8fj27WLcUxz58YYmy5dImhNmxYfobZtozUm5auvopVkzz3TrQaZt2uuifVSofKQQ+J5IB6X8sgj8Zr+3/+lA8whh8Tf88+vXIdUEEx1c5aWxpe7WbwOqffomWdiR3zAAekd06pV6cPqW7Z0/9e/1n6NLrssdhKp19s9dmZmMT4qta0FC2J8V2a36Wmnxc7urrvcr7026vqtb8VOcMst431s1iw+A/feG+U4+eT0jvSzz6JVKbNV7NZb4zdJ6lQEZWURUn7721gP4ii+1LJbb42vgjZtKrcSrV4dgX7ECPe//33tYPPoo+nP7R57xOe/rCxes5paX6qOh/nii2hhLCjwtQZxp7qJbr01Av6JJ669vdWr4/8vFQDPPTc+Z506RTD705+iVa6gIL7aarNkSfwPtm+fPtrx3nvjf2jAgJhOdY19/HGst9128f9+662xvLYuo4kT4/vmtNPcn33W/aSTKofX5cvjx8lBB8X3Ruoz9MILcbBDZnexe/yw2HHHePz3vhfd7pddFv+fzZtHqG6M36cKLtLw5s2LdtArr4x2/J12Sv/3tGsX35rXXBPf6Ot5zpiN2eLF6SOcjj++9l/H1Zk9O1pN3OMlOuqoyjvivLwYN/2DH6Snd9+98hf7ihXxi79jx/QA6Jq6GVLn2enQId1FklJREW9ps2bxS/G44+KtLCpKj3+ZNi2e75BDsqvjvffG+oMHRwvJ5ZfHYy+/vPLO2j2O8mrfPoJO6sitxx+vfrsVFfH4Tz+NHUBmq0FZmfv++0dYuPLK2DG1bBnrPPts7IQOOCA9BiE1IPvAA2PHWdUPfxiPGTs2WnIKC9PddZlB8p57fM0RbCtWRNcRxBiWgQPj/bvqqnSXxcqVMUC8uh3qOefEY7ffPsJbUVFMp0Lp0qUROn70o/Rjnngiyta8eTSQzpoV71+/frFjPuywtev385/H53jcuAhMqaC3997xfhxwQLqeLVrE81XdiX35ZbT6FRREEFm5MgbLZ36W8/Pjc5w6hcGhh8bnorw8yg1Rvg4dIgRVPcfTq6/GZz8vL0LZDjtEyG3ePD77//1vDJyHGOPlnu4mmjEjDtlv3z6Cyscfx2di0KD4CxH+Uueo2nnnqGuq9Wzhwuge6907/l+//DJary6/PD47L72U/twdfXTU9aWX4nXq3z9CwS9+EdtOhfa//S3+x7p1S3dHrVwZz7HDDvFZ+sc/1v4fmT8/1ikqSreq5edHj/7dd6fLn5eX/m151lkRwFLvbevWEd5ffz1afAsKItil6pEyZ06Ee4jzVl13XbQKN1S3noKL5MasWbHXPO+8GJ2Z+k8pLIy98p//HHvA6dM3qi6m9XXFFTEupr6Oiigvj+6J+++PlpfUEVYVFfFFucsu6cHEmR5/PL4Qr7xy7aOtqvrmm9hBPPHEutf58MNoYt9229iB9u0bX25XXBE7lrZtY0fSEEaPjh1RaodX9Ys7W8uXpwdxl5TEl3y/frFD2mOPyuPMly2LcLGuwabz5kUrS6rL4be/jfkPPhgtGKnwWV4eO65UF0F+fmR691jnlFMiJBQWxq/v1GDc6n5hr1wZn4PvfCfWOeqodBD617/SLRavvVb5caNGVf4MpMZ27bZb9WPr58+PcHPUUbHD69EjujNatEjv5O6/v/pAl2nRonh927aN7kZwv/HG6GoYOzZ28qkxYN/5TrwOv/51PLasLB7brl10+fXvH6/drbfGr/3UWRu22SaC37nnxut+5pmx4504MbazalU8t1kE5R13jBYod/cnn4xtjBwZXbMdOkRLR4sW0XKWes1T3V+33Va5fq+/Htvdfvv0D5bWreNzlZcXg+BTwSdzXMsjj6Q/y2ecET9w+vSJ6fbtKx9F6R6BLbM1sUWLCF3/+leEhoMOih8Wb70Vn/FnnonuxpR77onlF1wQgSt1zqtUl95nn8VrkppXUBCfy3V99isq4mu7X7/0YzK7ruuTgotsHJYujf+2yy+vfJRS6pv9pJNiD/3++7V/M0rOLFoUv5Ahvtjreuh0XY0bFzv1ffapv22musF22mn9wlDqCLHdd6/5o1pREb9cDzwwdlBVGxsnT45WtMMPj1/CN91Ue/BNddWtWhW//L/znQga3bvX/tjVq2OnWlO3XioQQXQnuMeO8dRT06EgGzNmRJlSLRhVLVwY9e3ePXauX36ZXrZ8ebqeS5ZEl0WqTL16RYtCNkfpLFkSrUip1obhw9PPnZ8fYSPzdApVfz+VlETLRXWv67Bh8bX1y1/Gby/3CKRHHpkua9WuxtWr02PYUkF60qQIY9WNaUttc9q0aDU799xoLcz86nzwwZpfg6o99X//e3Sfpsb/rFwZY3aeeaZuB4rOmxcDdxvqxJs1BReL5U3bwIEDffTo0bkuhtSFO5SUxOWmJ0yIqzn+73/piwy1bw+HHQYHHhjXXerUCXbYAbp1A7OcFl1g9Wp45ZW4QnizZg3/fIsWxdW/t9iifrbnDiNGxMdrq63W7/GPPgr77w9bb10/ZVoft94KV1wR1wa7/HK4+eYN3+bixdC3LxxzTFyvdUNMnx63/fZb9zqrVkFpaVwkdV1Wr46LpPbuDcXFdS/H7NlxHbMhQ6BVq5h3wAFxGbfbboNLL637NtelrAx+9jOYMycuD1dYWHn5rFkxb30/y4sWwcSJ8PXXcS243Xff4CJvlMxsjLsPrHZZNsHFzA4H/gjkA/e4+++rLG8OPAjsAcwHBrn7VDPbC7g7tRowzN2fNrMdgREZm+gFXOPut5nZMODHQGmy7Bfu/lxN5VNw2YTMnBmXnn7ppbg64ldfVV7evj0cckh8Ax1xxNrfCiKbkSVLIjgtWRIXgu/fv362u3IlNG9eP9vaWP3nP/F76Ve/0m+hjdEGBRczywc+Bw4FSoBRwCnu/knGOhcAu7r7eWY2GDje3QeZWStglbuXmVlX4COgm7uXVdn+TGBvd5+WBJel7n5LthVUcNlEuUdwWbAgLv382Wfw0UfwzDPxEy0vDzp0gM6dYddd46fHoYfCgAH6JpLNxi23xFXPR47Ux142HRsaXPYlWkq+l0z/HMDdb8hY58VknXfMrACYDXT2jI2bWU/gXaB7leByGPBrd/92Mj0MBRepyerV8MIL8N57sHBhtL2OHQtffhnLt90Wvv3tCDRbbQW77AL9+kVbtFpoREQ2ejUFl4IsHt8dmJExXQLsva51ktaVxUAnYJ6Z7Q3cB2wLnJ4ZWhKDgUerzLvIzH4IjAaucPeFWZRTNheFhXD00XHLVFoK//53dGa/+y7Mmxdt6Jk6dYogc+CBEW5694629oJs/hVERCTXsmlxORE43N1/lEyfTnTrXJSxzvhknZJk+otknXkZ6+wMPAAc4O4rknnNgK+AXdx9TjKvCzAPcOC3QFd3P6uacp0DnAOwzTbb7DEtNahTJNOSJTB+fAwA/uqraJ0ZPToGBFRUxDrNmkVX0557xmi3igrYZhsYPBiKinJbfhGRzdCGtrjMBDLHzfdI5lW3TknSVdSeGKS7hrt/amZLgb5ESwrAEcAHqdCSrLfmvpn9Ffh3dYVy97tJBv4OHDiw6R8aJQ2jXTv41rfilmnRouhe+uKLGDszZgw8/DB8800MFCgri8M0jjwSWrSI6e23h4EDo5WmU6e4qetJRKRRZRNcRgG9kzEqM4munR9UWWckMAR4BzgReNXdPXnMjKT7aFtgJ2BqxuNOoUo3kZl1dfdZyeTxwPi6VUkkC0VF0V104IHVLx8zBv7ylzj0ID8/wsyTT0J5eXqdggLYccdorUnd2rSJ8NOsWSzr3l0jJkVE6lGtwSUJHRcBLxKHQ9/n7hPM7FriBDEjgXuBh8xsMrCACDcA+wFDzWw1UAFckOo+MrPWxJFK51Z5ypvMrD/RVTS1muUiDW+PPeDuuyvPW748jmqaNi2OdJoxA8aNi/PPPFp1mFaiVas490zXrnEU1IoV0KsXXHhhtAIp1IiI1IlOQCdSHxYtirE0K1dC69awbFmcJWrSpBhbM3t2rNesGYwaFetvv32MqWnWLH3r3Ru++90Yb9O6dXRTNcYZ3kRENiIbfAK6jZ2CizQp33wDf/87PP98tMCsWhW3FSvgk0+iZSdTu3bQpUu03uTnRwvOkUfGmYWLi2OeiMgmRMFFpKlYuTJO5zlhQtxftiwO8549O4JNeTl8+ilMmRLrFxbG4dwdOkQLTerWpUuM3zn44GjVERFpQhRcRDYl7tEN9eabcdK9adPiAjPffJO+zZgBS5fG+kVFcSK+Ll3i75Zbxq1z5/jbsWNs46OPYsDxQQfFxWXatMltPUVks6XgIrK5Wb0a3n8/ws3MmXHFt9mz4++cOWufmA9iPE15eTy2oAD22isCzOLFMVanqCim+/SJ1qCKCthtt+iu0iBjEalHCi4iUtnKldEFVVoK8+fHYdu9e8dYm//9D157DV59NQYSFxXFsrlz05dVyNS1a5ywr1Wr9K19+/SlFrbcMlpvOnWCtm0VckSkVgouIrJ+ysoqXw5h5swILy1bRuvM6NExJqe0NMbjpG7z5sW8qlq2THdZbbFFdGctWBADjNu3h513hiuvjEPGly6Na1J17RqtPzrZn8hmQ8FFRBrfnDkxyHjBAvj662jZSXVZzZ4d4aZt2xhYXFERh4iPGhWB6OCDo+Xnm29iW23bRqtPKkj17h3hZu5cmDw5xul85zuw776www66VINIE7ehp/wXEam7Ll3iVhdffQU33AAjR8a1ok47LQLPSy9BSUm0uqxcGQHniSdigHGvXnH5hqefTm+nqCjdQtOyZbTmtGwZJwFs0SJadvr2jfurV0frz267xbby8urvNRCReqcWFxFpmioqKoeMGTPiUg2TJsVRUhUVcQTW8uUxwHj58pj++utoCapugHJBQQScFi3W/lt1XsuWsNNO0crTuXN0o82bF8+Rnw977x2HqotInanFRUQ2PVVbRrbeOvug4B6tO2Vl0TLz1VdxOPgXX8T5cpYvr/w3dX/BgvS8pUvhvvtqfp7ddku37DRvng5AVadTXWUVFdHltffeEaJWrIgytmypEw2KJBRcRGTzYxZHUqV06xZX/q6ruXPh3Xej9aZHj2h5ycuLAcqvvQbPPhvLU+FnxYro6iorq3m7qaOvMluFOnSIw9H32SdCztSpEW5Sg50z/265ZQSs0tLYVp8+6gKTTYa6ikREGltZWQSYVJAxi3E4K1fCyy9H6EmFkmbNIoRMnx7n5Zk0KeZts0200MyZkx7EvC5FRXF4+tKlEXogtp+6de4MRx8Nhx4aYeyzzyLw7LBDtAx98UW0SjVvHmdm3m+/OH+PSAPRUUUiIpuKxYsjVGS2oCxdmj65YOrWqlUEktLSOEJr4sQIR0VFEZTKytK3SZPg44/rVo59943BzFOmwMKF6dYeiPMBFRXBttvG3yVLYt6OO0b3WffucQ2uVatibNKiRemrqBeoI0AUXEREpDZffAFvvRVdXjvvHGHo88+jVWj77SNsrF4d43z+9S8YMSICy3bbxckFU2dmzsuLcUMLFsCsWTGeCCIs1ba/ycuLUNamTay7dGmM79l/fxgwAMaNi/MGuUdI6tkzus723DPK0aVLDL7+8stoOUq1Lg0YEHXQyQ+bDAUXERFpfCtXRjdW27YxPXFihI9UqCgsjC6voqLoipoxI1pnvv46BiO3bh3h6PXXo6use/fopmrePELSZ5/FEWQpBQXrHj/Url206GyxRQyINksHmdT96m7t2sWV2I88MlqwIMr+0UdRJvcoz6GH6oKm9UhHFYmISONr3jxuKX37xq2u3KOLrH37tVtNZs2CDz+MwcozZkR4KC6OkNKhQ3RHjRkT5/qZOzfOC5Q6ND51Sz1HdbcxY+CRR2KdwsKoT+oCplXreuKJ0Vo0YUIEtm23jS6wvLz04fnu6fvNm0erUXFxbNs9Hrd4cUzvsktsY9q0aBHr0iVajzp1Sj/vokUwfny8rpvJiRcVXEREZONmtu6dcteucavJgAHr/9zuEYxefjlCwooV0fIyYEB0T+XnxziiBx+Ev/89QkrfvjHm5/PPY0B1qg5msTx1f9my6s8nVJtOndKtPxMnRhkLC6NlaNddK7cmpS6n0aFDdPUtWhSBaaedovzNm8e6M2bEWajbtIlD8jt2jND31VcxL9Wa9M030SKWuvXpE+OpGpG6ikREROpDRUXl0JCNBQuiRaW8PB6Xukjp8uXRcjN9erS6bLddhIgPP4wxPKWlESz23DMuZvrf/8JTT8UZpjNbkioq6l4PswiDs2enH59qNapqzBjYffe6P0etRdAYFxERkc1PRUW0jCxcGK0yRUXR0vPppxGAVq+Odbp1iwHM8+bF4fiTJ0dg2mab9IVTzWK8UuZt//0bpItKY1xEREQ2R3l50YLTvn16XuvW0dV0wAHVP2a//RqnbOspq1MpmtnhZjbRzCab2dBqljc3sxHJ8vfMrDiZv5eZjU1uH5nZ8RmPmWpm45JlozPmdzSz/5jZpORvhw2vpoiIiGwKag0uZpYP3AEcAfQBTjGzPlVWOxtY6O7bA8OBG5P544GB7t4fOBz4i5lltvIc5O79qzQHDQVecffewCvJtIiIiEhWLS57AZPdfYq7rwIeA46tss6xwAPJ/SeBQ8zM3H2Zu6cOqm8BZDOgJnNbDwDHZfEYERER2QxkE1y6AzMypkuSedWukwSVxUAnADPb28wmAOOA8zKCjAMvmdkYMzsnY1td3H1Wcn820KW6QpnZOWY22sxGl5aWZlENERERaeoa/HKh7v6eu+8C7An83MxaJIv2c/fdiS6oC81srVFCHoc8VdtK4+53u/tAdx/YOXU8u4iIiGzSsgkuM4GtM6Z7JPOqXScZw9IemJ+5grt/CiwF+ibTM5O/c4GniS4pgDlm1jXZVldgbvbVERERkU1ZNsFlFNDbzHqaWTNgMDCyyjojgSHJ/ROBV93dk8cUAJjZtsBOwFQza21mbZP5rYHDiIG8Vbc1BPjn+lVNRERENjW1nsfF3cvM7CLgRSAfuM/dJ5jZtcBodx8J3As8ZGaTgQVEuAHYDxhqZquBCuACd59nZr2Apy3OLlgAPOLuLySP+T3wuJmdDUwDTq6vyoqIiEjTtkmcOdfMSomQ0xC2AOY10LY3FptDHWHzqKfquOnYHOqpOm466rue27p7tQNYN4ng0pDMbPS6Tju8qdgc6gibRz1Vx03H5lBP1XHT0Zj1bPCjikRERETqi4KLiIiINBkKLrW7O9cFaASbQx1h86in6rjp2BzqqTpuOhqtnhrjIiIiIk2GWlxERESkyVBwERERkSZDwWUdzOxwM5toZpPNbGiuy1NfzGxrM3vNzD4xswlmdmkyf5iZzTSzscntyFyXdUOY2VQzG5fUZXQyr6OZ/cfMJiV/O+S6nOvLzHbMeK/GmtkSM7tsU3gfzew+M5trZuMz5lX73ln4f8n/6cdmtnvuSp69ddTxZjP7LKnH02ZWlMwvNrPlGe/pXbkred2so57r/Iya2c+T93KimX0vN6Wum3XUcURG/aaa2dhkfpN8L2vYb+Tm/9LddatyI84Q/AXQC2gGfAT0yXW56qluXYHdk/ttgc+BPsAw4Mpcl68e6zkV2KLKvJuAocn9ocCNuS5nPdU1n7iS+rabwvsIHADsDoyv7b0DjgSeBwzYB3gv9+z1BwAABbJJREFU1+XfgDoeBhQk92/MqGNx5npN6baOelb7GU2+hz4CmgM9k+/g/FzXYX3qWGX5H4BrmvJ7WcN+Iyf/l2pxqd5ewGR3n+Luq4DHgGNzXKZ64e6z3P2D5P7XwKdA99yWqtEcCzyQ3H8AOC6HZalPhwBfuHtDnT26Ubn7m8SlQzKt6707FnjQw7tAkSUXad2YVVdHd3/J3cuSyXeJC9o2aet4L9flWOAxd1/p7l8Ck0lffHejVVMdLa5rczLwaKMWqp7VsN/Iyf+lgkv1ugMzMqZL2AR37mZWDAwA3ktmXZQ0693XlLtREg68ZGZjzOycZF4Xd5+V3J8NdMlN0erdYCp/MW5K72PKut67TfV/9SziF2tKTzP70MzeMLP9c1WoelTdZ3RTfC/3B+a4+6SMeU36vayy38jJ/6WCy2bKzNoATwGXufsS4E5gO6A/MIto3mzK9nP33YEjgAvN7IDMhR7tmU3+XAAWV2w/BngimbWpvY9r2VTeu3Uxs6uBMuDhZNYsYBt3HwBcDjxiZu1yVb56sMl/RjOcQuUfFU36vaxmv7FGY/5fKrhUbyawdcZ0j2TeJsHMCokP38Pu/g8Ad5/j7uXuXgH8lSbQRFsTd5+Z/J0LPE3UZ06quTL5Ozd3Jaw3RwAfuPsc2PTexwzreu82qf9VMzsDOAo4NdkRkHSdzE/ujyHGfuyQs0JuoBo+o5vae1kAnACMSM1ryu9ldfsNcvR/qeBSvVFAbzPrmfyiHQyMzHGZ6kXS53ov8Km735oxP7P/8XhgfNXHNhVm1trM2qbuE4MexxPv4ZBktSHAP3NTwnpV6RfdpvQ+VrGu924k8MPkKIZ9gMUZTddNipkdDvwMOMbdl2XM72xm+cn9XkBvYEpuSrnhaviMjgQGm1lzM+tJ1PP9xi5fPfou8Jm7l6RmNNX3cl37DXL1f5nr0cob640YFf05kYivznV56rFe+xHNeR8DY5PbkcBDwLhk/kiga67LugF17EUcnfARMCH1/gGdgFeAScDLQMdcl3UD69kamA+0z5jX5N9HIojNAlYTfeNnr+u9I45auCP5Px0HDMx1+TegjpOJcQGp/8u7knW/n3yOxwIfAEfnuvwbWM91fkaBq5P3ciJwRK7Lv751TObfD5xXZd0m+V7WsN/Iyf+lTvkvIiIiTYa6ikRERKTJUHARERGRJkPBRURERJoMBRcRERFpMhRcREREpMlQcBGRJs/MDjSzf+e6HCLS8BRcREREpMlQcBGRRmNmp5nZ+2Y21sz+Ymb5ZrbUzIab2QQze8XMOifr9jezd5OL8T2duhifmW1vZi+b2Udm9oGZbZdsvo2ZPWlmn5nZw8nZPjGz35vZJ8l2bslR1UWknii4iEijMLOdgUHAt929P1AOnEqcAXi0u+8CvAH8OnnIg8BV7r4rcfbN1PyHgTvcfTfgW8RZSyGuWHsZ0Ic4e/K3zawTcVr5XZLtXNewtRSRhqbgIiKN5RBgD2CUmY1NpnsBFaQvRPd3YD8zaw8UufsbyfwHgAOSa1B1d/enAdx9haev6/O+u5d4XLxvLFAMLAZWAPea2QnAmmsAiUjTpOAiIo3FgAfcvX9y29Hdh1Wz3vpeh2Rlxv1yoMDdy4irDz9JXHX5hfXctohsJBRcRKSxvAKcaGZbAphZRzPblvgeOjFZ5wfAW+6+GFhoZvsn808H3nD3r4ESMzsu2UZzM2u1ric0szbERSifA34C7NYQFRORxlOQ6wKIyObB3T8xs18CL5lZHnE13QuBb4C9kmVziXEwAEOAu5JgMgU4M5l/Ov+/vTu2gRAGggB4mxFAed8IJdAFZdDb5/4AYqIX6KSZAs52ttoLXLUn2a4Zn5tjl6o6kkx1Nj7rn58FPMzv0MCrknzHGPPb9wB6sCoCANrQuAAAbWhcAIA2BBcAoA3BBQBoQ3ABANoQXACANn4PEzegteuQHgAAAABJRU5ErkJggg==\n",
"text/plain": [
"tf.GradientTape()
function and Learning rate decay\n",
"\n",
"Calculate the loss and gradients with the tf.GradientTape()
function, and apply the gradients to the variables. \n",
"In addition, perform Learning rate decay in the optimizer.\n",
"\n",
"[Caution] Note that if you call the save_image()
function in the training, encoder.predict()
and decoder.predict()
will work and the execution will be slow.\n",
"\n",
"## (3) tf.GradientTape()
関数と学習率減衰を使った学習\n",
"\n",
"tf.GradientTape()
関数を使って loss と gradients を計算して、gradients を変数に適用する。\n",
"さらに、optimizer において Learning rate decay を行う。\n",
"\n",
"(注意) trainingの途中で save_images()
関数を呼び出すと、 encoder.predict()
と decoder.predict()
が動作して、実行が非常に遅くなるので注意すること。"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "TYuLK26XIUYl"
},
"outputs": [],
"source": [
"save_path3 = '/content/drive/MyDrive/ColabRun/AE03/'"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "5_oyibFQbiBj"
},
"outputs": [],
"source": [
"from nw.AutoEncoder import AutoEncoder\n",
"\n",
"AE3 = AutoEncoder(\n",
" input_dim = (28, 28, 1),\n",
" encoder_conv_filters = [32, 64, 64, 64],\n",
" encoder_conv_kernel_size = [3, 3, 3, 3],\n",
" encoder_conv_strides = [1, 2, 2, 1],\n",
" decoder_conv_t_filters = [64, 64, 32, 1],\n",
" decoder_conv_t_kernel_size = [3, 3, 3, 3],\n",
" decoder_conv_t_strides = [1, 2, 2, 1],\n",
" z_dim = 2\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "MMkuVG5TCDzr"
},
"outputs": [],
"source": [
"# initial_learning_rate * decay_rate ^ (step // decay_steps)\n",
"\n",
"lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n",
" initial_learning_rate = learning_rate,\n",
" decay_steps = 1000,\n",
" decay_rate=0.96\n",
")\n",
"\n",
"optimizer3 = tf.keras.optimizers.Adam(learning_rate=lr_schedule)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 117330,
"status": "ok",
"timestamp": 1637567482503,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "YNKUQta0bm9E",
"outputId": "29c08517-aed3-4a87-82e2-b435647313d4"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1/3 1875 loss: 0.0493 val loss: 0.0491 0:00:38.910057\n",
"2/3 1875 loss: 0.0451 val loss: 0.0451 0:01:17.719193\n",
"3/3 1875 loss: 0.0463 val loss: 0.0439 0:01:56.448946\n"
]
}
],
"source": [
"# At first, train for a few epochs.\n",
"# まず、少ない回数 training してみる\n",
"\n",
"loss3_1, vloss3_1 = AE3.train(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs = 3, \n",
" shuffle=True,\n",
" run_folder=save_path3,\n",
" optimizer = optimizer3,\n",
" save_epoch_interval=50,\n",
" validation_data=(x_test, x_test)\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 5,
"status": "ok",
"timestamp": 1637567482504,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "cUDPmvBqepKF",
"outputId": "bb83cce5-bc96-4024-8a26-d6112eb0ff67"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"3\n"
]
}
],
"source": [
"# Load the parameters and the weights saved before.\n",
"# 保存したパラメータと、重みを読み込む。\n",
"\n",
"AE3_work = AutoEncoder.load(save_path3)\n",
"print(AE3_work.epoch)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 3167903,
"status": "ok",
"timestamp": 1637570650405,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "UuZRByoze7hn",
"outputId": "ff1a83ed-4193-42f6-d4f4-049b99a99d11"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4/200 1875 loss: 0.0441 val loss: 0.0432 0:00:16.470529\n",
"5/200 1875 loss: 0.0425 val loss: 0.0424 0:00:32.578862\n",
"6/200 1875 loss: 0.0418 val loss: 0.0416 0:00:48.702078\n",
"7/200 1875 loss: 0.0413 val loss: 0.0411 0:01:04.721034\n",
"8/200 1875 loss: 0.0410 val loss: 0.0408 0:01:20.828852\n",
"9/200 1875 loss: 0.0406 val loss: 0.0408 0:01:36.866276\n",
"10/200 1875 loss: 0.0403 val loss: 0.0405 0:01:52.938620\n",
"11/200 1875 loss: 0.0401 val loss: 0.0403 0:02:08.915218\n",
"12/200 1875 loss: 0.0398 val loss: 0.0403 0:02:24.993334\n",
"13/200 1875 loss: 0.0397 val loss: 0.0402 0:02:40.943671\n",
"14/200 1875 loss: 0.0395 val loss: 0.0400 0:02:57.026531\n",
"15/200 1875 loss: 0.0393 val loss: 0.0396 0:03:13.174460\n",
"16/200 1875 loss: 0.0392 val loss: 0.0397 0:03:29.264352\n",
"17/200 1875 loss: 0.0391 val loss: 0.0397 0:03:45.378318\n",
"18/200 1875 loss: 0.0389 val loss: 0.0395 0:04:01.329874\n",
"19/200 1875 loss: 0.0388 val loss: 0.0393 0:04:17.245191\n",
"20/200 1875 loss: 0.0387 val loss: 0.0394 0:04:33.400665\n",
"21/200 1875 loss: 0.0386 val loss: 0.0392 0:04:49.561788\n",
"22/200 1875 loss: 0.0385 val loss: 0.0391 0:05:05.574827\n",
"23/200 1875 loss: 0.0384 val loss: 0.0392 0:05:21.680731\n",
"24/200 1875 loss: 0.0384 val loss: 0.0390 0:05:37.694906\n",
"25/200 1875 loss: 0.0383 val loss: 0.0391 0:05:53.796563\n",
"26/200 1875 loss: 0.0382 val loss: 0.0390 0:06:09.769820\n",
"27/200 1875 loss: 0.0382 val loss: 0.0388 0:06:25.773891\n",
"28/200 1875 loss: 0.0381 val loss: 0.0389 0:06:41.814141\n",
"29/200 1875 loss: 0.0380 val loss: 0.0389 0:06:57.785303\n",
"30/200 1875 loss: 0.0380 val loss: 0.0389 0:07:13.735852\n",
"31/200 1875 loss: 0.0379 val loss: 0.0388 0:07:29.669668\n",
"32/200 1875 loss: 0.0379 val loss: 0.0388 0:07:45.531500\n",
"33/200 1875 loss: 0.0379 val loss: 0.0388 0:08:01.576522\n",
"34/200 1875 loss: 0.0378 val loss: 0.0387 0:08:17.534843\n",
"35/200 1875 loss: 0.0378 val loss: 0.0387 0:08:33.530811\n",
"36/200 1875 loss: 0.0378 val loss: 0.0387 0:08:49.452516\n",
"37/200 1875 loss: 0.0377 val loss: 0.0386 0:09:05.483624\n",
"38/200 1875 loss: 0.0377 val loss: 0.0387 0:09:21.607403\n",
"39/200 1875 loss: 0.0377 val loss: 0.0386 0:09:37.595950\n",
"40/200 1875 loss: 0.0376 val loss: 0.0386 0:09:53.711715\n",
"41/200 1875 loss: 0.0376 val loss: 0.0386 0:10:09.573170\n",
"42/200 1875 loss: 0.0376 val loss: 0.0386 0:10:25.672371\n",
"43/200 1875 loss: 0.0376 val loss: 0.0386 0:10:41.658941\n",
"44/200 1875 loss: 0.0376 val loss: 0.0386 0:10:57.703071\n",
"45/200 1875 loss: 0.0375 val loss: 0.0386 0:11:13.708545\n",
"46/200 1875 loss: 0.0375 val loss: 0.0386 0:11:29.633509\n",
"47/200 1875 loss: 0.0375 val loss: 0.0386 0:11:45.618107\n",
"48/200 1875 loss: 0.0375 val loss: 0.0386 0:12:01.542282\n",
"49/200 1875 loss: 0.0375 val loss: 0.0386 0:12:17.577870\n",
"50/200 1875 loss: 0.0375 val loss: 0.0386 0:12:34.355703\n",
"51/200 1875 loss: 0.0375 val loss: 0.0386 0:12:50.369939\n",
"52/200 1875 loss: 0.0374 val loss: 0.0385 0:13:06.322242\n",
"53/200 1875 loss: 0.0374 val loss: 0.0385 0:13:22.294449\n",
"54/200 1875 loss: 0.0374 val loss: 0.0386 0:13:38.360678\n",
"55/200 1875 loss: 0.0374 val loss: 0.0385 0:13:54.381169\n",
"56/200 1875 loss: 0.0374 val loss: 0.0385 0:14:10.436163\n",
"57/200 1875 loss: 0.0374 val loss: 0.0385 0:14:26.374065\n",
"58/200 1875 loss: 0.0374 val loss: 0.0385 0:14:42.461288\n",
"59/200 1875 loss: 0.0374 val loss: 0.0385 0:14:58.804114\n",
"60/200 1875 loss: 0.0374 val loss: 0.0385 0:15:14.948368\n",
"61/200 1875 loss: 0.0374 val loss: 0.0385 0:15:31.003709\n",
"62/200 1875 loss: 0.0374 val loss: 0.0385 0:15:47.036744\n",
"63/200 1875 loss: 0.0374 val loss: 0.0385 0:16:02.927857\n",
"64/200 1875 loss: 0.0374 val loss: 0.0385 0:16:19.033550\n",
"65/200 1875 loss: 0.0374 val loss: 0.0385 0:16:35.146949\n",
"66/200 1875 loss: 0.0374 val loss: 0.0385 0:16:51.198622\n",
"67/200 1875 loss: 0.0374 val loss: 0.0385 0:17:07.332641\n",
"68/200 1875 loss: 0.0374 val loss: 0.0385 0:17:23.460586\n",
"69/200 1875 loss: 0.0373 val loss: 0.0385 0:17:39.446418\n",
"70/200 1875 loss: 0.0373 val loss: 0.0385 0:17:55.528039\n",
"71/200 1875 loss: 0.0373 val loss: 0.0385 0:18:11.502721\n",
"72/200 1875 loss: 0.0373 val loss: 0.0385 0:18:27.473277\n",
"73/200 1875 loss: 0.0373 val loss: 0.0385 0:18:43.449839\n",
"74/200 1875 loss: 0.0373 val loss: 0.0385 0:18:59.486046\n",
"75/200 1875 loss: 0.0373 val loss: 0.0385 0:19:15.552210\n",
"76/200 1875 loss: 0.0373 val loss: 0.0385 0:19:31.626925\n",
"77/200 1875 loss: 0.0373 val loss: 0.0385 0:19:47.513213\n",
"78/200 1875 loss: 0.0373 val loss: 0.0385 0:20:03.722407\n",
"79/200 1875 loss: 0.0373 val loss: 0.0385 0:20:19.804345\n",
"80/200 1875 loss: 0.0373 val loss: 0.0385 0:20:35.791387\n",
"81/200 1875 loss: 0.0373 val loss: 0.0385 0:20:51.787138\n",
"82/200 1875 loss: 0.0373 val loss: 0.0385 0:21:07.730577\n",
"83/200 1875 loss: 0.0373 val loss: 0.0385 0:21:23.787720\n",
"84/200 1875 loss: 0.0373 val loss: 0.0385 0:21:39.687327\n",
"85/200 1875 loss: 0.0373 val loss: 0.0385 0:21:55.584867\n",
"86/200 1875 loss: 0.0373 val loss: 0.0385 0:22:11.523757\n",
"87/200 1875 loss: 0.0373 val loss: 0.0385 0:22:27.390298\n",
"88/200 1875 loss: 0.0373 val loss: 0.0385 0:22:43.472842\n",
"89/200 1875 loss: 0.0373 val loss: 0.0385 0:22:59.672602\n",
"90/200 1875 loss: 0.0373 val loss: 0.0385 0:23:15.707643\n",
"91/200 1875 loss: 0.0373 val loss: 0.0385 0:23:31.828762\n",
"92/200 1875 loss: 0.0373 val loss: 0.0385 0:23:47.825785\n",
"93/200 1875 loss: 0.0373 val loss: 0.0385 0:24:03.841613\n",
"94/200 1875 loss: 0.0373 val loss: 0.0385 0:24:19.741862\n",
"95/200 1875 loss: 0.0373 val loss: 0.0385 0:24:35.758435\n",
"96/200 1875 loss: 0.0373 val loss: 0.0385 0:24:51.756826\n",
"97/200 1875 loss: 0.0373 val loss: 0.0385 0:25:07.796537\n",
"98/200 1875 loss: 0.0373 val loss: 0.0385 0:25:24.006386\n",
"99/200 1875 loss: 0.0373 val loss: 0.0385 0:25:39.953159\n",
"100/200 1875 loss: 0.0373 val loss: 0.0385 0:25:56.680088\n",
"101/200 1875 loss: 0.0373 val loss: 0.0385 0:26:12.840496\n",
"102/200 1875 loss: 0.0373 val loss: 0.0385 0:26:28.849194\n",
"103/200 1875 loss: 0.0373 val loss: 0.0385 0:26:44.841663\n",
"104/200 1875 loss: 0.0373 val loss: 0.0385 0:27:00.859847\n",
"105/200 1875 loss: 0.0373 val loss: 0.0385 0:27:17.004825\n",
"106/200 1875 loss: 0.0373 val loss: 0.0385 0:27:33.131367\n",
"107/200 1875 loss: 0.0373 val loss: 0.0385 0:27:49.215559\n",
"108/200 1875 loss: 0.0373 val loss: 0.0385 0:28:05.210165\n",
"109/200 1875 loss: 0.0373 val loss: 0.0385 0:28:21.219173\n",
"110/200 1875 loss: 0.0373 val loss: 0.0385 0:28:37.222607\n",
"111/200 1875 loss: 0.0373 val loss: 0.0385 0:28:53.464457\n",
"112/200 1875 loss: 0.0373 val loss: 0.0385 0:29:09.544785\n",
"113/200 1875 loss: 0.0373 val loss: 0.0385 0:29:25.686480\n",
"114/200 1875 loss: 0.0373 val loss: 0.0385 0:29:41.597519\n",
"115/200 1875 loss: 0.0373 val loss: 0.0385 0:29:57.662729\n",
"116/200 1875 loss: 0.0373 val loss: 0.0385 0:30:13.863950\n",
"117/200 1875 loss: 0.0373 val loss: 0.0385 0:30:30.316802\n",
"118/200 1875 loss: 0.0373 val loss: 0.0385 0:30:46.468128\n",
"119/200 1875 loss: 0.0373 val loss: 0.0385 0:31:02.499419\n",
"120/200 1875 loss: 0.0373 val loss: 0.0385 0:31:18.614042\n",
"121/200 1875 loss: 0.0373 val loss: 0.0385 0:31:35.006121\n",
"122/200 1875 loss: 0.0373 val loss: 0.0385 0:31:51.303524\n",
"123/200 1875 loss: 0.0373 val loss: 0.0385 0:32:07.364468\n",
"124/200 1875 loss: 0.0373 val loss: 0.0385 0:32:23.438019\n",
"125/200 1875 loss: 0.0373 val loss: 0.0385 0:32:39.503072\n",
"126/200 1875 loss: 0.0373 val loss: 0.0385 0:32:55.600306\n",
"127/200 1875 loss: 0.0373 val loss: 0.0385 0:33:11.714923\n",
"128/200 1875 loss: 0.0373 val loss: 0.0385 0:33:27.757243\n",
"129/200 1875 loss: 0.0373 val loss: 0.0385 0:33:43.846899\n",
"130/200 1875 loss: 0.0373 val loss: 0.0385 0:33:59.986440\n",
"131/200 1875 loss: 0.0373 val loss: 0.0385 0:34:16.036559\n",
"132/200 1875 loss: 0.0373 val loss: 0.0385 0:34:32.108351\n",
"133/200 1875 loss: 0.0373 val loss: 0.0385 0:34:48.126711\n",
"134/200 1875 loss: 0.0373 val loss: 0.0385 0:35:04.243871\n",
"135/200 1875 loss: 0.0373 val loss: 0.0385 0:35:20.342651\n",
"136/200 1875 loss: 0.0373 val loss: 0.0385 0:35:36.834930\n",
"137/200 1875 loss: 0.0373 val loss: 0.0385 0:35:53.240132\n",
"138/200 1875 loss: 0.0373 val loss: 0.0385 0:36:09.494644\n",
"139/200 1875 loss: 0.0373 val loss: 0.0385 0:36:25.711864\n",
"140/200 1875 loss: 0.0373 val loss: 0.0385 0:36:41.715972\n",
"141/200 1875 loss: 0.0373 val loss: 0.0385 0:36:57.798700\n",
"142/200 1875 loss: 0.0373 val loss: 0.0385 0:37:14.012655\n",
"143/200 1875 loss: 0.0373 val loss: 0.0385 0:37:29.964565\n",
"144/200 1875 loss: 0.0373 val loss: 0.0385 0:37:45.947853\n",
"145/200 1875 loss: 0.0373 val loss: 0.0385 0:38:01.922393\n",
"146/200 1875 loss: 0.0373 val loss: 0.0385 0:38:18.113317\n",
"147/200 1875 loss: 0.0373 val loss: 0.0385 0:38:34.302843\n",
"148/200 1875 loss: 0.0373 val loss: 0.0385 0:38:50.460746\n",
"149/200 1875 loss: 0.0373 val loss: 0.0385 0:39:06.514157\n",
"150/200 1875 loss: 0.0373 val loss: 0.0385 0:39:23.229584\n",
"151/200 1875 loss: 0.0373 val loss: 0.0385 0:39:39.470055\n",
"152/200 1875 loss: 0.0373 val loss: 0.0385 0:39:55.751577\n",
"153/200 1875 loss: 0.0373 val loss: 0.0385 0:40:12.016928\n",
"154/200 1875 loss: 0.0373 val loss: 0.0385 0:40:28.212051\n",
"155/200 1875 loss: 0.0373 val loss: 0.0385 0:40:44.578358\n",
"156/200 1875 loss: 0.0373 val loss: 0.0385 0:41:00.970150\n",
"157/200 1875 loss: 0.0373 val loss: 0.0385 0:41:17.138623\n",
"158/200 1875 loss: 0.0373 val loss: 0.0385 0:41:33.213235\n",
"159/200 1875 loss: 0.0373 val loss: 0.0385 0:41:49.222445\n",
"160/200 1875 loss: 0.0373 val loss: 0.0385 0:42:05.256580\n",
"161/200 1875 loss: 0.0373 val loss: 0.0385 0:42:21.305456\n",
"162/200 1875 loss: 0.0373 val loss: 0.0385 0:42:37.293515\n",
"163/200 1875 loss: 0.0373 val loss: 0.0385 0:42:53.218787\n",
"164/200 1875 loss: 0.0373 val loss: 0.0385 0:43:09.221011\n",
"165/200 1875 loss: 0.0373 val loss: 0.0385 0:43:25.241022\n",
"166/200 1875 loss: 0.0373 val loss: 0.0385 0:43:41.364001\n",
"167/200 1875 loss: 0.0373 val loss: 0.0385 0:43:57.468188\n",
"168/200 1875 loss: 0.0373 val loss: 0.0385 0:44:13.478234\n",
"169/200 1875 loss: 0.0373 val loss: 0.0385 0:44:29.480388\n",
"170/200 1875 loss: 0.0373 val loss: 0.0385 0:44:45.507111\n",
"171/200 1875 loss: 0.0373 val loss: 0.0385 0:45:01.481223\n",
"172/200 1875 loss: 0.0373 val loss: 0.0385 0:45:17.528489\n",
"173/200 1875 loss: 0.0373 val loss: 0.0385 0:45:33.518117\n",
"174/200 1875 loss: 0.0373 val loss: 0.0385 0:45:49.637809\n",
"175/200 1875 loss: 0.0373 val loss: 0.0385 0:46:05.890984\n",
"176/200 1875 loss: 0.0373 val loss: 0.0385 0:46:21.956460\n",
"177/200 1875 loss: 0.0373 val loss: 0.0385 0:46:38.062078\n",
"178/200 1875 loss: 0.0373 val loss: 0.0385 0:46:54.053728\n",
"179/200 1875 loss: 0.0373 val loss: 0.0385 0:47:09.917701\n",
"180/200 1875 loss: 0.0373 val loss: 0.0385 0:47:25.930845\n",
"181/200 1875 loss: 0.0373 val loss: 0.0385 0:47:41.949047\n",
"182/200 1875 loss: 0.0373 val loss: 0.0385 0:47:57.975498\n",
"183/200 1875 loss: 0.0373 val loss: 0.0385 0:48:13.821878\n",
"184/200 1875 loss: 0.0373 val loss: 0.0385 0:48:29.675598\n",
"185/200 1875 loss: 0.0373 val loss: 0.0385 0:48:45.563435\n",
"186/200 1875 loss: 0.0373 val loss: 0.0385 0:49:01.501470\n",
"187/200 1875 loss: 0.0373 val loss: 0.0385 0:49:17.551920\n",
"188/200 1875 loss: 0.0373 val loss: 0.0385 0:49:33.465461\n",
"189/200 1875 loss: 0.0373 val loss: 0.0385 0:49:49.437168\n",
"190/200 1875 loss: 0.0373 val loss: 0.0385 0:50:05.438445\n",
"191/200 1875 loss: 0.0373 val loss: 0.0385 0:50:21.633043\n",
"192/200 1875 loss: 0.0373 val loss: 0.0385 0:50:37.783884\n",
"193/200 1875 loss: 0.0373 val loss: 0.0385 0:50:53.888186\n",
"194/200 1875 loss: 0.0373 val loss: 0.0385 0:51:10.185000\n",
"195/200 1875 loss: 0.0373 val loss: 0.0385 0:51:26.449233\n",
"196/200 1875 loss: 0.0373 val loss: 0.0385 0:51:42.700071\n",
"197/200 1875 loss: 0.0373 val loss: 0.0385 0:51:58.913375\n",
"198/200 1875 loss: 0.0373 val loss: 0.0385 0:52:15.079322\n",
"199/200 1875 loss: 0.0373 val loss: 0.0385 0:52:30.994392\n",
"200/200 1875 loss: 0.0373 val loss: 0.0385 0:52:47.570349\n"
]
}
],
"source": [
"# Additional Training.\n",
"# 追加でtrainingする。\n",
"\n",
"# Compiles the part for loss and gradients fo train_tf() function into a graph of Tensorflow 2, so it is a little over twice as fast as train(). However, it is still nearly twice as slow as fit().\n",
"# train_tf() は loss と gradients を求める部分を tf のgraphにコンパイルしているので、train()よりも2倍強高速になっている。しかし、それでもfit()よりは2倍近く遅い。\n",
"\n",
"loss3_2, vloss3_2 = AE3_work.train_tf(\n",
" x_train,\n",
" x_train,\n",
" batch_size=32,\n",
" epochs = MAX_EPOCHS, \n",
" shuffle=True,\n",
" run_folder= save_path3,\n",
" optimizer = optimizer3,\n",
" save_epoch_interval=50,\n",
" validation_data=(x_test, x_test)\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"executionInfo": {
"elapsed": 16,
"status": "ok",
"timestamp": 1637570650405,
"user": {
"displayName": "Yoshihisa Nitta",
"photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GgJLeg9AmjfexROvC3P0wzJdd5AOGY_VOu-nxnh=s64",
"userId": "15888006800030996813"
},
"user_tz": -540
},
"id": "XLDhBSeefgX_",
"outputId": "c7b046cc-9b3c-4098-e586-2295fd4bea23"
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAicAAAEGCAYAAAC+SDXTAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXxV9Z3/8dcn+4WwiYhKQHBFlgI2Yhexi06LW9GqxZ06tk6tS3G3VSt1tB1Fxfm1jNZRW7RWoNRO6dSqHZeqo0OJEARElCJIgEKkgAQSyPL5/fE9gUtI4JLlnpvk/Xw87iP3fs/JyefLJdw33/M932PujoiIiEimyIq7ABEREZFkCiciIiKSURROREREJKMonIiIiEhGUTgRERGRjJITdwH748ADD/SBAwfGXYaIiIi0grfffvtjd+/TsL1dhZOBAwdSUlISdxkiIiLSCsxsZWPtOq0jIiIiGUXhRERERDKKwomIiIhklHY150RERCSTVFdXU1ZWRlVVVdylZLSCggKKiorIzc1NaX+FExERkWYqKyujW7duDBw4EDOLu5yM5O5s2LCBsrIyBg0alNL36LSOiIhIM1VVVdG7d28Fk70wM3r37r1fo0sKJyIiIi2gYLJv+/tnpHDyzjtwxx2wYUPclYiIiAgKJ7BkCdx9N6xfH3clIiIi+62wsDDuElqdwklBQfhaWRlvHSIiIgIonLCusjsv8WW2bdoRdykiIiLN5u7cdNNNDBs2jOHDhzNjxgwA1q5dy0knncTIkSMZNmwYr7/+OrW1tXzzm9/cue+UKVNirn53nf5S4pcWH8xFvMR7H73FMXEXIyIi7dfEiVBa2rrHHDkSHnoopV2fffZZSktLWbBgAR9//DHHH388J510Er/+9a/56le/ym233UZtbS3btm2jtLSU1atXs2jRIgA2bdrUunW3UKcfOUl0C/mscktNzJWIiIg03xtvvMEFF1xAdnY2ffv25Qtf+AJz587l+OOP5xe/+AWTJk1i4cKFdOvWjcMPP5zly5dzzTXX8Pzzz9O9e/e4y99Npx852RlOPqmOuRIREWnXUhzhSLeTTjqJ1157jT/+8Y9885vf5Prrr+fSSy9lwYIFvPDCCzzyyCPMnDmTJ554Iu5Sd9LISfewlG5lRW3MlYiIiDTfmDFjmDFjBrW1tZSXl/Paa68xevRoVq5cSd++ffn2t7/Nt771LebNm8fHH39MXV0d55xzDnfffTfz5s2Lu/zdaORE4URERDqAs88+m7feeosRI0ZgZtx3330cfPDBTJs2jcmTJ5Obm0thYSFPPvkkq1ev5rLLLqOurg6An/zkJzFXvzuFkx55AFRurYu5EhERkf1XUVEBhFVYJ0+ezOTJk3fbPmHCBCZMmLDH92XaaEkyndbpmQ9A5TaFExERkUygcLIznMRciIiIiAAKJyQKswEtECsiIpIpFE4S4avCiYiISGbo9OFk5611qnTLaxERkUzQ6cOJGRRYlcKJiIhIhuj04QQgYVVUbtcfhYiISCbQJzKQyNpB5Y7suMsQERFpU4WFhU1uW7FiBcOGDUtjNU1TOAES2dsVTkRERDJESivEmtlY4N+BbOAxd/+3BtvzgSeBTwMbgPHuviJp+wDgXWCSu98ftV0HfAtwYCFwmbtXtbRDzZHIqaayutMvlisiIi0wcSKUlrbuMUeO3Pv9BG+99Vb69+/PVVddBcCkSZPIycnhlVdeYePGjVRXV3P33Xczbty4/fq5VVVVXHnllZSUlJCTk8ODDz7Il770JRYvXsxll13Gjh07qKur47e//S2HHnoo3/jGNygrK6O2tpY77riD8ePHt6Tb+x45MbNsYCpwKjAEuMDMhjTY7XJgo7sfCUwB7m2w/UHgT0nH7AdcCxS7+zBC6Dm/uZ1oqURONZU1CiciItK+jB8/npkzZ+58PXPmTCZMmMDvfvc75s2bxyuvvMINN9yAu+/XcadOnYqZsXDhQp555hkmTJhAVVUVjzzyCN/73vcoLS2lpKSEoqIinn/+eQ499FAWLFjAokWLGDt2bIv7lcon8mhgmbsvBzCz6cA4wkhIvXHApOj5LOBnZmbu7mZ2FvAhsLWRn50ws2qgC7Cm2b1ooURODZWVuXH9eBER6QD2NsLRVkaNGsX69etZs2YN5eXl9OrVi4MPPpjrrruO1157jaysLFavXs26des4+OCDUz7uG2+8wTXXXAPA4MGDOeyww3j//ff57Gc/yz333ENZWRlf//rXOeqooxg+fDg33HADt9xyC2eccQZjxoxpcb9SmXPSD1iV9Losamt0H3evATYDvc2sELgF+FHyzu6+Grgf+AhYC2x29xeb04HWkMitobI2L64fLyIi0mznnXces2bNYsaMGYwfP56nn36a8vJy3n77bUpLS+nbty9VVa0za+LCCy9k9uzZJBIJTjvtNF5++WWOPvpo5s2bx/Dhw7n99tu56667Wvxz2npC7CRgirtXJDeaWS/CaMsg4FCgq5ld3NgBzOwKMysxs5Ly8vI2KTKRV6twIiIi7dL48eOZPn06s2bN4rzzzmPz5s0cdNBB5Obm8sorr7By5cr9PuaYMWN4+umnAXj//ff56KOPOOaYY1i+fDmHH3441157LePGjeOdd95hzZo1dOnShYsvvpibbrqpVe52nMppndVA/6TXRVFbY/uUmVkO0IMwMfYE4Fwzuw/oCdSZWRWwDvjQ3csBzOxZ4HPArxr+cHd/FHgUoLi4eP9OmqUokV9LZV1+WxxaRESkTQ0dOpQtW7bQr18/DjnkEC666CLOPPNMhg8fTnFxMYMHD97vY373u9/lyiuvZPjw4eTk5PDLX/6S/Px8Zs6cyVNPPUVubi4HH3wwP/jBD5g7dy433XQTWVlZ5Obm8vDDD7e4T7avSTJR2HgfOJkQQuYCF7r74qR9rgKGu/t3zOx84Ovu/o0Gx5kEVLj7/WZ2AvAEcDxQCfwSKHH3n+6tluLiYi8pKdm/Hqbg8uFzeGFxEWV1Dc9WiYiING3JkiUce+yxcZfRLjT2Z2Vmb7t7ccN99zly4u41ZnY18ALhqpon3H2xmd1FCBSzgceBp8xsGfAP9nHljbvPMbNZwDygBphPNDoSh0S+U+kF4B7WsxcREZHYpHT9rLs/BzzXoO2HSc+rgPP2cYxJDV7fCdyZaqFtKVHgVJKAmhrI1VU7IiLScS1cuJBLLrlkt7b8/HzmzJkTU0V70uIeQCIBlXTBt32C9VA4ERGR1Lk71o5G3YcPH05pa68Wtw/7u86Klq8nhBOA7ZtjWaBWRETaqYKCAjZs2LDfH76dibuzYcMGCgoKUv4ejZwAia4h8VZu2k7BgJiLERGRdqOoqIiysjLaaqmLjqKgoICioqKU91c4ARJdwgBS5abt9Iq5FhERaT9yc3MZNGhQ3GV0ODqtAyS6RuFk846YKxERERGFEyBRmA1A5SfVMVciIiIiCidAols4u6VwIiIiEj+FE5JGTrbUxFyJiIiIKJwAie5hbZPKitqYKxERERGFExROREREMonCCUnhZGtdzJWIiIiIwgmQ6JkPKJyIiIhkAoUTksLJNi0/LCIiEjeFEyDRK6z3r3AiIiISP4UTktY5qYy5EBEREVE4AcjJNXKoplI3JRYREYmdwkkkYVVUVumPQ0REJG76NI4krIrK7RZ3GSIiIp2ewkkkkbWdyu3ZcZchIiLS6SmcRBLZO6isVjgRERGJm8JJJJG9g8odOXGXISIi0ukpnEQSOdVUViuciIiIxE3hJJLIqaGyJtxjh1dfhddfj7UeERGRzkpDBZFEbjX/qAwrxXLDDdC1K7z2WrxFiYiIdEIpjZyY2VgzW2pmy8zs1ka255vZjGj7HDMb2GD7ADOrMLMbk9p6mtksM3vPzJaY2Wdb2pmWSOTVUlmbF16sXAkbN8ZZjoiISKe1z3BiZtnAVOBUYAhwgZkNabDb5cBGdz8SmALc22D7g8CfGrT9O/C8uw8GRgBL9r/81rMznGzdChs2wKZNcZYjIiLSaaUycjIaWObuy919BzAdGNdgn3HAtOj5LOBkMzMAMzsL+BBYXL+zmfUATgIeB3D3He4eaxpI5NdRWVcAH30UGjRyIiIiEotUwkk/YFXS67KordF93L0G2Az0NrNC4BbgRw32HwSUA78ws/lm9piZdW1G/a0mke9Uev6ucLJ1K1RXx1mSiIhIp9TWV+tMAqa4e0WD9hzgOOBhdx8FbAX2mMsCYGZXmFmJmZWUl5e3WaGJAqeSxK5wArB5c5v9PBEREWlcKuFkNdA/6XVR1NboPmaWA/QANgAnAPeZ2QpgIvADM7uaMPpS5u5zou+fRQgre3D3R9292N2L+/Tpk1KnmiNR4NSQS82yFbsadWpHREQk7VK5lHgucJSZDSKEkPOBCxvsMxuYALwFnAu87O4OjKnfwcwmARXu/rPo9SozO8bdlwInA++2sC8tkugSbvpXuWQF3eobNSlWREQk7fYZTty9JhrteAHIBp5w98VmdhdQ4u6zCRNbnzKzZcA/CAFmX64BnjazPGA5cFlzO9EaEonwtfL9VXTLzQ3zTRRORERE0i6lRdjc/TnguQZtP0x6XgWct49jTGrwuhQoTrXQtpboGs5wVX74dxgyBBYs0GkdERGRGGj5+sjOcLIjCz71qdCokRMREZG0UziJ7AwnJGDEiNCocCIiIpJ2CieRRGE2EIWTwYMhN1endURERGKgcBJJdAvTbypJwGGHQc+eGjkRERGJgcJJJNE9F4jCyYAB0KuXwomIiEgMFE4iO0dOuvSG7t3DyIlO64iIiKSdwkkk0SMPgMoDikKDTuuIiIjEQuEksjOc9Do0NOi0joiISCwUTiJdeuUDsLXbwaFBp3VERERioXAS6XFoV/LYzt/zDwsN9ad13OMtTEREpJNROIlYt0KK+jllB44MDb16wY4dUFUVb2EiIiKdjMJJkv5HFrBqbXS7oZ49w1ed2hEREUkrhZMk/fvDqlXRi/pwokmxIiIiaaVwkqR/f1i9GmprCad1QOFEREQkzRROkvTvDzU1sG4dOq0jIiISE4WTJEXR+murVqHTOiIiIjFROEnSv3/4WlaGwomIiEhMFE6S1IeT3UZOdFpHREQkrRROkhxwACQSUTjJy4MuXTRyIiIikmYKJ0nMGrmcWOFEREQkrRROGtgtnPTqpdM6IiIiaaZw0kBRkUZORERE4qRw0kD//rB2bVjvROFEREQk/RROGujfH+rqQkDRaR0REZH0SymcmNlYM1tqZsvM7NZGtueb2Yxo+xwzG9hg+wAzqzCzGxu0Z5vZfDP775Z0ojXtcTmxRk5ERETSap/hxMyyganAqcAQ4AIzG9Jgt8uBje5+JDAFuLfB9geBPzVy+O8BS/a36La0RzjZvDkMpYiIiEhapDJyMhpY5u7L3X0HMB0Y12CfccC06Pks4GQzMwAzOwv4EFic/A1mVgScDjzW/PJb327hpHfvEEw0eiIiIpI2qYSTfsCqpNdlUVuj+7h7DbAZ6G1mhcAtwI8aOe5DwM1ARg1LdO8OhYVRODnooNC4fn2sNYmIiHQmbT0hdhIwxd0rkhvN7Axgvbu/va8DmNkVZlZiZiXl5eVtVGbyzwujJ2VlQN++oXHdujb/uSIiIhLkpLDPaqB/0uuiqK2xfcrMLAfoAWwATgDONbP7gJ5AnZlVEUZavmZmpwEFQHcz+5W7X9zwh7v7o8CjAMXFxb4/nWuuI46ApUvZNXKicCIiIpI2qYSTucBRZjaIEELOBy5ssM9sYALwFnAu8LK7OzCmfgczmwRUuPvPoqbvR+1fBG5sLJjEZeRI+NOfoLJ7XxKg0zoiIiJptM/TOtEckquBFwhX1sx098VmdpeZfS3a7XHCHJNlwPXAHpcbtyejRkFtLSxc0xuysjRyIiIikkapjJzg7s8BzzVo+2HS8yrgvH0cY1IT7a8Cr6ZSR7qMGhW+zn8nm9EHHqhwIiIikkZaIbYRAweGJU7mzydMitVpHRERkbRROGmEWZh3Mn8+YVKsRk5ERETSRuGkCaNGwTvvQE2fQzRyIiIikkYKJ00YNQqqqmBp7jCNnIiIiKSRwkkTdk6K3T4Etm4NDxEREWlzCidNGDwYCgpg/qZBoUGndkRERNJC4aQJOTkwfDjM//vBoUGndkRERNJC4WQvRo2C+R/2xEHhREREJE0UTvZi5EjYtCWHMop0WkdERCRNFE72YujQ8HUxQzVyIiIikiYKJ3uxM5wUFCuciIiIpInCyV707h1Wr1+cN0qndURERNJE4WQfhg6FxXXHauREREQkTRRO9mHoUHi3ahC+TiMnIiIi6aBwsg9Dh0JFTYKP1ubGXYqIiEinoHCyDzsnxW7uB9XV8RYjIiLSCSic7MNulxOXl8dbjIiISCegcLIPvXrBIb0qtdaJiIhImiicpGDoEdsVTkRERNJE4SQFQ0fk8C5DqFv6QdyliIiIdHgKJykYOror2+jKytdWxl2KiIhIh6dwkoKhwwyARSWVMVciIiLS8SmcpGDYsPD1nVW9YOvWeIsRERHp4BROUtC9Oxxx8FZKfQSUlsZdjoiISIeWUjgxs7FmttTMlpnZrY1szzezGdH2OWY2sMH2AWZWYWY3Rq/7m9krZvaumS02s++1Rmfa0sjjsljACHj77bhLERER6dD2GU7MLBuYCpwKDAEuMLMhDXa7HNjo7kcCU4B7G2x/EPhT0usa4AZ3HwJ8BriqkWNmlBGfSbCMI9ny1qK4SxEREenQUhk5GQ0sc/fl7r4DmA6Ma7DPOGBa9HwWcLKZGYCZnQV8CCyu39nd17r7vOj5FmAJ0K8lHWlrI0eCk8XC/9OcExERkbaUSjjpB6xKel3GnkFi5z7uXgNsBnqbWSFwC/Cjpg4enQIaBcxJteg4jBwZvpau6AkVFfEWIyIi0oG19YTYScAUd2/00zwKL78FJrr7J03sc4WZlZhZSXmM97YpKoJehTtYwKc0KVZERKQNpRJOVgP9k14XRW2N7mNmOUAPYANwAnCfma0AJgI/MLOro/1yCcHkaXd/tqkf7u6Punuxuxf36dMnpU61BTMYOcIpZSSUlMRWh4iISEeXSjiZCxxlZoPMLA84H5jdYJ/ZwITo+bnAyx6McfeB7j4QeAj4sbv/LJqP8jiwxN0fbJWepMHI0fkstE9R+2ZGn4ESERFp1/YZTqI5JFcDLxAmrs5098VmdpeZfS3a7XHCHJNlwPXAHpcbN/B54BLgy2ZWGj1Oa3Yv0mTECKj0BB/8+UOoq4u7HBERkQ4pJ5Wd3P054LkGbT9Mel4FnLePY0xKev4GYPtTaCbYOSl200AGz5sHxcXxFiQiItIBaYXY/XDssZCb68xnFLz4YtzliIiIdEgKJ/shLw+OO874S5fTFE5ERETaiMLJfjr1VPjrtqGUv7EUtmyJuxwREZEOR+FkP51+elgp9vnaU+DVV+MuR0REpMNRONlPxx0Hffs6f8wep1M7IiIibUDhZD9lZcFppxkv2FhqnnsR3OMuSUREpENROGmG00+HTTWFvLX8IHjzzbjLERER6VAUTprhlFMgJ8f5Y+7Z8PjjcZcjIiLSoSicNEOPHjBmjPHfheNhxgz4pNF7FoqIiEgzKJw003nnweKN/Zi7bUgIKCIiItIqFE6a6aKLoGtX55Ge39epHRERkVakcNJM3bvDRRcZz2w9k01z3oNFi+IuSUREpENQOGmB73wHKqtzeSrrmxo9ERERaSUKJy0wahSMHg0/63IzdzzSj5NOrOMPf4i7KhERkfZN4aSFrrwS3q84lB9XXce8klruvFPrsomIiLSEwkkLXXopvPJSHeX9RjF50MPMnw9z58ZdlYiISPulcNJCWVnwxS9nccC3z+Gi9+6ga5c6Hnkk7qpERETaL4WT1nLZZXS3LVx89FymT4eNG+MuSEREpH1SOGktAwbAOefwLx/cSGUlPPVU3AWJiIi0Twonren22xm19Q1OKCpj8mTYsCHugkRERNofhZPWNGIEnHkmUz+5lPXrnUsvhbq6uIsSERFpXxROWtttt/HpT17hodP/h+eegx//WAFFRERkfyictLYTToCvfIXvvHo+55+9nTvugF694Ktfheeei7s4ERGRzKdw0hYeeAD7ZDO/7H0D06bBhRfCBx/A6afDuHGwcmXcBYqIiGSulMKJmY01s6VmtszMbm1ke76ZzYi2zzGzgQ22DzCzCjO7MdVjtmvDhsE115D/+H9w6dC3efhheO89uPdeeOkl+OxnQ1gRERGRPe0znJhZNjAVOBUYAlxgZkMa7HY5sNHdjwSmAPc22P4g8Kf9PGb7NmkSHHQQXHUV1NWRlwc33wxz5kB1NXzpS7BsWdxFioiIZJ5URk5GA8vcfbm77wCmA+Ma7DMOmBY9nwWcbGYGYGZnAR8Ci/fzmO1bjx5w330hjUybtrN56FB4+WWoqoKvfAVqa2OsUUREJAOlEk76AauSXpdFbY3u4+41wGagt5kVArcAP2rGMdu/iy+Gz30ObrlltyVjhw+Hn/0MPvwQ3nwzxvpEREQyUFtPiJ0ETHH3iuYewMyuMLMSMyspLy9vvcrSISsLpk4Nq7Hdeedum047DXJz4fe/j6k2ERGRDJVKOFkN9E96XRS1NbqPmeUAPYANwAnAfWa2ApgI/MDMrk7xmAC4+6PuXuzuxX369Emh3AwzciRceWUIKfPm7Wzu3h2+/OUQTtxjrE9ERCTDpBJO5gJHmdkgM8sDzgdmN9hnNjAhen4u8LIHY9x9oLsPBB4CfuzuP0vxmB3Hv/4r9O0L558Pn3yys3ncuDApdsmSGGsTERHJMPsMJ9EckquBF4AlwEx3X2xmd5nZ16LdHifMMVkGXA/s9dLgpo7Z/G5kuF69YPp0WL4cLr9851DJ16I/vfpTO5WVUNHsE2AiIiIdg3k7OqdQXFzsJSUlcZfRfJMnh+uJp0yBiRMBGD0azOBf/gWuuy4MrPToAf/0T+HOxgUFMdcsIiLSRszsbXcvbtieE0cxndaNN8Jbb8ENN8DgwTB2LOPGwe23w1//Cl/4Apx6ajjV89hjUFMDv/kN5OhdEhGRTkQfe+lkBk8+CWPGwDe+AW++yYUXDuOZZ+CKK+Dqq8MFPhAuN/7e98KIymOPhW8VERHpDBRO0q2wEP7wh3CDwDPOYNCcOSxa1HeP3a69Ftavh3vugTPPhLPOiqFWERGRGOjGf3EoKoLZs6G8PFyyU1nZ6G6TJsHRR8Mdd2glWRER6TwUTuLy6U/Dr34Vlre/7DKoq9tjl5yccBXyokXhYp+tW8Npnttvj6FeERGRNNHVOnG791649dZw9c6DD+4xuaSuLuSYzZvDVTylpaH997/fdSmyiIhIe9TU1ToaOYnbzTeHCSYPPbTHEvcQJsjec0+4D8+HH4ZQMnIkfPvb4axQskWL4OOP01S3iIhIG1E4iZtZWPfkn/85nMO566491rM/9dRwkU9JSRgtefJJ2LQp3Ffw5Zdh8eKw+Ozw4eGePZqfIiIi7ZnCSSbIyoJHH4VLLw2jJ9/97m4JwwwuuQSOPDK8Hj4c7r8fXnwRTj4Zhg0L82vPOw/mzoWf/zzst2YN3H03rG70rkUiIiKZSZcSZ4rsbPjlL+HQQ+Hf/g3WroVf/xq6dGl092uugQsvDKMp778fLjUuKoKNG+H73w9B5vLLoawM7rsvHPI739m1joqIiEim0kdVJjGDn/wEfvrTMBRyyimwYUOTu/fuDV/9aggq/fuHb586FaqqQntdXZijcsIJcNVV4ebIIiIimU7hJBNdfTXMmgXz5sHnPw8rVqT8rUcfDQ88EE73zJkT5qi8+GJYOf/RR+Hxx8N+f/5zmFT7wAPw6qth9GXduj2mu4iIiKSdLiXOZG+8EdJFfj489xyMGtXsQ9XWwtix8PrrcMYZ8NvfQteuYe2UZCeeGEZbDjighbWLiIjsgy4lbo9OPDEElNxcOOkk+OMfm32o7Gx45hno2zeEj9tuC5cd//3v8PzzYT24e+7ZdQPC+km0W7fCE0+EuyTfcQdUV7dS30RERJqgkZP2YPXqcIOd0lL44Q/Do5kzW9euhW3b4IgjGt/+8sthRf3t2yGRCF+3bw9zWlatCmeZ7rknXAm0aVO4QujAA1vQNxER6bSaGjlROGkvKivDjNZp08IwRv2VPW3gnXfCSEp1dRhxOeusEEqmTw93T66o2LVv167hKqCvfx0+9amQn+6/P6y98t3vhm2JRJuUKSIi7ZzCSUfgDv/5n2Gp+0QCHnkEzj13jyXv29LKlbBgQRh5qamByZPD6aLkWwMdcAAMHgxvvhlOIxUXQ79+ux7V1WGy7tKlYRrNF78YzlytWQO9esHpp0P37ru6XFGxa/n+bt3S1lUREWljCicdydKlYVW2uXPDNcP//u9wzDGxlbN2bVhvpbQ0hJGLLw7Ls/zlL+Gq6L/9LZyZSl5uv0+fcGVRaemek3Lz8+H448PVQytW7D7PZeDAEHwOPTT8rIKCcIPEbdvCaaauXcMoz6hR4bgbNoS5NRs2QGEhjBgR1oAxC5OE6+rC16yscJzs7KaznntYR6a8PGTDwsLw8/Ly0poPRUQ6DIWTjqa6Oixqcued4ZTPD34QHnl5cVfWpO3bQ5BxDyHDLHRj/vwQDA45JNw/aObMMDG3f38YNCjMaenWLYSMRYvggw/Ccdat27WQrhn07BlGWVo6aTc7O4zkHHhgCFF1deG469fDli177p+TE8LSsceGhfC2bQvBKCsrHCcnZ9+P3Nzw1rnDP/4RQhCE9r09duwIb39d3a5jNNyWnR3+/Nrr6bV29E+USIc0cGC4NUpbUDjpqNatg+uvD6vJjhgBDz8Mn/1s3FWlTW1tCCN5eSEMVFaGUZxFi8JpoN69Q8jo3Tt84C9YEAJQVlb40M7ODs/dw2mq+sf27SEMffxx2KewMBxj4EA46KCw0N3WrSG0bNkSJgu/+264+qmwMIwc1dXtfsymHtXV4Wu9wsJwegvCtoaP5H0hhJusrLCtHf06i0g7ccYZ8Ic/tM2xFU46uv/6rzD7dN26cNnxbbfBV74Sd1WSIvdd4SI/f9/71oeavLwQTi0dIVcAAAycSURBVOrV1oYRk5qasC0vb9fIT2Vl+z391F7rFukI8vLCyHRbUDjpDCoq4LHH4MEHw3/lzz4bHnoIBgyIuzIREZE9aBG2zqCwMFzJs2xZuNPf88+HiRATJ4bLbERERNoBhZOOKC8PbrkFliwJlxpPnRqu/T3nnLAMfv0sUhERkQyUUjgxs7FmttTMlpnZrY1szzezGdH2OWY2MGofbWal0WOBmZ2d9D3XmdliM1tkZs+YWUFrdUoihx0WFm1bvhyuuy7cWOf008M1vE8/vfviJCIiIhlin+HEzLKBqcCpwBDgAjMb0mC3y4GN7n4kMAW4N2pfBBS7+0hgLPBzM8sxs37AtdG2YUA2cH5rdEga0b9/WC2trCzc7bh797AYyfDhof2jj+KuUEREZKdURk5GA8vcfbm77wCmA+Ma7DMOmBY9nwWcbGbm7tvcvf7CxwIgefZtDpAwsxygC7CmuZ2QFOXlhVM7b78d1qIvLISbbw4jLCeeGE7/JK+UJiIiEoNUwkk/YFXS67KordF9ojCyGegNYGYnmNliYCHwHXevcffVwP3AR8BaYLO7v9jYDzezK8ysxMxKyvXB2TqysmD8+LCG/LJlcPfdYXnVq68Oq4mdfTY8++zuN9ERERFJkzafEOvuc9x9KHA88H0zKzCzXoTRlkHAoUBXM7u4ie9/1N2L3b24T58+bV1u53PEEWFNlEWLwh3/Jk6Et94KIyy9e4fl8X/60zBvRUREJA1SCSergf5Jr4uitkb3iU7T9AA2JO/g7kuACmAYcArwobuXu3s18CzwueZ0QFpR/RyUVavg5ZfDSMrKlXDttSHEHHkkfPvbYTXatWvjrlZERDqoVMLJXOAoMxtkZnmEiauzG+wzG5gQPT8XeNndPfqeHAAzOwwYDKwgnM75jJl1MTMDTgaWtLg30jpyc+FLX4IHHoD33gs3s3noIRg6FH7zG7joonD6Z/BguPLKcDOc9evjrlpERDqIlFaINbPTgIcIV9U84e73mNldQIm7z44uA34KGAX8Azjf3Zeb2SXArUA1UAfc5e7/FR3zR8B4oAaYD3zL3bfvrQ6tEJsBamvDrYRfeSU8Xn99193whgwJ9/U59lgYNiw879493npFRCRjafl6aRs1NTBv3q6wMn/+rlGUrCwYOTIElaOPhmOOCV+POqr93iJXRERajcKJpM+GDSGkvP46/O//hlNDqxtMUxowYPfAUv98wIBwG2AREenwmgonOY3tLNIivXvDKaeER72KijB35f33w2Pp0vD1qafgk0927ZeXFybeHnVUmNdyyCF7Pg46SAFGRKQDUziR9CgshFGjwiOZezgNVB9a6oPL3/4Gb7wRRmEaysoKAeWQQ5oOMH37hvku3bqFCb4iItJuKJxIvMxCkOjbF8aM2XP79u2wbh2sWRMuX274WLMmrHi7fn3T9wrKywvhqFu33b82t61r1xCQRESkTSicSGbLzw/zUAYM2Pt+NTVh6f36ELNuXTiVVFERriZq7Hn9PvVtVVWp19Wly+6BpaAAcnJ2f+Tm7tm2v+1tdQyz8MjK2vV8f9tERNqIwol0DDk5u07pNFdNza7gkhxamgo3ydu3bw/fX1MD27btel7/qK5Ora09aRhWmht4Gjvu3l535n1a67gi++OLX4T/+I+0/kiFE5F6OTnQs2d4xME9rCPT3GCzP23uux51dbu/bmlbqvs2dhqu4dWDjV1N2Fn3aa3jiuyvfY1ctwGFE5FMYbbrtIuISCemWX0iIiKSURROREREJKMonIiIiEhGUTgRERGRjKJwIiIiIhlF4UREREQyisKJiIiIZBSFExEREcko5u1o9UAzKwdWtsGhDwQ+boPjZprO0E/1sePoDP3sDH2EztFP9bF5DnP3Pg0b21U4aStmVuLuxXHX0dY6Qz/Vx46jM/SzM/QROkc/1cfWpdM6IiIiklEUTkRERCSjKJwEj8ZdQJp0hn6qjx1HZ+hnZ+gjdI5+qo+tSHNOREREJKNo5EREREQyisKJiIiIZJROH07MbKyZLTWzZWZ2a9z1tAYz629mr5jZu2a22My+F7VPMrPVZlYaPU6Lu9aWMrMVZrYw6k9J1HaAmf3ZzD6IvvaKu87mMrNjkt6vUjP7xMwmdoT30syeMLP1ZrYoqa3R986C/xf9nr5jZsfFV3nqmujjZDN7L+rH78ysZ9Q+0Mwqk97TR+KrPHVN9LHJv59m9v3ofVxqZl+Np+r900QfZyT1b4WZlUbt7fJ9hL1+dqT/99LdO+0DyAb+BhwO5AELgCFx19UK/ToEOC563g14HxgCTAJujLu+Vu7rCuDABm33AbdGz28F7o27zlbqazbwd+CwjvBeAicBxwGL9vXeAacBfwIM+AwwJ+76W9DHrwA50fN7k/o4MHm/9vJooo+N/v2M/h1aAOQDg6J/f7Pj7kNz+thg+wPAD9vz+xjV3tRnR9p/Lzv7yMloYJm7L3f3HcB0YFzMNbWYu69193nR8y3AEqBfvFWl1ThgWvR8GnBWjLW0ppOBv7l7W6ySnHbu/hrwjwbNTb1344AnPfg/oKeZHZKeSpuvsT66+4vuXhO9/D+gKO2FtaIm3semjAOmu/t2d/8QWEb4dzij7a2PZmbAN4Bn0lpUG9jLZ0fafy87ezjpB6xKel1GB/sQN7OBwChgTtR0dTT89kR7Pt2RxIEXzextM7siauvr7muj538H+sZTWqs7n93/Aexo7yU0/d511N/Vfyb8z7PeIDObb2Z/MbMxcRXVShr7+9kR38cxwDp3/yCprd2/jw0+O9L+e9nZw0mHZmaFwG+Bie7+CfAwcAQwElhLGIps70509+OAU4GrzOyk5I0exh7b/fXyZpYHfA34TdTUEd/L3XSU964pZnYbUAM8HTWtBQa4+yjgeuDXZtY9rvpaqMP//UxyAbv/p6Hdv4+NfHbslK7fy84eTlYD/ZNeF0Vt7Z6Z5RL+cj3t7s8CuPs6d6919zrgP2kHw6n74u6ro6/rgd8R+rSufmgx+ro+vgpbzanAPHdfBx3zvYw09d51qN9VM/smcAZwUfSPPdGpjg3R87cJ8zGOjq3IFtjL38+O9j7mAF8HZtS3tff3sbHPDmL4vezs4WQucJSZDYr+Z3o+MDvmmlosOgf6OLDE3R9Mak8+F3g2sKjh97YnZtbVzLrVPydMNFxEeA8nRLtNAH4fT4Wtarf/nXW09zJJU+/dbODS6OqAzwCbk4aZ2xUzGwvcDHzN3bcltfcxs+zo+eHAUcDyeKpsmb38/ZwNnG9m+WY2iNDHv6a7vlZ0CvCeu5fVN7Tn97Gpzw7i+L2Me3Zw3A/CbOP3Cen2trjraaU+nUgYdnsHKI0epwFPAQuj9tnAIXHX2sJ+Hk6Y+b8AWFz//gG9gZeAD4D/AQ6Iu9YW9rMrsAHokdTW7t9LQthaC1QTzlVf3tR7R7gaYGr0e7oQKI67/hb0cRnhPH397+Yj0b7nRH+PS4F5wJlx19+CPjb59xO4LXoflwKnxl1/c/sYtf8S+E6Dfdvl+xjV3tRnR9p/L7V8vYiIiGSUzn5aR0RERDKMwomIiIhkFIUTERERySgKJyIiIpJRFE5EREQkoyiciEi7YGZfNLP/jrsOEWl7CiciIiKSURRORKRVmdnFZvZXMys1s5+bWbaZVZjZFDNbbGYvmVmfaN+RZvZ/0Q3ifld/gzgzO9LM/sfMFpjZPDM7Ijp8oZnNMrP3zOzpaEVLzOzfzOzd6Dj3x9R1EWklCici0mrM7FhgPPB5dx8J1AIXEVa5LXH3ocBfgDujb3kSuMXdP0VYYbK+/WlgqruPAD5HWJ0Twl1SJwJDCCsEf97MehOWSB8aHefutu2liLQ1hRMRaU0nA58G5ppZafT6cKCOXTdH+xVwopn1AHq6+1+i9mnASdH9kvq5++8A3L3Kd92D5q/uXubhhnKlwEBgM1AFPG5mXwd23q9GRNonhRMRaU0GTHP3kdHjGHef1Mh+zb1vxvak57VAjrvXEO56O4twp9/nm3lsEckQCici0ppeAs41s4MAzOwAMzuM8G/NudE+FwJvuPtmYKOZjYnaLwH+4u5bgDIzOys6Rr6ZdWnqB5pZIeGmiM8B1wEj2qJjIpI+OXEXICIdh7u/a2a3Ay+aWRbhLq5XAVuB0dG29YR5KRBuv/5IFD6WA5dF7ZcAPzezu6JjnLeXH9sN+L2ZFRBGbq5v5W6JSJrprsQi0ubMrMLdC+OuQ0TaB53WERERkYyikRMRERHJKBo5ERERkYyicCIiIiIZReFEREREMorCiYiIiGQUhRMRERHJKP8f4PCQ6IDgZLgAAAAASUVORK5CYII=\n",
"text/plain": [
"