diff --git a/notebooks/basics/0_getting_started.ipynb b/notebooks/basics/0_getting_started.ipynb
deleted file mode 100644
index 07b2a2ba6d5a21be15de5c4061500d83b2aefdf3..0000000000000000000000000000000000000000
--- a/notebooks/basics/0_getting_started.ipynb
+++ /dev/null
@@ -1,129 +0,0 @@
-{
- "cells": [
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "# FINN Basics\n",
-    "\n",
-    "\n",
-    "## What is FINN?\n",
-    "\n",
-    "'FINN' is colloquially used to refer to two separate but highly related things:\n",
-    "\n",
-    "* The [FINN project](https://xilinx.github.io/finn/), which includes tools for training quantized neural networks such as [Brevitas](github.com/Xilinx/brevitas), the FINN compiler, and the [finn-hlslib](github.com/Xilinx/finn-hlslib) Vivado HLS library of FPGA components for QNNs.\n",
-    "* This repository, referred to as the *FINN compiler*, which is the centerpiece of the FINN project.\n",
-    "\n",
-    "## How to use the FINN compiler?\n",
-    "\n",
-    "The FINN compiler should not be thought of a single pushbutton tool that does everything for you, but rather as a collection of scripts/tools that will help you convert a QNN into a custom FPGA accelerator that performs high-performance inference. We do provide several examples of taking trained networks all the way down to FPGA bitfiles, but if you are trying to do this for custom networks you will have to write your own Python scripts that call the appropriate FINN Compiler functions that process your design correctly, or adding new functions as required."
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "## Requirements\n",
-    "\n",
-    "* Ubuntu 18.04\n",
-    "* Docker\n",
-    "* A working Vivado installation\n",
-    "* A `VIVADO_PATH` environment variable pointing to the Vivado installation directory (e.g. the directory where settings64.sh is located)\n",
-    "\n",
-    "\n",
-    "## Running FINN with Docker\n",
-    "\n",
-    "We use Docker extensively for developing and deploying FINN. If you are not familiar with Docker, there are many excellent [online resources]( https://docker-curriculum.com/) to get started. There is a Dockerfile in the root of the repository, as well as a `run-docker.sh` script that can be launched in the following modes:\n",
-    "\n",
-    "### Getting an interactive shell for development or experimentation\n",
-    "\n",
-    "Simply running `sh run-docker.sh` without any additional arguments will clone the dependency repos, create a Docker container and give you a terminal with you can use for development for experimentation. \n",
-    "\n",
-    "*Important:* the Docker container is spawned with the `--rm` option, so make sure that any important files you created inside the container are either in the /workspace/finn folder (which is mounted from the host cinoyter) or otherwise backed up.\n",
-    "\n",
-    "*Develop from host, run inside container:* The FINN repository directory will be mounted from the host, so that you can use a text editor on your host computer to develop and the changes will be reflected directly inside the container.\n",
-    "\n",
-    "### Running the Jupyter notebooks\n",
-    "\n",
-    "```sh run-docker.sh notebook```\n",
-    "\n",
-    "This will launch the Jupyter notebook server inside a Docker container, and print a link on the terminal that you can open in your browser to run the FINN notebooks or create new ones. The link will look something like this (the token you get will be different):\n",
-    "\n",
-    "`http://127.0.0.1:8888/?token=f5c6bd32ae93ec103a88152214baedff4ce1850d81065bfc`\n",
-    "\n",
-    "The `run-docker.sh` script forwards ports 8888 for Jupyter and 8081 for Netron, and launches the notebook server with appropriate arguments.\n",
-    "\n",
-    "### Running the test suite\n",
-    "\n",
-    "FINN comes with a set of tests which you can easily launch in Docker as follows:\n",
-    "\n",
-    "```sh run-docker.sh test```\n",
-    "\n",
-    "Note that some of the tests involve extra compilation and the entire test suite may take some time to complete. "
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "## Intermediate Representation: FINN-ONNX\n",
-    "\n",
-    "FINN uses [ONNX](onnx.ai) as an intermediate representation (IR) for neural networks. As such, almost every component inside FINN uses ONNX and its [Python API](https://github.com/onnx/onnx/blob/master/docs/PythonAPIOverview.md), so you may want to familiarize yourself with how ONNX represents DNNs. Specifically, the [ONNX protobuf description](https://github.com/onnx/onnx/blob/master/onnx/onnx.proto) (or its [human-readable documentation](https://github.com/onnx/onnx/blob/master/docs/IR.md) and the [operator schemas](https://github.com/onnx/onnx/blob/master/docs/Operators.md)  are useful as reference documents.\n",
-    "\n",
-    "FINN uses ONNX is a specific way that we refer to as FINN-ONNX, and not all ONNX graphs are supported by FINN (and vice versa). Here is a list of key points to keep in mind:\n",
-    "\n",
-    "* *Custom quantization annotations but data stored as float.* ONNX does not support datatypes smaller than 8-bit integers, whereas in FINN we are interested in smaller integers down to ternary and bipolar. To make this work, FINN uses the `quantization_annotation` field in ONNX to annotate tensors with their [FINN DataType](https://github.com/Xilinx/finn/blob/master/src/finn/core/datatype.py) information. However, all tensors are expected to use single-precision floating point (float32) storage in FINN. This means we store even a 1-bit value as floating point for the purposes of representation. The FINN compiler flow is responsible for eventually producing a packed representation for the target hardware, where the 1-bit is actually stored as 1-bit.\n",
-    "\n",
-    "* *Custom operations/nodes.* FINN uses many custom operations (`op_type` in ONNX NodeProto) that are not defined in the ONNX operator schema. These custom nodes are marked with `domain=\"finn\"` in the protobuf to identify them as such. These nodes can represent specific operations that we need for low-bit networks, or operations that are specific to a particular hardware backend.\n",
-    "\n",
-    "* *Custom ONNX execution flow* To verify correct operation of FINN-ONNX graphs, FINN provides its own [ONNX execution flow](https://github.com/Xilinx/finn/blob/master/src/finn/core/onnx_exec.py). This flow supports the standard set of ONNX operations as well as the custom FINN operations.  *Important:* this execution flow is *only* meant for checking the correctness of models after applying transformations, and *not* for high performance inference. \n",
-    "\n",
-    "* *ModelWrapper* FINN provides a [`ModelWrapper`](https://github.com/Xilinx/finn/blob/master/src/finn/core/modelwrapper.py) class as a thin wrapper around ONNX to make it easier to analyze and manipulate ONNX graphs. This wrapper provides many helper functions, while still giving full access to the ONNX protobuf representation. \n",
-    "\n",
-    "[Netron](https://lutzroeder.github.io/netron/) is very useful for visualizing ONNX models, including FINN-ONNX models."
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "## More FINN Resources\n",
-    "\n",
-    "* **[List of publications](https://github.com/Xilinx/finn/blob/master/docs/publications.md)**\n",
-    "* **[Roadmap](https://github.com/Xilinx/finn/projects/1)**\n",
-    "* **[Status of example networks](https://github.com/Xilinx/finn/blob/master/docs/example-networks.md)**\n",
-    "\n",
-    "\n",
-    "\n"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {},
-   "outputs": [],
-   "source": []
-  }
- ],
- "metadata": {
-  "kernelspec": {
-   "display_name": "Python 3",
-   "language": "python",
-   "name": "python3"
-  },
-  "language_info": {
-   "codemirror_mode": {
-    "name": "ipython",
-    "version": 3
-   },
-   "file_extension": ".py",
-   "mimetype": "text/x-python",
-   "name": "python",
-   "nbconvert_exporter": "python",
-   "pygments_lexer": "ipython3",
-   "version": "3.6.8"
-  }
- },
- "nbformat": 4,
- "nbformat_minor": 2
-}
diff --git a/notebooks/basics/1_how_to_work_with_onnx.ipynb b/notebooks/basics/0_how_to_work_with_onnx.ipynb
similarity index 92%
rename from notebooks/basics/1_how_to_work_with_onnx.ipynb
rename to notebooks/basics/0_how_to_work_with_onnx.ipynb
index 29b2751aff73706d5590c6641b86104368816922..4b4fc456944a49c0f2478766dae3c5fb8b4e3da6 100644
--- a/notebooks/basics/1_how_to_work_with_onnx.ipynb
+++ b/notebooks/basics/0_how_to_work_with_onnx.ipynb
@@ -6,7 +6,7 @@
    "source": [
     "# FINN - How to work with ONNX\n",
     "\n",
-    "This notebook should give an overview of ONNX ProtoBuf, help to create and manipulate an ONNX model and use FINN functions to work with it. There may be overlaps to other notebooks, like [ModelWrapper](2_modelwrapper.ipynb) and [CustomOps](../internals/2_custom_op.ipynb), but this notebook will give an overview about the handling of ONNX models in FINN."
+    "This notebook should give an overview of ONNX ProtoBuf, help to create and manipulate an ONNX model and use FINN functions to work with it. There may be overlaps to notebook [ModelWrapper](2_modelwrapper.ipynb), but this notebook will give an overview about the handling of ONNX models in FINN."
    ]
   },
   {
@@ -173,18 +173,9 @@
    "cell_type": "code",
    "execution_count": 6,
    "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Serving 'simple_model.onnx' at http://0.0.0.0:8081\n"
-     ]
-    }
-   ],
+   "outputs": [],
    "source": [
-    "import netron\n",
-    "netron.start('simple_model.onnx', port=8081, host=\"0.0.0.0\")"
+    "from finn.util.visualization import showInNetron"
    ]
   },
   {
@@ -194,22 +185,37 @@
     "scrolled": true
    },
    "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Serving 'simple_model.onnx' at http://0.0.0.0:8081\n"
+     ]
+    },
     {
      "data": {
       "text/html": [
-       "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>\n"
+       "\n",
+       "        <iframe\n",
+       "            width=\"100%\"\n",
+       "            height=\"400\"\n",
+       "            src=\"http://0.0.0.0:8081/\"\n",
+       "            frameborder=\"0\"\n",
+       "            allowfullscreen\n",
+       "        ></iframe>\n",
+       "        "
       ],
       "text/plain": [
-       "<IPython.core.display.HTML object>"
+       "<IPython.lib.display.IFrame at 0x7fb9303c7b38>"
       ]
      },
+     "execution_count": 7,
      "metadata": {},
-     "output_type": "display_data"
+     "output_type": "execute_result"
     }
    ],
    "source": [
-    "%%html\n",
-    "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>"
+    "showInNetron('simple_model.onnx')"
    ]
   },
   {
@@ -310,16 +316,16 @@
      "output_type": "stream",
      "text": [
       "The output of the ONNX model is: \n",
-      "[[12.  9. 14.  8.]\n",
-      " [ 9.  9.  4.  6.]\n",
-      " [ 3. 19.  9.  5.]\n",
-      " [ 8. 22.  7.  2.]]\n",
+      "[[ 1. 16.  3. 10.]\n",
+      " [ 5. 17. 17. 13.]\n",
+      " [ 3. 11. 10. 17.]\n",
+      " [ 9.  2.  4.  8.]]\n",
       "\n",
       "The output of the reference function is: \n",
-      "[[12.  9. 14.  8.]\n",
-      " [ 9.  9.  4.  6.]\n",
-      " [ 3. 19.  9.  5.]\n",
-      " [ 8. 22.  7.  2.]]\n",
+      "[[ 1. 16.  3. 10.]\n",
+      " [ 5. 17. 17. 13.]\n",
+      " [ 3. 11. 10. 17.]\n",
+      " [ 9.  2.  4.  8.]]\n",
       "\n",
       "The results are the same!\n"
      ]
@@ -349,7 +355,7 @@
    "source": [
     "### How to manipulate an ONNX model\n",
     "\n",
-    "In the model there are two successive adder nodes. An adder node in ONNX can only add two inputs, but there is also the [**sum**](https://github.com/onnx/onnx/blob/master/docs/Operators.md#Sum) node, which can process more than one input. So it would be a reasonable change of the graph to combine the two successive adder nodes to one sum node."
+    "In the model there are two successive adder nodes. An adder node in ONNX can only add two inputs, but there is also the [**sum**](https://github.com/onnx/onnx/blob/master/docs/Operators.md#Sum) node, which can process more than two inputs. So it would be a reasonable change of the graph to combine the two successive adder nodes to one sum node."
    ]
   },
   {
@@ -363,7 +369,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 14,
+   "execution_count": 13,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -380,7 +386,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 15,
+   "execution_count": 14,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -404,7 +410,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 16,
+   "execution_count": 15,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -427,7 +433,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 17,
+   "execution_count": 16,
    "metadata": {},
    "outputs": [
     {
@@ -455,7 +461,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 18,
+   "execution_count": 17,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -484,7 +490,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 19,
+   "execution_count": 18,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -514,7 +520,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 20,
+   "execution_count": 19,
    "metadata": {},
    "outputs": [
     {
@@ -550,7 +556,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 21,
+   "execution_count": 20,
    "metadata": {},
    "outputs": [
     {
@@ -585,7 +591,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 22,
+   "execution_count": 21,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -601,7 +607,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 23,
+   "execution_count": 22,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -622,7 +628,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 24,
+   "execution_count": 23,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -650,7 +656,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 25,
+   "execution_count": 24,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -660,7 +666,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 26,
+   "execution_count": 25,
    "metadata": {},
    "outputs": [
     {
@@ -671,34 +677,31 @@
       "Stopping http://0.0.0.0:8081\n",
       "Serving 'simple_model1.onnx' at http://0.0.0.0:8081\n"
      ]
-    }
-   ],
-   "source": [
-    "import netron\n",
-    "netron.start('simple_model1.onnx', port=8081, host=\"0.0.0.0\")"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 27,
-   "metadata": {},
-   "outputs": [
+    },
     {
      "data": {
       "text/html": [
-       "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>\n"
+       "\n",
+       "        <iframe\n",
+       "            width=\"100%\"\n",
+       "            height=\"400\"\n",
+       "            src=\"http://0.0.0.0:8081/\"\n",
+       "            frameborder=\"0\"\n",
+       "            allowfullscreen\n",
+       "        ></iframe>\n",
+       "        "
       ],
       "text/plain": [
-       "<IPython.core.display.HTML object>"
+       "<IPython.lib.display.IFrame at 0x7fb93018f9e8>"
       ]
      },
+     "execution_count": 25,
      "metadata": {},
-     "output_type": "display_data"
+     "output_type": "execute_result"
     }
    ],
    "source": [
-    "%%html\n",
-    "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>"
+    "showInNetron('simple_model1.onnx')"
    ]
   },
   {
@@ -710,7 +713,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 28,
+   "execution_count": 26,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -720,7 +723,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 29,
+   "execution_count": 27,
    "metadata": {},
    "outputs": [
     {
@@ -728,16 +731,16 @@
      "output_type": "stream",
      "text": [
       "The output of the manipulated ONNX model is: \n",
-      "[[12.  9. 14.  8.]\n",
-      " [ 9.  9.  4.  6.]\n",
-      " [ 3. 19.  9.  5.]\n",
-      " [ 8. 22.  7.  2.]]\n",
+      "[[ 1. 16.  3. 10.]\n",
+      " [ 5. 17. 17. 13.]\n",
+      " [ 3. 11. 10. 17.]\n",
+      " [ 9.  2.  4.  8.]]\n",
       "\n",
       "The output of the reference function is: \n",
-      "[[12.  9. 14.  8.]\n",
-      " [ 9.  9.  4.  6.]\n",
-      " [ 3. 19.  9.  5.]\n",
-      " [ 8. 22.  7.  2.]]\n",
+      "[[ 1. 16.  3. 10.]\n",
+      " [ 5. 17. 17. 13.]\n",
+      " [ 3. 11. 10. 17.]\n",
+      " [ 9.  2.  4.  8.]]\n",
       "\n",
       "The results are the same!\n"
      ]
@@ -752,13 +755,6 @@
     "else:\n",
     "    raise Exception(\"Something went wrong, the output of the model doesn't match the expected output!\")"
    ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {},
-   "outputs": [],
-   "source": []
   }
  ],
  "metadata": {
diff --git a/notebooks/basics/3_brevitas_network_import.ipynb b/notebooks/basics/1_brevitas_network_import.ipynb
similarity index 62%
rename from notebooks/basics/3_brevitas_network_import.ipynb
rename to notebooks/basics/1_brevitas_network_import.ipynb
index 30026e7aaa541641d4068ca0a7a0a3cf7c14088f..ca148e9ebd23efc37da89fc47e2b289ac6066565 100644
--- a/notebooks/basics/3_brevitas_network_import.ipynb
+++ b/notebooks/basics/1_brevitas_network_import.ipynb
@@ -22,10 +22,7 @@
    "outputs": [],
    "source": [
     "import onnx\n",
-    "import inspect\n",
-    "\n",
-    "def showSrc(what):\n",
-    "    print(\"\".join(inspect.getsourcelines(what)[0]))"
+    "from finn.util.visualization import showSrc, showInNetron"
    ]
   },
   {
@@ -49,8 +46,9 @@
       "class LFC(Module):\n",
       "\n",
       "    def __init__(self, num_classes=10, weight_bit_width=None, act_bit_width=None,\n",
-      "                 in_bit_width=None, in_ch=1, in_features=(28, 28)):\n",
+      "                 in_bit_width=None, in_ch=1, in_features=(28, 28), device=\"cpu\"):\n",
       "        super(LFC, self).__init__()\n",
+      "        self.device = device\n",
       "\n",
       "        weight_quant_type = get_quant_type(weight_bit_width)\n",
       "        act_quant_type = get_quant_type(act_bit_width)\n",
@@ -72,20 +70,29 @@
       "            self.features.append(BatchNorm1d(num_features=in_features))\n",
       "            self.features.append(get_act_quant(act_bit_width, act_quant_type))\n",
       "            self.features.append(Dropout(p=HIDDEN_DROPOUT))\n",
-      "        self.fc = get_quant_linear(in_features=in_features,\n",
+      "        self.features.append(get_quant_linear(in_features=in_features,\n",
       "                                   out_features=num_classes,\n",
       "                                   per_out_ch_scaling=LAST_FC_PER_OUT_CH_SCALING,\n",
       "                                   bit_width=weight_bit_width,\n",
       "                                   quant_type=weight_quant_type,\n",
-      "                                   stats_op=stats_op)\n",
+      "                                   stats_op=stats_op))\n",
+      "        self.features.append(BatchNorm1d(num_features=num_classes))\n",
+      "\n",
+      "        for m in self.modules():\n",
+      "          if isinstance(m, QuantLinear):\n",
+      "            torch.nn.init.uniform_(m.weight.data, -1, 1)\n",
       "\n",
+      "    def clip_weights(self, min_val, max_val):\n",
+      "        for mod in self.features:\n",
+      "            if isinstance(mod, QuantLinear):\n",
+      "                mod.weight.data.clamp_(min_val, max_val)\n",
+      "    \n",
       "    def forward(self, x):\n",
       "        x = x.view(x.shape[0], -1)\n",
-      "        x = 2.0 * x - torch.tensor([1.0])\n",
+      "        x = 2.0 * x - torch.tensor([1.0]).to(self.device)\n",
       "        for mod in self.features:\n",
       "            x = mod(x)\n",
-      "        out = self.fc(x)\n",
-      "        return out\n",
+      "        return x\n",
       "\n"
      ]
     }
@@ -135,22 +142,12 @@
        "      (weight_reg): WeightReg()\n",
        "      (weight_quant): WeightQuantProxy(\n",
        "        (tensor_quant): BinaryQuant(\n",
-       "          (scaling_impl): ParameterStatsScaling(\n",
-       "            (parameter_list_stats): ParameterListStats(\n",
-       "              (first_tracked_param): _ViewParameterWrapper()\n",
-       "              (stats): Stats(\n",
-       "                (stats_impl): AbsAve()\n",
-       "              )\n",
-       "            )\n",
-       "            (stats_scaling_impl): StatsScaling(\n",
-       "              (affine_rescaling): Identity()\n",
-       "              (restrict_scaling): RestrictValue(\n",
-       "                (forward_impl): Sequential(\n",
-       "                  (0): PowerOfTwo()\n",
-       "                  (1): ClampMin()\n",
-       "                )\n",
+       "          (scaling_impl): StandaloneScaling(\n",
+       "            (restrict_value): RestrictValue(\n",
+       "              (forward_impl): Sequential(\n",
+       "                (0): PowerOfTwo()\n",
+       "                (1): Identity()\n",
        "              )\n",
-       "              (restrict_scaling_preprocess): LogTwo()\n",
        "            )\n",
        "          )\n",
        "        )\n",
@@ -181,22 +178,12 @@
        "      (weight_reg): WeightReg()\n",
        "      (weight_quant): WeightQuantProxy(\n",
        "        (tensor_quant): BinaryQuant(\n",
-       "          (scaling_impl): ParameterStatsScaling(\n",
-       "            (parameter_list_stats): ParameterListStats(\n",
-       "              (first_tracked_param): _ViewParameterWrapper()\n",
-       "              (stats): Stats(\n",
-       "                (stats_impl): AbsAve()\n",
-       "              )\n",
-       "            )\n",
-       "            (stats_scaling_impl): StatsScaling(\n",
-       "              (affine_rescaling): Identity()\n",
-       "              (restrict_scaling): RestrictValue(\n",
-       "                (forward_impl): Sequential(\n",
-       "                  (0): PowerOfTwo()\n",
-       "                  (1): ClampMin()\n",
-       "                )\n",
+       "          (scaling_impl): StandaloneScaling(\n",
+       "            (restrict_value): RestrictValue(\n",
+       "              (forward_impl): Sequential(\n",
+       "                (0): PowerOfTwo()\n",
+       "                (1): Identity()\n",
        "              )\n",
-       "              (restrict_scaling_preprocess): LogTwo()\n",
        "            )\n",
        "          )\n",
        "        )\n",
@@ -227,22 +214,12 @@
        "      (weight_reg): WeightReg()\n",
        "      (weight_quant): WeightQuantProxy(\n",
        "        (tensor_quant): BinaryQuant(\n",
-       "          (scaling_impl): ParameterStatsScaling(\n",
-       "            (parameter_list_stats): ParameterListStats(\n",
-       "              (first_tracked_param): _ViewParameterWrapper()\n",
-       "              (stats): Stats(\n",
-       "                (stats_impl): AbsAve()\n",
-       "              )\n",
-       "            )\n",
-       "            (stats_scaling_impl): StatsScaling(\n",
-       "              (affine_rescaling): Identity()\n",
-       "              (restrict_scaling): RestrictValue(\n",
-       "                (forward_impl): Sequential(\n",
-       "                  (0): PowerOfTwo()\n",
-       "                  (1): ClampMin()\n",
-       "                )\n",
+       "          (scaling_impl): StandaloneScaling(\n",
+       "            (restrict_value): RestrictValue(\n",
+       "              (forward_impl): Sequential(\n",
+       "                (0): PowerOfTwo()\n",
+       "                (1): Identity()\n",
        "              )\n",
-       "              (restrict_scaling_preprocess): LogTwo()\n",
        "            )\n",
        "          )\n",
        "        )\n",
@@ -268,33 +245,24 @@
        "      )\n",
        "    )\n",
        "    (13): Dropout(p=0.2)\n",
-       "  )\n",
-       "  (fc): QuantLinear(\n",
-       "    in_features=1024, out_features=10, bias=False\n",
-       "    (weight_reg): WeightReg()\n",
-       "    (weight_quant): WeightQuantProxy(\n",
-       "      (tensor_quant): BinaryQuant(\n",
-       "        (scaling_impl): ParameterStatsScaling(\n",
-       "          (parameter_list_stats): ParameterListStats(\n",
-       "            (first_tracked_param): _ViewParameterWrapper()\n",
-       "            (stats): Stats(\n",
-       "              (stats_impl): AbsAve()\n",
-       "            )\n",
-       "          )\n",
-       "          (stats_scaling_impl): StatsScaling(\n",
-       "            (affine_rescaling): Identity()\n",
-       "            (restrict_scaling): RestrictValue(\n",
+       "    (14): QuantLinear(\n",
+       "      in_features=1024, out_features=10, bias=False\n",
+       "      (weight_reg): WeightReg()\n",
+       "      (weight_quant): WeightQuantProxy(\n",
+       "        (tensor_quant): BinaryQuant(\n",
+       "          (scaling_impl): StandaloneScaling(\n",
+       "            (restrict_value): RestrictValue(\n",
        "              (forward_impl): Sequential(\n",
        "                (0): PowerOfTwo()\n",
-       "                (1): ClampMin()\n",
+       "                (1): Identity()\n",
        "              )\n",
        "            )\n",
-       "            (restrict_scaling_preprocess): LogTwo()\n",
        "          )\n",
        "        )\n",
        "      )\n",
+       "      (bias_quant): BiasQuantProxy()\n",
        "    )\n",
-       "    (bias_quant): BiasQuantProxy()\n",
+       "    (15): BatchNorm1d(10, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n",
        "  )\n",
        ")"
       ]
@@ -323,12 +291,12 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 10,
+   "execution_count": 5,
    "metadata": {},
    "outputs": [
     {
      "data": {
-      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAARX0lEQVR4nO3dfYyVZXrH8d/FoDAw8iYRCaisG/5QqmUbgk1KyOKmxlUMbKJm/aPauAmarMmqTVqz/UOSaqJVa/pH3YStL9CsmiWoq0a7a82mWo1GNFQQW1CULGR4E5H3t+HqH/NgZ3We6549z3nOc9z7+0kmM3Ouec65OTM/zsv13Pdt7i4Af/xGNT0AAJ1B2IFMEHYgE4QdyARhBzIxupM3Zma89Z+ZUaPKH09OnTpV23VXvf6enp6wPjAw0PJ1183dbbjLK4XdzK6U9M+SeiT9q7vfV+X6cmU27O/mS6k/6ip/eKNHx38CqcCk6r29vaW1Q4cOhcem9PX1hfUDBw6U1lIt50mTJoX1zz77LKx3o5afxptZj6R/kfR9SRdLusHMLm7XwAC0V5XX7PMlfeTuW9z9uKSnJS1pz7AAtFuVsM+Q9Lsh328rLvs9ZrbMzNaa2doKtwWgotrfoHP3FZJWSLxBBzSpyiP7dknnDfl+ZnEZgC5UJezvSJptZt8yszMl/VDS8+0ZFoB2a/lpvLufNLPbJP1ag623x9z9g7aNLCPjx48P6wcPHmz5useMGRPWjx07FtZTbcFx48aF9ai9lmoppqSOj9prqT76vn37WhpTN6v0mt3dX5L0UpvGAqBGnC4LZIKwA5kg7EAmCDuQCcIOZIKwA5mwTq4um+vpsqled6qXffTo0bA+duzYlo9Nia676vWfffbZYb3qNNLofp06dWp47O7du8N6amrwyZMnw3qdyuaz88gOZIKwA5kg7EAmCDuQCcIOZIKwA5mg9fYNkGrNVfkd1nnddUtNDa6yem1q6m5qanCTS03TegMyR9iBTBB2IBOEHcgEYQcyQdiBTBB2IBP02TvgrLPOCuvRbqOSNHHixLB+4sSJ0lpqN9LUFNbPP/88rC9YsCCs33rrraW1VC/6jjvuCOtbt24N601OM20SfXYgc4QdyARhBzJB2IFMEHYgE4QdyARhBzJBn/0b4JFHHgnrUS871Wuuuox1b29vWI+ktk2+5JJLwvqmTZvC+vHjx0trZ5xxRnhsdO6ClP53HzlyJKzXqazPXmnLZjP7VNIBSQOSTrr7vCrXB6A+lcJeWOTue9pwPQBqxGt2IBNVw+6SfmNm75rZsuF+wMyWmdlaM1tb8bYAVFD1afwCd99uZudIesXM/sfdXxv6A+6+QtIKiTfogCZVemR39+3F512SnpU0vx2DAtB+LYfdzMab2Vmnv5Z0haQN7RoYgPaq8jR+mqRniz7taElPuvu/t2VUf2RSWzYvWrQorF922WVhPeqVHzx4MDw21W/u6+sL66nzNKI566m11x999NGWr1uS7rzzztLaW2+9FR5b93bSTWg57O6+RdKftnEsAGpE6w3IBGEHMkHYgUwQdiAThB3IBFNcu0Bqqubs2bPD+v79+0trEyZMCI+NpoFK6SmwVbZ8TrX9UlJLcO/du7e0tnTp0vDYdevWhfVUSzLV8qwTS0kDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJdiw42TFRT7fOfnBK6thU/ZZbbgnrq1atCuszZ85s+bZTffZ77rknrK9evTqsn3nmmaW1K664Ijz2wQcfDOuprbCj2168eHF47LZt28L6nj3fvDVWeWQHMkHYgUwQdiAThB3IBGEHMkHYgUwQdiATHZ/Pnup3Rzo51naqOvd54cKFYf2iiy4qrY0bNy48dvTo+FSLNWvWhPUtW7aE9SpSyz3PmTMnrKfu90jq75T57AC6FmEHMkHYgUwQdiAThB3IBGEHMkHYgUx0vM8+alT5/y9V54XXqcpc+lOnTlW67eg+S9VPnjwZHjt+/PiwfujQobCe2o46+p2l5tJfffXVYf3pp58O61X67Kk17VP3a5Na7rOb2WNmtsvMNgy5bIqZvWJmm4vPk9s5WADtN5Kn8U9IuvIrl90l6VV3ny3p1eJ7AF0sGXZ3f03SV/fRWSJpZfH1SknxXjoAGtfqGnTT3L2/+HqHpGllP2hmyyQta/F2ALRJ5QUn3d2jDRvdfYWkFRIbOwJNarX1ttPMpktS8XlX+4YEoA6thv15STcVX98k6VftGQ6AuiT77Gb2lKTvSpoqaaekuyU9J+mXks6XtFXS9e5evhn2/19XbU/jq64bX7UeSfVkU3uoR/uvV9Xb2xvWjxw5EtZT5wBUOcfgwgsvDOsff/xxy9edGldqTfqUw4cPVzq+irI+e/I1u7vfUFL6XqURAegoTpcFMkHYgUwQdiAThB3IBGEHMsGWzYVUC3JgYCCsR3p6esJ61WWHozZRqsWUmsKakrr+aNvkqCZJixYtamlMp0W/0xMnToTHpqa4Vvl7aAqP7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZKKr+ux1budcdTnnKuq+7QMHDpTWUv3iVK87dXyqTx8tF51axvq6664L60ePHg3rY8eOLa2l+uyp31mTWzK3ikd2IBOEHcgEYQcyQdiBTBB2IBOEHcgEYQcy0fE+ezS3u5t75dGSyanllFPq3Fb50ksvDY+dM2dOWE8tJf3cc8+F9UjUB5ekhQsXhvUqW3inlqGOzl2Qqi/B3QQe2YFMEHYgE4QdyARhBzJB2IFMEHYgE4QdyETH++zRnPU6++ipufKped1RT3j06PhuXLp0aVhPHb9kyZKwPmbMmNLa3Llzw2MnTZoU1lO97Ndff73l42fPnh0em1qbPdXrXr9+fWnt8ssvD4+N7lOpO/voKclHdjN7zMx2mdmGIZctN7PtZrau+Liq3mECqGokT+OfkHTlMJc/7O5zi4+X2jssAO2WDLu7vyZpbwfGAqBGVd6gu83M3i+e5k8u+yEzW2Zma81sbYXbAlBRq2H/maRvS5orqV/SQ2U/6O4r3H2eu89r8bYAtEFLYXf3ne4+4O6nJP1c0vz2DgtAu7UUdjObPuTbH0jaUPazALqDpfqoZvaUpO9Kmippp6S7i+/nSnJJn0q6xd37kzdmFt5Yqt+cmvcdmTVrVli/5pprwvrixYtLa6l516l526m509H+61K8hnlfX194bErVed3R7/SLL74Ij504cWJYT9m8eXNpbdWqVeGxDz1U+spUUnf32d192JNKkifVuPsNw1z8aOURAegoTpcFMkHYgUwQdiAThB3IBGEHMpFsvbX1xsw8Wna5zimud999d1hfvnx5WN+zZ09pberUqa0M6UuprYf37o2nJkT1Cy64IDw21RZMbdmccuzYsdJaahpp6u8h1YqNpi2ntlx++eWXw/rNN98c1pvc0rms9cYjO5AJwg5kgrADmSDsQCYIO5AJwg5kgrADmeh4nz2qV9maODXVMtX3rLLt8q5du8L61q1bw/oDDzwQ1levXh3W580rXwTo4YcfDo9Nbdk8eXLpimOSpG3btoX16Hf6xBNPhMd+8sknYf3aa68N69HU46rTa1988cWwnpoyXSf67EDmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZKKjffZRo0Z5ND/6+PHj4fHnnHNOaW337t3hsak+e2rudNQvTm0HvWnTprA+ZcqUsJ5atjha7vn8888Pj03NZ08t771v376wfuONN5bWXnjhhfDYlNQ6AtFy0YsWLQqPTa0xkLpfUst/14k+O5A5wg5kgrADmSDsQCYIO5AJwg5kgrADmeiq+exVpPqeK1euDOvXX399y9d/+PDh8Nhx48aF9dS2yKl5/gMDA6W11Lrvb775Zlh/8sknw/q6devC+htvvFFaS51fkOrhp37n0Xkb8+fPD499++23w/rjjz8e1lPrytep5T67mZ1nZr81s41m9oGZ/aS4fIqZvWJmm4vP8SoHABo1kqfxJyX9jbtfLOnPJf3YzC6WdJekV919tqRXi+8BdKlk2N29393fK74+IOlDSTMkLZF0+rnxSklL6xokgOriFz1fYWazJH1H0tuSprl7f1HaIWlayTHLJC1rfYgA2mHE78abWZ+kNZJud/f9Q2s++C7fsG++ufsKd5/n7uWrIgKo3YjCbmZnaDDov3D3Z4qLd5rZ9KI+XVK8xCqARiVbbzY4f3OlpL3ufvuQyx+Q9Jm732dmd0ma4u5/m7iu8MbOPffccCw7duwI65Fo+15JmjlzZli/9957S2szZswIj01tuZzaujjaLlqS7r///tLaxo0bw2NTU1xT2yKnpKYtR1JtwxMnToT1aOpx6u9+woQJYb3qlOk6lbXeRvKa/S8k/ZWk9WZ2uqn6U0n3Sfqlmf1I0lZJcaMaQKOSYXf3/5JU9l/k99o7HAB14XRZIBOEHcgEYQcyQdiBTBB2IBMdneLa09PjUV83NVU06n3u37+/tCZJfX19YT3VN416vlX6vVK655s6RyDqZad6+MeOHQvrVUW/79Ryzampwam/lyq/s5SqY6sTS0kDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJrlpKOjWHOOqlp5YVrjove/r06aW1/v7+0tpI9Pb2hvXUls11XndqGetDhw6F9SpzylNGjYofq6rMKW/6/IQq6LMDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJruqzA6iOPjuQOcIOZIKwA5kg7EAmCDuQCcIOZIKwA5lIht3MzjOz35rZRjP7wMx+Uly+3My2m9m64uOq+ocLoFXJk2rMbLqk6e7+npmdJeldSUs1uB/7QXd/cMQ3xkk1QO3KTqoZyf7s/ZL6i68PmNmHkma0d3gA6vYHvWY3s1mSviPp7eKi28zsfTN7zMwmlxyzzMzWmtnaSiMFUMmIz403sz5J/ynpXnd/xsymSdojySX9gwaf6t+cuA6exgM1K3saP6Kwm9kZkl6U9Gt3/6dh6rMkvejuf5K4HsIO1KzliTA2uDzoo5I+HBr04o27034gaUPVQQKoz0jejV8g6XVJ6yWdXpv3p5JukDRXg0/jP5V0S/FmXnRdPLIDNav0NL5dCDtQP+azA5kj7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZIKwA5kg7EAmkgtOttkeSVuHfD+1uKwbdevYunVcEmNrVTvHdkFZoaPz2b9242Zr3X1eYwMIdOvYunVcEmNrVafGxtN4IBOEHchE02Ff0fDtR7p1bN06LomxtaojY2v0NTuAzmn6kR1AhxB2IBONhN3MrjSz/zWzj8zsribGUMbMPjWz9cU21I3uT1fsobfLzDYMuWyKmb1iZpuLz8PusdfQ2LpiG+9gm/FG77umtz/v+Gt2M+uRtEnSX0raJukdSTe4+8aODqSEmX0qaZ67N34ChpktlHRQ0qrTW2uZ2T9K2uvu9xX/UU5297/rkrEt1x+4jXdNYyvbZvyv1eB9187tz1vRxCP7fEkfufsWdz8u6WlJSxoYR9dz99ck7f3KxUskrSy+XqnBP5aOKxlbV3D3fnd/r/j6gKTT24w3et8F4+qIJsI+Q9Lvhny/Td2137tL+o2ZvWtmy5oezDCmDdlma4ekaU0OZhjJbbw76SvbjHfNfdfK9udV8Qbd1y1w9z+T9H1JPy6ernYlH3wN1k29059J+rYG9wDsl/RQk4MpthlfI+l2d98/tNbkfTfMuDpyvzUR9u2Szhvy/czisq7g7tuLz7skPavBlx3dZOfpHXSLz7saHs+X3H2nuw+4+ylJP1eD912xzfgaSb9w92eKixu/74YbV6futybC/o6k2Wb2LTM7U9IPJT3fwDi+xszGF2+cyMzGS7pC3bcV9fOSbiq+vknSrxocy+/plm28y7YZV8P3XePbn7t7xz8kXaXBd+Q/lvT3TYyhZFwXSvrv4uODpscm6SkNPq07ocH3Nn4k6WxJr0raLOk/JE3porH9mwa39n5fg8Ga3tDYFmjwKfr7ktYVH1c1fd8F4+rI/cbpskAmeIMOyARhBzJB2IFMEHYgE4QdyARhBzJB2IFM/B+tIjCppYWKvAAAAABJRU5ErkJggg==\n",
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAARX0lEQVR4nO3dfYyVZXrH8d/FoDAw8iYRCaisG/5QqmUbgk1KyOKmxlUMbKJm/aPauAmarMmqTVqz/UOSaqJVa/pH3YStL9CsmiWoq0a7a82mWo1GNFQQW1CULGR4E5H3t+HqH/NgZ3We6549z3nOc9z7+0kmM3Ouec65OTM/zsv13Pdt7i4Af/xGNT0AAJ1B2IFMEHYgE4QdyARhBzIxupM3Zma89Z+ZUaPKH09OnTpV23VXvf6enp6wPjAw0PJ1183dbbjLK4XdzK6U9M+SeiT9q7vfV+X6cmU27O/mS6k/6ip/eKNHx38CqcCk6r29vaW1Q4cOhcem9PX1hfUDBw6U1lIt50mTJoX1zz77LKx3o5afxptZj6R/kfR9SRdLusHMLm7XwAC0V5XX7PMlfeTuW9z9uKSnJS1pz7AAtFuVsM+Q9Lsh328rLvs9ZrbMzNaa2doKtwWgotrfoHP3FZJWSLxBBzSpyiP7dknnDfl+ZnEZgC5UJezvSJptZt8yszMl/VDS8+0ZFoB2a/lpvLufNLPbJP1ag623x9z9g7aNLCPjx48P6wcPHmz5useMGRPWjx07FtZTbcFx48aF9ai9lmoppqSOj9prqT76vn37WhpTN6v0mt3dX5L0UpvGAqBGnC4LZIKwA5kg7EAmCDuQCcIOZIKwA5mwTq4um+vpsqled6qXffTo0bA+duzYlo9Nia676vWfffbZYb3qNNLofp06dWp47O7du8N6amrwyZMnw3qdyuaz88gOZIKwA5kg7EAmCDuQCcIOZIKwA5mg9fYNkGrNVfkd1nnddUtNDa6yem1q6m5qanCTS03TegMyR9iBTBB2IBOEHcgEYQcyQdiBTBB2IBP02TvgrLPOCuvRbqOSNHHixLB+4sSJ0lpqN9LUFNbPP/88rC9YsCCs33rrraW1VC/6jjvuCOtbt24N601OM20SfXYgc4QdyARhBzJB2IFMEHYgE4QdyARhBzJBn/0b4JFHHgnrUS871Wuuuox1b29vWI+ktk2+5JJLwvqmTZvC+vHjx0trZ5xxRnhsdO6ClP53HzlyJKzXqazPXmnLZjP7VNIBSQOSTrr7vCrXB6A+lcJeWOTue9pwPQBqxGt2IBNVw+6SfmNm75rZsuF+wMyWmdlaM1tb8bYAVFD1afwCd99uZudIesXM/sfdXxv6A+6+QtIKiTfogCZVemR39+3F512SnpU0vx2DAtB+LYfdzMab2Vmnv5Z0haQN7RoYgPaq8jR+mqRniz7taElPuvu/t2VUf2RSWzYvWrQorF922WVhPeqVHzx4MDw21W/u6+sL66nzNKI566m11x999NGWr1uS7rzzztLaW2+9FR5b93bSTWg57O6+RdKftnEsAGpE6w3IBGEHMkHYgUwQdiAThB3IBFNcu0Bqqubs2bPD+v79+0trEyZMCI+NpoFK6SmwVbZ8TrX9UlJLcO/du7e0tnTp0vDYdevWhfVUSzLV8qwTS0kDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJdiw42TFRT7fOfnBK6thU/ZZbbgnrq1atCuszZ85s+bZTffZ77rknrK9evTqsn3nmmaW1K664Ijz2wQcfDOuprbCj2168eHF47LZt28L6nj3fvDVWeWQHMkHYgUwQdiAThB3IBGEHMkHYgUwQdiATHZ/Pnup3Rzo51naqOvd54cKFYf2iiy4qrY0bNy48dvTo+FSLNWvWhPUtW7aE9SpSyz3PmTMnrKfu90jq75T57AC6FmEHMkHYgUwQdiAThB3IBGEHMkHYgUx0vM8+alT5/y9V54XXqcpc+lOnTlW67eg+S9VPnjwZHjt+/PiwfujQobCe2o46+p2l5tJfffXVYf3pp58O61X67Kk17VP3a5Na7rOb2WNmtsvMNgy5bIqZvWJmm4vPk9s5WADtN5Kn8U9IuvIrl90l6VV3ny3p1eJ7AF0sGXZ3f03SV/fRWSJpZfH1SknxXjoAGtfqGnTT3L2/+HqHpGllP2hmyyQta/F2ALRJ5QUn3d2jDRvdfYWkFRIbOwJNarX1ttPMpktS8XlX+4YEoA6thv15STcVX98k6VftGQ6AuiT77Gb2lKTvSpoqaaekuyU9J+mXks6XtFXS9e5evhn2/19XbU/jq64bX7UeSfVkU3uoR/uvV9Xb2xvWjxw5EtZT5wBUOcfgwgsvDOsff/xxy9edGldqTfqUw4cPVzq+irI+e/I1u7vfUFL6XqURAegoTpcFMkHYgUwQdiAThB3IBGEHMsGWzYVUC3JgYCCsR3p6esJ61WWHozZRqsWUmsKakrr+aNvkqCZJixYtamlMp0W/0xMnToTHpqa4Vvl7aAqP7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZKKr+ux1budcdTnnKuq+7QMHDpTWUv3iVK87dXyqTx8tF51axvq6664L60ePHg3rY8eOLa2l+uyp31mTWzK3ikd2IBOEHcgEYQcyQdiBTBB2IBOEHcgEYQcy0fE+ezS3u5t75dGSyanllFPq3Fb50ksvDY+dM2dOWE8tJf3cc8+F9UjUB5ekhQsXhvUqW3inlqGOzl2Qqi/B3QQe2YFMEHYgE4QdyARhBzJB2IFMEHYgE4QdyETH++zRnPU6++ipufKped1RT3j06PhuXLp0aVhPHb9kyZKwPmbMmNLa3Llzw2MnTZoU1lO97Ndff73l42fPnh0em1qbPdXrXr9+fWnt8ssvD4+N7lOpO/voKclHdjN7zMx2mdmGIZctN7PtZrau+Liq3mECqGokT+OfkHTlMJc/7O5zi4+X2jssAO2WDLu7vyZpbwfGAqBGVd6gu83M3i+e5k8u+yEzW2Zma81sbYXbAlBRq2H/maRvS5orqV/SQ2U/6O4r3H2eu89r8bYAtEFLYXf3ne4+4O6nJP1c0vz2DgtAu7UUdjObPuTbH0jaUPazALqDpfqoZvaUpO9Kmippp6S7i+/nSnJJn0q6xd37kzdmFt5Yqt+cmvcdmTVrVli/5pprwvrixYtLa6l516l526m509H+61K8hnlfX194bErVed3R7/SLL74Ij504cWJYT9m8eXNpbdWqVeGxDz1U+spUUnf32d192JNKkifVuPsNw1z8aOURAegoTpcFMkHYgUwQdiAThB3IBGEHMpFsvbX1xsw8Wna5zimud999d1hfvnx5WN+zZ09pberUqa0M6UuprYf37o2nJkT1Cy64IDw21RZMbdmccuzYsdJaahpp6u8h1YqNpi2ntlx++eWXw/rNN98c1pvc0rms9cYjO5AJwg5kgrADmSDsQCYIO5AJwg5kgrADmeh4nz2qV9maODXVMtX3rLLt8q5du8L61q1bw/oDDzwQ1levXh3W580rXwTo4YcfDo9Nbdk8eXLpimOSpG3btoX16Hf6xBNPhMd+8sknYf3aa68N69HU46rTa1988cWwnpoyXSf67EDmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZKKjffZRo0Z5ND/6+PHj4fHnnHNOaW337t3hsak+e2rudNQvTm0HvWnTprA+ZcqUsJ5atjha7vn8888Pj03NZ08t771v376wfuONN5bWXnjhhfDYlNQ6AtFy0YsWLQqPTa0xkLpfUst/14k+O5A5wg5kgrADmSDsQCYIO5AJwg5kgrADmeiq+exVpPqeK1euDOvXX399y9d/+PDh8Nhx48aF9dS2yKl5/gMDA6W11Lrvb775Zlh/8sknw/q6devC+htvvFFaS51fkOrhp37n0Xkb8+fPD499++23w/rjjz8e1lPrytep5T67mZ1nZr81s41m9oGZ/aS4fIqZvWJmm4vP8SoHABo1kqfxJyX9jbtfLOnPJf3YzC6WdJekV919tqRXi+8BdKlk2N29393fK74+IOlDSTMkLZF0+rnxSklL6xokgOriFz1fYWazJH1H0tuSprl7f1HaIWlayTHLJC1rfYgA2mHE78abWZ+kNZJud/f9Q2s++C7fsG++ufsKd5/n7uWrIgKo3YjCbmZnaDDov3D3Z4qLd5rZ9KI+XVK8xCqARiVbbzY4f3OlpL3ufvuQyx+Q9Jm732dmd0ma4u5/m7iu8MbOPffccCw7duwI65Fo+15JmjlzZli/9957S2szZswIj01tuZzaujjaLlqS7r///tLaxo0bw2NTU1xT2yKnpKYtR1JtwxMnToT1aOpx6u9+woQJYb3qlOk6lbXeRvKa/S8k/ZWk9WZ2uqn6U0n3Sfqlmf1I0lZJcaMaQKOSYXf3/5JU9l/k99o7HAB14XRZIBOEHcgEYQcyQdiBTBB2IBMdneLa09PjUV83NVU06n3u37+/tCZJfX19YT3VN416vlX6vVK655s6RyDqZad6+MeOHQvrVUW/79Ryzampwam/lyq/s5SqY6sTS0kDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJrlpKOjWHOOqlp5YVrjove/r06aW1/v7+0tpI9Pb2hvXUls11XndqGetDhw6F9SpzylNGjYofq6rMKW/6/IQq6LMDmSPsQCYIO5AJwg5kgrADmSDsQCYIO5CJruqzA6iOPjuQOcIOZIKwA5kg7EAmCDuQCcIOZIKwA5lIht3MzjOz35rZRjP7wMx+Uly+3My2m9m64uOq+ocLoFXJk2rMbLqk6e7+npmdJeldSUs1uB/7QXd/cMQ3xkk1QO3KTqoZyf7s/ZL6i68PmNmHkma0d3gA6vYHvWY3s1mSviPp7eKi28zsfTN7zMwmlxyzzMzWmtnaSiMFUMmIz403sz5J/ynpXnd/xsymSdojySX9gwaf6t+cuA6exgM1K3saP6Kwm9kZkl6U9Gt3/6dh6rMkvejuf5K4HsIO1KzliTA2uDzoo5I+HBr04o27034gaUPVQQKoz0jejV8g6XVJ6yWdXpv3p5JukDRXg0/jP5V0S/FmXnRdPLIDNav0NL5dCDtQP+azA5kj7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZIKwA5kg7EAmCDuQCcIOZIKwA5kg7EAmkgtOttkeSVuHfD+1uKwbdevYunVcEmNrVTvHdkFZoaPz2b9242Zr3X1eYwMIdOvYunVcEmNrVafGxtN4IBOEHchE02Ff0fDtR7p1bN06LomxtaojY2v0NTuAzmn6kR1AhxB2IBONhN3MrjSz/zWzj8zsribGUMbMPjWz9cU21I3uT1fsobfLzDYMuWyKmb1iZpuLz8PusdfQ2LpiG+9gm/FG77umtz/v+Gt2M+uRtEnSX0raJukdSTe4+8aODqSEmX0qaZ67N34ChpktlHRQ0qrTW2uZ2T9K2uvu9xX/UU5297/rkrEt1x+4jXdNYyvbZvyv1eB9187tz1vRxCP7fEkfufsWdz8u6WlJSxoYR9dz99ck7f3KxUskrSy+XqnBP5aOKxlbV3D3fnd/r/j6gKTT24w3et8F4+qIJsI+Q9Lvhny/Td2137tL+o2ZvWtmy5oezDCmDdlma4ekaU0OZhjJbbw76SvbjHfNfdfK9udV8Qbd1y1w9z+T9H1JPy6ernYlH3wN1k29059J+rYG9wDsl/RQk4MpthlfI+l2d98/tNbkfTfMuDpyvzUR9u2Szhvy/czisq7g7tuLz7skPavBlx3dZOfpHXSLz7saHs+X3H2nuw+4+ylJP1eD912xzfgaSb9w92eKixu/74YbV6futybC/o6k2Wb2LTM7U9IPJT3fwDi+xszGF2+cyMzGS7pC3bcV9fOSbiq+vknSrxocy+/plm28y7YZV8P3XePbn7t7xz8kXaXBd+Q/lvT3TYyhZFwXSvrv4uODpscm6SkNPq07ocH3Nn4k6WxJr0raLOk/JE3porH9mwa39n5fg8Ga3tDYFmjwKfr7ktYVH1c1fd8F4+rI/cbpskAmeIMOyARhBzJB2IFMEHYgE4QdyARhBzJB2IFM/B+tIjCppYWKvAAAAABJRU5ErkJggg==\n",
       "text/plain": [
        "<Figure size 432x288 with 1 Axes>"
       ]
@@ -353,17 +321,17 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 11,
+   "execution_count": 6,
    "metadata": {},
    "outputs": [
     {
      "data": {
       "text/plain": [
-       "tensor([2.4663e-03, 6.8211e-06, 8.9177e-01, 2.1330e-05, 3.6883e-04, 3.0418e-06,\n",
-       "        1.1795e-04, 5.0158e-05, 1.0517e-01, 2.4597e-05])"
+       "tensor([0.0602, 0.0147, 0.5844, 0.0445, 0.0270, 0.0185, 0.0595, 0.0082, 0.1689,\n",
+       "        0.0141])"
       ]
      },
-     "execution_count": 11,
+     "execution_count": 6,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -378,12 +346,12 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 12,
+   "execution_count": 7,
    "metadata": {},
    "outputs": [
     {
      "data": {
-      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEICAYAAABS0fM3AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAa3klEQVR4nO3debxdZXn28d9FEgRCBCFxgASIGCxxeAuNgKJCRSqoQB1qwRcrVkXfFkVBK1oriIqvI7WKVgQnZCgC2qABxFdE3yKBMBsmQwQSQAlzGEoYrv6xni2bk3P2WSRZ68BZ1/fz2Z+z13jfe59z1r3X86z1bNkmIiK6a62xTiAiIsZWCkFERMelEEREdFwKQUREx6UQRER0XApBRETHpRBE1CTpMEk/KM83k3SvpAmrsJ+PSTpmzWc4MOaOkn5Xcv7rNmPHk18KwTgk6XpJrx5m/s6SHi0Hg97j9L7lW0n6oaTbJN0t6XJJB63KwW5I3AMkLZD0oKTvPsFt3yLpPEn3S/rlKOv2v77lkq6R9I7VyX0ktm+0vb7tR2rktHTItkfYflcTeQ1wOPC1kvOPV3dnkr4r6dNrIK94Ekgh6J6by8Gg99gDQNKWwHxgCfAi2xsAfwPMAaasbkzg08C3V2HbO4B/Bf5v3Vi21weeDnwE+Jak2UNXkjRxFXJ5KtscWLgqG3bwveqcFILo+SRwnu2DbN8CYPsa22+1fdfQlSX9paQr+qbPlnRh3/Sve00Qtk8rn0JvH2Y/z5D0E0nLJN1Znk/vLbf9c9snUxWT2lz5MXAnMFvSFpIs6Z2SbgR+UeLvUM447pJ0maSd+3KbKenccnZxNjC1b1lvfxPL9EaSviPp5vI6fixpMnAGsEnfGdgm/U1MZds9JS0sOfxS0tZ9y66X9KFydna3pP+QtE5ZNrW8X3dJuqO85yv9T0u6DngucHrJ4Wklj7llu0WS3t23/mGSTpH0A0n3APsNeq/73ot3SFpSXv97Jb2k5H2XpK/1rb+lpF9Iur2cfR4vacO+5dtKuqS87z8sr/nTfctfL+nSst/zJL14UH4xuhSC6Hk1cMoTWP98YFY5GE0CXkx1wJsiaV2qM4lf19jPWsB3qD6xbgY8AHxt4BY1SFpL0huADYEr+hbtBGwNvEbSpsBPqc5WNgI+BJwqaVpZ9wTgIqoC8Cng7QNCHgesB7wAeCZwpO37gN15/FnY4wqapK2AE4EPANOAeVQH7LX7VnsLsBswk+p93q/MPxhYWrZ7FvAxYKUxY2xvCdwI7FFyeBA4qWy7CfBm4AhJr+rbbC+qv4cNgeMHvO5+2wOzgL+lOov7Z6q/qxcAb5G0U+9lA58tsbcGZgCHlfdjbeBHwHepficnAm/oBZC0DdWZ5XuAjYFvAnMlPa1mjjGMFILu2aR8kuo93lLmbwzcUncnth8ALgReCfwFcBnwX8COwA7A72yvdAYwzH5ut32q7fttLwc+Q3WwXlWbSLoLuA04FHib7Wv6lh9m+76S/77APNvzbD9q+2xgAfBaSZsBLwH+xfaDtn8FnM4wJD2H6oD/Xtt32n7I9rk18/1b4Ke2z7b9EPBFYF3gZX3r/Jvtm23fUXL48zL/IeA5wOYl5q9dY/AwSTOofk8fsf3fti8FjgH+rm+139j+cXlfHqj5Wj5V9vcz4D7gRNu32r6J6kPBNgC2F5XX+6DtZcCXeex3vgMwsbzmh2yfBlzQF2N/4Ju259t+xPb3gAfLdrGK0vbXPTfbnj7M/NupDirDkvTvVAdOgCNsHwGcC+xM9cnyXKpmmJ2o/jFrHQglrQccSfWJ9xll9hRJE0briB3BSK+vZ0nf882Bv5G0R9+8ScA5VJ9W7yyf6ntuoPr0OtQM4A7bd65CvpuU/QJg+1FJS4BN+9b5Q9/z+8s2AF+g+iT9M0kAR9uu05eyScl3ed+8G6jO4nqW8MT9se/5A8NMrw8g6VnAV4BXUPU/rUX1t9PL7aYhBW3o7+ztkt7XN29tHntPYhXkjCB6fg68aaSFtt/b17xxRJndKwSvLM/PpSoEO1GzEFA1bzwf2N7208u+oGo+aMLQA8xxtjfse0wuB9NbgGeUdv6ezUbY5xJgo/527hHiDedmqoMbAKqO6DOAm0Z9IfZy2wfbfi6wJ3CQpF1G267E3EhS/0UAmw2J2eSwxEeU/b+o/M735bHf9y3ApuV96OkvvkuAzwz5na1n+8QG8x33UgjGr0mS1ul7jHb2dyjwMklfkPRsAEnPKx2Gwx3gAM6jOohvB1xgeyHVQW174Fe9lSRNLB2cE4AJQ/KZQvVp8S5JG5U86Nt2Qtl2IrBW2XbSE3gfBvkBsIek1/TiqLrcc7rtG6iaiT4paW1JLwf2GG4npXP9DODrqjq/J0nqFbQ/AhtL2mCEHE4GXidpl/K6DqY6ozpvtORLp+nzykHzbuAR4NHRtrO9pOz/s+U1vxh4Z3k/2jAFuBe4u/TTfLhv2W+oXscB5e9mL6q/r55vAe+VtL0qkyW9bkhRiycohWD8mkd1gO09Dhu0su3rgJcCWwALJd0NnEp1MFw+wjb3ARcDC22vKLN/A9xg+9a+VT9ecjiE6tPfA2UeVJ2K61K16Z8PnDkkzNvK+t+gakp4gOpgsNrKAXEvqk7WZVSfNj/MY/8Xb6UqandQFajvD9jd26ja7K8GbqXq/MX21VQdnotLn8zjmjBK/8W+wFep3oM9qDp1VzC6WVRncvdSve9ft31Oje0A9qH6Xd9M1Tl7qO2f19x2dX0S2JaqeP0UOK23oLzuN1IVpruo3pufUBVHbC8A3k11QcGdwCJGuaopRqd8MU1EPJlJmg/8u+3vjHUu41XOCCLiSUXSTpKeXZqG3k51yezQM8VYg3LVUEQ82Tyfqu9kMrAYeHPvJsdoRpqGIiI6Lk1DEREd95RrGpo6daq32GKLsU4jIuIp5aKLLrrN9rThlj3lCsEWW2zBggULxjqNiIinFEk3jLQsTUMRER2XQhAR0XEpBBERHZdCEBHRcSkEEREdl0IQEdFxKQQRER2XQhAR0XEpBBERHfeUu7M4nrgjz7628Rgf3HWrxmNERDNyRhAR0XEpBBERHZdCEBHRcSkEEREdl0IQEdFxKQQRER2XQhAR0XEpBBERHZdCEBHRcSkEEREdl0IQEdFxKQQRER2XQhAR0XEpBBERHZdCEBHRcSkEEREdl0IQEdFxjRYCSbtJukbSIkmHDLN8M0nnSLpE0uWSXttkPhERsbLGCoGkCcBRwO7AbGAfSbOHrPZx4GTb2wB7A19vKp+IiBhek2cE2wGLbC+2vQI4CdhryDoGnl6ebwDc3GA+ERExjCYLwabAkr7ppWVev8OAfSUtBeYB7xtuR5L2l7RA0oJly5Y1kWtERGeNdWfxPsB3bU8HXgscJ2mlnGwfbXuO7TnTpk1rPcmIiPGsyUJwEzCjb3p6mdfvncDJALZ/A6wDTG0wp4iIGKLJQnAhMEvSTElrU3UGzx2yzo3ALgCStqYqBGn7iYhoUWOFwPbDwAHAWcBVVFcHLZR0uKQ9y2oHA++WdBlwIrCfbTeVU0RErGxikzu3PY+qE7h/3if6nl8J7NhkDhERMdhYdxZHRMQYSyGIiOi4FIKIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI4btRBIelEbiURExNioc0bwdUkXSPoHSRs0nlFERLRq1EJg+xXA/wZmABdJOkHSro1nFhERrajVR2D7d8DHgY8AOwH/JulqSW9sMrmIiGhenT6CF0s6ErgKeBWwh+2ty/MjG84vIiIaNrHGOl8FjgE+ZvuB3kzbN0v6eGOZRUREK+o0Df3I9nH9RUDSgQC2j2sss4iIaEWdQvB3w8zbbw3nERERY2TEpiFJ+wBvBWZKmtu3aApwR9OJRUREOwb1EZwH3AJMBb7UN385cHmTSUVERHtGLAS2bwBuAF7aXjoREdG2QU1D/9/2yyUtB9y/CLDtpzeeXURENG7QGcHLy88p7aUTERFtG3RGsNGgDW2nwzgiYhwY1Fl8EVWTkIZZZuC5jWQUERGtGtQ0NLPNRCIiYmyMeEOZpD8rP7cd7lFn55J2k3SNpEWSDhlhnbdIulLSQkknrNrLiIiIVTWoaeggYH8efw9Bj6kGnRuRpAnAUcCuwFLgQklzbV/Zt84s4KPAjrbvlPTMJ5h/RESspkFNQ/uXn3+5ivveDlhkezGApJOAvYAr+9Z5N3CU7TtLrFtXMVZERKyiOsNQryPpIEmnSTpV0gckrVNj35sCS/qml5Z5/bYCtpL0X5LOl7TbCDnsL2mBpAXLli2rEToiIuqqM+jc94EXUA1H/bXyfE2NOjoRmAXsDOwDfEvShkNXsn207Tm250ybNm0NhY6ICKj3fQQvtD27b/ocSVeOuPZjbqL6esue6WVev6XAfNsPAb+XdC1VYbiwxv4jImINqHNGcLGkHXoTkrYHFtTY7kJglqSZktYG9gbmDlnnx1RnA0iaStVUtLjGviMiYg0ZdGfxFVRXB00CzpN0Y5neHLh6tB3bfljSAcBZwATg27YXSjocWGB7bln2V+UM4xHgw7ZvX90XFRER9Q1qGnr96u7c9jxg3pB5n+h7bqrLVA9a3VgREbFqRhuG+k/KNf51rhaKiIinkDqXj+4p6XfA74FzgeuBMxrOKyIiWlKns/hTwA7AtWX8oV2A8xvNKiIiWlOnEDxUOnDXkrSW7XOAOQ3nFRERLalzH8FdktYHfg0cL+lW4L5m04qIiLbUOSPYC3gA+ABwJnAdsEeTSUVERHtGPSOwfZ+kZ1MNIncHcFau9Y+IGD/qXDX0LuAC4I3Am4HzJf1904lFREQ76vQRfBjYpncWIGlj4Dzg200mFhER7ajTR3A7sLxvenmZFxER48CgsYZ6wz4sAuZL+k+qsYb2Ai5vIbeIiGjBoKahKeXndeXR85/NpRMREW0bNNbQJ/uny70E2L636aQiIqI9da4aeqGkS4CFwEJJF0l6QfOpRUREG+p0Fh8NHGR7c9ubAwcD32o2rYiIaEudQjC5jC8EgO1fApMbyygiIlpV5z6CxZL+hce+sH5f8nWSERHjRp0zgr8HpgGnAacCU8u8iIgYBwaeEUiaAPyz7fe3lE9ERLRs4BmB7UeAl7eUS0REjIE6fQSXSJoL/JC+7yGwfVpjWUVERGvqFIJ1qMYWelXfPFP1GURExFNcrdFHbd/WeCYRETEmRuwjkLSHpGXA5ZKWSnpZi3lFRERLBnUWfwZ4he1NgDcBn20npYiIaNOgQvCw7asBbM/nsdFIIyJiHBnUR/DMvu8kWGna9pebSysiItoyqBB8i8efBQydjoiIcaD29xFERMT4VGesoYiIGMdSCCIiOi6FICKi40bsIxhyxdBKctVQRMT4MOiqod4VQs8HXgLMLdN7ABc0mVRERLRn1KuGJP0K2Nb28jJ9GPDTVrKLiIjG1ekjeBawom96RZkXERHjQJ3RR78PXCDpR2X6r4HvNZdSRES0adRCYPszks4AXlFmvcP2Jc2mFRERbal7+eh6wD22vwIslTSzzkaSdpN0jaRFkg4ZsN6bJFnSnJr5RETEGjJqIZB0KPAR4KNl1iTgBzW2mwAcBewOzAb2kTR7mPWmAAcC8+unHRERa0qdM4I3AHtSvq/Y9s3UG3xuO2CR7cW2VwAnAXsNs96ngM8B/10r44iIWKPqFIIVtk31PcVImlxz35sCS/qml5Z5fyJpW2CG7YGXo0raX9ICSQuWLVtWM3xERNRRpxCcLOmbwIaS3g38HDhmdQNLWgv4MnDwaOvaPtr2HNtzpk2btrqhIyKiT52rhr4oaVfgHqq7jD9h++wa+74JmNE3Pb3M65kCvBD4pSSAZwNzJe1pe0HN/CMiYjWNWggkfc72R4Czh5k3yIXArHKF0U3A3sBbewtt3w1M7dvnL4EPpQhERLSrTtPQrsPM2320jWw/DBwAnAVcBZxse6GkwyXt+cTSjIiIpgwaffT/AP8AbCnp8r5FU4Dz6uzc9jxg3pB5nxhh3Z3r7DMiItasQU1DJwBnAJ8F+m8GW277jkazioiI1ozYNGT7btvXA18B7rB9g+0bgIclbd9WghER0aw6fQTfAO7tm763zIuIiHGgTiFQuaEMANuPUm/U0oiIeAqoUwgWS3q/pEnlcSCwuOnEIiKiHXUKwXuBl1HdC7AU2B7Yv8mkIiKiPXXuLL6V6mawiIgYhwbdR/BPtj8v6auUAef62X5/o5lFREQrBp0RXFV+ZsiHiIhxbMRCYPv08jPfTxwRMY4Naho6nWGahHpsZ7ygiIhxYFDT0BfLzzdSDRHd+3rKfYA/NplURES0Z1DT0LkAkr5ku/9L5U+XlH6DiIhxos59BJMlPbc3Ub5foO7XVUZExJNcnaEiPkj1LWKLAQGbA+9pNKuIiGhNnRvKzpQ0C/izMutq2w82m1ZERLRl1KYhSesBHwYOsH0ZsJmk1zeeWUREtKJOH8F3gBXAS8v0TcCnG8soIiJaVacQbGn788BDALbvp+oriIiIcaBOIVghaV3KzWWStgTSRxARMU7UuWroUOBMYIak44Edgf2aTCoiItozsBBIEnA11d3FO1A1CR1o+7YWcouIiBYMLAS2LWme7RcBP20pp4iIaFGdPoKLJb2k8UwiImJM1Okj2B7YV9L1wH1UzUO2/eImE4uIiHbUKQSvaTyLiIgYM4O+j2Adqi+ufx5wBXCs7YfbSiwiItoxqI/ge8AcqiKwO/ClVjKKiIhWDWoaml2uFkLSscAF7aQUERFtGnRG8FDvSZqEIiLGr0FnBP9L0j3luYB1y3TvqqGnN55dREQ0btBXVU5oM5GIiBgbdW4oi4iIcSyFICKi41IIIiI6LoUgIqLjUggiIjqu0UIgaTdJ10haJOmQYZYfJOlKSZdL+n+SNm8yn4iIWFljhUDSBOAoquEpZgP7SJo9ZLVLgDllJNNTgM83lU9ERAyvyTOC7YBFthfbXgGcBOzVv4Ltc2zfXybPB6Y3mE9ERAyjyUKwKbCkb3ppmTeSdwJnDLdA0v6SFkhasGzZsjWYYkREPCk6iyXtSzXS6ReGW277aNtzbM+ZNm1au8lFRIxzdb6YZlXdBMzom55e5j2OpFcD/wzsZPvBBvOJiIhhNHlGcCEwS9JMSWsDewNz+1eQtA3wTWBP27c2mEtERIygsUJQhq4+ADgLuAo42fZCSYdL2rOs9gVgfeCHki6VNHeE3UVEREOabBrC9jxg3pB5n+h7/uom40dExOieFJ3FERExdlIIIiI6LoUgIqLjUggiIjouhSAiouNSCCIiOi6FICKi41IIIiI6LoUgIqLjUggiIjouhSAiouNSCCIiOi6FICKi41IIIiI6LoUgIqLjUggiIjqu0S+miYho05FnX9t4jA/uulXjMdqWM4KIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI5LIYiI6LgUgoiIjkshiIjouBSCiIiOSyGIiOi4FIKIiI5LIYiI6LgUgoiIjmu0EEjaTdI1khZJOmSY5U+T9B9l+XxJWzSZT0RErKyxQiBpAnAUsDswG9hH0uwhq70TuNP284Ajgc81lU9ERAxvYoP73g5YZHsxgKSTgL2AK/vW2Qs4rDw/BfiaJNl2Ewkdefa1Tez2cT6461aNx4iIWJOaLASbAkv6ppcC24+0ju2HJd0NbAzc1r+SpP2B/cvkvZKuaSTj4U0dms8gB41h7DUsrzuxE3sYa/Bvve3XvflIC5osBGuM7aOBo8citqQFtuckdmIndmKPl9hDNdlZfBMwo296epk37DqSJgIbALc3mFNERAzRZCG4EJglaaaktYG9gblD1pkLvL08fzPwi6b6ByIiYniNNQ2VNv8DgLOACcC3bS+UdDiwwPZc4FjgOEmLgDuoisWTzZg0SSV2Yid2YrdF+QAeEdFtubM4IqLjUggiIjouhWAEow2P0XDsb0u6VdJvW447Q9I5kq6UtFDSgS3GXkfSBZIuK7E/2VbsvhwmSLpE0k/GIPb1kq6QdKmkBS3H3lDSKZKulnSVpJe2FPf55fX2HvdI+kAbsUv8D5a/td9KOlHSOi3GPrDEXdjmax6R7TyGPKg6t68DngusDVwGzG4x/iuBbYHftvy6nwNsW55PAa5t63UDAtYvzycB84EdWn79BwEnAD9pM26JfT0wte24Jfb3gHeV52sDG45BDhOAPwCbtxRvU+D3wLpl+mRgv5ZivxD4LbAe1QU7PweeNxa/+94jZwTD+9PwGLZXAL3hMVph+1dUV1G1yvYtti8uz5cDV1H9w7QR27bvLZOTyqO1KxkkTQdeBxzTVswnA0kbUH3wOBbA9grbd41BKrsA19m+ocWYE4F1yz1M6wE3txR3a2C+7fttPwycC7yxpdjDSiEY3nDDY7RyQHyyKCPBbkP1ybytmBMkXQrcCpxtu7XYwL8C/wQ82mLMfgZ+JumiMqRKW2YCy4DvlGaxYyRNbjF+z97AiW0Fs30T8EXgRuAW4G7bP2sp/G+BV0jaWNJ6wGt5/M23rUshiJVIWh84FfiA7Xvaimv7Edt/TnUX+naSXthGXEmvB261fVEb8UbwctvbUo3W+4+SXtlS3IlUzZDfsL0NcB/Qdp/Y2sCewA9bjPkMqrP8mcAmwGRJ+7YR2/ZVVCMt/ww4E7gUeKSN2CNJIRheneExxiVJk6iKwPG2TxuLHErTxDnAbi2F3BHYU9L1VM2Ar5L0g5ZiA3/6hIrtW4EfUTVPtmEpsLTv7OsUqsLQpt2Bi23/scWYrwZ+b3uZ7YeA04CXtRXc9rG2/8L2K4E7qfrjxkwKwfDqDI8x7kgSVVvxVba/3HLsaZI2LM/XBXYFrm4jtu2P2p5uewuq3/UvbLfy6RBA0mRJU3rPgb+iaj5onO0/AEskPb/M2oXHDxXfhn1osVmouBHYQdJ65e9+F6o+sVZIemb5uRlV/8AJbcUezlNi9NG2eYThMdqKL+lEYGdgqqSlwKG2j20h9I7A24ArSls9wMdsz2sh9nOA75UvNFoLONl265dxjpFnAT+qjkdMBE6wfWaL8d8HHF8+9CwG3tFW4FL4dgXe01ZMANvzJZ0CXAw8DFxCu0M+nCppY+Ah4B/HqIP+TzLEREREx6VpKCKi41IIIiI6LoUgIqLjUggiIjouhSAiouNSCCIiOi6FICKi4/4HEHMv4f97kiwAAAAASUVORK5CYII=\n",
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEICAYAAABS0fM3AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAclUlEQVR4nO3debgdVZ3u8e9LIDIFgiQqJEDCaMcJ8RgQFZGhO3QreBERrmjHK9NtwkXx2oIitrRTO+GErUAQlEZuiGhHjQRolBYVSEAEkoCEEEgYD5Mg+gCB9/5RdXBz2GefnaHq5Jx6P8+zn1PDqlq/vXeyf7VWVa2SbSIiornWG+oAIiJiaCURREQ0XBJBRETDJRFERDRcEkFERMMlEURENFwSQUSXJFnSjuX0tyV9YjX38ydJ26/d6DrWJ0nflfSIpGvrqjeGjySCEUjSMkn7tVm+t6Rnyx+ivtdPWtbvLOkiSQ9K+qOkGyWdKGnUGsYzQ9ICSU9KOncVtz1U0m8k/VnSLwcp2/r+Hpd0q6T3r0nsA7F9rO1/HaycpF9KOrLftpvaXlpFXAN4E7A/MNH21DXdmaRJZVJcf81Di3VBEkHz3FP+EPW93g4gaQfgGmA58CrbmwPvAnqAMWtaJ/Bp4JzV2PZh4KvA57uty/amwGbAR4GzJE3pX6hhP2LbActsP7GqGzbsc2qsJILo8yngN7ZPtH0vgO1bbf9P24/2LyzprZJuapm/TNL8lvlfSXpHuZ+Lbf8YeKjNfraQ9FNJvWXXxU8lTexbb/ty27MokknXXPgx8AgwRdJ0Sb+WdLqkh4B/kfQiSV+SdJek+8vuno1aYvuIpHsl3SPpf/WL+1xJn26ZP0jSDZIek3S7pGmSPgO8Gfhm2Ur5Zlm2tYtpc0nfK9//nZJOkbReuW66pKvKGB+RdIekA1rqnC5padn6uUPSe9p8vh8AzgbeUMbwqXL5UZKWSHpY0hxJW7dsY0nHSboNuG2wz7r8LL4l6edlHb+W9DJJXy3jvkXSa1vKn1R+Ro9LWiTpf7SsGyXpy2Wr9I6yNflc66P8vGaW38vdkj69pi3WSCKIv9oPmL0K5a8GdpI0TtIGwKuBrSWNKX9Me4BfdbGf9YDvUhy1bgv8BfjmKkXehqT1yh+YsUBfwtodWAq8FPgMRStjZ2BXYEdgAnBquf004P9SdKnsRPH5DFTXVOB7wEfK+vaiOAL/OMVnMKNsfc1os/k3gM2B7YG3AO8DWruzdgduBcYBXwBmqrAJ8HXgANtjgD2BG/rv3PZM4Fjgt2UMn5S0D/A54FBgK+BO4MJ+m76jrPsFrakBHAqcUsb5JPBb4PpyfjbwlZayt1MkyM0pDkDOl7RVue4o4ACK72S3Mo5W5wIrKb6v1wJ/CxxJrBnbeY2wF7AM2K/N8r2BZ4FHW16HluueBqatYj2/Ag4G9gAuBWYB04C3Aje2Kf9p4NxB9rkr8Eib5UcCvxxk29b39zDFD+Nh5brpwF0tZQU8AezQsuwNwB3l9DnA51vW7QwY2LGcPxf4dDn9HeD0AWL6JXBkv2Wm+CEbBTwFTGlZd0zf+yxjXtKybuNy25cBm5Tv853ARoN8LtOBq1rmZwJfaJnftPz+J7XEt0+H/U0qy6zf8lmc1bL+eGBxy/yrgEc77O8G4KBy+grgmJZ1+/XVRZHAn2x9v8DhwC+G8v/bSHil/6957rE9sc3yhyiODtuS9G3giHL2s7Y/C1xJ8eO7opx+hOKo9slyflCSNgZOp0ggW5SLx0gaZfuZbvbRz0DvD4rzH33GU/ywXifpuXAofpwBtgauayl/Z4c6twHmrnqojAM26LfvOylaJn3u65uw/ecy1k1t3yfp3RStlpmSfg182PYtXdS7NcXRet9+/1R2l02gOIiA539W3bi/ZfovbeY37ZuR9D7gRIqEQrluXEtsrXW3Tm9H8Xnd2/KdrbcasUY/6RqKPpdTHF225eIqmb4TzJ8tF/clgr3K6SspEsFb6DIRAB8GdgF2t71ZuS8ofpTXttahdh+k+IF6he2x5WtzFyeaAe6l+IHvs22H/S4Hduiizv4epDgS365fPXd32OavO7bn2d6fIoHfApzVzXYU51ueq7PsZtqyX72VDEssaTuKOGcAW9oeC9zMX7/ve4HWRN76HSynOMgY1/KdbWb7FVXE2iRJBCPXBpI2bHkN1vr7JLCnpC9KehmApB0lnS9p7ADb/IbiR3wqcK3thRQ/MLsD/91XSNL6kjakONoe1S+eMRQ/yI9KenEZBy3bjiq3XR9Yr9x2g1X4HNqy/SzFD9Lpkl5S1jVB0t+VRWYB0yVNKVstnxxgV1B0tbxf0r7luYkJkl5errufov+/XQzPlPV8pjy3sh3FkfL5g8Uv6aXlCepNKH4c/0TRLdaNH5Tx7irpRcBngWtsL+ty+zWxCUWS6QVQcXnvK1vWzwJOKD/DsRRXfgHg4iKGS4EvS9qs/Kx3kPSWGuIe0ZIIRq65FD+wfa9/6VTY9u0UfeSTgIWS/gj8EFgAPD7ANk9QdDEstP1Uufi3wJ22H2gpekoZw0kU3Ut/KZdBcWnoRhRHx1cDl/Sr5r1l+X+nOMH4F7o/8h3MR4ElwNWSHqNoFe1Svrefl7FdUZa5YqCd2L6W4gTv6cAfKVpDfUfcXwMOKa+e+XqbzY+nOFexFLgKuIDuLrNdjyJp3ENxPuQtwP/uYjtsXw58guL7vZeiNXNYN9uuKduLgC9T/Du5n+L8wa9bipxF8WN/I/A7in/HK4G+bsL3AaOBRRRdkbPp0KUZ3VF5wiUiYp1TXi77bdvbDVo4VltaBBGxzpC0kaS/L7sTJ1B0yf1oqOMa6dIiiIh1Rnk+5krg5RTdgD8DTrD92JAGNsIlEURENFy6hiIiGm7Y3VA2btw4T5o0aajDiIgYVq677roHbY9vt27YJYJJkyaxYMGCoQ4jImJYkTTg3fGVdg2pGIHx1nKUw5MGKHNoOQLhQkkXVBlPRES8UGUtgnJo2DMoRm9cAcyXNKe8oaSvzE7AycAbbT/Sd4dnRETUp8oWwVSKkROXlnedXggc1K/MUcAZth8B6Hc3akRE1KDKRDCB548KuILnj6oIxdC+O5cPsri6HAP+BSQdreJRhwt6e3srCjciopmG+vLR9Ske+rE3xbjiZ7Ub4Mz2mbZ7bPeMH9/2pHdERKymKhPB3Tx/CNmJvHB43RXAHNtP274D+ANFYoiIiJpUmQjmUzzKcLKk0RSjG87pV+bHFK0BJI2j6CpaWmFMERHRT2WJwPZKiodPzAMWA7NsL5R0mqQDy2LzgIckLQJ+AXzE9gsecB4REdUZdmMN9fT0ODeURUSsGknX2e5pt27Y3Vkcq+70y/5QeR0f2n/nyuuIiGoM9VVDERExxJIIIiIaLokgIqLhkggiIhouiSAiouGSCCIiGi6JICKi4ZIIIiIaLokgIqLhkggiIhouiSAiouGSCCIiGi6JICKi4ZIIIiIaLokgIqLhkggiIhouiSAiouGSCCIiGi6JICKi4ZIIIiIaLokgIqLhkggiIhouiSAiouGSCCIiGi6JICKi4SpNBJKmSbpV0hJJJ7VZP11Sr6QbyteRVcYTEREvtH5VO5Y0CjgD2B9YAcyXNMf2on5F/5/tGVXFERERnVXZIpgKLLG91PZTwIXAQRXWFxERq6HKRDABWN4yv6Jc1t87Jd0oabakbSqMJyIi2hjqk8U/ASbZfjVwGXBeu0KSjpa0QNKC3t7eWgOMiBjpqkwEdwOtR/gTy2XPsf2Q7SfL2bOB17Xbke0zbffY7hk/fnwlwUZENFWViWA+sJOkyZJGA4cBc1oLSNqqZfZAYHGF8URERBuVXTVke6WkGcA8YBRwju2Fkk4DFtieA/wfSQcCK4GHgelVxRMREe1VlggAbM8F5vZbdmrL9MnAyVXGEBERnQ31yeKIiBhiSQQREQ2XRBAR0XBJBBERDZdEEBHRcEkEERENl0QQEdFwSQQREQ2XRBAR0XBJBBERDZdEEBHRcEkEERENl0QQEdFwgyYCScdL2qKOYCIion7dtAheCsyXNEvSNEmqOqiIiKjPoInA9inATsBMigfH3Cbps5J2qDi2iIioQVfnCGwbuK98rQS2AGZL+kKFsUVERA0GfUKZpBOA9wEPUjxg/iO2n5a0HnAb8M/VhhgREVXq5lGVLwYOtn1n60Lbz0p6WzVhRUREXbrpGtq+fxKQ9H0A24sriSoiImrTTSJ4ReuMpFHA66oJJyIi6jZgIpB0sqTHgVdLeqx8PQ48APxnbRFGRESlBkwEtj9newzwRdubla8xtre0fXKNMUZERIUGPFks6eW2bwEukrRb//W2r680soiIqEWnq4Y+DBwFfLnNOgP7VBJRRETUasBEYPuo8u9b6wsnIiLq1qlr6OBOG9q+eO2HExERdevUNfT2DusMDJoIJE0DvgaMAs62/fkByr0TmA283vaCwfYbERFrT6euofevyY7L+w3OAPYHVlCMYDrH9qJ+5cYAJwDXrEl9ERGxejp1DR1h+3xJJ7Zbb/srg+x7KrDE9tJyfxcCBwGL+pX7V+DfgI90HXVERKw1ne4s3qT8O2aA12AmAMtb5leUy55TXpa6je2fddqRpKMlLZC0oLe3t4uqIyKiW526hr5T/v1UFRWXo5d+heIZBx3ZPhM4E6Cnp8dVxBMR0VTdPKpye0k/kdQr6QFJ/ylp+y72fTewTcv8xHJZnzHAK4FfSloG7AHMkdTTffgREbGmuhl07gJgFrAVsDVwEfCDLrabD+wkabKk0cBhwJy+lbb/aHuc7Um2JwFXAwfmqqGIiHp1kwg2tv192yvL1/nAhoNtZHslMAOYBywGZtleKOk0SQeuWdgREbG2dLpq6MXl5M8lnQRcSHH/wLuBud3s3Pbc/mVtnzpA2b272WdERKxdnW4ou47ih1/l/DEt6wxkBNKIiBGg01VDk+sMJCIihkY3zyxG0iuBKbScG7D9vaqCioiI+gyaCCR9EtibIhHMBQ4ArgKSCCIiRoBurho6BNgXuK8cf+g1wOaVRhUREbXpJhH8xfazwEpJm1E8s3ibQbaJiIhhoptzBAskjQXOoriS6E/AbyuNKiIiajNoIrD9T+XktyVdAmxm+8Zqw4qIiLp0e9XQwcCbKO4fuApIIoiIGCG6GXTuW8CxwE3AzcAxks6oOrCIiKhHNy2CfYC/sW0ASecBCyuNKiIiatPNVUNLgG1b5rcpl0VExAjQadC5n1CcExgDLJZ0bblqKnDtQNtFRMTw0qlr6Eu1RREREUOm06BzV/ZNS3op8Ppy9lrbD1QdWERE1KObq4YOpegKehdwKHCNpEOqDiwiIurRzVVDHwde39cKkDQeuByYXWVgERFRj26uGlqvX1fQQ11uFxERw0A3LYJLJM3jrw+s7/pRlRERse7rmAgkCfg6xYniN5WLz7T9o6oDi4iIenRMBLYtaa7tVwEX1xRTRETUqJu+/uslvX7wYhERMRx1c45gd+AIScuAJwBRNBZeXWVgERFRj24Swd9VHkVERAyZTmMNvQT4GLAjxRDUn7P9WF2BRUREPTqdI/geRVfQN4BNKa4eioiIEaZTItjK9sdtz7N9PLDK5wQkTZN0q6Qlkk5qs/5YSTdJukHSVZKmrGodERGxZjpeNSRpC0kvlvRiYFS/+Y4kjQLOAA4ApgCHt/mhv8D2q2zvCnwB+MrqvY2IiFhdnU4Wbw5cR3GVUJ/ry78Gth9k31OBJbaXAki6EDgIWNRXoN85h03K/UZERI06DUM9aQ33PQFY3jK/guJS1OeRdBxwIjCa4rGYLyDpaOBogG233bZdkYiIWE1DPnic7TNs7wB8FDhlgDJn2u6x3TN+/Ph6A4yIGOGqTAR3UzzfuM/EctlALgTeUWE8ERHRRpWJYD6wk6TJkkYDhwFzWgtI2qll9h+A2yqMJyIi2uh0Q1nHK4NsPzzI+pWSZgDzgFHAObYXSjoNWGB7DjBD0n7A08AjwD+u6huIiIg10+mqoesoruIRsC3FD7WAscBdwOTBdm57Lv2eXWD71JbpE1Y95IiIWJsG7BqyPdn29hSPpXy77XG2twTeBlxaV4AREVGtbs4R7FEe2QNg++fAntWFFBERdepm9NF7JJ0CnF/Ovwe4p7qQIiKiTt20CA4HxgM/onhK2fhyWUREjACDtgjKq4NOkLSJ7SdqiCkiImo0aItA0p6SFgGLy/nXSPpW5ZFFREQtuukaOp3iKWUPAdj+PbBXlUFFRER9urqz2PbyfoueqSCWiIgYAt1cNbRc0p6AJW0AnEDZTRQREcNfNy2CY4HjKIaVvhvYFfinKoOKiIj6dNMi2MX2e1oXSHoj8OtqQoqIiDp10yL4RpfLIiJiGOo0+ugbKIaSGC/pxJZVm1GMJhoRESNAp66h0cCmZZkxLcsfAw6pMqiIiKhPp2cWXwlcKelc23fWGFNERNSom3MEZ0sa2zcjaQtJ8yqMKSIiatRNIhhn+9G+GduPAC+pLqSIiKhTN4ngWUnb9s1I2o7iyWURETECdHMfwceBqyRdSfGoyjcDR1caVURE1KabYagvkbQbsEe56IO2H6w2rIiIqMuAXUOSXl7+3Y3i4fX3lK9ty2URETECdGoRfBg4Cvhym3UG9qkkooiIqFWn+wiOKv++tb5wIiKibp2GmDi404a2L1774URERN06dQ29vfz7Eooxh64o598K/IbiQfYRETHMdeoaej+ApEuBKbbvLee3As6tJbqIiKhcNzeUbdOXBEr3U1xFFBERI0A3ieC/JM2TNF3SdOBnwOXd7FzSNEm3Sloi6aQ260+UtEjSjZL+q7xrOSIiajRoIrA9A/g28Jrydabt4wfbTtIo4AzgAGAKcLikKf2K/Q7osf1qYDbwhVULPyIi1lQ3Q0wAXA88bvtySRtLGmP78UG2mQossb0UQNKFwEHAor4Ctn/RUv5q4IjuQ4+IiLVh0BaBpKMojta/Uy6aAPy4i31PAJa3zK8olw3kA8DPB4jhaEkLJC3o7e3touqIiOhWN+cIjgPeSPFkMmzfxloehlrSEUAP8MV2622fabvHds/48ePXZtUREY3XTdfQk7afkgSApPXpbhjqu4FtWuYnlsueR9J+FCOcvsX2k13sNyIi1qJuWgRXSvoYsJGk/YGLgJ90sd18YCdJkyWNBg4D5rQWkPRaii6nA20/sGqhR0TE2tBNIvgo0AvcBBwDzAVOGWwj2yuBGcA8YDEwy/ZCSadJOrAs9kVgU+AiSTdImjPA7iIioiIdu4bKS0AX2n45cNaq7tz2XIrE0brs1Jbp/VZ1nxERsXZ1bBHYfga4tfVRlRERMbJ0c7J4C2ChpGuBJ/oW2j5w4E0iImK46CYRfKLyKCIiYsh0eh7BhsCxwI4UJ4pnlieAIyJiBOl0juA8ipu8bqIYL6jdIysjImKY69Q1NMX2qwAkzQSurSekiIjVc/plf6i8jg/tv3PlddStU4vg6b6JdAlFRIxcnVoEr5H0WDktijuLHyunbXuzyqOLiIjKdXpU5ag6A4mIiKHRzRATERExgiURREQ0XBJBRETDJRFERDRcEkFERMMlEURENFwSQUREwyURREQ0XBJBRETDJRFERDRcEkFERMMlEURENFwSQUREwyURREQ0XBJBRETDJRFERDRcEkFERMNVmggkTZN0q6Qlkk5qs34vSddLWinpkCpjiYiI9ipLBJJGAWcABwBTgMMlTelX7C5gOnBBVXFERERnnR5ev6amAktsLwWQdCFwELCor4DtZeW6ZyuMIyIiOqiya2gCsLxlfkW5bJVJOlrSAkkLent710pwERFRGBYni22fabvHds/48eOHOpyIiBGlykRwN7BNy/zEcllERKxDqkwE84GdJE2WNBo4DJhTYX0REbEaKksEtlcCM4B5wGJglu2Fkk6TdCCApNdLWgG8C/iOpIVVxRMREe1VedUQtucCc/stO7Vlej5Fl1FERAyRYXGyOCIiqpNEEBHRcEkEERENl0QQEdFwSQQREQ2XRBAR0XBJBBERDZdEEBHRcJXeULauOf2yP1Rex4f237nyOiIi1qZGJYKIpqj6oCcHPCNLuoYiIhouiSAiouGSCCIiGi6JICKi4XKyOCqVK7Ui1n1pEURENFwSQUREwyURREQ0XBJBRETDJRFERDRcrhqKES1DLUQMLi2CiIiGSyKIiGi4dA1FVCTdUjFcpEUQEdFwaRHUJEMtRIxsw/n/eFoEERENV2mLQNI04GvAKOBs25/vt/5FwPeA1wEPAe+2vazKmCKiWsP5yLipKmsRSBoFnAEcAEwBDpc0pV+xDwCP2N4ROB34t6riiYiI9qrsGpoKLLG91PZTwIXAQf3KHAScV07PBvaVpApjioiIfmS7mh1LhwDTbB9Zzr8X2N32jJYyN5dlVpTzt5dlHuy3r6OBo8vZXYBbKwm6vXHAg4OWSt2pO3Wn7nW77u1sj2+3YlhcNWT7TODMoahb0gLbPak7dafu1D1S6u6vyq6hu4FtWuYnlsvalpG0PrA5xUnjiIioSZWJYD6wk6TJkkYDhwFz+pWZA/xjOX0IcIWr6quKiIi2Kusasr1S0gxgHsXlo+fYXijpNGCB7TnATOD7kpYAD1Mki3XNkHRJpe7UnbpTd10qO1kcERHDQ+4sjohouCSCiIiGSyIYgKRpkm6VtETSSTXXfY6kB8r7LOqsdxtJv5C0SNJCSSfUWPeGkq6V9Puy7k/VVXdLDKMk/U7ST4eg7mWSbpJ0g6QFNdc9VtJsSbdIWizpDTXVu0v5fvtej0n6YB11l/V/qPy3drOkH0jasMa6TyjrXVjnex6Q7bz6vShObt8ObA+MBn4PTKmx/r2A3YCba37fWwG7ldNjgD/U9b4BAZuW0xsA1wB71Pz+TwQuAH5aZ71l3cuAcXXXW9Z9HnBkOT0aGDsEMYwC7qO46amO+iYAdwAblfOzgOk11f1K4GZgY4oLdi4HdhyK777vlRZBe90Mj1EZ2/9NcRVVrWzfa/v6cvpxYDHFf5g66rbtP5WzG5Sv2q5kkDQR+Afg7LrqXBdI2pziwGMmgO2nbD86BKHsC9xu+84a61wf2Ki8h2lj4J6a6v0b4Brbf7a9ErgSOLimuttKImhvArC8ZX4FNf0griskTQJeS3FkXledoyTdADwAXGa7trqBrwL/DDxbY52tDFwq6bpySJW6TAZ6ge+W3WJnS9qkxvr7HAb8oK7KbN8NfAm4C7gX+KPtS2uq/mbgzZK2lLQx8Pc8/+bb2iURxAtI2hT4IfBB24/VVa/tZ2zvSnEX+lRJr6yjXklvAx6wfV0d9Q3gTbZ3oxit9zhJe9VU7/oU3ZD/bvu1wBNA3efERgMHAhfVWOcWFK38ycDWwCaSjqijbtuLKUZavhS4BLgBeKaOugeSRNBeN8NjjEiSNqBIAv9h++KhiKHsmvgFMK2mKt8IHChpGUU34D6Szq+pbuC5I1RsPwD8iKJ7sg4rgBUtra/ZFImhTgcA19u+v8Y69wPusN1r+2ngYmDPuiq3PdP262zvBTxCcT5uyCQRtNfN8BgjTjkE+Exgse2v1Fz3eEljy+mNgP2BW+qo2/bJtifankTxXV9hu5ajQwBJm0ga0zcN/C1F90HlbN8HLJe0S7loX2BRHXW3OJwau4VKdwF7SNq4/He/L8U5sVpIekn5d1uK8wMX1FV3O8Ni9NG6eYDhMeqqX9IPgL2BcZJWAJ+0PbOGqt8IvBe4qeyrB/iY7bk11L0VcF75QKP1gFm2a7+Mc4i8FPhR+SiO9YELbF9SY/3HA/9RHvQsBd5fV8Vl4tsfOKauOgFsXyNpNnA9sBL4HfUO+fBDSVsCTwPHDdEJ+udkiImIiIZL11BERMMlEURENFwSQUREwyURREQ0XBJBRETDJRFERDRcEkFERMP9f9TbtPnh8O6mAAAAAElFTkSuQmCC\n",
       "text/plain": [
        "<Figure size 432x288 with 1 Axes>"
       ]
@@ -422,9 +390,18 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 13,
+   "execution_count": 8,
    "metadata": {},
-   "outputs": [],
+   "outputs": [
+    {
+     "name": "stderr",
+     "output_type": "stream",
+     "text": [
+      "/workspace/brevitas_cnv_lfc/training_scripts/models/LFC.py:85: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.\n",
+      "  x = 2.0 * x - torch.tensor([1.0]).to(self.device)\n"
+     ]
+    }
+   ],
    "source": [
     "import brevitas.onnx as bo\n",
     "export_onnx_path = \"/tmp/LFCW1A1.onnx\"\n",
@@ -441,45 +418,40 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 14,
+   "execution_count": 9,
    "metadata": {},
    "outputs": [
     {
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "\n",
-      "Stopping http://0.0.0.0:8081\n",
       "Serving '/tmp/LFCW1A1.onnx' at http://0.0.0.0:8081\n"
      ]
-    }
-   ],
-   "source": [
-    "import netron\n",
-    "netron.start(export_onnx_path, port=8081, host=\"0.0.0.0\")"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 15,
-   "metadata": {},
-   "outputs": [
+    },
     {
      "data": {
       "text/html": [
-       "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>\n"
+       "\n",
+       "        <iframe\n",
+       "            width=\"100%\"\n",
+       "            height=\"400\"\n",
+       "            src=\"http://0.0.0.0:8081/\"\n",
+       "            frameborder=\"0\"\n",
+       "            allowfullscreen\n",
+       "        ></iframe>\n",
+       "        "
       ],
       "text/plain": [
-       "<IPython.core.display.HTML object>"
+       "<IPython.lib.display.IFrame at 0x7f86cdb6e5f8>"
       ]
      },
+     "execution_count": 9,
      "metadata": {},
-     "output_type": "display_data"
+     "output_type": "execute_result"
     }
    ],
    "source": [
-    "%%html\n",
-    "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>"
+    "showInNetron('/tmp/LFCW1A1.onnx')"
    ]
   },
   {
@@ -500,19 +472,19 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 16,
+   "execution_count": 10,
    "metadata": {},
    "outputs": [
     {
      "data": {
       "text/plain": [
-       "input: \"32\"\n",
-       "input: \"33\"\n",
-       "output: \"35\"\n",
+       "input: \"37\"\n",
+       "input: \"38\"\n",
+       "output: \"40\"\n",
        "op_type: \"MatMul\""
       ]
      },
-     "execution_count": 16,
+     "execution_count": 10,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -532,22 +504,22 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 17,
+   "execution_count": 11,
    "metadata": {},
    "outputs": [
     {
      "data": {
       "text/plain": [
-       "array([[ 1.,  1.,  1., ...,  1.,  1., -1.],\n",
-       "       [ 1.,  1., -1., ...,  1.,  1., -1.],\n",
-       "       [-1.,  1., -1., ..., -1.,  1., -1.],\n",
+       "array([[-1., -1., -1., ..., -1., -1.,  1.],\n",
+       "       [-1.,  1., -1., ...,  1., -1., -1.],\n",
+       "       [ 1., -1.,  1., ..., -1., -1., -1.],\n",
        "       ...,\n",
-       "       [-1.,  1., -1., ..., -1., -1.,  1.],\n",
-       "       [ 1.,  1., -1., ...,  1.,  1., -1.],\n",
-       "       [-1.,  1.,  1., ..., -1., -1.,  1.]], dtype=float32)"
+       "       [ 1.,  1., -1., ...,  1.,  1.,  1.],\n",
+       "       [-1., -1.,  1., ...,  1.,  1., -1.],\n",
+       "       [ 1.,  1., -1., ...,  1., -1., -1.]], dtype=float32)"
       ]
      },
-     "execution_count": 17,
+     "execution_count": 11,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -565,7 +537,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 18,
+   "execution_count": 12,
    "metadata": {},
    "outputs": [
     {
@@ -574,7 +546,7 @@
        "<DataType.BIPOLAR: 8>"
       ]
      },
-     "execution_count": 18,
+     "execution_count": 12,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -585,7 +557,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 19,
+   "execution_count": 13,
    "metadata": {},
    "outputs": [
     {
@@ -594,7 +566,7 @@
        "[784, 1024]"
       ]
      },
-     "execution_count": 19,
+     "execution_count": 13,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -612,50 +584,56 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 20,
+   "execution_count": 14,
    "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "\n",
-      "Stopping http://0.0.0.0:8081\n",
-      "Serving '/tmp/LFCW1A1-clean.onnx' at http://0.0.0.0:8081\n"
-     ]
-    }
-   ],
+   "outputs": [],
    "source": [
     "from finn.transformation.fold_constants import FoldConstants\n",
     "from finn.transformation.infer_shapes import InferShapes\n",
     "model = model.transform(InferShapes())\n",
     "model = model.transform(FoldConstants())\n",
     "export_onnx_path_transformed = \"/tmp/LFCW1A1-clean.onnx\"\n",
-    "model.save(export_onnx_path_transformed)\n",
-    "netron.start(export_onnx_path_transformed, port=8081, host=\"0.0.0.0\")"
+    "model.save(export_onnx_path_transformed)"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 21,
+   "execution_count": 15,
    "metadata": {},
    "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "\n",
+      "Stopping http://0.0.0.0:8081\n",
+      "Serving '/tmp/LFCW1A1-clean.onnx' at http://0.0.0.0:8081\n"
+     ]
+    },
     {
      "data": {
       "text/html": [
-       "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>\n"
+       "\n",
+       "        <iframe\n",
+       "            width=\"100%\"\n",
+       "            height=\"400\"\n",
+       "            src=\"http://0.0.0.0:8081/\"\n",
+       "            frameborder=\"0\"\n",
+       "            allowfullscreen\n",
+       "        ></iframe>\n",
+       "        "
       ],
       "text/plain": [
-       "<IPython.core.display.HTML object>"
+       "<IPython.lib.display.IFrame at 0x7f86cdb6ec18>"
       ]
      },
+     "execution_count": 15,
      "metadata": {},
-     "output_type": "display_data"
+     "output_type": "execute_result"
     }
    ],
    "source": [
-    "%%html\n",
-    "<iframe src=\"http://0.0.0.0:8081/\" style=\"position: relative; width: 100%;\" height=\"400\"></iframe>"
+    "showInNetron('/tmp/LFCW1A1-clean.onnx')"
    ]
   },
   {
@@ -667,18 +645,18 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 22,
+   "execution_count": 16,
    "metadata": {},
    "outputs": [
     {
      "data": {
       "text/plain": [
-       "array([[ 3.3252678 , -2.5652065 ,  9.215742  , -1.4251148 ,  1.4251148 ,\n",
-       "        -3.3727715 ,  0.28502294, -0.5700459 ,  7.07807   , -1.2826033 ]],\n",
+       "array([[-1.5095654 , -2.915617  ,  0.764004  , -1.8118242 , -2.308991  ,\n",
+       "        -2.6900144 , -1.520713  , -3.4965858 , -0.47711682, -2.9628415 ]],\n",
        "      dtype=float32)"
       ]
      },
-     "execution_count": 22,
+     "execution_count": 16,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -694,7 +672,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 23,
+   "execution_count": 17,
    "metadata": {},
    "outputs": [
     {
@@ -703,7 +681,7 @@
        "True"
       ]
      },
-     "execution_count": 23,
+     "execution_count": 17,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -718,13 +696,6 @@
    "source": [
     "We have succesfully verified that the transformed and cleaned-up FINN graph still produces the same output, and can now use this model for further processing in FINN."
    ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {},
-   "outputs": [],
-   "source": []
   }
  ],
  "metadata": {
diff --git a/notebooks/basics/2_modelwrapper.ipynb b/notebooks/basics/2_modelwrapper.ipynb
deleted file mode 100644
index 6b3cd0337d938c100e0f71e61f8505a5b7377505..0000000000000000000000000000000000000000
--- a/notebooks/basics/2_modelwrapper.ipynb
+++ /dev/null
@@ -1,365 +0,0 @@
-{
- "cells": [
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "# FINN - ModelWrapper\n",
-    "--------------------------------------\n",
-    "<font size=\"3\"> This notebook is about the ModelWrapper class within FINN. \n",
-    "\n",
-    "Following showSrc function is used to print the source code of function calls in the Jupyter notebook:</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 2,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "import inspect\n",
-    "\n",
-    "def showSrc(what):\n",
-    "    print(\"\".join(inspect.getsourcelines(what)[0]))"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "## General Information\n",
-    "------------------------------\n",
-    "* <font size=\"3\"> wrapper around ONNX ModelProto that exposes several utility\n",
-    "    functions for graph manipulation and exploration </font>\n",
-    "* <font size=\"3\"> ModelWrapper instance takes ONNX ModelProto and `make_deepcopy` flag as input </font>\n",
-    "* <font size=\"3\"> ONNX ModelProto can either be a string with the path to a stored .onnx file on disk, or serialized bytes </font>\n",
-    "* <font size=\"3\"> `make_deepcopy` is by default False but can be set to True if a (deep) copy should be created </font>"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "### Create a ModelWrapper instance\n",
-    "\n",
-    "<font size=\"3\">Here we use a premade ONNX file on disk to load up the ModelWrapper, but this could have been produced from e.g. a trained Brevitas PyTorch model. See [this notebook](3_brevitas_network_import.ipynb) for more details.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 3,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "from finn.core.modelwrapper import ModelWrapper\n",
-    "model = ModelWrapper(\"../LFCW1A1.onnx\")"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "### Access the ONNX GraphProto through ModelWrapper\n",
-    "\n",
-    "<font size=\"3\">ModelWrapper is a thin wrapper around the ONNX protobuf, and it offers a range of helper functions as well as direct access to the underlying protobuf. The `.model` member gives access to the full ONNX ModelProto, whereas `.graph` gives access to the GraphProto, as follows:</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 4,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "ModelProto IR version is 4\n",
-      "GraphProto top-level outputs are [name: \"60\"\n",
-      "type {\n",
-      "  tensor_type {\n",
-      "    elem_type: 1\n",
-      "    shape {\n",
-      "      dim {\n",
-      "        dim_value: 1\n",
-      "      }\n",
-      "      dim {\n",
-      "        dim_value: 10\n",
-      "      }\n",
-      "    }\n",
-      "  }\n",
-      "}\n",
-      "]\n",
-      "There are 29 nodes in the graph\n",
-      "The first node is \n",
-      "input: \"0\"\n",
-      "output: \"21\"\n",
-      "op_type: \"Shape\"\n",
-      "\n"
-     ]
-    }
-   ],
-   "source": [
-    "# access the ONNX ModelProto\n",
-    "modelproto = model.model\n",
-    "print(\"ModelProto IR version is %d\" % modelproto.ir_version)\n",
-    "\n",
-    "# the graph\n",
-    "graphproto = model.graph\n",
-    "print(\"GraphProto top-level outputs are %s\" % str(graphproto.output))\n",
-    "\n",
-    "# the node list\n",
-    "nodes = model.graph.node\n",
-    "print(\"There are %d nodes in the graph\" % len(nodes))\n",
-    "print(\"The first node is \\n%s\" % str(nodes[0]))"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "### Helper functions for tensors\n",
-    "<font size=\"3\"> Every input and output of every node in the onnx model is represented as tensor with several properties (i.e. name, shape, data type). ModelWrapper provides some utility functions to work with the tensors. </font>"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "##### Get all tensor names\n",
-    "\n",
-    "<font size=\"3\">Produces a list of all tensor names (inputs, activations, weights, outputs...) in the graph.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 5,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "['0', 'features.3.weight', 'features.3.bias', 'features.3.running_mean', 'features.3.running_var', 'features.7.weight', 'features.7.bias', 'features.7.running_mean', 'features.7.running_var', 'features.11.weight', 'features.11.bias', 'features.11.running_mean', 'features.11.running_var', '20', '23', '28', '30', '33', '34', '41', '42', '49', '50', '57', '58', '60']\n"
-     ]
-    }
-   ],
-   "source": [
-    "# get all tensor names\n",
-    "tensor_list = model.get_all_tensor_names()\n",
-    "print(tensor_list)"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "##### Producer and consumer of a tensor\n",
-    "\n",
-    "<font size=\"3\">A tensor can have a producer node and/or a consumer node in the onnx model. ModelWrapper provides two helper functions to access these nodes, they are shown in the following.\n",
-    "\n",
-    "It may be that a tensor does not have a producer or consumer node, for example if the tensor represents a constant that is already set. In that case `None` will be returned.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 6,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Producer node of tensor 60:\n",
-      "input: \"59\"\n",
-      "input: \"58\"\n",
-      "output: \"60\"\n",
-      "op_type: \"Mul\"\n",
-      "\n",
-      "Consumer node of tensor 0:\n",
-      "input: \"0\"\n",
-      "output: \"21\"\n",
-      "op_type: \"Shape\"\n",
-      "\n",
-      "Producer of tensor 0: None\n"
-     ]
-    }
-   ],
-   "source": [
-    "# get random tensor and find producer and consumer (returns node)\n",
-    "\n",
-    "tensor_name = tensor_list[25]\n",
-    "print(\"Producer node of tensor {}:\".format(tensor_name))\n",
-    "print(model.find_producer(tensor_name))\n",
-    "\n",
-    "tensor_name = tensor_list[0]\n",
-    "print(\"Consumer node of tensor {}:\".format(tensor_name))\n",
-    "print(model.find_consumer(tensor_name))\n",
-    "\n",
-    "print(\"Producer of tensor 0: %s\" % str(model.find_producer(\"0\")))\n"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "##### Tensor shape\n",
-    "<font size=\"3\">Each tensor has a specific shape which can be accessed with the following ModelWrapper helper functions.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 7,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Shape of tensor 0 is [1, 1, 28, 28]\n"
-     ]
-    }
-   ],
-   "source": [
-    "# get tensor_shape\n",
-    "\n",
-    "print(\"Shape of tensor 0 is %s\" % str(model.get_tensor_shape(\"0\")))"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "<font size=\"3\">It is also possible to set the tensor shape with a helper function. The syntax would be the following:\n",
-    "    \n",
-    "`onnx_model.set_tensor_shape(tensor_name, tensor_shape)`\n",
-    "\n",
-    "Optionally, the dtype (container datatype) of the tensor can also be specified as third argument. By default it is set to TensorProto.FLOAT. \n",
-    "    \n",
-    "**Important:** dtype should not be confused with FINN data type, which specifies the quantization annotation. See the remarks about FINN-ONNX in [this notebook](0_getting_started.ipynb). It is safest to use floating point tensors as the container data type for best compatibility inside FINN.</font>"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "##### Tensor FINN DataType"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "<font size=\"3\">FINN introduces its [own data types](https://github.com/Xilinx/finn/blob/master/src/finn/core/datatype.py) because ONNX does not natively support precisions less than 8 bits. FINN is about quantized neural networks, so precision of i.e. 4 bits, 3 bits, 2 bits or 1 bit are of interest. To represent the data within FINN, float tensors are used with additional annotation to specify the quantized data type of a tensor. The following helper functions are about this quantization annotation.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 8,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "The FINN DataType of tensor 0 is DataType.FLOAT32\n",
-      "The FINN DataType of tensor 32 is DataType.BIPOLAR\n"
-     ]
-    }
-   ],
-   "source": [
-    "# get tensor data type (FINN data type)\n",
-    "print(\"The FINN DataType of tensor 0 is \" + str(model.get_tensor_datatype(\"0\")))\n",
-    "print(\"The FINN DataType of tensor 32 is \" + str(model.get_tensor_datatype(\"32\")))"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "<font size=\"3\">In addition to the get_tensor_datatatype() function, the (FINN) datatype of a tensor can be set using the `set_tensor_datatype(tensor_name, datatype)` function.</font>"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "##### Tensor initializers\n",
-    "<font size=\"3\">Some tensors have *initializers*, like tensors that represent constants or i.e. the trained weight values. \n",
-    "\n",
-    "ModelWrapper contains two helper functions for this case, one to determine the current initializer and one to set the initializer of a tensor. If there is no initializer, `None` is returned.</font>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 9,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Initializer for tensor 33:\n",
-      "[[ 1.  1.  1. ...  1.  1. -1.]\n",
-      " [ 1.  1. -1. ...  1.  1. -1.]\n",
-      " [-1.  1. -1. ... -1.  1. -1.]\n",
-      " ...\n",
-      " [-1.  1. -1. ... -1. -1.  1.]\n",
-      " [ 1.  1. -1. ...  1.  1. -1.]\n",
-      " [-1.  1.  1. ... -1. -1.  1.]]\n",
-      "Initializer for tensor 0:\n",
-      "None\n"
-     ]
-    }
-   ],
-   "source": [
-    "# get tensor initializer\n",
-    "tensor_name = tensor_list[1]\n",
-    "print(\"Initializer for tensor 33:\\n\" + str(model.get_initializer(\"33\")))\n",
-    "print(\"Initializer for tensor 0:\\n\" + str(model.get_initializer(\"0\")))"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "<font size=\"3\">Like for the other tensor helper functions there is a corresponding set function (`set_initializer(tensor_name, tensor_value)`).</font>"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "### More helper functions\n",
-    "<font size=\"3\">ModelWrapper contains more useful functions, if you are interested please have a look at the [Python code](https://github.com/Xilinx/finn/blob/master/src/finn/core/modelwrapper.py) directly. "
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {},
-   "outputs": [],
-   "source": []
-  }
- ],
- "metadata": {
-  "kernelspec": {
-   "display_name": "Python 3",
-   "language": "python",
-   "name": "python3"
-  },
-  "language_info": {
-   "codemirror_mode": {
-    "name": "ipython",
-    "version": 3
-   },
-   "file_extension": ".py",
-   "mimetype": "text/x-python",
-   "name": "python",
-   "nbconvert_exporter": "python",
-   "pygments_lexer": "ipython3",
-   "version": "3.6.8"
-  }
- },
- "nbformat": 4,
- "nbformat_minor": 2
-}