Skip to content
Snippets Groups Projects
custom_federated_algorithms_2.ipynb 73.6 KiB
Newer Older
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "tqrD7Yzlmlsk"
      },
      "source": [
        "##### Copyright 2019 The TensorFlow Authors."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "cellView": "form",
        "colab": {},
        "colab_type": "code",
        "id": "2k8X1C1nmpKv"
      },
      "outputs": [],
      "source": [
        "#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n",
        "# you may not use this file except in compliance with the License.\n",
        "# You may obtain a copy of the License at\n",
        "#\n",
        "# https://www.apache.org/licenses/LICENSE-2.0\n",
        "#\n",
        "# Unless required by applicable law or agreed to in writing, software\n",
        "# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
        "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
        "# See the License for the specific language governing permissions and\n",
        "# limitations under the License."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "32xflLc4NTx-"
      },
      "source": [
        "# Custom Federated Algorithms, Part 2: Implementing Federated Averaging"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "jtATV6DlqPs0"
      },
      "source": [
        "\u003ctable class=\"tfo-notebook-buttons\" align=\"left\"\u003e\n",
        "  \u003ctd\u003e\n",
        "    \u003ca target=\"_blank\" href=\"https://www.tensorflow.org/federated/tutorials/custom_federated_algorithms_2\"\u003e\u003cimg src=\"https://www.tensorflow.org/images/tf_logo_32px.png\" /\u003eView on TensorFlow.org\u003c/a\u003e\n",
        "  \u003c/td\u003e\n",
        "  \u003ctd\u003e\n",
        "    \u003ca target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/federated/blob/v0.8.0/docs/tutorials/custom_federated_algorithms_2.ipynb\"\u003e\u003cimg src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" /\u003eRun in Google Colab\u003c/a\u003e\n",
        "  \u003c/td\u003e\n",
        "  \u003ctd\u003e\n",
        "    \u003ca target=\"_blank\" href=\"https://github.com/tensorflow/federated/blob/v0.8.0/docs/tutorials/custom_federated_algorithms_2.ipynb\"\u003e\u003cimg src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" /\u003eView source on GitHub\u003c/a\u003e\n",
        "  \u003c/td\u003e\n",
        "\u003c/table\u003e"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "_igJ2sfaNWS8"
      },
      "source": [
        "This tutorial is the second part of a two-part series that demonstrates how to\n",
        "implement custom types of federated algorithms in TFF using the\n",
        "[Federated Core (FC)](../federated_core.md), which serves as a foundation for\n",
        "the [Federated Learning (FL)](../federated_learning.md) layer (`tff.learning`).\n",
        "\n",
        "We encourage you to first read the\n",
        "[first part of this series](custom_federated_algorithms_1.ipynb), which\n",
        "introduce some of the key concepts and programming abstractions used here.\n",
        "\n",
        "This second part of the series uses the mechanisms introduced in the first part\n",
        "to implement a simple version of federated training and evaluation algorithms.\n",
        "\n",
        "We encourage you to review the\n",
        "[image classification](federated_learning_for_image_classification.ipynb) and\n",
        "[text generation](federated_learning_for_text_generation.ipynb) tutorials for a\n",
        "higher-level and more gentle introduction to TFF's Federated Learning APIs, as\n",
        "they will help you put the concepts we describe here in context."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "cuJuLEh2TfZG"
      },
      "source": [
        "## Before we start\n",
        "\n",
        "Before we start, try to run the following \"Hello World\" example to make sure\n",
        "your environment is correctly setup. If it doesn't work, please refer to the\n",
        "[Installation](../install.md) guide for instructions."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "rB1ovcX1mBxQ"
      },
      "outputs": [],
      "source": [
        "# NOTE: If you are running a Jupyter notebook, and installing a locally built\n",
        "# pip package, you may need to edit the following to point to the '.whl' file\n",
        "# on your local filesystem.\n",
        "\n",
        "!pip install --quiet --upgrade tensorflow_federated\n",
        "!pip install --quiet --upgrade tf-nightly"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "-skNC6aovM46"
      },
      "outputs": [],
      "source": [
        "from __future__ import absolute_import, division, print_function\n",
        "\n",
        "import collections\n",
        "\n",
        "import numpy as np\n",
        "from six.moves import range\n",
        "import tensorflow as tf\n",
        "import tensorflow_federated as tff\n",
        "\n",
        "tf.compat.v1.enable_v2_behavior()"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 420,
          "status": "ok",
          "timestamp": 1570209774342,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "zzXwGnZamIMM",
        "outputId": "a339c5a0-a5e0-4b52-d92e-c6c2b1c333d7"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "'Hello, World!'"
            ]
          },
          "execution_count": 139,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "@tff.federated_computation\n",
        "def hello_world():\n",
        "  return 'Hello, World!'\n",
        "\n",
        "\n",
        "hello_world()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "iu5Gd8D6W33s"
      },
      "source": [
        "## Implementing Federated Averaging\n",
        "\n",
        "As in\n",
        "[Federated Learning for Image Classification](federated_learning_for_image_classification.md),\n",
        "we are going to use the MNIST example, but since this is intended as a low-level\n",
        "tutorial, we are going to bypass the Keras API and `tff.simulation`, write raw\n",
        "model code, and construct a federated data set from scratch.\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "b6qCjef350c_"
      },
      "source": [
        "\n",
        "### Preparing federated data sets\n",
        "\n",
        "For the sake of a demonstration, we're going to simulate a scenario in which we\n",
        "have data from 10 users, and each of the users contributes knowledge how to\n",
        "recognize a different digit. This is about as\n",
        "non-[i.i.d.](https://en.wikipedia.org/wiki/Independent_and_identically_distributed_random_variables)\n",
        "as it gets.\n",
        "\n",
        "First, let's load the standard MNIST data:"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "uThZM4Ds-KDQ"
      },
      "outputs": [],
      "source": [
        "mnist_train, mnist_test = tf.keras.datasets.mnist.load_data()"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 419,
          "status": "ok",
          "timestamp": 1570209775660,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "PkJc5rHA2no_",
        "outputId": "391477e3-180a-4d76-c1da-116c0fb3bf46"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "[(dtype('uint8'), (60000, 28, 28)), (dtype('uint8'), (60000,))]"
            ]
          },
          "execution_count": 141,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "[(x.dtype, x.shape) for x in mnist_train]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "mFET4BKJFbkP"
      },
      "source": [
        "The data comes as Numpy arrays, one with images and another with digit labels, both\n",
        "with the first dimension going over the individual examples. Let's write a\n",
        "helper function that formats it in a way compatible with how we feed federated\n",
        "sequences into TFF computations, i.e., as a list of lists - the outer list\n",
        "ranging over the users (digits), the inner ones ranging over batches of data in\n",
        "each client's sequence. As is customary, we will structure each batch as a pair\n",
        "of tensors named `x` and `y`, each with the leading batch dimension. While at\n",
        "it, we'll also flatten each image into a 784-element vector and rescale the\n",
        "pixels in it into the `0..1` range, so that we don't have to clutter the model\n",
        "logic with data conversions."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "XTaTLiq5GNqy"
      },
      "outputs": [],
      "source": [
        "NUM_EXAMPLES_PER_USER = 1000\n",
        "BATCH_SIZE = 100\n",
        "\n",
        "def get_data_for_digit(source, digit):\n",
        "  output_sequence = []\n",
        "  all_samples = [i for i, d in enumerate(source[1]) if d == digit]\n",
        "  for i in range(0, min(len(all_samples), NUM_EXAMPLES_PER_USER), BATCH_SIZE):\n",
        "    batch_samples = all_samples[i:i + BATCH_SIZE]\n",
        "    output_sequence.append({\n",
        "        'x': np.array([source[0][i].flatten() / 255.0 for i in batch_samples],\n",
        "                      dtype=np.float32),\n",
        "        'y': np.array([source[1][i] for i in batch_samples], dtype=np.int32)})\n",
        "  return output_sequence\n",
        "\n",
        "federated_train_data = [get_data_for_digit(mnist_train, d) for d in range(10)]\n",
        "\n",
        "federated_test_data = [get_data_for_digit(mnist_test, d) for d in range(10)]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "xpNdBimWaMHD"
      },
      "source": [
        "As a quick sanity check, let's look at the `Y` tensor in the last batch of data\n",
        "contributed by the fifth client (the one corresponding to the digit `5`)."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 101
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 433,
          "status": "ok",
          "timestamp": 1570209778170,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "bTNuL1W4bcuc",
        "outputId": "43b62cf9-777c-48f5-d568-59aa41b85bf9"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "array([5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,\n",
              "       5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,\n",
              "       5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,\n",
              "       5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5,\n",
              "       5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5], dtype=int32)"
            ]
          },
          "execution_count": 143,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "federated_train_data[5][-1]['y']"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "Xgvcwv7Obhat"
      },
      "source": [
        "Just to be sure, let's also look at the image corresponding to the last element of that batch."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 275
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 613,
          "status": "ok",
          "timestamp": 1570209778830,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "cI4aat1za525",
        "outputId": "ae6c35e8-c05e-4d9d-f5db-e510c0709da3"
      },
      "outputs": [
        {
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQUAAAECCAYAAAD3k8IpAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnU1sm1X2/7+245fYSRynVdOoQCvKorBCIMSis6EDEhIF\nqYvRoBZRZjEbVJAqVVUlQEAHFgGNGKkVCHZddDpCM2KGgDQCqavZDNJoVlMWvwqVlpC20MRx/BLb\nsZ//gv95eu7xvY/t+D05H+nqeewmzvVTP1+fc+4554Y8z/OgKIry/wkPegKKogwXQyMKuVwO586d\nQy6XG/RUrOj8OkPnt3n6PbehEoXz588P5X8KoPPrFJ3f5un33IZGFBRFGQ5UFBRFMVBRUBTFoGNR\nuHbtGp5//nk8/fTTeP7553H9+vVNvU4kEsGePXsQiUQ6nVJP0Pl1hs5v8/R7bqFO8xSOHz+O3/zm\nNzh8+DA+//xz/O1vf8OFCxe6NT9FUfpMR6KwvLyMp59+Gv/+978RCoVQr9fx+OOP46uvvkImk2np\nNX71q1/hhx9+2OwUFEVpk3vuuQf/+te/nP8+1smLLy0tYXZ2FqFQCAAQDoexa9cu3Lx5s2VR+OGH\nH/D99993Mg1FUbpIR6LQKrlcrmGNNRKJYG5urh9/XlEUC0tLS6jVasZzU1NTnYnC3Nwcbt26Bc/z\nfPfh9u3b2L17t/FzFy5cwPnz543n9uzZg8uXL3fy5xVF6YBjx45hcXHReO7EiROdicLMzAwOHDiA\nhYUFPPfcc1hYWMBDDz3U4DocP34cR44cMZ4bxiivomwnLl68aLUUOl59+O6773DmzBnkcjmk02nM\nz89j3759Lf/+vn37NKagKH1k7969uHbtmvPfO44p3H///fj00087fRlFUYYEzWhUFMVARUFRFAMV\nBUVRDFQUFEUxUFFQFMVARUFRFAMVBUVRDFQUFEUxUFFQFMVARUFRFAMVBUVRDFQUFEUxUFFQFMVA\nRUFRFAMVBUVRDFQUFEUxUFFQFMVARUFRFAMVBUVRDPqy74MyOtDGPqFQyBj83wDA1e/X8zz/3+R5\nr+ba7s/L92R7f7a/Ybs28jXq9To8z/OP8hyA8zgsqCgoPqFQCOFwGJFIxHp0iQL/cNfrddTrddRq\nNf+cRjc+/EE3pu3nbO/PNZqJBv2cvC507nkeqtUqNjY2sLGx0XDOr4PtOCyoKCg+oVAIkUgEY2Nj\nDSMajfo3R5AgbGxsoFar+TcDnXfzQ083MA0pWC4Lgm5eeo/yXL6uHEHXZmxsDPV6Hevr6yiXy/7g\nj0kYSDBrtRpCoZB/HBaLQUVB8aEbbGxsDLFYzB/RaBSxWMz/NrSZv57noVaroVqt+iMcDqNSqaBe\nr3ftQy+FgH/T07+7oJs6Go36NzI/58JgO0YiEet1oVGr1VAsFlEoFFAsFo1Bv8+FMhQKYWNjQy0F\nZbihGycWiyEejyORSPhHmyjwUavV/G9F7mPTN2Gn2L65pfkf9Hek2MkxNjYW6GJEo1HE4/GG60LP\n1Wo1rK2tYW1tDblcDmtra4hGowiHw/7WipFIBNVq1bC6uima3UBFQfGhDy19AyYSCYyPj/tjbGys\nQQi4QFSrVUQiEeMDT9+K3Zwjv3G5fx8kCqFQyCp2/JxuYFu8IBwOIxaLGddDjo2NDWSzWWSzWSQS\nCd+6AuBbBJVKpUEQyMoZFlQUFB/pPpAopFIppFIppyjQkB94ciekz98p8hucTH+XKNBzUuy46PGb\nmMce+ON4PI5UKoVkMulfE/64Wq0ilUohkUgYFsLGxgbK5bJvEdD1qdfrvkvRzevTKSoKig8PptE3\nKn3gJycnEY1GjeU1udxWLpcBmIIwNjbWVVGwWQo0Z5so8Mdk/pMYJJNJY5Ao8CAkH4lEApOTk5iY\nmMDExIR/TsdKpeIUhGKxiGq16l8fcqs2NjbUUlCGF24p8BuIhCEWizldBxIF+mblz/PAGj1P2Pxo\n15IgDxTajjbx4Y9jsViDEPARj8cNEZDi0EwUaLWBgo3cAolGo23HQAaFioLiw6PsUhjGx8cRj8ed\nyTwAUC6X/RuA36x0U5XLZavbQdZGszwCsmBoxUCeN1uajEajRgxAxkzoxnW5EGQ50bWgn6cgocw7\n4IOWIW2iOmyoKCgGdDPQjcZFYXx8PHDJjouCbZmP/GqZxEODuy+2I4kUrRbIcwrqEVIgxsbGnEHG\nRCJhzNcWbKR4BP0Oj2UAcAqCbQyzOKgoKD7SUqAbjlyIZDLpDMJFIhFUKhVfDGwJQevr60biDh3p\nPBQK+YLCB1kdNBe6oeUNPjY2ZrwXec7zDLig8CVJbtoHLUlyl0BaCi7RGwVBALogCocOHfL9plAo\nhFOnTuHgwYPdmJvSZ5q5D8lk0prRR6NSqVgFgY6U4EMJPPycAm7yZuWP6VtaDnIFSBRccQWyNnjS\nEn8sRUzOX/6eFIVWBKGbKd+9omNRCIVCOHfuHPbv39+N+SgDhn/46aYkQUilUg03BXcXKpWKYUHw\nG8zzPESjUT/92VYfEA6HDfOc3/g0D9cgwaL3wN8PwZcvm6U521KeuWBKawlAQ5wkyEqQFsMw0bEo\nDOObUjaHzX3glgKtQLh8eikK/IYkUeBp0CQKNCKRSNMbP2jw+gx6P/y8WSCTp0rLIKpNLKTotRJL\n2BbuAwCcOnUKnufh0UcfxcmTJzE5OWn8ey6XQy6XM56LRCKYm5vrxp9XugQXBfr258uSyWSyIb2X\nHyuVSkPFIeF5nu9iVKtVVCoV45y+fckicR2DRjNRoKNryTNo5ULiqv9oVxjk6/WTpaUl1Go147mp\nqanOReHSpUuYnZ1FtVrFu+++i7Nnz+L99983fubChQs4f/688dyePXtw+fLlTv+80mVkYhCvBORW\nATfvyacnS4C7B7yEmlwMlzBEIhGnELjOpSjQe+Dvx3ZuIyhbs5WYwdraml8MVSqV/ApJeo8UR5Fl\n5YOyFo4dO4bFxUXjuRMnTnQuCrOzswB+WQM+evQoXn755YafOX78OI4cOWI8J5ePlNGH107Qmj4v\nm47H44HuQzgc9vMAbK4Df0x5AjJpqZNkIMpA5DevfCxvbP44n8/jp59+wp07d7CysoJcLod8Po9S\nqWSIAxfMQboQFy9e7L6lUCqVUKvVMDExAQD48ssv8eCDDzb83NTUFKampjr5U8oIQKJAKwVcEChe\nIIOLfNDqg1xdcD1Hy4iuuod2BYKnZ9usGfq2l5YQnRcKBSwvL2N5eRkrKytYXV1FPp9HsVj0RcFm\nQQ1KGFzue0ei8PPPP+PVV1/139z+/fvx5ptvdvKSygjDVy7i8bghCOQ68BvJtiTJA5i2RCV+bDW9\nuVXom79SqWB9fd03/+mcf9PbLJ5CoYDV1VW/UlJaCvSzNjdimOhIFO6991589tln3ZqLMuLwzD9K\nRqKkn1gs5n/TyqQlOnclL7mGtBSIzboQ5D6QKMhGKfRt74qLFItFv58C9VSQloJ878OYs6AZjUrX\n4DEF4G5eAJn/tr6NrjRnV6qz/DduKXRaXMTdBxKFfD6PfD6PtbU1lEolVCoVv5EMPyeLolAo+MFG\nfs7bsTVbiRg0KgpK1yD3AbgrCDI45yq7pptC1hzIYfs3aSlsFpulkM/nsbq6ilwuh0Kh0OBS8HM5\naAWCfq5arTrf/zChoqB0Dfqmp14CQV2a6MjPm+UStJJr0AlkKZAFQO5ALpfDysoK1tbWjJu9VCoZ\n59y94K4FDR7pl9dimFBR2Ia4knWCzHTZcERmAdJrdLokuNnfaefoGqVSyY8HrK6u+oMChxQ05ELA\nxYGvLthWWIZRAGyoKGwzggp+bFmKcvBaBy4OvULeSEFZgDyhyHUelGtQLBYNIeDnNveBjrb8g2GN\nF7SCisI2gsRAftPzJiJBacy8ZLjbQT6J7eZvdrSVZPPHfClRLitWq1WUSiW/CzOl5vPOzHIFQroL\nruVGFQVlaJEpzNI1cJUm84Qh2UCFVwl2i3ZcAP64Xq9bk6LoZuU3sm0FoVQq+asNtrG+vm4kNsnh\nEgQVBWWoIWGw9UMIshJo8HhDPywF283vCl6SNeC6acncl3EBOpd5CXJjl3K5bBUcbiWMwpJjM1QU\nthG8CpI3UpEVka60Yt7YlL9ONwVBug0uUbCVIHOXwGbi2/IIeD6BLYDIj7SCEFQLwec2iq4DoKKw\n7bB1V+Kdjbgg2BqdyG7ErlLpTnBZAlIM5Dm3FGx7OdISIyUjySMtK7qSk6iWw1Ul2WwJdlRQUdhG\nBAmCLId2ndPryFyBXiJFwHXklgJ3FSiBKJ/PI5fL+clI8kh7M7gG5Rm4BIvO5dxHDRWFLY5sR8ZL\nm2Wb83Q6jcnJSb8/AZUo8+Ij12u3StDNZPv2DWpSIp+jFQSXCxAkCqurq37VrytYOWyFS71CRWGL\nYktQomIlaq02MTFhHKenp7Fjxw5kMhlfHKghqlxhsAlCs29FV+6AzCGQZcn8aFt2pHOyEKTbQOe8\nYCmfz/txhKAKxlELEnYDFYUtiKvVGO91MDEx4fe5SKfT/jGTyWB6ehpTU1N++7WgNmetYvP9ZaCO\nzH+5euDKA5CPZWqxrGoslUqBxUpkDQxLZ6RBoaKwxXAJAsUTuKUwNTWFTCaDmZkZZDIZ332Ympry\nLQW+GzN//VaQN5NMIpJLerY8Aj5srd74uSsWQKIilyRp0Gu7WqVtN2tBRWEL4ioYonhCIpEwRGHH\njh3YuXMn0ul0Q49E7j5sRhDkEqPMLOTFQ5RA5Co64nkCNlFpNrh7IVcZuKUgx3ZDRWGLI/csIEth\nYmLCdxd27tyJ2dlZpNNpa/KSdB+a0SwNmd+gcvmQJwvJ5CHKKHT1ebS5JDJW4bImeMzClQuxXVBR\n2ILYNjGRMQVuKZAoTE1NBe7o3A6u1GRKRSbrgPcf4E1NaBQKBf+8VCpZu0DT0Wb68/OgxCPZ62GU\n8ww6RUVhi2HbAEW6DzKmsHPnTuzatQvpdNpaPdlOJaSrkEkGGaXLQEFA3srMVpDkCiJS2bLt7/Lz\noCHnb3u8HVBR2EI02wVJ7vjE90+YnJz0u3JvlqBcA8/zfAGw1RiQleASBEoukkLABWI7+v+9QEVh\niyH3O+TVjLR3gmyP3q0qR1uVIg8ClsvlhpgBjx3QkO4DL1m29S3Yjt/mvURFYQshKyDlzspSFIJa\npG8GSjWWS4t05IFEeZTnfPCCJBIamWCkdA8VhS2E3AuSbwYbi8X8JUbqoCR3W+4UCuZRAFGWKMuk\nIW4d8BUGV+NT274R2zXrsJeoKGwxpCjwsmebpdBN94FbCnw1gd/8QYO3UJfxAspRcJUrK91DRWEL\nId0HCixS4RPfNVqKQjcsBbnDklxN4EuMMoZA/Q9tLdO4y7AVmpgMOyoKWwib+0CrDZShKAONm8lB\ncMEtBb7EyJufSmHgx3K5HNhjUZZK89UOpXuoKGwhbKLAlx9JFHhX5l7GFMh9yGazWF5e9jdclUlJ\nNCqVSst5BLZ8CKU7qCiMIK6NUHhLNZ6HQDkIVOTEhcG2F6OLoGQgAEbBkcw7oJ4F0n3g59VqtUdX\nTGkHFYURwpZlyAeVRPNKR36k8ujJyUk/vhCLxdoSBdnPgPdC4DEE2e6M5xzw1QQNFg4fKgojhqs9\nOyUncVGQ/RJoTExMGL0S2hGFoM5EJAJSGGSbdLmBiorCcKGiMELImAEvWopGow2ikE6nkU6nMT09\n7VsI1GUpmUwaAcd2LAUKJso9E11CwC0FvsTIqxOV4aFp2Hl+fh6//vWvceDAAVy9etV//tq1a3j+\n+efx9NNP4/nnn8f169d7OlHFXHK0BRKptRoXhenpaUxPTyOTySCTyfjuxGYsBQANHZNplYHHD6S1\nINufycYmaikMF01F4amnnsKf//xn7Nmzx3j+zTffxAsvvIB//vOfOHr0KN54442eTVK5C69t4EuO\nXBS420CCwLsrcfdhMzEFyhugFQYZS3BZDDKmoO7DcNJUFB555BHMzs4a/2nLy8v49ttv8cwzzwAA\nDh8+jCtXrmBlZaV3M1WaWgpkJUhLgUSBei92Igq8FyKverTFE6QwcFHg5c6aZzBcbCqmsLS0hNnZ\nWf+DFA6HsWvXLty8eROZTKbh56kUlhOJRDA3N7eZP79tcSUnudwHKQzJZLJhdyc630xMgZYfbZYC\nFwm+2Qq9znZvZDIMLC0tNcRzpqam+hNovHDhAs6fP288t2fPHly+fLkff37LQI1SyHUgK0H2RJCt\n2ymwmEwmG7ox8XNXghAdbRYCCQLvgcCtAl7hqHkIw8WxY8ewuLhoPHfixInNicLc3Bxu3boFz/MQ\nCoVQr9dx+/Zt7N692/rzx48fx5EjR4znIpHIZv70tkJ+e5PrwC0EGUegIKKtb4KroSsnaBcmHkOw\nJSatrq5ibW2toXU6tTpThouLFy92binQf+zMzAwOHDiAhYUFPPfcc1hYWMBDDz1kdR3oD01NTW1y\n6tsT12YuNiuBuwtkHcisRW4ZuLZ7I3PetdkKtw6kKGSzWV8UqJ+iisJw43Lfm4rCO++8g6+//hp3\n7tzB7373O2QyGSwsLOCtt97CmTNn8OGHHyKdTmN+fr7rk96uSEHgZj4FGbmlMDk56ScoUSoz5SHw\nRip87waXOMjux/zYTBRyuZxRBq2iMJqEvAH/b+3btw/ff//9IKcwVARt5pJMJrFjxw7MzMxYj7Rv\ng4wr0HksFnMKDgB/P0ZZskzn+Xwey8vLDWNlZQXLy8vI5XIN+zXwQY1VlcGyd+9eXLt2zfnvmtE4\nhNgEQe7dIC0Fch9s1ZC8EtLmlnB4K3TZQt1lKWSzWaysrPjlz3JXJ7UURgsVhSHGtn8DuQ88YYm7\nDyQGfCMXV4NWKQy8E7Pc1l2uONjch3w+b91wRUVhtFBRGFKkIHBRcK0+pNNp/994b8ZWm7PyQCMV\nOfGmq0Fl0dlsFoVCwbozk2YsjhYqCkMEdxEoMMjPyTWQg+ci8A7OtuXIZnArQW7Iynsf2Doul0ol\n55KmisLooKIwBPDMUFn5yLduox2dqOKRbwLLW7bLLs2tFjtxK8HWfJWSk/jqAu+wLNukabbiaKKi\nMEDkFm/kHlD3ZT5isZi1DFq2VyNR4MuQrQoDr20IqoAkUaANX3lRk607kzJaqCgMCNsqAFkKFC/g\nI5lMBloKFFTkTVfa7b9oEwWyFCiN2bZrk223JhWD0UVFYQDYcgUAM42ZBxF5MJEqHclSkO4DL3Li\nlkIryCpI3qadAorSUuCVjmopbA1UFAaETRgojdmVg0CiQMuPMp2ZkpOk29CJ+0CWwurqqmEpuGIK\n9Dq2ozIaqCj0GRlHoCPdzNx94NYBjyXIdGbuPtiWMjcTaOR7P3L3wVb9KBulaPv10UZFYQBIYbDl\nIci6hpmZGb9rEh/SfbC9fqtI94EsBV4azdOWZfckbZayNVBR6DNSBKSVYOumxN0H3hvBlrXYCfyb\nXm7NZktE4i4PWSmu1+SPbTEHtSiGBxWFPiJrGGSCEokBX3GQCUquHZ66NT/XtnPJZBIbGxu+CMit\n7qPRaEM6szznYmPbF1KFYThQUegzrn0bxsbGDFFwZSzKDWJbTWFuBS4KsjHs+Pi4UxQoyElVkK5A\nIy/JpkGP1fUYHlQU+oiMHfBsRRIFKQwkBnKvhl7sBQncXRa19YCs1+tOQYjH49jY2LBuK0ePZeVl\nOBxGpVJBvV432sEpg0VFoc9wS0HWKdgSlrilkEqljN9xVT9uFhItVwt5EoVoNIpKpYJYLGYcyX1w\nDVrV4HMmV6JboqZ0jopCH5E3Hd14NLilQD0RpPtAQsCPvXAfSHjIUlhfX4fneYaIUZv2eDxuiIKt\nBoIauNgEQZuvDBcqCn2GBxf5DSZdB1dcwbaxbLfdB5elAMAQhFgs1tA3IahKslKpWAWhm/NXOkdF\noY+4LAUqenKJArcUbB2Zujk/W0whkUigWq0iFAr5AkCBRVszFb6awM/L5TIAUxC6HRNROkdFoY+4\nlvxIBGjJkQZZD9zF6PX8eLyDBIFWB8LhcMNO05S8JIuipDB4nof19XVnTYbneahWq9bcBVdeg+Y5\n9AYVhT4ig3i8pRpvssqzFHmTlH7MjydRyRWFaDRqLCPKcy4E0nWo1+sol8vOYGoymfRXIlzCYstx\n4OcqDN1BRaHPcCtB7t1A9QyujVx6jRQtcgfITSGRoHwDOWxxBD5IFFz5GOVy2drOjQ9pqZD1ohWZ\n3UNFoY/I+gbZfNXVTalfPrcUBUooouer1ap1kxg6Bi1HUqCRi4FNFGxuCX+OBgUtaU4ak+geKgp9\nxOY+UDWktBQG6T6MjY3537o8BmJLTZYDcMcAuCiQIFDPx0Kh4Hdy4glO/DE1kS2Xy8b+l2RBaAJU\nd1BR6CM2n91mKQxSFEgA+EoJBR1teQjSZSBsAUASBbIMisWiscJSKpWMm58fqekLz8sgC4VWRug9\nqDB0hopCH3FZCjb3gdc39NN9IEuB5tqKewAEd1mi56vVKsbHx30xoGMqlTJ6NPCW8nReLpd9QaDX\n5MuaslxchWHzqCj0kaDVB95ebdCWArkMruU/eo5juwnlnKvVqi8GNEgQeKt4ue1cPB5HqVTyS8N5\nHYUro1OFYfOoKAwYV78BWWZcq9WMGodm27+183ddN0+zv2G7EYOOZObLSlGe2RmPx7G+vm50si6V\nSlaria6RjCm4Vj+U1lBR6CP8G65cLvtbsK2trSEWi/kJPDygB9z9BufLg7ZmLa3OIWjQz8gj/W2a\nT9C5a36yNJsHL+n1bdmeZFElEgmjIEx2rV5fX7cuk9K5CkNrqCj0EaoUlPsqkJtAosGDdvwm8jzP\nWvdAP9dMGGzBwWa5BVIwbA1h5c3PMxbD4bCfDQk0ulD0urKLE9VXJBIJP+BIosD7UPA0aUqu4jtl\nU5akJje1jopCH+GWAvnN+Xze94v5jQo0pkUDMG4GfkO1MwfbUmKrKwuuG58/R3PjgsXfDwUzuYXA\ne0yQhcCXJUkg+A5YfDs8EgW5asFdDKU1morC/Pw8vvrqKywuLuKLL77AAw88AAA4dOiQHwwLhUI4\ndeoUDh482PMJjzLSfSiVSkZvRfmNzIuT4vE4QqGQcTPRKgEJSquWAr9RpKltSyumx3xetsHFit4T\n+fnS8uEuiawHsSUubWxsYHx83OghIXe/otcolUr+cyoI7dNUFJ566im89NJLOHr0qPF8KBTCuXPn\nsH///p5NbqtBokDfZMVi0SglljEEmRJNNz+3EGi9vtWmrTJ4KVuiufonclGQG87QkQSEC4Ks5KT3\nRuf0+67UaT4SiYRTEPjcuCBoaXb7NBWFRx55BEBwV16lNXiyDWXl0fO8EpH71hRoowpDLgh0c7T6\nf+Fa1SDf25a6zI+A6b7QqNVqxtxoftRRiVsy0pWIRCJWAbK5OePj49YqS2mJ8GtKjV1UFFqno5jC\nqVOn4HkeHn30UZw8eRKTk5PWn6M9AziRSARzc3Od/PmRQ8YUgLsfXqoQdHVTHh8fN/Z14DdVO+Is\nRcFW/uwaPBgom8/KQB7NUQoWz4VwJT+5zkulUoOFYFty5ILAU6IVk6WlpQbXampqavOicOnSJczO\nzqJareLdd9/F2bNn8f7771t/9sKFCzh//rzx3J49e3D58uXN/vmRRZq03L8PhUJG7wS+gzTFEvi/\n8yausVisqQtRr9eNegJZW8DdCdsAYAT65JFWBmQfSTq3rVwApsjRY9eRgo+2Hg4yXkJiS1mR9B5c\nS6/bjWPHjmFxcdF47sSJE5sXhdnZWQC/1NgfPXoUL7/8svNnjx8/jiNHjhjPdbpxySgiTWKedAPA\njzOQINDzZF3k83lDBORo1sBVioKsPJSCIIORAKxWAj/axIB3rOZxCNu5a3WDznktxvj4uNHxCYBh\nQXBLqFKp+PEX23IsuUfbiYsXL3bPUiiVSqjVapiYmAAAfPnll3jwwQedP0+7GylosAz485FIxBcE\nMq95DCKZTBpt1fl5O6LAXQYuCPybN8h9kC4ED4raNolxiQVvcS+TkfgAzFZ20WjU6AgF3F0VoffJ\ney9wUXBZQNsxRuZy35uKwjvvvIOvv/4ad+7cwUsvvYRMJoOPPvoIr7zyiq+w+/fvx5tvvtn1SW81\npHnLRYHiCcVi0XcrZPYjxRVIDOSxFVGQMQQuDi4LgS/r2cTAlrJsEwf6hueuDz9yi4KXb3P3gfIY\nSMAoH4KE1CYI5D5Q8hgNblUod2kqCq+//jpef/31huc/++yznkxoqyOTaUgo6LFNECjzkW4cuV8E\n99mDIFEI6rFoW3lotvrARSHIUuBBU35MJBKo1Wq+xcODlnL1grsPgLl5TTgctgoCFwVqziLrJ7SA\n6i6a0dhnuAhwgaAPPnDXzCdBoMIgci1s38CbEQXZ3swlCtyykXEA/jgoyEgmP++fQC4RtVOTORF8\nhQUwG77wx5THEYlEnIJALi93MzTb0Y6KQh/hmYF85YHXCPCS4HZ98lZEgQuBTRRcCUx047gChEGW\nAhcFKhGnNGQSBFvmo8x+5KLABYHmTqJgE4RisWi4DPx9ySSo7Y6KQp/hwgCY/jJ9E9PauswY5P42\n97/pvB1RkMuPUhRswgCgYU5yfkFBxmQy6fdLkILATXpZTSkzPUlAuatBWZ0uQUgmk6hWq8b/Ab3v\nZtdtu6GiMAC47yr9WDJxbVWI/NvYNloVBT5sy5CuwZcHpTBw0XJZMryFO88toHOebi0LpahFHGCm\nSstrJzfRKRaLmJiYMCwF+ptkVWhyk4mKQo8J+rC5AltBz7uWM1stiApaYZBJQK7X4DczR5riMq2a\nVg944pYMqJJrQV2agbvBRKCxfwN/jrsXfLdsek1eucn/tqZBm6go9AhZBMSxJdq4kOm7dAPw4Fi7\nomBzD2yZgVIc+N9wiYMUK1tMgs+FCwL1YuSCwJchuQUl053pXK50UECTvyYvTOtnu7tRQUWhB7jS\ndCVcHGwFZ/LfSBT4z8iAWRDcVHe5CDZRcM2NB0oJvjELnxsFDCmrUAoC7WxtsxDoBufuCn3ry1iE\ntBRIELj+tTMDAAAWHUlEQVTokSCUy+W+NsYdFVQUeoT8BnNhsxJ4NF7Cv5Up4YmOrYiCFAZ+bCYG\ncr7ccpG/xwWBbl5+U9qWXNfX1xsEgTe4lWnSdH35igUvJuNp0Hw+JAilUsnPpFRRuIuKQpeRboNL\nFKQpLkXA9RwduSDwYqJm2G58nizExaCZpWCbq2tuPMW4Wq0iGo1ifX3dWKmgylHuMpAgVCoVRKNR\neF5jvwZeSMUtBUrMIqGh/A9alYjH4+o+WFBR6AGuSkBJsww6m0hwX5t/U7f6obbd9PxvyaPt97lY\ncTeHP29bPaEaBFuq9NjYGNbX1xtchmQy6S9hUhYjf/886UsGGnmhFImU7I1JloJyFxWFPiAj8kEC\n0SzoKH+3k2+4Zu5Bs99zxURcQVZbkJBXQtIOUNxCmJyc9EWB8gzotWydqHigUboi9XrdFwTaxFdj\nCo2oKHQRvmZvS/CxBfi4v8tpdbmylZu417Q7J1cgNhKJGKsQvHeCa9lUihHP54hGo8bPJxKJht28\nZbBSUVHYNLZvw2b5/wAC+xnwJbNWlitHGVsg1uZy2EYnf48/ViGwo6KwCVwJNJSLT9F0+maiAcD/\nFuSpuACMYB8RtAoxqgSJALeyggShVZGQ/65C0BoqCm0S9OHkFXuUasvTbj3PM/ZNpAAXRcVJAORx\nK9LMGrCJg/w9/lq2c/m3Wv357Y6KwiaxfZB5V6BUKoWJiQljeJ6HfD5vRLxpic7m1wYlN40yQSIQ\nZCXYBML1+s2eVzFwo6KwCeSHWVoKZB1MTk5iamoK6XQaU1NT8DzPKHGmAqByuWyslXMx4I+3As1W\nI1yrEy5BCBKAdoREuYuKQpsE+cN8OY22l5+enkYmk0Emk/HLewGzkYprO/WtJAacIJchSAw6vcFd\ngqJCYaKisAlcpi93H8hSSKfTmJmZwY4dO/zVBXIZ1tfXUSwWA9fKt4sw8HyFVsRAXqtWXSwVgOao\nKLSBLYbAfWFKmpFluyQO9XodlUrFFwOeQCM3VeGZi8DWEAfZxk0OCszyfAJ+bYJiDkG4RESxo6LQ\nJvRtJlN0KZ5A7oMc4+PjqNfr/mO5kQt9+HnvAZ6zP4rI+AFZUrLhLJ2n02lkMhlMT0/7bdtIJGx7\nR8jyaaU7qCi0ATd1bV2QeE4C5SnwrsUkCvxn+M0RjUaNPRaA0VyBcPnsZE1xYeTnJArpdBqTk5NI\npVJWa4r3hlRB6D4qCm3C4wey/ViQILRqKZDwyL4Eo4bN/5e9DlKplJ/DkUwmMTU1hZmZGX+1xiYK\nPH2cxyCU7qGi0CbSfeAmsMxm5MLARYFnO3JR4H0IuSCM0ofeFhDkMRgeiKX8DXIV0uk0pqenffeB\nREOKgisoqXQHFYU2cLkPNkGwWQy1Ws1qKfA6CcAUBP7BHxX3AQhO7orH474opNNpf/CcDpf74Fqy\nVLqHikKb0IdbFj9xUbBZCkExBS4MvJJyVANprhyEcDhspIGTKFAeRzqdbsgClaJge32lu6gotIHM\nSeDCIK0FmzC0YinwJUne7n1UkCsO/FtdWgqU8ZnJZLBjxw6k02mjPTstUdK1kh2SRum6jBIqCptA\nuhHN9mOgmz4SiRiCQR9+8qupMxBlOdJr04efdxJydVDq9fvm75+fy2ti6ykxMTFhuAh0TrGEyclJ\nP/5C4kmiSa8RBHe7+L4W1PHJtcP2KLll/UBFoY+EQr90BpJdhai5KDUvdQ3quSC7L/PzXs+/lc1g\nXMKYSqWQyWQwMzPT4DJIq0AmLbViFfCmsJQkViqV/JHP51EoFFAqlfwuz3z3auUXVBT6CO8/mEql\nDEEAgEQi4X+Ai8WicU47LMkdnegx0PskJ37jc+vI5kbZRjKZ9K0CHmCcnJxEMpl0ZjIGxQ5kd2m6\nJlwUCoUCCoUC1tbW/OvK97JUa8FERaGPhEIhw6embdQAYGxsDOPj4/4HOJFIoFgs+q3DqLEp79bE\nN0ztx4fatV8k3cQym1PGVcg6IneJBrcUZNcqshRsyMa20lLgTVrz+bxhKfD9LNVSMGkqCtlsFqdP\nn8aNGzcQi8Wwd+9evP3228hkMrh27RrOnDmDbDaL6elpvPfee7jvvvv6Me+RhFsKyWTS2N6dSq7z\n+byR/89vjLGxMVQqFVQqFUQiEVQqFQDmJi+9hOdm8NUTHiehmg/eXIafy0AiT2Ci98uHy32wtb63\nWQrU0IZEgVsKsgW88gtNRSEUCuH3v/89HnvsMQDAe++9hz/+8Y9455138Oabb+KFF17A4cOH8fnn\nn+ONN97AhQsXej7pUYUsBVqeBNDQg4H71TLYSGY6v0no25G3Ou8V3FKQVgBfZkylUn4AlZ/zAKJc\nrqX6BpnG3KogtGMp8Maw6j400lQU0um0LwgA8PDDD+Mvf/kLlpeX8e233+KZZ54BABw+fBh/+MMf\nsLKygkwm07sZjzDcUgBMQSATmkfc+bcktWfjPjbdBGNjY0bj117Pnxd+cWtAugU0KBlpfHzccDXk\nuQxc8sBmUFl5kKVAorC2ttZgKaj7YKetmILnebh06RKefPJJLC0tYXZ21ih22bVrF27evNkgCrlc\nDrlcznguEolgbm6uw+mPFrT6AJiCQEtkqVSqwWWg68v3MADuWgh8O/VWd4naLDygKJdVqakMDyDK\nkUgkjMCkrDKV+0LSsRVBsFkKXBRsMQXuPmxHUVhaWjI2KgaAqamp9kTh7NmzSKVSOHbsGP73v/+1\n/HsXLlzA+fPnjef27NmDy5cvt/Pnh4Yg89W1zRr/pgfufuvy/QwikYi/qiBfz/M8v8rQNTY2Nnr6\nvmOxmC8ANvegFVGQe2Pw0QweO5FLsp7n+Ss1xWLRD9jSsMUUKpWKf/23I8eOHcPi4qLx3IkTJ1oX\nhfn5eVy/fh0ff/wxAGBubg63bt0ytjO7ffs2du/e3fC7x48fx5EjR4znRn2rLn6z2jYoabZpK0/2\nIXg7NxkEC4fDfh4DLVXK816LQjQadQYOpfsg6xa4JdBKirJtxyy5DMuTkDY2NlAoFLCysoJsNovV\n1VXfQiXXQa48bPc8hYsXL27eUvjggw9w5coVfPLJJ775OzMzgwMHDmBhYQHPPfccFhYW8NBDD1nj\nCZS9tpWwiYAUAvmY4DECLgy8RbwUhGg06m+tTvtFyHP5H9xtKPFKBgzpMXclgkQhqJjJ5RYAMFwD\nuqn5eT6fN0SBhEHGE+h6bXf3weW+NxWFq1ev4pNPPsG+ffvw29/+FgBw77334ty5c3jrrbdw5swZ\nfPjhh0in05ifn+/urIcQeYO3YyFIYQDgrxpQvIFEge/YTD58KpXyzV7b6LUZzNO05XIkX5Lko5ml\n4BIG23Xj28hTlic/z+fzyGazvjDYLAV+/fg29cpdmorCAw88gG+//db6b/fffz8+/fTTrk9qFAiy\nFFz7HcpvIx5M8zzPD+IlEgkAdwN71JRE5u/LLej6sfpga6NmazLDh0sUXLgElTe85fEDOl9bW8Pq\n6qphKaytrRmWAl0v7j5sV0vBhWY0toErqGizFJoJgxQEnsQkLQT+AeZpzvK8H3kKtuIv2x6acrS6\nuiCvMb+OJArlctkPJvJVBbIMuOvALYVSqWSNR6gomKgotInNdZAFSu24D3QuKw1pZYLf9LIYylYc\n1Utk6bg8D+rUzBukyNeUuISWrCNKSsrn81hbW/NvfhIA/jiXy/nCsb6+bt3tW90HExWFTdDsxufi\nIIXCVnZM57zasBVxCXJNekVQ2bRrtFPlGHRNeUyB8g9yuZzTXaBzelwul63WHj1WfkFFoU2kdcC/\nybmfTx9eCoSVSqWGjD3bPgZBS7VbramIzeri1pF0kaiGgW50bg1QYNGVqESrDUpzVBTaQH5rbWxs\nGN+CMoOOFzZFo1FUq1VrnwGXab1VsX1DkyC4lhsrlQoKhYJvEdAgK2FtbQ2FQsFa8LRd8xA2i4pC\nm0hR4IVJsiqPC0IkEkG1WjWW8mKxmP9hHfVkrlYJcoN4zQIJLG8yw4OJcpAocMuAFzwpraOi0CYk\nCPybnYSC+7q8sIki9BsbG771QB9WvtJASUxb3WJwxQxIFOga0gqDXGmQsQI6LxaLvsumyUmbR0Wh\nDaT7QM/R47GxMf8DLfcqCIVC/oeelg95M1PpX29VYXAtOZIocGHlNz8JAMUKbEeevqx5CJtHRaFN\n6ANM51wUIpEISqWS0Q+BxxwoGCnTl+PxeIOJu9WFgQdr5eoCuQpra2t+zCCbzfoCwIue+LFcLvvX\nmA+NKbSHikIb2L7dyAUgN6BUKhkWAkEfeh5DoOxFvkWcTOTZSsLgWm6knAHpPtCqwvLyMlZWVpDL\n5YxGrLKXJaV68zwETWNuHxWFNuHbuIVCId+NIGGQDUe5aWyrZ5Dlu1tNCGzYkr5sokCWwvLyMn7+\n+WfkcrnAbtdcXF35HEpzVBQ2gSvhhUfPpevA26/JVGG+47St4xDvNyAzITm9FhMZ95DnQc9xF8GV\nhyBzDyhVmactUxCR54HQ6HWV6HZBRaGLSN+YWwzA3cYqXCh4kU8ymfQtDZ7DQOe2gqJWCoy6/R5d\n2Zs210omelGCl+xKXa1WUSgUcOfOHSwvL2N5ednIP+CNUfjvae1C91FR6CLkH/Mcfd4+DbjbTo3M\nZV71R+3YbMPzPOsGq/1exrSZ/XRTcgvAZRXwUm+eoETJSdls1hi5XA7FYtFfWZDNUVQQuo+KQhfh\n8QPqmwigIeDFf4aW4IrForF/Iu9oRLEIz/N8V4LOARgBzV6/P24NySNZPbYVAC6A0uynQXEEnpAk\nLQVuaXBRUGHoHioKXYSLgnQR+IeYlwDztGhqfjoxMeHfACQIZCVQNSLBE6j6YS3IwKCt9oOb+Pwx\nrwORWYskjLKvIp3z7lL872qVY/dRUegylO0ImO4E7/TDXYZCoYBkMolcLofJyUnjG5FudGq0IgUB\ngOFC9BppKXAx4K4BDwTy53gauK1JilxulIPa2HMx0ByE7qOi0EW4pUCCUK1W/epI/o1JNwLfXo3K\ne+nDzxuvUHt0DglCP28Kmyhwi8DWJo1bCLLLMrcIeM2CXGHgmaCuRjZKd1BR6CL8g8rrI3hZNAmC\nbQPWfD7f4DLQ3hCpVAqxWMz/W+FweCA3BQ80SpeB3/yy27Tcf4GnLdM55RrwuAF/zHM5bEelO6go\ndJmgG5RuZH4j0bIj5SnYliR5HgN/Th577UY0K20mwXMN6qNIYiDFgWIGcqib0F9UFPqM9Mt5cLBS\nqfhl11Q7AfwS3KtWq0gmkw39GORGrL2E5uEaLveBHsuaBVvfAx4vUEtgMKgo9BGZ3itFgfcepIAi\nD0yOj487t1zrpyi4lhspXmILNkrXgobc09HW65KundIfVBT6DBcG+RxVWVIWJN2E9C0bj8cNAZDn\nvV6SlAFGuTwolyDluW0jG7IUuJUgMyJVEPqLikKfkYLAn+M9GkkQaBkvl8shFos19HcM2rK928i+\nlPLcZUHIgKQtJsFLnHVlYbCoKPQRWSMA/HKjyf0QuCDwVGdeZGU79mP+tpoGKRZBw7W6wJuh2Cod\nlf6hotBnuFVAyUn0mG4uV4NX1wat/SqKst2wtoIoW12Eq8+BXF2wlTurMPQXFYU+Qx92Wwl0KBTy\nayZs1oDcXUn+fr9oViptyyOQ5zZrwFWarfQXFYUBoTeAMqz0J2leUZSRQUVBURSDpu5DNpvF6dOn\ncePGDcRiMezduxdvv/02MpkMDh065O9vEAqFcOrUKRw8eLAf81YUpVd4Tchms94333zjP56fn/de\ne+01z/M874knnvCuXr3a7CUC2bt3rwdAhw4dfRp79+4NvCebug/pdBqPPfaY//jhhx/Gjz/+6D/W\nIJmibC3aWn3wPA+XLl3Ck08+6T936tQpeJ6HRx99FCdPnsTk5GTXJ6koSv8IeW181b/99tv46aef\ncP78eQDArVu3MDs7i2q1infffReFQgHvv/9+w+9Rzz1OJBLB3Nwc9u3bh++//77Dt6EoSqvs3bsX\n165dw9LSUkNb/KmpqdYthfn5eVy/fh0ff/yx/9zs7CwAIBqN4ujRo3j55Zetv3vhwgVfSIg9e/bg\n8uXLLb8RRVG6y7Fjx7C4uGg8d+LEidZE4YMPPsCVK1fwySef+DX+pVIJtVoNExMTAIAvv/wSDz74\noPX3jx8/jiNHjhjPbZet1xVlWLl48aLVUmjqPly9ehXPPvss9u3bh3g8DgC49957cfr0abz66qt+\nTvv+/fvx+uuvY+fOnW1NTN0HRekv5D64aCum0AtUFBSlvzQTBc1oVBTFQEVBURQDFQVFUQxUFBRF\nMVBRUBTFQEVBURQDFQVFUQxUFBRFMVBRUBTFQEVBURQDFQVFUQxUFBRFMVBRUBTFQEVBURQDFQVF\nUQwGvm3cPffcM+gpKMq2otk9N/AmK4qiDBdD4z4sLS3h0KFDWFpaGvRUrOj8OkPnt3n6PbehEYVa\nrYbFxcWGRpLDgs6vM3R+m6ffcxsaUVAUZThQUVAUxUBFQVEUg8hbb7311qAnQcTjcTz++OP+/hLD\nhs6vM3R+m6efc9MlSUVRDNR9UBTFQEVBURSDgac5A8C1a9dw5swZZLNZTE9P47333sN999036Gn5\nHDp0CIlEArFYDKFQCKdOncLBgwcHNp/5+Xl89dVXWFxcxBdffIEHHngAwPBcR9f8huE6ZrNZnD59\nGjdu3EAsFsPevXvx9ttvI5PJDMX1C5pf366fNwS8+OKL3sLCgud5nvePf/zDe/HFFwc8I5NDhw55\nV69eHfQ0fP7zn/94N2/e9A4dOuT93//9n//8sFxH1/yG4Tpms1nvm2++8R/Pz897r732mud5w3H9\ngub3xBNP9OX6Ddx9WF5exrfffotnnnkGAHD48GFcuXIFKysrA57ZXTzPgzdE8dhHHnkEs7OzxpyG\n6Tra5gcMx3VMp9N47LHH/McPP/wwfvzxx6G5fq75Ef24fgN3H5aWljA7O4tQKAQACIfD2LVrF27e\nvIlMJjPg2d3l1KlT8DwPjz76KE6ePInJyclBT8lAr2P7eJ6HS5cu4cknnxzK68fnR/Tj+g3cUhgF\nLl26hL///e/461//inq9jrNnzw56SiPJsF3Hs2fPIpVK4dixYwOdhws5v35dv4GLwtzcHG7duuWb\nRfV6Hbdv38bu3bsHPLO7zM7OAgCi0SiOHj2K//73vwOeUSN6Hdtjfn4e169fx5/+9CcAw3f95PyA\n/l2/gYvCzMwMDhw4gIWFBQDAwsICHnrooaExeUulEvL5vP/4yy+/xIMPPjjAGZnQh1ivY+t88MEH\nuHLlCj788EOMjf3iQQ/T9bPNr5/XbygyGr/77jucOXMGuVwO6XQa8/Pz2Ldv36CnBQC4ceMGXn31\nVdTrddTrdezfvx+vv/46du7cObA5vfPOO/j6669x584dTE9PI5PJYGFhYWiuo21+H330EV555ZWB\nX8erV6/i2Wefxb59+/yU4XvvvRfnzp0biuvnmt/p06f79jkcClFQFGV4GLj7oCjKcKGioCiKgYqC\noigGKgqKohioKCiKYqCioCiKgYqCoigGKgqKohj8P9TxOUEZnVRqAAAAAElFTkSuQmCC\n",
            "text/plain": [
              "\u003cmatplotlib.figure.Figure at 0xe994c5abe10\u003e"
            ]
          },
          "metadata": {
            "tags": []
          },
          "output_type": "display_data"
        }
      ],
      "source": [
        "from matplotlib import pyplot as plt\n",
        "\n",
        "plt.imshow(federated_train_data[5][-1]['x'][-1].reshape(28, 28), cmap='gray')\n",
        "plt.grid(False)\n",
        "plt.show()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "J-ox58PA56f8"
      },
      "source": [
        "### On combining TensorFlow and TFF\n",
        "\n",
        "In this tutorial, for compactness we immediately decorate functions that\n",
        "introduce TensorFlow logic with `tff.tf_computation`. However, for more complex\n",
        "logic, this is not the pattern we recommend. Debugging TensorFlow can already be\n",
        "a challenge, and debugging TensorFlow after it has been fully serialized and\n",
        "then re-imported necessarily loses some metadata and limits interactivity,\n",
        "making debugging even more of a challenge.\n",
        "\n",
        "Therefore, **we strongly recommend writing complex TF logic as stand-alone\n",
        "Python functions** (that is, without `tff.tf_computation` decoration). This way\n",
        "the TensorFlow logic can be developed and tested using TF best practices and\n",
        "tools (like eager mode), before serializing the computation for TFF (e.g., by invoking `tff.tf_computation` with a Python function as the argument)."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "RSd6UatXbzw-"
      },
      "source": [
        "### Defining a loss function\n",
        "\n",
        "Now that we have the data, let's define a loss function that we can use for\n",
        "training. First, let's define the type of input as a TFF named tuple. Since the\n",
        "size of data batches may vary, we set the batch dimension to `None` to indicate\n",
        "that the size of this dimension is unknown."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 419,
          "status": "ok",
          "timestamp": 1570209779265,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "653xv5NXd4fy",
        "outputId": "28264cf1-2b98-4f2f-f7ad-5f9c690e5e71"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "'\u003cx=float32[?,784],y=int32[?]\u003e'"
            ]
          },
          "execution_count": 145,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "BATCH_SPEC = collections.OrderedDict([\n",
        "    ('x', tf.TensorSpec(shape=[None, 784], dtype=tf.float32)),\n",
        "    ('y', tf.TensorSpec(shape=[None], dtype=tf.int32)),\n",
        "])\n",
        "BATCH_TYPE = tff.to_type(BATCH_SPEC)\n",
        "\n",
        "str(BATCH_TYPE)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "pb6qPUvyh5A1"
      },
      "source": [
        "You may be wondering why we can't just define an ordinary Python type. Recall\n",
        "the discussion in [part 1](custom_federated_algorithms_1.ipynb), where we\n",
        "explained that while we can express the logic of TFF computations using Python,\n",
        "under the hood TFF computations *are not* Python. The symbol `BATCH_TYPE`\n",
        "defined above represents an abstract TFF type specification. It is important to\n",
        "distinguish this *abstract* TFF type from concrete Python *representation*\n",
        "types, e.g., containers such as `dict` or `collections.namedtuple` that may be\n",
        "used to represent the TFF type in the body of a Python function. Unlike Python,\n",
        "TFF has a single abstract type constructor `tff.NamedTupleType` for tuple-like\n",
        "containers, with elements that can be individually named or left unnamed. This\n",
        "type is also used to model formal parameters of computations, as TFF\n",
        "computations can formally only declare one parameter and one result - you will\n",
        "see examples of this shortly.\n",
        "\n",
        "Let's now define the TFF type of model parameters, again as a TFF named tuple of\n",
        "*weights* and *bias*."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 447,
          "status": "ok",
          "timestamp": 1570209779730,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "Og7VViafh-30",
        "outputId": "7c2584c1-7340-4923-e211-6c4087813ca5"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "\u003cweights=float32[784,10],bias=float32[10]\u003e\n"
          ]
        }
      ],
      "source": [
        "MODEL_SPEC = collections.OrderedDict([\n",
        "    ('weights', tf.TensorSpec(shape=[784, 10], dtype=tf.float32)),\n",
        "    ('bias', tf.TensorSpec(shape=[10], dtype=tf.float32)),\n",
        "])\n",
        "MODEL_TYPE = tff.to_type(MODEL_SPEC)\n",
        "print(MODEL_TYPE)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "iHhdaWSpfQxo"
      },
      "source": [
        "With those definitions in place, now we can define the loss for a given model, over\n",
        "a single batch. Note how in the body of `batch_loss`, we access named tuple\n",
        "elements using the dot (`X.Y`) notation, as is standard for TFF."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "4EObiz_Ke0uK"
      },
      "outputs": [],
      "source": [
        "# NOTE: `forward_pass` is defined separately from `batch_loss` so that it can \n",
        "# be later called from within another tf.function. Necessary because a\n",
        "# @tf.function  decorated method cannot invoke a @tff.tf_computation.\n",
        "\n",
        "@tf.function\n",
        "def forward_pass(model, batch):\n",
        "  predicted_y = tf.nn.softmax(\n",
        "      tf.matmul(batch['x'], model['weights']) + model['bias'])\n",
        "  return -tf.reduce_mean(\n",
        "      tf.reduce_sum(\n",
        "          tf.one_hot(batch['y'], 10) * tf.math.log(predicted_y), axis=[1]))\n",
        "\n",
        "@tff.tf_computation(MODEL_TYPE, BATCH_TYPE)\n",
        "def batch_loss(model, batch):\n",
        "  return batch_loss(model, batch)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "8K0UZHGnr8SB"
      },
      "source": [
        "As expected, computation `batch_loss` returns `float32` loss given the model and\n",
        "a single data batch. Note how the `MODEL_TYPE` and `BATCH_TYPE` have been lumped\n",
        "together into a 2-tuple of formal parameters; you can recognize the type of\n",
        "`batch_loss` as `(\u003cMODEL_TYPE,BATCH_TYPE\u003e -\u003e float32)`."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 451,
          "status": "ok",
          "timestamp": 1570209780675,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "4WXEAY8Nr89V",
        "outputId": "fbd8ad63-ca6f-49bf-f0c8-0adf4c35a6c1"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "'(\u003c\u003cweights=float32[784,10],bias=float32[10]\u003e,\u003cx=float32[?,784],y=int32[?]\u003e\u003e -\u003e float32)'"
            ]
          },
          "execution_count": 148,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "str(batch_loss.type_signature)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "pAnt_UcdnvGa"
      },
      "source": [
        "As a sanity check, let's construct an initial model filled with zeros and\n",
        "compute the loss over the batch of data we visualized above."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 420,
          "status": "ok",
          "timestamp": 1570209781117,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "U8Ne8igan3os",
        "outputId": "d0b9d7aa-4e01-49cd-91ff-8f76aa82257f"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "2.3025854"
            ]
          },
          "execution_count": 149,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "initial_model = {\n",
        "    'weights': np.zeros([784, 10], dtype=np.float32),\n",
        "    'bias': np.zeros([10], dtype=np.float32)\n",
        "}\n",
        "\n",
        "sample_batch = federated_train_data[5][-1]\n",
        "\n",
        "batch_loss(initial_model, sample_batch)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "ckigEAyDAWFV"
      },
      "source": [
        "Note that we feed the TFF computation with the initial model defined as a\n",
        "`dict`, even though the body of the Python function that defines it consumes\n",
        "model parameters as `model.weight` and `model.bias`. The arguments of the call\n",
        "to `batch_loss` aren't simply passed to the body of that function.\n",
        "\n",
        "\n",
        "What happens when we invoke `batch_loss`?\n",
        "The Python body of `batch_loss` has already been traced and serialized  in the above cell where it was defined.  TFF acts as the caller to `batch_loss`\n",
        "at the computation definition time, and as the target of invocation at the time\n",
        "`batch_loss` is invoked. In both roles, TFF serves as the bridge between TFF's\n",
        "abstract type system and Python representation types. At the invocation time,\n",
        "TFF will accept most standard Python container types (`dict`, `list`, `tuple`,\n",
        "`collections.namedtuple`, etc.) as concrete representations of abstract TFF\n",
        "tuples. Also, although as noted above, TFF computations formally only accept a\n",
        "single parameter, you can use the familiar Python call syntax with positional\n",
        "and/or keyword arguments in case where the type of the parameter is a tuple - it\n",
        "works as expected."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "eB510nILYbId"
      },
      "source": [
        "### Gradient descent on a single batch\n",
        "\n",
        "Now, let's define a computation that uses this loss function to perform a single\n",
        "step of gradient descent. Note how in defining this function, we use\n",
        "`batch_loss` as a subcomponent. You can invoke a computation constructed with\n",
        "`tff.tf_computation` inside the body of another computation, though typically\n",
        "this is not necessary - as noted above, because serialization looses some\n",
        "debugging information, it is often preferable for more complex computations to\n",
        "write and test all the TensorFlow without the `tff.tf_computation` decorator."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "O4uaVxw3AyYS"
      },
      "outputs": [],
      "source": [
        "@tff.tf_computation(MODEL_TYPE, BATCH_TYPE, tf.float32)\n",
        "def batch_train(initial_model, batch, learning_rate):\n",
        "  # Define a group of model variables and set them to `initial_model`.\n",
        "  model_vars = collections.OrderedDict([\n",
        "      (name, tf.Variable(name=name, initial_value=value))\n",
        "      for name, value in initial_model.items()\n",
        "  ])\n",
        "  @tf.function\n",
        "  def _train_on_batch(model_vars, batch):\n",
        "    # Perform one step of gradient descent using loss from `batch_loss`.\n",
        "    optimizer = tf.keras.optimizers.SGD(learning_rate)\n",
        "    with tf.GradientTape() as tape:\n",
        "      loss = forward_pass(model_vars, batch)\n",
        "    grads = tape.gradient(loss, model_vars)\n",
        "    optimizer.apply_gradients(\n",
        "        zip(tf.nest.flatten(grads), tf.nest.flatten(model_vars)))\n",
        "    return model_vars\n",
        "  return _train_on_batch(model_vars, batch)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 417,
          "status": "ok",
          "timestamp": 1570209782394,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "Y84gQsaohC38",
        "outputId": "c35a322d-15cd-415c-f046-c88853b5f6ba"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "'(\u003c\u003cweights=float32[784,10],bias=float32[10]\u003e,\u003cx=float32[?,784],y=int32[?]\u003e,float32\u003e -\u003e \u003cweights=float32[784,10],bias=float32[10]\u003e)'"
            ]
          },
          "execution_count": 151,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "str(batch_train.type_signature)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "ID8xg9FCUL2A"
      },
      "source": [
        "When you invoke a Python function decorated with `tff.tf_computation` within the\n",
        "body of another such function, the logic of the inner TFF computation is\n",
        "embedded (essentially, inlined) in the logic of the outer one. As noted above,\n",
        "if you are writing both computations, it is likely preferable to make the inner\n",
        "function (`batch_loss` in this case) a regular Python or `tf.function` rather\n",
        "than a `tff.tf_computation`. However, here we illustrate that calling one\n",
        "`tff.tf_computation` inside another basically works as expected. This may be\n",
        "necessary if, for example, you do not have the Python code defining\n",
        "`batch_loss`, but only its serialized TFF representation.\n",
        "\n",
        "Now, let's apply this function a few times to the initial model to see whether\n",
        "the loss decreases."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "8edcJTlXUULm"
      },
      "outputs": [],
      "source": [
        "model = initial_model\n",
        "losses = []\n",
        "for _ in range(5):\n",
        "  model = batch_train(model, sample_batch, 0.1)\n",
        "  losses.append(batch_loss(model, sample_batch))"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 416,
          "status": "ok",
          "timestamp": 1570209783978,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "3n1onojT1zHv",
        "outputId": "24ed4703-fa57-4613-d8a1-99fcd3cebcc1"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "[0.19690022, 0.13176313, 0.10113226, 0.082738124, 0.0703014]"
            ]
          },
          "execution_count": 153,
          "metadata": {
            "tags": []
          },
          "output_type": "execute_result"
        }
      ],
      "source": [
        "losses"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "EQk4Ha8PU-3P"
      },
      "source": [
        "### Gradient descent on a sequence of local data\n",
        "\n",
        "Now, since `batch_train` appears to work, let's write a similar training\n",
        "function `local_train` that consumes the entire sequence of all batches from one\n",
        "user instead of just a single batch. The new computation will need to now\n",
        "consume `tff.SequenceType(BATCH_TYPE)` instead of `BATCH_TYPE`."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {},
        "colab_type": "code",
        "id": "EfPD5a6QVNXM"
      },
      "outputs": [],
      "source": [
        "LOCAL_DATA_TYPE = tff.SequenceType(BATCH_TYPE)\n",
        "\n",
        "@tff.federated_computation(MODEL_TYPE, tf.float32, LOCAL_DATA_TYPE)\n",
        "def local_train(initial_model, learning_rate, all_batches):\n",
        "\n",
        "  # Mapping function to apply to each batch.\n",
        "  @tff.federated_computation(MODEL_TYPE, BATCH_TYPE)\n",
        "  def batch_fn(model, batch):\n",
        "    return batch_train(model, batch, learning_rate)\n",
        "\n",
        "  return tff.sequence_reduce(all_batches, initial_model, batch_fn)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 0,
      "metadata": {
        "colab": {
          "height": 34
        },
        "colab_type": "code",
        "executionInfo": {
          "elapsed": 420,
          "status": "ok",
          "timestamp": 1570209784877,
          "user": {
            "displayName": "",
            "photoUrl": "",
            "userId": ""
          },
          "user_tz": 420
        },
        "id": "sAhkS5yKUgjC",
        "outputId": "3d098970-5cc5-456f-92b5-77e502d208b1"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "'(\u003c\u003cweights=float32[784,10],bias=float32[10]\u003e,float32,\u003cx=float32[?,784],y=int32[?]\u003e*\u003e -\u003e \u003cweights=float32[784,10],bias=float32[10]\u003e)'"
            ]
          },
          "execution_count": 155,
          "metadata": {
            "tags": []
          },