{ "cells": [ { "cell_type": "markdown", "metadata": { "comet_cell_id": "226b8e9ebe71d" }, "source": [ "# Evaluating Machine Learning Models\n", "\n", "First, we will use **fairkit-learn** to train and evaluate models using the ProPublica COMPAS Dataset.\n", "\n", "Next, we will use **AI Fairness 360** to train and evaluate models using the German Credit Dataset. \n", "\n", "Finally, we will use **scikit-learn** to train and evaluate models using the Adult Census Income Dataset. \n", "\n", "Along with the provided tooling and resources within this notebook, you will be allowed to use outside resources (e.g. Google) to help you complete this exercise.\n", "\n", "Please plan to complete the entire exercise in one sitting. Make sure you have time and your computer is plugged into power before you start; you'll be running machine learning algorithms, which will wear your battery down.\n", "\n", "Responses for this exercise will be entered in the Evaluating ML Models Exercise Response Form. You will first be asked some demographic questions then each page that follows maps to each task you complete. You will be expected to enter responses regarding each task and will have to submit for your assignment to be graded.\n", "\n", "You may go back and forth between the form and Jupyter Notebook while completing the exercise, but once you've submitted the form you will not be able to go back. So please make sure you're happy with all your responses before submitting the form.\n", "\n", "\n", "## Models\n", "\n", "Because there are a variety of models provided by scikit-learn and AI Fairness 360, we will only use a subset for this assignment. The models you will be evaluating are as follows:\n", "\n", "* **Logistic Regression**: a Machine Learning algorithm which is used for the classification problems, it is a predictive analysis algorithm and based on the concept of probability. [More info here.](https://machinelearningmastery.com/logistic-regression-for-machine-learning/) [Scikit-learn documentation](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html)\n", "* **K Nearest Neighbor Classifier**: a model that classifies data points based on the points that are most similar to it. It uses test data to make an “educated guess” on what an unclassified point should be classified as. [More info here.](https://towardsdatascience.com/machine-learning-basics-with-the-k-nearest-neighbors-algorithm-6a6e71d01761) [Scikit-learn documentation](https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KNeighborsClassifier.html)\n", "* **Random Forest**: an ensemble machine learning algorithm that is used for classification and regression problems. Random forest applies the technique of bagging (bootstrap aggregating) to decision tree learners. [More info here.](https://towardsdatascience.com/understanding-random-forest-58381e0602d2) [Scikit-learn documentation](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html)\n", "* **Support Vector Classifier**: a discriminative classifier formally defined by a separating hyperplane. In other words, given labeled training data (supervised learning), the algorithm outputs an optimal hyperplane which categorizes new examples. In two dimentional space this hyperplane is a line dividing a plane in two parts where in each class lay in either side. [More info here.](https://medium.com/machine-learning-101/chapter-2-svm-support-vector-machine-theory-f0812effc72) [Scikit-learn documentation](https://scikit-learn.org/stable/modules/generated/sklearn.svm.SVC.html)\n", "* **Adversarial Debiasing**: learns a classifier to maximize prediction accuracy and simultaneously reduce an adversary's ability to determine the protected attribute from the predictions. [Documentation.](https://aif360.readthedocs.io/en/latest/modules/inprocessing.html#adversarial-debiasing)\n", "\n", "The Adversarial Debiasing model is only available for use when using AI Fairness 360 or fairkit-learn.\n", "\n", "\n", "## Bias Mitigating Algorithms\n", "\n", "When using AI Fairness 360 and fairkit-learn, you will have access to the following bias mitigating pre- and post- processing algorithms:\n", "\n", "* **Pre-processing algorithms**\n", " - *Disparate Impact Remover*: a preprocessing technique that edits feature Values increase group fairness while preserving rank-ordering within groups\n", " - *Reweighing*: a preprocessing technique that Weights the examples in each (group, label) combination differently to ensure fairness before classification\n", " \n", " \n", "* **Post-processing algorithms**\n", " - *Calibrated Equalized Odds*: a post-processing technique that optimizes over calibrated classifier score outputs to find probabilities with which to change output labels with an equalized odds objective\n", " - *Reject Option Classification*: a postprocessing technique that gives favorable outcomes to unpriviliged groups and unfavorable outcomes to priviliged groups in a confidence band around the decision boundary with the highest uncertainty \n", "\n", "\n", "## Model Evaluation Metrics\n", "\n", "To evaluate your trained models, you will be using one or more of the following metrics:\n", "\n", "* **Performance metrics**:\n", " - *Accuracy Score* (UnifiedMetricLibrary.accuracy_score) When evaluating a model with this metric, the goal is to *maximize* the value.\n", " \n", " \n", "* **Fairness Metrics**:\n", " - *Equal Opportunity Difference* (UnifiedMetricLibrary.equal_opportunity_difference) also known as \"true positive rate difference\". When evaluating a model with this metric, the goal is to *minimize* the value.\n", " - *Average Odds Difference* (UnifiedMetricLibrary.average_odds_difference) When evaluating a model with this metric, the goal is to *minimize* the value.\n", " - *Statistical Parity Difference* (UnifiedMetricLibrary.mean_difference) also known as \"mean difference\". When evaluating a model with this metric, the goal is to *minimize* the value.\n", " - *Disparate Impact* (UnifiedMetricLibrary.disparate_impact) When evaluating a model with this metric, the goal is to *maximize* the value.\n", " \n", " \n", "* **Overall Model Quality**:\n", " - *Classifier Quality Score* (classifier_quality_score) When evaluating a model with this metric, the goal is to *maximize* the value." ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "b4bcefbe0a41f" }, "source": [ "## Getting started\n", "\n", "Before beginning the exercise, make sure to run the following cells to install and import all necessary packages. If you need any additional packages, add the installation and import statement(s) to the cell below and re-run the cell before adding and running code that uses the additional packages.\n", "\n", "The exercise instructions includes information on how to use Jupyter notebook (e.g, how to add and run code cells). " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "comet_cell_id": "38bcbb4c093c5" }, "outputs": [], "source": [ "!pip install -q numpy==1.15.4\n", "!pip install -q pandas==0.23.3\n", "!pip install -q six==1.11.0\n", "!pip install -q scikit_learn==0.20.0\n", "!pip install -q aif360==0.2.0" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "comet_cell_id": "2c7cd3e562e7" }, "outputs": [], "source": [ "# Load all necessary packages\n", "import numpy as np\n", "import sklearn as skl\n", "import six\n", "import tensorflow as tf\n", "\n", "# dataset\n", "from aif360.datasets import CompasDataset\n", "\n", "# metrics\n", "from fklearn.metric_library import UnifiedMetricLibrary, classifier_quality_score\n", "\n", "# models\n", "from fklearn.scikit_learn_wrapper import LogisticRegression, KNeighborsClassifier, RandomForestClassifier, SVC\n", "from aif360.algorithms.inprocessing import AdversarialDebiasing\n", "\n", "# pre/post-processing algorithms\n", "from aif360.algorithms.preprocessing import DisparateImpactRemover, Reweighing\n", "from aif360.algorithms.postprocessing import CalibratedEqOddsPostprocessing, RejectOptionClassification\n", "\n", "# search\n", "from fklearn.fair_selection_aif import ModelSearch, DEFAULT_ADB_PARAMS" ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "46991f420accb" }, "source": [ "# Tutorial 1: fairkit-learn\n", "\n", "First, we show you how to train and evaluate models using fairkit-learn. You will use the knowledge from this tutorial to complete Task 1, so please read thoroughly and execute the code cells in order.\n", "\n", "## Step 1: Import the dataset\n", "\n", "First we need to import the dataset we will use for training and testing our model.\n", "\n", "Below, we provide code that imports the COMPAS dataset. \n", "**Note: a warning may pop up when you run this cell. As long as you don't see any errors in the code, it is fine to continue.**" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "comet_cell_id": "936840797dfba" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "WARNING:root:Missing Data: 5 rows removed from CompasDataset.\n" ] } ], "source": [ "data_orig = CompasDataset()" ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "1f7cb8ab1c822" }, "source": [ "## Step 2: Set protected attributes\n", "\n", "To use the grid search functionality provided by fairkit-learn, we again need to specify the privileged and unprivileged (protected) attributes. \n", "\n", "Below we provide code that stores the protected attributes (*race* is 0 for \"Not Caucasian\", *sex* is 0 for \"Male\")." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "comet_cell_id": "8f3e98f0712d1" }, "outputs": [], "source": [ "unprivileged = [{'race': 0, 'sex': 0}]\n", "privileged = [{'race': 1, 'sex': 1}]" ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "8e6392efce817" }, "source": [ "## Step 3: Specify parameters for grid search\n", "\n", "Now we need to specify the various parameters required for the grid search provided by fairkit-learn. Each search parameter is a dictionary of options to include in the search. For each search parameter, you can input one or multiple options to consider. \n", "\n", "Below we provide code that sets parameters for a simple grid search across different hyper-parameter values for the Logistic Regression model, with and without the specified pre-/post-processing algorithms. We specify all performance and fairness metrics for the search -- given the way the classifier quality score is calculated, this cannot be added to the grid search and will be calculated later." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "comet_cell_id": "e89b66337a2a6" }, "outputs": [], "source": [ "models = {'LogisticRegression': LogisticRegression, 'RandomForestClassifier': RandomForestClassifier,\n", " 'AdversarialDebiasing': AdversarialDebiasing}\n", "\n", "metrics = {'UnifiedMetricLibrary': [UnifiedMetricLibrary,\n", " 'accuracy_score',\n", " 'average_odds_difference',\n", " 'statistical_parity_difference',\n", " 'equal_opportunity_difference',\n", " 'disparate_impact'\n", " ]\n", " }\n", "\n", "hyperparameters = {'LogisticRegression':{'penalty': ['l1', 'l2'], 'C': [0.1, 0.5, 1]},\n", " 'RandomForestClassifier':{},\n", " 'AdversarialDebiasing': DEFAULT_ADB_PARAMS(unprivileged=unprivileged, privileged=privileged)\n", " }\n", "\n", "thresholds = [i * 10.0/100 for i in range(5)]\n", "\n", "processor_args = {'unprivileged_groups': unprivileged, 'privileged_groups': privileged}\n", "\n", "preprocessors=[DisparateImpactRemover()]\n", "postprocessors=[CalibratedEqOddsPostprocessing(**processor_args)]\n", "\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "7bba44eb7e995" }, "source": [ "## Step 4: Run the grid search\n", "\n", "Now that we've set all the parameters necessary for the grid search, we're ready to run it. The output of the grid search is saved to a .csv file.\n", "\n", "Below we provide code that creates and uses the `ModelSearch` object to run a grid search over the parameters we specified and saves the output to a .csv file in the specified directory. **The search take a while to complete. Wait until the search completes before attempting to execute more cells.**\n", "\n", "**Note: warnings may appear during search, however, as long as you don’t see any code errors it is fine to continue.**" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "comet_cell_id": "470c3e83a0934" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n", " FutureWarning)\n", "/Users/bjohnson/anaconda3/envs/fairkit-learn/lib/python3.7/site-packages/sklearn/ensemble/forest.py:248: FutureWarning: The default value of n_estimators will change from 10 in version 0.20 to 100 in 0.22.\n", " \"10 in version 0.20 to 100 in 0.22.\", FutureWarning)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 0; iter: 0; batch classifier loss: 0.877071; batch adversarial loss: 0.712803\n", "epoch 1; iter: 0; batch classifier loss: 0.727612; batch adversarial loss: 0.719875\n", "epoch 2; iter: 0; batch classifier loss: 0.679733; batch adversarial loss: 0.705100\n", "epoch 3; iter: 0; batch classifier loss: 0.630754; batch adversarial loss: 0.695460\n", "epoch 4; iter: 0; batch classifier loss: 0.603920; batch adversarial loss: 0.676134\n", "epoch 5; iter: 0; batch classifier loss: 0.638403; batch adversarial loss: 0.686755\n", "epoch 6; iter: 0; batch classifier loss: 0.603781; batch adversarial loss: 0.651624\n", "epoch 7; iter: 0; batch classifier loss: 0.581655; batch adversarial loss: 0.679451\n", "epoch 8; iter: 0; batch classifier loss: 0.563415; batch adversarial loss: 0.662766\n", "epoch 9; iter: 0; batch classifier loss: 0.588290; batch adversarial loss: 0.646921\n", "epoch 10; iter: 0; batch classifier loss: 0.550787; batch adversarial loss: 0.660158\n", "epoch 11; iter: 0; batch classifier loss: 0.566986; batch adversarial loss: 0.662246\n", "epoch 12; iter: 0; batch classifier loss: 0.585916; batch adversarial loss: 0.665605\n", "epoch 13; iter: 0; batch classifier loss: 0.584854; batch adversarial loss: 0.677492\n", "epoch 14; iter: 0; batch classifier loss: 0.596661; batch adversarial loss: 0.640089\n", "epoch 15; iter: 0; batch classifier loss: 0.604081; batch adversarial loss: 0.653237\n", "epoch 16; iter: 0; batch classifier loss: 0.618483; batch adversarial loss: 0.652368\n", "epoch 17; iter: 0; batch classifier loss: 0.552163; batch adversarial loss: 0.639925\n", "epoch 18; iter: 0; batch classifier loss: 0.561280; batch adversarial loss: 0.665192\n", "epoch 19; iter: 0; batch classifier loss: 0.578862; batch adversarial loss: 0.645024\n", "epoch 20; iter: 0; batch classifier loss: 0.610292; batch adversarial loss: 0.675460\n", "epoch 21; iter: 0; batch classifier loss: 0.612907; batch adversarial loss: 0.616270\n", "epoch 22; iter: 0; batch classifier loss: 0.607450; batch adversarial loss: 0.647144\n", "epoch 23; iter: 0; batch classifier loss: 0.600425; batch adversarial loss: 0.637906\n", "epoch 24; iter: 0; batch classifier loss: 0.562078; batch adversarial loss: 0.632928\n", "epoch 25; iter: 0; batch classifier loss: 0.585174; batch adversarial loss: 0.670254\n", "epoch 26; iter: 0; batch classifier loss: 0.583103; batch adversarial loss: 0.633709\n", "epoch 27; iter: 0; batch classifier loss: 0.606510; batch adversarial loss: 0.645707\n", "epoch 28; iter: 0; batch classifier loss: 0.582080; batch adversarial loss: 0.695471\n", "epoch 29; iter: 0; batch classifier loss: 0.598255; batch adversarial loss: 0.624126\n", "epoch 30; iter: 0; batch classifier loss: 0.585683; batch adversarial loss: 0.615283\n", "epoch 31; iter: 0; batch classifier loss: 0.507424; batch adversarial loss: 0.642617\n", "epoch 32; iter: 0; batch classifier loss: 0.536420; batch adversarial loss: 0.622242\n", "epoch 33; iter: 0; batch classifier loss: 0.520318; batch adversarial loss: 0.654869\n", "epoch 34; iter: 0; batch classifier loss: 0.610233; batch adversarial loss: 0.639153\n", "epoch 35; iter: 0; batch classifier loss: 0.554781; batch adversarial loss: 0.661504\n", "epoch 36; iter: 0; batch classifier loss: 0.572235; batch adversarial loss: 0.625446\n", "epoch 37; iter: 0; batch classifier loss: 0.555038; batch adversarial loss: 0.650518\n", "epoch 38; iter: 0; batch classifier loss: 0.591223; batch adversarial loss: 0.622105\n", "epoch 39; iter: 0; batch classifier loss: 0.520023; batch adversarial loss: 0.669646\n", "epoch 40; iter: 0; batch classifier loss: 0.586169; batch adversarial loss: 0.624770\n", "epoch 41; iter: 0; batch classifier loss: 0.602615; batch adversarial loss: 0.650531\n", "epoch 42; iter: 0; batch classifier loss: 0.592935; batch adversarial loss: 0.650475\n", "epoch 43; iter: 0; batch classifier loss: 0.499415; batch adversarial loss: 0.642423\n", "epoch 44; iter: 0; batch classifier loss: 0.563787; batch adversarial loss: 0.623816\n", "epoch 45; iter: 0; batch classifier loss: 0.526961; batch adversarial loss: 0.640259\n", "epoch 46; iter: 0; batch classifier loss: 0.610158; batch adversarial loss: 0.627969\n", "epoch 47; iter: 0; batch classifier loss: 0.565620; batch adversarial loss: 0.654067\n", "epoch 48; iter: 0; batch classifier loss: 0.555428; batch adversarial loss: 0.600727\n", "epoch 49; iter: 0; batch classifier loss: 0.659450; batch adversarial loss: 0.631347\n", "epoch 0; iter: 0; batch classifier loss: 0.980618; batch adversarial loss: 0.636991\n", "epoch 1; iter: 0; batch classifier loss: 0.772691; batch adversarial loss: 0.691015\n", "epoch 2; iter: 0; batch classifier loss: 0.714705; batch adversarial loss: 0.652715\n", "epoch 3; iter: 0; batch classifier loss: 0.667691; batch adversarial loss: 0.711316\n", "epoch 4; iter: 0; batch classifier loss: 0.722761; batch adversarial loss: 0.670542\n", "epoch 5; iter: 0; batch classifier loss: 0.631507; batch adversarial loss: 0.671619\n", "epoch 6; iter: 0; batch classifier loss: 0.623865; batch adversarial loss: 0.700565\n", "epoch 7; iter: 0; batch classifier loss: 0.602710; batch adversarial loss: 0.672841\n", "epoch 8; iter: 0; batch classifier loss: 0.563048; batch adversarial loss: 0.669574\n", "epoch 9; iter: 0; batch classifier loss: 0.585108; batch adversarial loss: 0.650693\n", "epoch 10; iter: 0; batch classifier loss: 0.520814; batch adversarial loss: 0.714467\n", "epoch 11; iter: 0; batch classifier loss: 0.597176; batch adversarial loss: 0.668571\n", "epoch 12; iter: 0; batch classifier loss: 0.643618; batch adversarial loss: 0.676159\n", "epoch 13; iter: 0; batch classifier loss: 0.607157; batch adversarial loss: 0.635873\n", "epoch 14; iter: 0; batch classifier loss: 0.590296; batch adversarial loss: 0.665441\n", "epoch 15; iter: 0; batch classifier loss: 0.571676; batch adversarial loss: 0.686577\n", "epoch 16; iter: 0; batch classifier loss: 0.538355; batch adversarial loss: 0.721953\n", "epoch 17; iter: 0; batch classifier loss: 0.562292; batch adversarial loss: 0.634421\n", "epoch 18; iter: 0; batch classifier loss: 0.620733; batch adversarial loss: 0.679446\n", "epoch 19; iter: 0; batch classifier loss: 0.582858; batch adversarial loss: 0.652675\n", "epoch 20; iter: 0; batch classifier loss: 0.631856; batch adversarial loss: 0.662723\n", "epoch 21; iter: 0; batch classifier loss: 0.605410; batch adversarial loss: 0.631887\n", "epoch 22; iter: 0; batch classifier loss: 0.581333; batch adversarial loss: 0.655450\n", "epoch 23; iter: 0; batch classifier loss: 0.592902; batch adversarial loss: 0.631428\n", "epoch 24; iter: 0; batch classifier loss: 0.647945; batch adversarial loss: 0.694012\n", "epoch 25; iter: 0; batch classifier loss: 0.568416; batch adversarial loss: 0.630440\n", "epoch 26; iter: 0; batch classifier loss: 0.583250; batch adversarial loss: 0.635410\n", "epoch 27; iter: 0; batch classifier loss: 0.561080; batch adversarial loss: 0.638434\n", "epoch 28; iter: 0; batch classifier loss: 0.643283; batch adversarial loss: 0.610751\n", "epoch 29; iter: 0; batch classifier loss: 0.579853; batch adversarial loss: 0.672730\n", "epoch 30; iter: 0; batch classifier loss: 0.563068; batch adversarial loss: 0.653030\n", "epoch 31; iter: 0; batch classifier loss: 0.547667; batch adversarial loss: 0.639428\n", "epoch 32; iter: 0; batch classifier loss: 0.616764; batch adversarial loss: 0.662658\n", "epoch 33; iter: 0; batch classifier loss: 0.541108; batch adversarial loss: 0.624902\n", "epoch 34; iter: 0; batch classifier loss: 0.582482; batch adversarial loss: 0.642642\n", "epoch 35; iter: 0; batch classifier loss: 0.487433; batch adversarial loss: 0.591660\n", "epoch 36; iter: 0; batch classifier loss: 0.572450; batch adversarial loss: 0.646861\n", "epoch 37; iter: 0; batch classifier loss: 0.586598; batch adversarial loss: 0.628944\n", "epoch 38; iter: 0; batch classifier loss: 0.574248; batch adversarial loss: 0.658884\n", "epoch 39; iter: 0; batch classifier loss: 0.552582; batch adversarial loss: 0.633940\n", "epoch 40; iter: 0; batch classifier loss: 0.531395; batch adversarial loss: 0.618732\n", "epoch 41; iter: 0; batch classifier loss: 0.594319; batch adversarial loss: 0.693685\n", "epoch 42; iter: 0; batch classifier loss: 0.577085; batch adversarial loss: 0.643947\n", "epoch 43; iter: 0; batch classifier loss: 0.629182; batch adversarial loss: 0.596142\n", "epoch 44; iter: 0; batch classifier loss: 0.586320; batch adversarial loss: 0.657772\n", "epoch 45; iter: 0; batch classifier loss: 0.619961; batch adversarial loss: 0.625644\n", "epoch 46; iter: 0; batch classifier loss: 0.561095; batch adversarial loss: 0.628183\n", "epoch 47; iter: 0; batch classifier loss: 0.545862; batch adversarial loss: 0.673661\n", "epoch 48; iter: 0; batch classifier loss: 0.611978; batch adversarial loss: 0.629328\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 49; iter: 0; batch classifier loss: 0.556630; batch adversarial loss: 0.647700\n", "epoch 0; iter: 0; batch classifier loss: 1.906523; batch adversarial loss: 0.692811\n", "epoch 1; iter: 0; batch classifier loss: 0.840882; batch adversarial loss: 0.662183\n", "epoch 2; iter: 0; batch classifier loss: 0.700949; batch adversarial loss: 0.737353\n", "epoch 3; iter: 0; batch classifier loss: 0.623127; batch adversarial loss: 0.713265\n", "epoch 4; iter: 0; batch classifier loss: 0.643547; batch adversarial loss: 0.702123\n", "epoch 5; iter: 0; batch classifier loss: 0.682118; batch adversarial loss: 0.708217\n", "epoch 6; iter: 0; batch classifier loss: 0.655846; batch adversarial loss: 0.683245\n", "epoch 7; iter: 0; batch classifier loss: 0.637736; batch adversarial loss: 0.635302\n", "epoch 8; iter: 0; batch classifier loss: 0.635454; batch adversarial loss: 0.651661\n", "epoch 9; iter: 0; batch classifier loss: 0.637401; batch adversarial loss: 0.697933\n", "epoch 10; iter: 0; batch classifier loss: 0.516233; batch adversarial loss: 0.730907\n", "epoch 11; iter: 0; batch classifier loss: 0.572001; batch adversarial loss: 0.632851\n", "epoch 12; iter: 0; batch classifier loss: 0.619478; batch adversarial loss: 0.705136\n", "epoch 13; iter: 0; batch classifier loss: 0.649814; batch adversarial loss: 0.686819\n", "epoch 14; iter: 0; batch classifier loss: 0.554341; batch adversarial loss: 0.648599\n", "epoch 15; iter: 0; batch classifier loss: 0.630471; batch adversarial loss: 0.676725\n", "epoch 16; iter: 0; batch classifier loss: 0.572000; batch adversarial loss: 0.647445\n", "epoch 17; iter: 0; batch classifier loss: 0.586704; batch adversarial loss: 0.673816\n", "epoch 18; iter: 0; batch classifier loss: 0.550577; batch adversarial loss: 0.656888\n", "epoch 19; iter: 0; batch classifier loss: 0.590844; batch adversarial loss: 0.645773\n", "epoch 20; iter: 0; batch classifier loss: 0.552860; batch adversarial loss: 0.624596\n", "epoch 21; iter: 0; batch classifier loss: 0.562224; batch adversarial loss: 0.654888\n", "epoch 22; iter: 0; batch classifier loss: 0.564140; batch adversarial loss: 0.686775\n", "epoch 23; iter: 0; batch classifier loss: 0.549911; batch adversarial loss: 0.635536\n", "epoch 24; iter: 0; batch classifier loss: 0.598656; batch adversarial loss: 0.642810\n", "epoch 25; iter: 0; batch classifier loss: 0.576286; batch adversarial loss: 0.636326\n", "epoch 26; iter: 0; batch classifier loss: 0.569018; batch adversarial loss: 0.639561\n", "epoch 27; iter: 0; batch classifier loss: 0.597283; batch adversarial loss: 0.709730\n", "epoch 28; iter: 0; batch classifier loss: 0.662572; batch adversarial loss: 0.628857\n", "epoch 29; iter: 0; batch classifier loss: 0.640725; batch adversarial loss: 0.673747\n", "epoch 30; iter: 0; batch classifier loss: 0.537371; batch adversarial loss: 0.652736\n", "epoch 31; iter: 0; batch classifier loss: 0.646521; batch adversarial loss: 0.659063\n", "epoch 32; iter: 0; batch classifier loss: 0.621189; batch adversarial loss: 0.624136\n", "epoch 33; iter: 0; batch classifier loss: 0.604486; batch adversarial loss: 0.652873\n", "epoch 34; iter: 0; batch classifier loss: 0.611144; batch adversarial loss: 0.637682\n", "epoch 35; iter: 0; batch classifier loss: 0.598278; batch adversarial loss: 0.665317\n", "epoch 36; iter: 0; batch classifier loss: 0.626691; batch adversarial loss: 0.651652\n", "epoch 37; iter: 0; batch classifier loss: 0.604883; batch adversarial loss: 0.641420\n", "epoch 38; iter: 0; batch classifier loss: 0.578888; batch adversarial loss: 0.625179\n", "epoch 39; iter: 0; batch classifier loss: 0.585398; batch adversarial loss: 0.652768\n", "epoch 40; iter: 0; batch classifier loss: 0.602085; batch adversarial loss: 0.665900\n", "epoch 41; iter: 0; batch classifier loss: 0.577875; batch adversarial loss: 0.659617\n", "epoch 42; iter: 0; batch classifier loss: 0.552104; batch adversarial loss: 0.636055\n", "epoch 43; iter: 0; batch classifier loss: 0.623055; batch adversarial loss: 0.638046\n", "epoch 44; iter: 0; batch classifier loss: 0.598822; batch adversarial loss: 0.640457\n", "epoch 45; iter: 0; batch classifier loss: 0.564554; batch adversarial loss: 0.634101\n", "epoch 46; iter: 0; batch classifier loss: 0.580493; batch adversarial loss: 0.650770\n", "epoch 47; iter: 0; batch classifier loss: 0.567602; batch adversarial loss: 0.642866\n", "epoch 48; iter: 0; batch classifier loss: 0.600937; batch adversarial loss: 0.631129\n", "epoch 49; iter: 0; batch classifier loss: 0.558536; batch adversarial loss: 0.652978\n", "epoch 0; iter: 0; batch classifier loss: 0.862767; batch adversarial loss: 0.995026\n", "epoch 1; iter: 0; batch classifier loss: 1.015677; batch adversarial loss: 1.036755\n", "epoch 2; iter: 0; batch classifier loss: 0.972327; batch adversarial loss: 1.090168\n", "epoch 3; iter: 0; batch classifier loss: 0.769327; batch adversarial loss: 0.962409\n", "epoch 4; iter: 0; batch classifier loss: 0.852513; batch adversarial loss: 0.976979\n", "epoch 5; iter: 0; batch classifier loss: 0.778908; batch adversarial loss: 0.919723\n", "epoch 6; iter: 0; batch classifier loss: 0.749292; batch adversarial loss: 0.915335\n", "epoch 7; iter: 0; batch classifier loss: 0.696477; batch adversarial loss: 0.858897\n", "epoch 8; iter: 0; batch classifier loss: 0.682704; batch adversarial loss: 0.821878\n", "epoch 9; iter: 0; batch classifier loss: 0.697151; batch adversarial loss: 0.850032\n", "epoch 10; iter: 0; batch classifier loss: 0.683152; batch adversarial loss: 0.823052\n", "epoch 11; iter: 0; batch classifier loss: 0.766225; batch adversarial loss: 0.832651\n", "epoch 12; iter: 0; batch classifier loss: 0.758398; batch adversarial loss: 0.795057\n", "epoch 13; iter: 0; batch classifier loss: 0.654123; batch adversarial loss: 0.736719\n", "epoch 14; iter: 0; batch classifier loss: 0.676862; batch adversarial loss: 0.774385\n", "epoch 15; iter: 0; batch classifier loss: 0.691933; batch adversarial loss: 0.749204\n", "epoch 16; iter: 0; batch classifier loss: 0.596564; batch adversarial loss: 0.748302\n", "epoch 17; iter: 0; batch classifier loss: 0.637721; batch adversarial loss: 0.744950\n", "epoch 18; iter: 0; batch classifier loss: 0.683758; batch adversarial loss: 0.726230\n", "epoch 19; iter: 0; batch classifier loss: 0.625166; batch adversarial loss: 0.688270\n", "epoch 20; iter: 0; batch classifier loss: 0.608840; batch adversarial loss: 0.723509\n", "epoch 21; iter: 0; batch classifier loss: 0.628701; batch adversarial loss: 0.716399\n", "epoch 22; iter: 0; batch classifier loss: 0.631409; batch adversarial loss: 0.749333\n", "epoch 23; iter: 0; batch classifier loss: 0.612819; batch adversarial loss: 0.659810\n", "epoch 24; iter: 0; batch classifier loss: 0.636820; batch adversarial loss: 0.701821\n", "epoch 25; iter: 0; batch classifier loss: 0.617702; batch adversarial loss: 0.684841\n", "epoch 26; iter: 0; batch classifier loss: 0.576829; batch adversarial loss: 0.694599\n", "epoch 27; iter: 0; batch classifier loss: 0.615575; batch adversarial loss: 0.681354\n", "epoch 28; iter: 0; batch classifier loss: 0.571081; batch adversarial loss: 0.693325\n", "epoch 29; iter: 0; batch classifier loss: 0.626128; batch adversarial loss: 0.670374\n", "epoch 30; iter: 0; batch classifier loss: 0.693191; batch adversarial loss: 0.684600\n", "epoch 31; iter: 0; batch classifier loss: 0.589311; batch adversarial loss: 0.701679\n", "epoch 32; iter: 0; batch classifier loss: 0.571186; batch adversarial loss: 0.677862\n", "epoch 33; iter: 0; batch classifier loss: 0.551196; batch adversarial loss: 0.719874\n", "epoch 34; iter: 0; batch classifier loss: 0.581493; batch adversarial loss: 0.670230\n", "epoch 35; iter: 0; batch classifier loss: 0.633269; batch adversarial loss: 0.666200\n", "epoch 36; iter: 0; batch classifier loss: 0.610058; batch adversarial loss: 0.639056\n", "epoch 37; iter: 0; batch classifier loss: 0.588288; batch adversarial loss: 0.690228\n", "epoch 38; iter: 0; batch classifier loss: 0.624657; batch adversarial loss: 0.670120\n", "epoch 39; iter: 0; batch classifier loss: 0.614995; batch adversarial loss: 0.673911\n", "epoch 40; iter: 0; batch classifier loss: 0.594897; batch adversarial loss: 0.645674\n", "epoch 41; iter: 0; batch classifier loss: 0.566901; batch adversarial loss: 0.660891\n", "epoch 42; iter: 0; batch classifier loss: 0.555223; batch adversarial loss: 0.682021\n", "epoch 43; iter: 0; batch classifier loss: 0.639825; batch adversarial loss: 0.649632\n", "epoch 44; iter: 0; batch classifier loss: 0.611916; batch adversarial loss: 0.721161\n", "epoch 45; iter: 0; batch classifier loss: 0.566475; batch adversarial loss: 0.692769\n", "epoch 46; iter: 0; batch classifier loss: 0.615430; batch adversarial loss: 0.631117\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 47; iter: 0; batch classifier loss: 0.509616; batch adversarial loss: 0.664190\n", "epoch 48; iter: 0; batch classifier loss: 0.523164; batch adversarial loss: 0.630367\n", "epoch 49; iter: 0; batch classifier loss: 0.587686; batch adversarial loss: 0.606352\n", "epoch 0; iter: 0; batch classifier loss: 1.501014; batch adversarial loss: 0.817721\n", "epoch 1; iter: 0; batch classifier loss: 0.857626; batch adversarial loss: 0.788193\n", "epoch 2; iter: 0; batch classifier loss: 0.793460; batch adversarial loss: 0.784312\n", "epoch 3; iter: 0; batch classifier loss: 0.712284; batch adversarial loss: 0.750757\n", "epoch 4; iter: 0; batch classifier loss: 0.742962; batch adversarial loss: 0.732566\n", "epoch 5; iter: 0; batch classifier loss: 0.637667; batch adversarial loss: 0.718790\n", "epoch 6; iter: 0; batch classifier loss: 0.612970; batch adversarial loss: 0.709093\n", "epoch 7; iter: 0; batch classifier loss: 0.604810; batch adversarial loss: 0.693429\n", "epoch 8; iter: 0; batch classifier loss: 0.627262; batch adversarial loss: 0.688469\n", "epoch 9; iter: 0; batch classifier loss: 0.738526; batch adversarial loss: 0.677621\n", "epoch 10; iter: 0; batch classifier loss: 0.641201; batch adversarial loss: 0.671810\n", "epoch 11; iter: 0; batch classifier loss: 0.629877; batch adversarial loss: 0.656107\n", "epoch 12; iter: 0; batch classifier loss: 0.601683; batch adversarial loss: 0.664947\n", "epoch 13; iter: 0; batch classifier loss: 0.588363; batch adversarial loss: 0.643415\n", "epoch 14; iter: 0; batch classifier loss: 0.613469; batch adversarial loss: 0.627996\n", "epoch 15; iter: 0; batch classifier loss: 0.594271; batch adversarial loss: 0.648144\n", "epoch 16; iter: 0; batch classifier loss: 0.628050; batch adversarial loss: 0.648448\n", "epoch 17; iter: 0; batch classifier loss: 0.522780; batch adversarial loss: 0.676669\n", "epoch 18; iter: 0; batch classifier loss: 0.631481; batch adversarial loss: 0.625192\n", "epoch 19; iter: 0; batch classifier loss: 0.578748; batch adversarial loss: 0.644324\n", "epoch 20; iter: 0; batch classifier loss: 0.558233; batch adversarial loss: 0.657580\n", "epoch 21; iter: 0; batch classifier loss: 0.598591; batch adversarial loss: 0.665631\n", "epoch 22; iter: 0; batch classifier loss: 0.589780; batch adversarial loss: 0.627765\n", "epoch 23; iter: 0; batch classifier loss: 0.606864; batch adversarial loss: 0.588875\n", "epoch 24; iter: 0; batch classifier loss: 0.567765; batch adversarial loss: 0.641853\n", "epoch 25; iter: 0; batch classifier loss: 0.611223; batch adversarial loss: 0.632550\n", "epoch 26; iter: 0; batch classifier loss: 0.617759; batch adversarial loss: 0.684127\n", "epoch 27; iter: 0; batch classifier loss: 0.585525; batch adversarial loss: 0.676529\n", "epoch 28; iter: 0; batch classifier loss: 0.593098; batch adversarial loss: 0.610949\n", "epoch 29; iter: 0; batch classifier loss: 0.588491; batch adversarial loss: 0.653075\n", "epoch 30; iter: 0; batch classifier loss: 0.542846; batch adversarial loss: 0.693415\n", "epoch 31; iter: 0; batch classifier loss: 0.564689; batch adversarial loss: 0.658845\n", "epoch 32; iter: 0; batch classifier loss: 0.584894; batch adversarial loss: 0.630757\n", "epoch 33; iter: 0; batch classifier loss: 0.628377; batch adversarial loss: 0.662309\n", "epoch 34; iter: 0; batch classifier loss: 0.609364; batch adversarial loss: 0.625341\n", "epoch 35; iter: 0; batch classifier loss: 0.544045; batch adversarial loss: 0.664349\n", "epoch 36; iter: 0; batch classifier loss: 0.575548; batch adversarial loss: 0.631110\n", "epoch 37; iter: 0; batch classifier loss: 0.564394; batch adversarial loss: 0.624849\n", "epoch 38; iter: 0; batch classifier loss: 0.574940; batch adversarial loss: 0.717950\n", "epoch 39; iter: 0; batch classifier loss: 0.573482; batch adversarial loss: 0.650203\n", "epoch 40; iter: 0; batch classifier loss: 0.568365; batch adversarial loss: 0.663334\n", "epoch 41; iter: 0; batch classifier loss: 0.555803; batch adversarial loss: 0.616476\n", "epoch 42; iter: 0; batch classifier loss: 0.603914; batch adversarial loss: 0.631273\n", "epoch 43; iter: 0; batch classifier loss: 0.614373; batch adversarial loss: 0.654205\n", "epoch 44; iter: 0; batch classifier loss: 0.505885; batch adversarial loss: 0.654238\n", "epoch 45; iter: 0; batch classifier loss: 0.586571; batch adversarial loss: 0.632091\n", "epoch 46; iter: 0; batch classifier loss: 0.570634; batch adversarial loss: 0.611606\n", "epoch 47; iter: 0; batch classifier loss: 0.556927; batch adversarial loss: 0.678882\n", "epoch 48; iter: 0; batch classifier loss: 0.587705; batch adversarial loss: 0.624998\n", "epoch 49; iter: 0; batch classifier loss: 0.592716; batch adversarial loss: 0.643393\n", "epoch 0; iter: 0; batch classifier loss: 1.046204; batch adversarial loss: 0.660347\n", "epoch 1; iter: 0; batch classifier loss: 0.735974; batch adversarial loss: 0.688973\n", "epoch 2; iter: 0; batch classifier loss: 0.625763; batch adversarial loss: 0.673552\n", "epoch 3; iter: 0; batch classifier loss: 0.634703; batch adversarial loss: 0.687942\n", "epoch 4; iter: 0; batch classifier loss: 0.643348; batch adversarial loss: 0.697454\n", "epoch 5; iter: 0; batch classifier loss: 0.633260; batch adversarial loss: 0.698689\n", "epoch 6; iter: 0; batch classifier loss: 0.660761; batch adversarial loss: 0.676040\n", "epoch 7; iter: 0; batch classifier loss: 0.655106; batch adversarial loss: 0.739734\n", "epoch 8; iter: 0; batch classifier loss: 0.583651; batch adversarial loss: 0.696957\n", "epoch 9; iter: 0; batch classifier loss: 0.591898; batch adversarial loss: 0.700386\n", "epoch 10; iter: 0; batch classifier loss: 0.560254; batch adversarial loss: 0.732268\n", "epoch 11; iter: 0; batch classifier loss: 0.608907; batch adversarial loss: 0.700659\n", "epoch 12; iter: 0; batch classifier loss: 0.602107; batch adversarial loss: 0.656127\n", "epoch 13; iter: 0; batch classifier loss: 0.635230; batch adversarial loss: 0.720145\n", "epoch 14; iter: 0; batch classifier loss: 0.549650; batch adversarial loss: 0.674980\n", "epoch 15; iter: 0; batch classifier loss: 0.618191; batch adversarial loss: 0.671966\n", "epoch 16; iter: 0; batch classifier loss: 0.578032; batch adversarial loss: 0.653592\n", "epoch 17; iter: 0; batch classifier loss: 0.617681; batch adversarial loss: 0.666002\n", "epoch 18; iter: 0; batch classifier loss: 0.594393; batch adversarial loss: 0.660972\n", "epoch 19; iter: 0; batch classifier loss: 0.565911; batch adversarial loss: 0.635590\n", "epoch 20; iter: 0; batch classifier loss: 0.589836; batch adversarial loss: 0.649445\n", "epoch 21; iter: 0; batch classifier loss: 0.589444; batch adversarial loss: 0.654227\n", "epoch 22; iter: 0; batch classifier loss: 0.545455; batch adversarial loss: 0.658150\n", "epoch 23; iter: 0; batch classifier loss: 0.576481; batch adversarial loss: 0.655856\n", "epoch 24; iter: 0; batch classifier loss: 0.602988; batch adversarial loss: 0.650196\n", "epoch 25; iter: 0; batch classifier loss: 0.709599; batch adversarial loss: 0.692910\n", "epoch 26; iter: 0; batch classifier loss: 0.604488; batch adversarial loss: 0.648564\n", "epoch 27; iter: 0; batch classifier loss: 0.624722; batch adversarial loss: 0.665295\n", "epoch 28; iter: 0; batch classifier loss: 0.573751; batch adversarial loss: 0.651518\n", "epoch 29; iter: 0; batch classifier loss: 0.620761; batch adversarial loss: 0.662715\n", "epoch 30; iter: 0; batch classifier loss: 0.619415; batch adversarial loss: 0.606341\n", "epoch 31; iter: 0; batch classifier loss: 0.597153; batch adversarial loss: 0.629168\n", "epoch 32; iter: 0; batch classifier loss: 0.542483; batch adversarial loss: 0.639895\n", "epoch 33; iter: 0; batch classifier loss: 0.622910; batch adversarial loss: 0.640877\n", "epoch 34; iter: 0; batch classifier loss: 0.596589; batch adversarial loss: 0.666764\n", "epoch 35; iter: 0; batch classifier loss: 0.578334; batch adversarial loss: 0.643121\n", "epoch 36; iter: 0; batch classifier loss: 0.517053; batch adversarial loss: 0.653589\n", "epoch 37; iter: 0; batch classifier loss: 0.546902; batch adversarial loss: 0.613802\n", "epoch 38; iter: 0; batch classifier loss: 0.640115; batch adversarial loss: 0.676277\n", "epoch 39; iter: 0; batch classifier loss: 0.611142; batch adversarial loss: 0.677764\n", "epoch 40; iter: 0; batch classifier loss: 0.586480; batch adversarial loss: 0.646376\n", "epoch 41; iter: 0; batch classifier loss: 0.546239; batch adversarial loss: 0.654433\n", "epoch 42; iter: 0; batch classifier loss: 0.503816; batch adversarial loss: 0.655020\n", "epoch 43; iter: 0; batch classifier loss: 0.538433; batch adversarial loss: 0.654345\n", "epoch 44; iter: 0; batch classifier loss: 0.590383; batch adversarial loss: 0.668953\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 45; iter: 0; batch classifier loss: 0.590787; batch adversarial loss: 0.644131\n", "epoch 46; iter: 0; batch classifier loss: 0.530629; batch adversarial loss: 0.654872\n", "epoch 47; iter: 0; batch classifier loss: 0.533250; batch adversarial loss: 0.610568\n", "epoch 48; iter: 0; batch classifier loss: 0.564394; batch adversarial loss: 0.615931\n", "epoch 49; iter: 0; batch classifier loss: 0.618759; batch adversarial loss: 0.638546\n", "epoch 0; iter: 0; batch classifier loss: 1.407643; batch adversarial loss: 0.688694\n", "epoch 1; iter: 0; batch classifier loss: 0.738437; batch adversarial loss: 0.704925\n", "epoch 2; iter: 0; batch classifier loss: 0.619612; batch adversarial loss: 0.683771\n", "epoch 3; iter: 0; batch classifier loss: 0.672768; batch adversarial loss: 0.647484\n", "epoch 4; iter: 0; batch classifier loss: 0.689663; batch adversarial loss: 0.675743\n", "epoch 5; iter: 0; batch classifier loss: 0.609593; batch adversarial loss: 0.657576\n", "epoch 6; iter: 0; batch classifier loss: 0.622752; batch adversarial loss: 0.677839\n", "epoch 7; iter: 0; batch classifier loss: 0.612981; batch adversarial loss: 0.708891\n", "epoch 8; iter: 0; batch classifier loss: 0.570167; batch adversarial loss: 0.665061\n", "epoch 9; iter: 0; batch classifier loss: 0.589414; batch adversarial loss: 0.651380\n", "epoch 10; iter: 0; batch classifier loss: 0.627403; batch adversarial loss: 0.633828\n", "epoch 11; iter: 0; batch classifier loss: 0.603636; batch adversarial loss: 0.667230\n", "epoch 12; iter: 0; batch classifier loss: 0.552787; batch adversarial loss: 0.624296\n", "epoch 13; iter: 0; batch classifier loss: 0.570329; batch adversarial loss: 0.692457\n", "epoch 14; iter: 0; batch classifier loss: 0.561762; batch adversarial loss: 0.675442\n", "epoch 15; iter: 0; batch classifier loss: 0.618151; batch adversarial loss: 0.625080\n", "epoch 16; iter: 0; batch classifier loss: 0.582873; batch adversarial loss: 0.684486\n", "epoch 17; iter: 0; batch classifier loss: 0.553802; batch adversarial loss: 0.665596\n", "epoch 18; iter: 0; batch classifier loss: 0.665350; batch adversarial loss: 0.665729\n", "epoch 19; iter: 0; batch classifier loss: 0.563647; batch adversarial loss: 0.646476\n", "epoch 20; iter: 0; batch classifier loss: 0.686074; batch adversarial loss: 0.658628\n", "epoch 21; iter: 0; batch classifier loss: 0.571004; batch adversarial loss: 0.635422\n", "epoch 22; iter: 0; batch classifier loss: 0.509560; batch adversarial loss: 0.663705\n", "epoch 23; iter: 0; batch classifier loss: 0.640952; batch adversarial loss: 0.670341\n", "epoch 24; iter: 0; batch classifier loss: 0.587306; batch adversarial loss: 0.674035\n", "epoch 25; iter: 0; batch classifier loss: 0.549023; batch adversarial loss: 0.641436\n", "epoch 26; iter: 0; batch classifier loss: 0.601045; batch adversarial loss: 0.652118\n", "epoch 27; iter: 0; batch classifier loss: 0.487973; batch adversarial loss: 0.666464\n", "epoch 28; iter: 0; batch classifier loss: 0.655029; batch adversarial loss: 0.613275\n", "epoch 29; iter: 0; batch classifier loss: 0.593322; batch adversarial loss: 0.653018\n", "epoch 30; iter: 0; batch classifier loss: 0.589598; batch adversarial loss: 0.640741\n", "epoch 31; iter: 0; batch classifier loss: 0.559717; batch adversarial loss: 0.636658\n", "epoch 32; iter: 0; batch classifier loss: 0.599122; batch adversarial loss: 0.647996\n", "epoch 33; iter: 0; batch classifier loss: 0.611614; batch adversarial loss: 0.655075\n", "epoch 34; iter: 0; batch classifier loss: 0.584445; batch adversarial loss: 0.666081\n", "epoch 35; iter: 0; batch classifier loss: 0.539033; batch adversarial loss: 0.624313\n", "epoch 36; iter: 0; batch classifier loss: 0.596741; batch adversarial loss: 0.608516\n", "epoch 37; iter: 0; batch classifier loss: 0.592865; batch adversarial loss: 0.651269\n", "epoch 38; iter: 0; batch classifier loss: 0.561591; batch adversarial loss: 0.648676\n", "epoch 39; iter: 0; batch classifier loss: 0.569907; batch adversarial loss: 0.652209\n", "epoch 40; iter: 0; batch classifier loss: 0.615716; batch adversarial loss: 0.631905\n", "epoch 41; iter: 0; batch classifier loss: 0.593096; batch adversarial loss: 0.659411\n", "epoch 42; iter: 0; batch classifier loss: 0.582182; batch adversarial loss: 0.683472\n", "epoch 43; iter: 0; batch classifier loss: 0.506200; batch adversarial loss: 0.649818\n", "epoch 44; iter: 0; batch classifier loss: 0.540876; batch adversarial loss: 0.645240\n", "epoch 45; iter: 0; batch classifier loss: 0.593076; batch adversarial loss: 0.589605\n", "epoch 46; iter: 0; batch classifier loss: 0.607502; batch adversarial loss: 0.683193\n", "epoch 47; iter: 0; batch classifier loss: 0.588726; batch adversarial loss: 0.624911\n", "epoch 48; iter: 0; batch classifier loss: 0.570990; batch adversarial loss: 0.609731\n", "epoch 49; iter: 0; batch classifier loss: 0.541064; batch adversarial loss: 0.632150\n", "epoch 0; iter: 0; batch classifier loss: 1.140840; batch adversarial loss: 0.686592\n", "epoch 1; iter: 0; batch classifier loss: 0.690314; batch adversarial loss: 0.646984\n", "epoch 2; iter: 0; batch classifier loss: 0.680523; batch adversarial loss: 0.680034\n", "epoch 3; iter: 0; batch classifier loss: 0.598954; batch adversarial loss: 0.676844\n", "epoch 4; iter: 0; batch classifier loss: 0.624966; batch adversarial loss: 0.677777\n", "epoch 5; iter: 0; batch classifier loss: 0.630695; batch adversarial loss: 0.678083\n", "epoch 6; iter: 0; batch classifier loss: 0.605373; batch adversarial loss: 0.662384\n", "epoch 7; iter: 0; batch classifier loss: 0.643467; batch adversarial loss: 0.666940\n", "epoch 8; iter: 0; batch classifier loss: 0.630116; batch adversarial loss: 0.665797\n", "epoch 9; iter: 0; batch classifier loss: 0.626258; batch adversarial loss: 0.663837\n", "epoch 10; iter: 0; batch classifier loss: 0.575881; batch adversarial loss: 0.636281\n", "epoch 11; iter: 0; batch classifier loss: 0.602486; batch adversarial loss: 0.676965\n", "epoch 12; iter: 0; batch classifier loss: 0.573321; batch adversarial loss: 0.655087\n", "epoch 13; iter: 0; batch classifier loss: 0.578274; batch adversarial loss: 0.661448\n", "epoch 14; iter: 0; batch classifier loss: 0.583333; batch adversarial loss: 0.673713\n", "epoch 15; iter: 0; batch classifier loss: 0.605180; batch adversarial loss: 0.670577\n", "epoch 16; iter: 0; batch classifier loss: 0.539716; batch adversarial loss: 0.642051\n", "epoch 17; iter: 0; batch classifier loss: 0.584636; batch adversarial loss: 0.684088\n", "epoch 18; iter: 0; batch classifier loss: 0.587304; batch adversarial loss: 0.644112\n", "epoch 19; iter: 0; batch classifier loss: 0.544410; batch adversarial loss: 0.648348\n", "epoch 20; iter: 0; batch classifier loss: 0.612893; batch adversarial loss: 0.615456\n", "epoch 21; iter: 0; batch classifier loss: 0.662384; batch adversarial loss: 0.642103\n", "epoch 22; iter: 0; batch classifier loss: 0.561414; batch adversarial loss: 0.652324\n", "epoch 23; iter: 0; batch classifier loss: 0.559410; batch adversarial loss: 0.658444\n", "epoch 24; iter: 0; batch classifier loss: 0.589119; batch adversarial loss: 0.644808\n", "epoch 25; iter: 0; batch classifier loss: 0.571991; batch adversarial loss: 0.667835\n", "epoch 26; iter: 0; batch classifier loss: 0.588403; batch adversarial loss: 0.648233\n", "epoch 27; iter: 0; batch classifier loss: 0.590847; batch adversarial loss: 0.630980\n", "epoch 28; iter: 0; batch classifier loss: 0.529487; batch adversarial loss: 0.641323\n", "epoch 29; iter: 0; batch classifier loss: 0.520567; batch adversarial loss: 0.648073\n", "epoch 30; iter: 0; batch classifier loss: 0.635515; batch adversarial loss: 0.617541\n", "epoch 31; iter: 0; batch classifier loss: 0.569916; batch adversarial loss: 0.692057\n", "epoch 32; iter: 0; batch classifier loss: 0.669616; batch adversarial loss: 0.646807\n", "epoch 33; iter: 0; batch classifier loss: 0.563257; batch adversarial loss: 0.640939\n", "epoch 34; iter: 0; batch classifier loss: 0.632970; batch adversarial loss: 0.680366\n", "epoch 35; iter: 0; batch classifier loss: 0.595709; batch adversarial loss: 0.638286\n", "epoch 36; iter: 0; batch classifier loss: 0.569776; batch adversarial loss: 0.618864\n", "epoch 37; iter: 0; batch classifier loss: 0.527300; batch adversarial loss: 0.657241\n", "epoch 38; iter: 0; batch classifier loss: 0.577031; batch adversarial loss: 0.634663\n", "epoch 39; iter: 0; batch classifier loss: 0.582849; batch adversarial loss: 0.625791\n", "epoch 40; iter: 0; batch classifier loss: 0.590501; batch adversarial loss: 0.632568\n", "epoch 41; iter: 0; batch classifier loss: 0.582391; batch adversarial loss: 0.632342\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 42; iter: 0; batch classifier loss: 0.573780; batch adversarial loss: 0.606970\n", "epoch 43; iter: 0; batch classifier loss: 0.507100; batch adversarial loss: 0.655658\n", "epoch 44; iter: 0; batch classifier loss: 0.615199; batch adversarial loss: 0.656611\n", "epoch 45; iter: 0; batch classifier loss: 0.585887; batch adversarial loss: 0.677781\n", "epoch 46; iter: 0; batch classifier loss: 0.604253; batch adversarial loss: 0.622422\n", "epoch 47; iter: 0; batch classifier loss: 0.544315; batch adversarial loss: 0.663204\n", "epoch 48; iter: 0; batch classifier loss: 0.525675; batch adversarial loss: 0.574582\n", "epoch 49; iter: 0; batch classifier loss: 0.506985; batch adversarial loss: 0.645881\n", "epoch 0; iter: 0; batch classifier loss: 0.838736; batch adversarial loss: 0.666675\n", "epoch 1; iter: 0; batch classifier loss: 0.666485; batch adversarial loss: 0.679395\n", "epoch 2; iter: 0; batch classifier loss: 0.647497; batch adversarial loss: 0.753328\n", "epoch 3; iter: 0; batch classifier loss: 0.613311; batch adversarial loss: 0.636765\n", "epoch 4; iter: 0; batch classifier loss: 0.617195; batch adversarial loss: 0.685886\n", "epoch 5; iter: 0; batch classifier loss: 0.624632; batch adversarial loss: 0.687409\n", "epoch 6; iter: 0; batch classifier loss: 0.614397; batch adversarial loss: 0.679180\n", "epoch 7; iter: 0; batch classifier loss: 0.572779; batch adversarial loss: 0.685229\n", "epoch 8; iter: 0; batch classifier loss: 0.577053; batch adversarial loss: 0.690509\n", "epoch 9; iter: 0; batch classifier loss: 0.602793; batch adversarial loss: 0.642608\n", "epoch 10; iter: 0; batch classifier loss: 0.626855; batch adversarial loss: 0.645161\n", "epoch 11; iter: 0; batch classifier loss: 0.606157; batch adversarial loss: 0.650430\n", "epoch 12; iter: 0; batch classifier loss: 0.608706; batch adversarial loss: 0.660868\n", "epoch 13; iter: 0; batch classifier loss: 0.556847; batch adversarial loss: 0.618347\n", "epoch 14; iter: 0; batch classifier loss: 0.506001; batch adversarial loss: 0.642298\n", "epoch 15; iter: 0; batch classifier loss: 0.578482; batch adversarial loss: 0.670636\n", "epoch 16; iter: 0; batch classifier loss: 0.587824; batch adversarial loss: 0.650949\n", "epoch 17; iter: 0; batch classifier loss: 0.528919; batch adversarial loss: 0.659597\n", "epoch 18; iter: 0; batch classifier loss: 0.628401; batch adversarial loss: 0.650817\n", "epoch 19; iter: 0; batch classifier loss: 0.641597; batch adversarial loss: 0.653471\n", "epoch 20; iter: 0; batch classifier loss: 0.586883; batch adversarial loss: 0.630066\n", "epoch 21; iter: 0; batch classifier loss: 0.615888; batch adversarial loss: 0.637745\n", "epoch 22; iter: 0; batch classifier loss: 0.567656; batch adversarial loss: 0.667348\n", "epoch 23; iter: 0; batch classifier loss: 0.602220; batch adversarial loss: 0.660717\n", "epoch 24; iter: 0; batch classifier loss: 0.555222; batch adversarial loss: 0.627921\n", "epoch 25; iter: 0; batch classifier loss: 0.639615; batch adversarial loss: 0.699474\n", "epoch 26; iter: 0; batch classifier loss: 0.542405; batch adversarial loss: 0.665122\n", "epoch 27; iter: 0; batch classifier loss: 0.612478; batch adversarial loss: 0.623808\n", "epoch 28; iter: 0; batch classifier loss: 0.604135; batch adversarial loss: 0.663868\n", "epoch 29; iter: 0; batch classifier loss: 0.545118; batch adversarial loss: 0.621345\n", "epoch 30; iter: 0; batch classifier loss: 0.597467; batch adversarial loss: 0.649204\n", "epoch 31; iter: 0; batch classifier loss: 0.633824; batch adversarial loss: 0.660972\n", "epoch 32; iter: 0; batch classifier loss: 0.658961; batch adversarial loss: 0.641589\n", "epoch 33; iter: 0; batch classifier loss: 0.538498; batch adversarial loss: 0.638642\n", "epoch 34; iter: 0; batch classifier loss: 0.590526; batch adversarial loss: 0.612643\n", "epoch 35; iter: 0; batch classifier loss: 0.539063; batch adversarial loss: 0.697168\n", "epoch 36; iter: 0; batch classifier loss: 0.568123; batch adversarial loss: 0.608830\n", "epoch 37; iter: 0; batch classifier loss: 0.564549; batch adversarial loss: 0.625751\n", "epoch 38; iter: 0; batch classifier loss: 0.631033; batch adversarial loss: 0.602425\n", "epoch 39; iter: 0; batch classifier loss: 0.572021; batch adversarial loss: 0.653783\n", "epoch 40; iter: 0; batch classifier loss: 0.536810; batch adversarial loss: 0.626990\n", "epoch 41; iter: 0; batch classifier loss: 0.590729; batch adversarial loss: 0.649266\n", "epoch 42; iter: 0; batch classifier loss: 0.565029; batch adversarial loss: 0.644310\n", "epoch 43; iter: 0; batch classifier loss: 0.582824; batch adversarial loss: 0.626057\n", "epoch 44; iter: 0; batch classifier loss: 0.571911; batch adversarial loss: 0.618611\n", "epoch 45; iter: 0; batch classifier loss: 0.568840; batch adversarial loss: 0.632556\n", "epoch 46; iter: 0; batch classifier loss: 0.553110; batch adversarial loss: 0.586160\n", "epoch 47; iter: 0; batch classifier loss: 0.542814; batch adversarial loss: 0.593464\n", "epoch 48; iter: 0; batch classifier loss: 0.557397; batch adversarial loss: 0.641091\n", "epoch 49; iter: 0; batch classifier loss: 0.597960; batch adversarial loss: 0.619575\n", "epoch 0; iter: 0; batch classifier loss: 0.911208; batch adversarial loss: 0.655766\n", "epoch 1; iter: 0; batch classifier loss: 0.698189; batch adversarial loss: 0.706118\n", "epoch 2; iter: 0; batch classifier loss: 0.644348; batch adversarial loss: 0.674534\n", "epoch 3; iter: 0; batch classifier loss: 0.638047; batch adversarial loss: 0.723701\n", "epoch 4; iter: 0; batch classifier loss: 0.610060; batch adversarial loss: 0.667164\n", "epoch 5; iter: 0; batch classifier loss: 0.589690; batch adversarial loss: 0.696241\n", "epoch 6; iter: 0; batch classifier loss: 0.643558; batch adversarial loss: 0.636932\n", "epoch 7; iter: 0; batch classifier loss: 0.646285; batch adversarial loss: 0.722483\n", "epoch 8; iter: 0; batch classifier loss: 0.633880; batch adversarial loss: 0.709699\n", "epoch 9; iter: 0; batch classifier loss: 0.581411; batch adversarial loss: 0.701032\n", "epoch 10; iter: 0; batch classifier loss: 0.657780; batch adversarial loss: 0.674959\n", "epoch 11; iter: 0; batch classifier loss: 0.637098; batch adversarial loss: 0.696028\n", "epoch 12; iter: 0; batch classifier loss: 0.545748; batch adversarial loss: 0.674760\n", "epoch 13; iter: 0; batch classifier loss: 0.593925; batch adversarial loss: 0.663701\n", "epoch 14; iter: 0; batch classifier loss: 0.593469; batch adversarial loss: 0.675462\n", "epoch 15; iter: 0; batch classifier loss: 0.679858; batch adversarial loss: 0.655358\n", "epoch 16; iter: 0; batch classifier loss: 0.563178; batch adversarial loss: 0.688439\n", "epoch 17; iter: 0; batch classifier loss: 0.597847; batch adversarial loss: 0.672614\n", "epoch 18; iter: 0; batch classifier loss: 0.617874; batch adversarial loss: 0.660724\n", "epoch 19; iter: 0; batch classifier loss: 0.555598; batch adversarial loss: 0.676152\n", "epoch 20; iter: 0; batch classifier loss: 0.603996; batch adversarial loss: 0.662527\n", "epoch 21; iter: 0; batch classifier loss: 0.592283; batch adversarial loss: 0.656674\n", "epoch 22; iter: 0; batch classifier loss: 0.656438; batch adversarial loss: 0.632851\n", "epoch 23; iter: 0; batch classifier loss: 0.605694; batch adversarial loss: 0.658802\n", "epoch 24; iter: 0; batch classifier loss: 0.568512; batch adversarial loss: 0.640082\n", "epoch 25; iter: 0; batch classifier loss: 0.508674; batch adversarial loss: 0.621530\n", "epoch 26; iter: 0; batch classifier loss: 0.551606; batch adversarial loss: 0.634183\n", "epoch 27; iter: 0; batch classifier loss: 0.643667; batch adversarial loss: 0.657815\n", "epoch 28; iter: 0; batch classifier loss: 0.589535; batch adversarial loss: 0.652845\n", "epoch 29; iter: 0; batch classifier loss: 0.625577; batch adversarial loss: 0.642786\n", "epoch 30; iter: 0; batch classifier loss: 0.578845; batch adversarial loss: 0.639652\n", "epoch 31; iter: 0; batch classifier loss: 0.537222; batch adversarial loss: 0.660958\n", "epoch 32; iter: 0; batch classifier loss: 0.600689; batch adversarial loss: 0.601508\n", "epoch 33; iter: 0; batch classifier loss: 0.571539; batch adversarial loss: 0.628760\n", "epoch 34; iter: 0; batch classifier loss: 0.583623; batch adversarial loss: 0.643776\n", "epoch 35; iter: 0; batch classifier loss: 0.601953; batch adversarial loss: 0.707454\n", "epoch 36; iter: 0; batch classifier loss: 0.589697; batch adversarial loss: 0.687439\n", "epoch 37; iter: 0; batch classifier loss: 0.578248; batch adversarial loss: 0.649939\n", "epoch 38; iter: 0; batch classifier loss: 0.566996; batch adversarial loss: 0.654727\n", "epoch 39; iter: 0; batch classifier loss: 0.558173; batch adversarial loss: 0.652345\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 40; iter: 0; batch classifier loss: 0.573320; batch adversarial loss: 0.646042\n", "epoch 41; iter: 0; batch classifier loss: 0.589315; batch adversarial loss: 0.652977\n", "epoch 42; iter: 0; batch classifier loss: 0.594077; batch adversarial loss: 0.612197\n", "epoch 43; iter: 0; batch classifier loss: 0.588144; batch adversarial loss: 0.645024\n", "epoch 44; iter: 0; batch classifier loss: 0.614177; batch adversarial loss: 0.657203\n", "epoch 45; iter: 0; batch classifier loss: 0.556107; batch adversarial loss: 0.595363\n", "epoch 46; iter: 0; batch classifier loss: 0.576794; batch adversarial loss: 0.600886\n", "epoch 47; iter: 0; batch classifier loss: 0.601720; batch adversarial loss: 0.618873\n", "epoch 48; iter: 0; batch classifier loss: 0.560436; batch adversarial loss: 0.661748\n", "epoch 49; iter: 0; batch classifier loss: 0.612796; batch adversarial loss: 0.661554\n", "epoch 0; iter: 0; batch classifier loss: 0.886866; batch adversarial loss: 0.622805\n", "epoch 1; iter: 0; batch classifier loss: 0.730050; batch adversarial loss: 0.640855\n", "epoch 2; iter: 0; batch classifier loss: 0.635540; batch adversarial loss: 0.680868\n", "epoch 3; iter: 0; batch classifier loss: 0.658583; batch adversarial loss: 0.659219\n", "epoch 4; iter: 0; batch classifier loss: 0.638787; batch adversarial loss: 0.666629\n", "epoch 5; iter: 0; batch classifier loss: 0.594908; batch adversarial loss: 0.647343\n", "epoch 6; iter: 0; batch classifier loss: 0.598754; batch adversarial loss: 0.657415\n", "epoch 7; iter: 0; batch classifier loss: 0.606996; batch adversarial loss: 0.646558\n", "epoch 8; iter: 0; batch classifier loss: 0.574839; batch adversarial loss: 0.683717\n", "epoch 9; iter: 0; batch classifier loss: 0.573825; batch adversarial loss: 0.632728\n", "epoch 10; iter: 0; batch classifier loss: 0.559482; batch adversarial loss: 0.653235\n", "epoch 11; iter: 0; batch classifier loss: 0.586685; batch adversarial loss: 0.669887\n", "epoch 12; iter: 0; batch classifier loss: 0.602329; batch adversarial loss: 0.681094\n", "epoch 13; iter: 0; batch classifier loss: 0.532661; batch adversarial loss: 0.676962\n", "epoch 14; iter: 0; batch classifier loss: 0.621807; batch adversarial loss: 0.676730\n", "epoch 15; iter: 0; batch classifier loss: 0.622083; batch adversarial loss: 0.649516\n", "epoch 16; iter: 0; batch classifier loss: 0.643093; batch adversarial loss: 0.666333\n", "epoch 17; iter: 0; batch classifier loss: 0.663006; batch adversarial loss: 0.710759\n", "epoch 18; iter: 0; batch classifier loss: 0.553718; batch adversarial loss: 0.667455\n", "epoch 19; iter: 0; batch classifier loss: 0.591099; batch adversarial loss: 0.662411\n", "epoch 20; iter: 0; batch classifier loss: 0.545536; batch adversarial loss: 0.647575\n", "epoch 21; iter: 0; batch classifier loss: 0.591465; batch adversarial loss: 0.637581\n", "epoch 22; iter: 0; batch classifier loss: 0.594111; batch adversarial loss: 0.651019\n", "epoch 23; iter: 0; batch classifier loss: 0.575965; batch adversarial loss: 0.637820\n", "epoch 24; iter: 0; batch classifier loss: 0.494803; batch adversarial loss: 0.626893\n", "epoch 25; iter: 0; batch classifier loss: 0.586715; batch adversarial loss: 0.638419\n", "epoch 26; iter: 0; batch classifier loss: 0.479582; batch adversarial loss: 0.652787\n", "epoch 27; iter: 0; batch classifier loss: 0.562443; batch adversarial loss: 0.618872\n", "epoch 28; iter: 0; batch classifier loss: 0.582884; batch adversarial loss: 0.649493\n", "epoch 29; iter: 0; batch classifier loss: 0.632209; batch adversarial loss: 0.640120\n", "epoch 30; iter: 0; batch classifier loss: 0.599355; batch adversarial loss: 0.654148\n", "epoch 31; iter: 0; batch classifier loss: 0.606835; batch adversarial loss: 0.679417\n", "epoch 32; iter: 0; batch classifier loss: 0.517798; batch adversarial loss: 0.642111\n", "epoch 33; iter: 0; batch classifier loss: 0.614916; batch adversarial loss: 0.680163\n", "epoch 34; iter: 0; batch classifier loss: 0.519407; batch adversarial loss: 0.662842\n", "epoch 35; iter: 0; batch classifier loss: 0.510940; batch adversarial loss: 0.606887\n", "epoch 36; iter: 0; batch classifier loss: 0.532039; batch adversarial loss: 0.659477\n", "epoch 37; iter: 0; batch classifier loss: 0.493318; batch adversarial loss: 0.666816\n", "epoch 38; iter: 0; batch classifier loss: 0.556205; batch adversarial loss: 0.653106\n", "epoch 39; iter: 0; batch classifier loss: 0.450466; batch adversarial loss: 0.655879\n", "epoch 40; iter: 0; batch classifier loss: 0.558647; batch adversarial loss: 0.601934\n", "epoch 41; iter: 0; batch classifier loss: 0.552668; batch adversarial loss: 0.603492\n", "epoch 42; iter: 0; batch classifier loss: 0.494646; batch adversarial loss: 0.626402\n", "epoch 43; iter: 0; batch classifier loss: 0.619058; batch adversarial loss: 0.600598\n", "epoch 44; iter: 0; batch classifier loss: 0.610288; batch adversarial loss: 0.619251\n", "epoch 45; iter: 0; batch classifier loss: 0.590397; batch adversarial loss: 0.626207\n", "epoch 46; iter: 0; batch classifier loss: 0.546436; batch adversarial loss: 0.604851\n", "epoch 47; iter: 0; batch classifier loss: 0.527937; batch adversarial loss: 0.604883\n", "epoch 48; iter: 0; batch classifier loss: 0.494694; batch adversarial loss: 0.652526\n", "epoch 49; iter: 0; batch classifier loss: 0.567262; batch adversarial loss: 0.616521\n", "epoch 0; iter: 0; batch classifier loss: 1.366953; batch adversarial loss: 0.684904\n", "epoch 1; iter: 0; batch classifier loss: 0.861545; batch adversarial loss: 0.649295\n", "epoch 2; iter: 0; batch classifier loss: 0.553931; batch adversarial loss: 0.658084\n", "epoch 3; iter: 0; batch classifier loss: 0.612118; batch adversarial loss: 0.696921\n", "epoch 4; iter: 0; batch classifier loss: 0.671592; batch adversarial loss: 0.680620\n", "epoch 5; iter: 0; batch classifier loss: 0.576223; batch adversarial loss: 0.649479\n", "epoch 6; iter: 0; batch classifier loss: 0.639621; batch adversarial loss: 0.657427\n", "epoch 7; iter: 0; batch classifier loss: 0.564470; batch adversarial loss: 0.641909\n", "epoch 8; iter: 0; batch classifier loss: 0.608648; batch adversarial loss: 0.640619\n", "epoch 9; iter: 0; batch classifier loss: 0.575191; batch adversarial loss: 0.644073\n", "epoch 10; iter: 0; batch classifier loss: 0.527697; batch adversarial loss: 0.646188\n", "epoch 11; iter: 0; batch classifier loss: 0.618659; batch adversarial loss: 0.684329\n", "epoch 12; iter: 0; batch classifier loss: 0.618544; batch adversarial loss: 0.648391\n", "epoch 13; iter: 0; batch classifier loss: 0.575944; batch adversarial loss: 0.645933\n", "epoch 14; iter: 0; batch classifier loss: 0.578738; batch adversarial loss: 0.647000\n", "epoch 15; iter: 0; batch classifier loss: 0.549334; batch adversarial loss: 0.646977\n", "epoch 16; iter: 0; batch classifier loss: 0.574853; batch adversarial loss: 0.653769\n", "epoch 17; iter: 0; batch classifier loss: 0.544335; batch adversarial loss: 0.644094\n", "epoch 18; iter: 0; batch classifier loss: 0.638052; batch adversarial loss: 0.622953\n", "epoch 19; iter: 0; batch classifier loss: 0.628507; batch adversarial loss: 0.632000\n", "epoch 20; iter: 0; batch classifier loss: 0.564795; batch adversarial loss: 0.631001\n", "epoch 21; iter: 0; batch classifier loss: 0.544283; batch adversarial loss: 0.615053\n", "epoch 22; iter: 0; batch classifier loss: 0.565379; batch adversarial loss: 0.664297\n", "epoch 23; iter: 0; batch classifier loss: 0.515936; batch adversarial loss: 0.616117\n", "epoch 24; iter: 0; batch classifier loss: 0.585832; batch adversarial loss: 0.671486\n", "epoch 25; iter: 0; batch classifier loss: 0.551794; batch adversarial loss: 0.634444\n", "epoch 26; iter: 0; batch classifier loss: 0.580613; batch adversarial loss: 0.633833\n", "epoch 27; iter: 0; batch classifier loss: 0.533140; batch adversarial loss: 0.657908\n", "epoch 28; iter: 0; batch classifier loss: 0.559138; batch adversarial loss: 0.652791\n", "epoch 29; iter: 0; batch classifier loss: 0.579454; batch adversarial loss: 0.624757\n", "epoch 30; iter: 0; batch classifier loss: 0.553526; batch adversarial loss: 0.613291\n", "epoch 31; iter: 0; batch classifier loss: 0.466150; batch adversarial loss: 0.641193\n", "epoch 32; iter: 0; batch classifier loss: 0.526578; batch adversarial loss: 0.652227\n", "epoch 33; iter: 0; batch classifier loss: 0.528630; batch adversarial loss: 0.639022\n", "epoch 34; iter: 0; batch classifier loss: 0.551455; batch adversarial loss: 0.662071\n", "epoch 35; iter: 0; batch classifier loss: 0.621607; batch adversarial loss: 0.662804\n", "epoch 36; iter: 0; batch classifier loss: 0.572662; batch adversarial loss: 0.658348\n", "epoch 37; iter: 0; batch classifier loss: 0.551317; batch adversarial loss: 0.664989\n", "epoch 38; iter: 0; batch classifier loss: 0.648123; batch adversarial loss: 0.630487\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 39; iter: 0; batch classifier loss: 0.546520; batch adversarial loss: 0.659174\n", "epoch 40; iter: 0; batch classifier loss: 0.520883; batch adversarial loss: 0.607577\n", "epoch 41; iter: 0; batch classifier loss: 0.556713; batch adversarial loss: 0.630373\n", "epoch 42; iter: 0; batch classifier loss: 0.476367; batch adversarial loss: 0.618990\n", "epoch 43; iter: 0; batch classifier loss: 0.520872; batch adversarial loss: 0.665385\n", "epoch 44; iter: 0; batch classifier loss: 0.587816; batch adversarial loss: 0.650942\n", "epoch 45; iter: 0; batch classifier loss: 0.597513; batch adversarial loss: 0.644970\n", "epoch 46; iter: 0; batch classifier loss: 0.594896; batch adversarial loss: 0.639536\n", "epoch 47; iter: 0; batch classifier loss: 0.557839; batch adversarial loss: 0.646471\n", "epoch 48; iter: 0; batch classifier loss: 0.472494; batch adversarial loss: 0.683319\n", "epoch 49; iter: 0; batch classifier loss: 0.537420; batch adversarial loss: 0.640076\n", "epoch 0; iter: 0; batch classifier loss: 1.723255; batch adversarial loss: 0.692471\n", "epoch 1; iter: 0; batch classifier loss: 0.699054; batch adversarial loss: 0.673260\n", "epoch 2; iter: 0; batch classifier loss: 0.618161; batch adversarial loss: 0.685280\n", "epoch 3; iter: 0; batch classifier loss: 0.730788; batch adversarial loss: 0.673476\n", "epoch 4; iter: 0; batch classifier loss: 0.722526; batch adversarial loss: 0.657213\n", "epoch 5; iter: 0; batch classifier loss: 0.624558; batch adversarial loss: 0.666966\n", "epoch 6; iter: 0; batch classifier loss: 0.694149; batch adversarial loss: 0.701506\n", "epoch 7; iter: 0; batch classifier loss: 0.613050; batch adversarial loss: 0.643393\n", "epoch 8; iter: 0; batch classifier loss: 0.572187; batch adversarial loss: 0.736883\n", "epoch 9; iter: 0; batch classifier loss: 0.560458; batch adversarial loss: 0.680824\n", "epoch 10; iter: 0; batch classifier loss: 0.669657; batch adversarial loss: 0.662213\n", "epoch 11; iter: 0; batch classifier loss: 0.635056; batch adversarial loss: 0.691387\n", "epoch 12; iter: 0; batch classifier loss: 0.628881; batch adversarial loss: 0.675876\n", "epoch 13; iter: 0; batch classifier loss: 0.591996; batch adversarial loss: 0.708616\n", "epoch 14; iter: 0; batch classifier loss: 0.564751; batch adversarial loss: 0.645577\n", "epoch 15; iter: 0; batch classifier loss: 0.559689; batch adversarial loss: 0.692578\n", "epoch 16; iter: 0; batch classifier loss: 0.549127; batch adversarial loss: 0.644654\n", "epoch 17; iter: 0; batch classifier loss: 0.651314; batch adversarial loss: 0.686403\n", "epoch 18; iter: 0; batch classifier loss: 0.617027; batch adversarial loss: 0.664349\n", "epoch 19; iter: 0; batch classifier loss: 0.588793; batch adversarial loss: 0.654775\n", "epoch 20; iter: 0; batch classifier loss: 0.607558; batch adversarial loss: 0.681368\n", "epoch 21; iter: 0; batch classifier loss: 0.617326; batch adversarial loss: 0.629687\n", "epoch 22; iter: 0; batch classifier loss: 0.553121; batch adversarial loss: 0.627001\n", "epoch 23; iter: 0; batch classifier loss: 0.573780; batch adversarial loss: 0.633167\n", "epoch 24; iter: 0; batch classifier loss: 0.624649; batch adversarial loss: 0.636766\n", "epoch 25; iter: 0; batch classifier loss: 0.550134; batch adversarial loss: 0.671877\n", "epoch 26; iter: 0; batch classifier loss: 0.593751; batch adversarial loss: 0.596348\n", "epoch 27; iter: 0; batch classifier loss: 0.549859; batch adversarial loss: 0.692848\n", "epoch 28; iter: 0; batch classifier loss: 0.610467; batch adversarial loss: 0.659677\n", "epoch 29; iter: 0; batch classifier loss: 0.638320; batch adversarial loss: 0.702952\n", "epoch 30; iter: 0; batch classifier loss: 0.476523; batch adversarial loss: 0.634830\n", "epoch 31; iter: 0; batch classifier loss: 0.562073; batch adversarial loss: 0.659132\n", "epoch 32; iter: 0; batch classifier loss: 0.552323; batch adversarial loss: 0.701090\n", "epoch 33; iter: 0; batch classifier loss: 0.594959; batch adversarial loss: 0.595250\n", "epoch 34; iter: 0; batch classifier loss: 0.530601; batch adversarial loss: 0.641101\n", "epoch 35; iter: 0; batch classifier loss: 0.605526; batch adversarial loss: 0.633169\n", "epoch 36; iter: 0; batch classifier loss: 0.543247; batch adversarial loss: 0.626984\n", "epoch 37; iter: 0; batch classifier loss: 0.581434; batch adversarial loss: 0.644459\n", "epoch 38; iter: 0; batch classifier loss: 0.501990; batch adversarial loss: 0.648758\n", "epoch 39; iter: 0; batch classifier loss: 0.553095; batch adversarial loss: 0.634617\n", "epoch 40; iter: 0; batch classifier loss: 0.549651; batch adversarial loss: 0.615262\n", "epoch 41; iter: 0; batch classifier loss: 0.531289; batch adversarial loss: 0.628039\n", "epoch 42; iter: 0; batch classifier loss: 0.588487; batch adversarial loss: 0.671525\n", "epoch 43; iter: 0; batch classifier loss: 0.494682; batch adversarial loss: 0.671941\n", "epoch 44; iter: 0; batch classifier loss: 0.552737; batch adversarial loss: 0.655566\n", "epoch 45; iter: 0; batch classifier loss: 0.592400; batch adversarial loss: 0.659078\n", "epoch 46; iter: 0; batch classifier loss: 0.568672; batch adversarial loss: 0.590417\n", "epoch 47; iter: 0; batch classifier loss: 0.552401; batch adversarial loss: 0.696433\n", "epoch 48; iter: 0; batch classifier loss: 0.672048; batch adversarial loss: 0.639207\n", "epoch 49; iter: 0; batch classifier loss: 0.543037; batch adversarial loss: 0.625622\n", "epoch 0; iter: 0; batch classifier loss: 0.845600; batch adversarial loss: 1.045207\n", "epoch 1; iter: 0; batch classifier loss: 1.419765; batch adversarial loss: 1.134988\n", "epoch 2; iter: 0; batch classifier loss: 0.919405; batch adversarial loss: 1.114787\n", "epoch 3; iter: 0; batch classifier loss: 0.695837; batch adversarial loss: 0.960570\n", "epoch 4; iter: 0; batch classifier loss: 0.727688; batch adversarial loss: 0.978377\n", "epoch 5; iter: 0; batch classifier loss: 0.738481; batch adversarial loss: 0.947548\n", "epoch 6; iter: 0; batch classifier loss: 0.734012; batch adversarial loss: 0.937154\n", "epoch 7; iter: 0; batch classifier loss: 0.669971; batch adversarial loss: 0.918565\n", "epoch 8; iter: 0; batch classifier loss: 0.747264; batch adversarial loss: 0.909925\n", "epoch 9; iter: 0; batch classifier loss: 0.681988; batch adversarial loss: 0.880940\n", "epoch 10; iter: 0; batch classifier loss: 0.621899; batch adversarial loss: 0.814539\n", "epoch 11; iter: 0; batch classifier loss: 0.650619; batch adversarial loss: 0.797759\n", "epoch 12; iter: 0; batch classifier loss: 0.622968; batch adversarial loss: 0.794087\n", "epoch 13; iter: 0; batch classifier loss: 0.599750; batch adversarial loss: 0.773155\n", "epoch 14; iter: 0; batch classifier loss: 0.596399; batch adversarial loss: 0.763104\n", "epoch 15; iter: 0; batch classifier loss: 0.637463; batch adversarial loss: 0.759039\n", "epoch 16; iter: 0; batch classifier loss: 0.581280; batch adversarial loss: 0.737816\n", "epoch 17; iter: 0; batch classifier loss: 0.648799; batch adversarial loss: 0.738482\n", "epoch 18; iter: 0; batch classifier loss: 0.562859; batch adversarial loss: 0.715484\n", "epoch 19; iter: 0; batch classifier loss: 0.659535; batch adversarial loss: 0.719314\n", "epoch 20; iter: 0; batch classifier loss: 0.595509; batch adversarial loss: 0.704961\n", "epoch 21; iter: 0; batch classifier loss: 0.584952; batch adversarial loss: 0.705091\n", "epoch 22; iter: 0; batch classifier loss: 0.673003; batch adversarial loss: 0.693465\n", "epoch 23; iter: 0; batch classifier loss: 0.637615; batch adversarial loss: 0.692873\n", "epoch 24; iter: 0; batch classifier loss: 0.607681; batch adversarial loss: 0.671395\n", "epoch 25; iter: 0; batch classifier loss: 0.586003; batch adversarial loss: 0.680013\n", "epoch 26; iter: 0; batch classifier loss: 0.663231; batch adversarial loss: 0.661352\n", "epoch 27; iter: 0; batch classifier loss: 0.579609; batch adversarial loss: 0.644944\n", "epoch 28; iter: 0; batch classifier loss: 0.593411; batch adversarial loss: 0.669653\n", "epoch 29; iter: 0; batch classifier loss: 0.640926; batch adversarial loss: 0.663450\n", "epoch 30; iter: 0; batch classifier loss: 0.570141; batch adversarial loss: 0.678751\n", "epoch 31; iter: 0; batch classifier loss: 0.577521; batch adversarial loss: 0.635783\n", "epoch 32; iter: 0; batch classifier loss: 0.552512; batch adversarial loss: 0.635597\n", "epoch 33; iter: 0; batch classifier loss: 0.600135; batch adversarial loss: 0.671453\n", "epoch 34; iter: 0; batch classifier loss: 0.589070; batch adversarial loss: 0.633546\n", "epoch 35; iter: 0; batch classifier loss: 0.609119; batch adversarial loss: 0.620446\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 36; iter: 0; batch classifier loss: 0.624269; batch adversarial loss: 0.668979\n", "epoch 37; iter: 0; batch classifier loss: 0.575568; batch adversarial loss: 0.629863\n", "epoch 38; iter: 0; batch classifier loss: 0.581189; batch adversarial loss: 0.666735\n", "epoch 39; iter: 0; batch classifier loss: 0.640068; batch adversarial loss: 0.613165\n", "epoch 40; iter: 0; batch classifier loss: 0.475990; batch adversarial loss: 0.651583\n", "epoch 41; iter: 0; batch classifier loss: 0.560293; batch adversarial loss: 0.640624\n", "epoch 42; iter: 0; batch classifier loss: 0.552296; batch adversarial loss: 0.661160\n", "epoch 43; iter: 0; batch classifier loss: 0.538795; batch adversarial loss: 0.649450\n", "epoch 44; iter: 0; batch classifier loss: 0.503779; batch adversarial loss: 0.658026\n", "epoch 45; iter: 0; batch classifier loss: 0.521745; batch adversarial loss: 0.659502\n", "epoch 46; iter: 0; batch classifier loss: 0.603993; batch adversarial loss: 0.631452\n", "epoch 47; iter: 0; batch classifier loss: 0.530964; batch adversarial loss: 0.644663\n", "epoch 48; iter: 0; batch classifier loss: 0.550096; batch adversarial loss: 0.610205\n", "epoch 49; iter: 0; batch classifier loss: 0.517033; batch adversarial loss: 0.654509\n", "epoch 0; iter: 0; batch classifier loss: 0.918163; batch adversarial loss: 1.121837\n", "epoch 1; iter: 0; batch classifier loss: 1.090690; batch adversarial loss: 1.223462\n", "epoch 2; iter: 0; batch classifier loss: 1.020082; batch adversarial loss: 1.083617\n", "epoch 3; iter: 0; batch classifier loss: 0.821867; batch adversarial loss: 1.110684\n", "epoch 4; iter: 0; batch classifier loss: 0.741849; batch adversarial loss: 1.070193\n", "epoch 5; iter: 0; batch classifier loss: 0.722208; batch adversarial loss: 1.059134\n", "epoch 6; iter: 0; batch classifier loss: 0.759922; batch adversarial loss: 0.979503\n", "epoch 7; iter: 0; batch classifier loss: 0.679335; batch adversarial loss: 0.970793\n", "epoch 8; iter: 0; batch classifier loss: 0.586555; batch adversarial loss: 0.989170\n", "epoch 9; iter: 0; batch classifier loss: 0.592407; batch adversarial loss: 0.940391\n", "epoch 10; iter: 0; batch classifier loss: 0.633289; batch adversarial loss: 0.948801\n", "epoch 11; iter: 0; batch classifier loss: 0.640219; batch adversarial loss: 0.825924\n", "epoch 12; iter: 0; batch classifier loss: 0.632782; batch adversarial loss: 0.831365\n", "epoch 13; iter: 0; batch classifier loss: 0.617190; batch adversarial loss: 0.840702\n", "epoch 14; iter: 0; batch classifier loss: 0.611827; batch adversarial loss: 0.807484\n", "epoch 15; iter: 0; batch classifier loss: 0.719152; batch adversarial loss: 0.790937\n", "epoch 16; iter: 0; batch classifier loss: 0.660686; batch adversarial loss: 0.757750\n", "epoch 17; iter: 0; batch classifier loss: 0.591136; batch adversarial loss: 0.770263\n", "epoch 18; iter: 0; batch classifier loss: 0.612309; batch adversarial loss: 0.752555\n", "epoch 19; iter: 0; batch classifier loss: 0.593822; batch adversarial loss: 0.759102\n", "epoch 20; iter: 0; batch classifier loss: 0.643711; batch adversarial loss: 0.717411\n", "epoch 21; iter: 0; batch classifier loss: 0.544119; batch adversarial loss: 0.739149\n", "epoch 22; iter: 0; batch classifier loss: 0.611292; batch adversarial loss: 0.698020\n", "epoch 23; iter: 0; batch classifier loss: 0.619005; batch adversarial loss: 0.707082\n", "epoch 24; iter: 0; batch classifier loss: 0.610061; batch adversarial loss: 0.687057\n", "epoch 25; iter: 0; batch classifier loss: 0.535445; batch adversarial loss: 0.686328\n", "epoch 26; iter: 0; batch classifier loss: 0.572604; batch adversarial loss: 0.691706\n", "epoch 27; iter: 0; batch classifier loss: 0.599103; batch adversarial loss: 0.673632\n", "epoch 28; iter: 0; batch classifier loss: 0.536421; batch adversarial loss: 0.647013\n", "epoch 29; iter: 0; batch classifier loss: 0.564136; batch adversarial loss: 0.664776\n", "epoch 30; iter: 0; batch classifier loss: 0.629590; batch adversarial loss: 0.647939\n", "epoch 31; iter: 0; batch classifier loss: 0.507973; batch adversarial loss: 0.683617\n", "epoch 32; iter: 0; batch classifier loss: 0.678681; batch adversarial loss: 0.633108\n", "epoch 33; iter: 0; batch classifier loss: 0.538671; batch adversarial loss: 0.636028\n", "epoch 34; iter: 0; batch classifier loss: 0.587393; batch adversarial loss: 0.667845\n", "epoch 35; iter: 0; batch classifier loss: 0.589084; batch adversarial loss: 0.610247\n", "epoch 36; iter: 0; batch classifier loss: 0.551045; batch adversarial loss: 0.648164\n", "epoch 37; iter: 0; batch classifier loss: 0.540973; batch adversarial loss: 0.616787\n", "epoch 38; iter: 0; batch classifier loss: 0.564719; batch adversarial loss: 0.642311\n", "epoch 39; iter: 0; batch classifier loss: 0.531106; batch adversarial loss: 0.640725\n", "epoch 40; iter: 0; batch classifier loss: 0.464275; batch adversarial loss: 0.618715\n", "epoch 41; iter: 0; batch classifier loss: 0.585698; batch adversarial loss: 0.672758\n", "epoch 42; iter: 0; batch classifier loss: 0.519452; batch adversarial loss: 0.663340\n", "epoch 43; iter: 0; batch classifier loss: 0.564000; batch adversarial loss: 0.609444\n", "epoch 44; iter: 0; batch classifier loss: 0.487953; batch adversarial loss: 0.616552\n", "epoch 45; iter: 0; batch classifier loss: 0.577049; batch adversarial loss: 0.644169\n", "epoch 46; iter: 0; batch classifier loss: 0.558715; batch adversarial loss: 0.644790\n", "epoch 47; iter: 0; batch classifier loss: 0.602108; batch adversarial loss: 0.658633\n", "epoch 48; iter: 0; batch classifier loss: 0.585761; batch adversarial loss: 0.610999\n", "epoch 49; iter: 0; batch classifier loss: 0.534292; batch adversarial loss: 0.640719\n", "epoch 0; iter: 0; batch classifier loss: 0.743298; batch adversarial loss: 0.676641\n", "epoch 1; iter: 0; batch classifier loss: 0.749337; batch adversarial loss: 0.658863\n", "epoch 2; iter: 0; batch classifier loss: 0.625587; batch adversarial loss: 0.642971\n", "epoch 3; iter: 0; batch classifier loss: 0.622441; batch adversarial loss: 0.673972\n", "epoch 4; iter: 0; batch classifier loss: 0.680124; batch adversarial loss: 0.684832\n", "epoch 5; iter: 0; batch classifier loss: 0.651064; batch adversarial loss: 0.661338\n", "epoch 6; iter: 0; batch classifier loss: 0.596764; batch adversarial loss: 0.649120\n", "epoch 7; iter: 0; batch classifier loss: 0.582848; batch adversarial loss: 0.675613\n", "epoch 8; iter: 0; batch classifier loss: 0.607242; batch adversarial loss: 0.639142\n", "epoch 9; iter: 0; batch classifier loss: 0.577521; batch adversarial loss: 0.660881\n", "epoch 10; iter: 0; batch classifier loss: 0.576464; batch adversarial loss: 0.626788\n", "epoch 11; iter: 0; batch classifier loss: 0.585102; batch adversarial loss: 0.661102\n", "epoch 12; iter: 0; batch classifier loss: 0.540015; batch adversarial loss: 0.654309\n", "epoch 13; iter: 0; batch classifier loss: 0.486927; batch adversarial loss: 0.641817\n", "epoch 14; iter: 0; batch classifier loss: 0.535599; batch adversarial loss: 0.655543\n", "epoch 15; iter: 0; batch classifier loss: 0.676292; batch adversarial loss: 0.698293\n", "epoch 16; iter: 0; batch classifier loss: 0.574275; batch adversarial loss: 0.678741\n", "epoch 17; iter: 0; batch classifier loss: 0.526684; batch adversarial loss: 0.652343\n", "epoch 18; iter: 0; batch classifier loss: 0.616025; batch adversarial loss: 0.629848\n", "epoch 19; iter: 0; batch classifier loss: 0.578963; batch adversarial loss: 0.674080\n", "epoch 20; iter: 0; batch classifier loss: 0.553202; batch adversarial loss: 0.597479\n", "epoch 21; iter: 0; batch classifier loss: 0.595007; batch adversarial loss: 0.588820\n", "epoch 22; iter: 0; batch classifier loss: 0.519966; batch adversarial loss: 0.716527\n", "epoch 23; iter: 0; batch classifier loss: 0.678478; batch adversarial loss: 0.676951\n", "epoch 24; iter: 0; batch classifier loss: 0.528380; batch adversarial loss: 0.707704\n", "epoch 25; iter: 0; batch classifier loss: 0.588301; batch adversarial loss: 0.616020\n", "epoch 26; iter: 0; batch classifier loss: 0.582160; batch adversarial loss: 0.660825\n", "epoch 27; iter: 0; batch classifier loss: 0.545429; batch adversarial loss: 0.667421\n", "epoch 28; iter: 0; batch classifier loss: 0.547449; batch adversarial loss: 0.680079\n", "epoch 29; iter: 0; batch classifier loss: 0.625149; batch adversarial loss: 0.662606\n", "epoch 30; iter: 0; batch classifier loss: 0.560578; batch adversarial loss: 0.654763\n", "epoch 31; iter: 0; batch classifier loss: 0.555794; batch adversarial loss: 0.681805\n", "epoch 32; iter: 0; batch classifier loss: 0.536116; batch adversarial loss: 0.640845\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 33; iter: 0; batch classifier loss: 0.579689; batch adversarial loss: 0.688077\n", "epoch 34; iter: 0; batch classifier loss: 0.518525; batch adversarial loss: 0.662092\n", "epoch 35; iter: 0; batch classifier loss: 0.603105; batch adversarial loss: 0.646027\n", "epoch 36; iter: 0; batch classifier loss: 0.571237; batch adversarial loss: 0.670297\n", "epoch 37; iter: 0; batch classifier loss: 0.582125; batch adversarial loss: 0.639729\n", "epoch 38; iter: 0; batch classifier loss: 0.607465; batch adversarial loss: 0.640020\n", "epoch 39; iter: 0; batch classifier loss: 0.556892; batch adversarial loss: 0.630613\n", "epoch 40; iter: 0; batch classifier loss: 0.526206; batch adversarial loss: 0.651281\n", "epoch 41; iter: 0; batch classifier loss: 0.506331; batch adversarial loss: 0.642716\n", "epoch 42; iter: 0; batch classifier loss: 0.510201; batch adversarial loss: 0.663794\n", "epoch 43; iter: 0; batch classifier loss: 0.589719; batch adversarial loss: 0.654443\n", "epoch 44; iter: 0; batch classifier loss: 0.567297; batch adversarial loss: 0.675311\n", "epoch 45; iter: 0; batch classifier loss: 0.570290; batch adversarial loss: 0.645293\n", "epoch 46; iter: 0; batch classifier loss: 0.499623; batch adversarial loss: 0.635917\n", "epoch 47; iter: 0; batch classifier loss: 0.593236; batch adversarial loss: 0.628945\n", "epoch 48; iter: 0; batch classifier loss: 0.537393; batch adversarial loss: 0.652359\n", "epoch 49; iter: 0; batch classifier loss: 0.535231; batch adversarial loss: 0.608432\n", "epoch 0; iter: 0; batch classifier loss: 0.836799; batch adversarial loss: 0.651056\n", "epoch 1; iter: 0; batch classifier loss: 0.652681; batch adversarial loss: 0.702795\n", "epoch 2; iter: 0; batch classifier loss: 0.684329; batch adversarial loss: 0.682053\n", "epoch 3; iter: 0; batch classifier loss: 0.669780; batch adversarial loss: 0.659006\n", "epoch 4; iter: 0; batch classifier loss: 0.603148; batch adversarial loss: 0.661384\n", "epoch 5; iter: 0; batch classifier loss: 0.650812; batch adversarial loss: 0.659330\n", "epoch 6; iter: 0; batch classifier loss: 0.582362; batch adversarial loss: 0.682533\n", "epoch 7; iter: 0; batch classifier loss: 0.695651; batch adversarial loss: 0.651090\n", "epoch 8; iter: 0; batch classifier loss: 0.616374; batch adversarial loss: 0.660435\n", "epoch 9; iter: 0; batch classifier loss: 0.632366; batch adversarial loss: 0.655866\n", "epoch 10; iter: 0; batch classifier loss: 0.610634; batch adversarial loss: 0.693365\n", "epoch 11; iter: 0; batch classifier loss: 0.629878; batch adversarial loss: 0.625211\n", "epoch 12; iter: 0; batch classifier loss: 0.605541; batch adversarial loss: 0.663159\n", "epoch 13; iter: 0; batch classifier loss: 0.613881; batch adversarial loss: 0.673000\n", "epoch 14; iter: 0; batch classifier loss: 0.535501; batch adversarial loss: 0.681575\n", "epoch 15; iter: 0; batch classifier loss: 0.525353; batch adversarial loss: 0.642047\n", "epoch 16; iter: 0; batch classifier loss: 0.529286; batch adversarial loss: 0.673208\n", "epoch 17; iter: 0; batch classifier loss: 0.576529; batch adversarial loss: 0.631682\n", "epoch 18; iter: 0; batch classifier loss: 0.670363; batch adversarial loss: 0.654382\n", "epoch 19; iter: 0; batch classifier loss: 0.569610; batch adversarial loss: 0.598402\n", "epoch 20; iter: 0; batch classifier loss: 0.526719; batch adversarial loss: 0.653636\n", "epoch 21; iter: 0; batch classifier loss: 0.530099; batch adversarial loss: 0.630364\n", "epoch 22; iter: 0; batch classifier loss: 0.512786; batch adversarial loss: 0.674982\n", "epoch 23; iter: 0; batch classifier loss: 0.604977; batch adversarial loss: 0.641433\n", "epoch 24; iter: 0; batch classifier loss: 0.642969; batch adversarial loss: 0.662454\n", "epoch 25; iter: 0; batch classifier loss: 0.545622; batch adversarial loss: 0.648392\n", "epoch 26; iter: 0; batch classifier loss: 0.637099; batch adversarial loss: 0.652766\n", "epoch 27; iter: 0; batch classifier loss: 0.515785; batch adversarial loss: 0.619144\n", "epoch 28; iter: 0; batch classifier loss: 0.612310; batch adversarial loss: 0.631125\n", "epoch 29; iter: 0; batch classifier loss: 0.525689; batch adversarial loss: 0.666162\n", "epoch 30; iter: 0; batch classifier loss: 0.604267; batch adversarial loss: 0.627052\n", "epoch 31; iter: 0; batch classifier loss: 0.540902; batch adversarial loss: 0.646573\n", "epoch 32; iter: 0; batch classifier loss: 0.547852; batch adversarial loss: 0.646974\n", "epoch 33; iter: 0; batch classifier loss: 0.612487; batch adversarial loss: 0.631401\n", "epoch 34; iter: 0; batch classifier loss: 0.544861; batch adversarial loss: 0.648442\n", "epoch 35; iter: 0; batch classifier loss: 0.555689; batch adversarial loss: 0.657496\n", "epoch 36; iter: 0; batch classifier loss: 0.576813; batch adversarial loss: 0.625443\n", "epoch 37; iter: 0; batch classifier loss: 0.504120; batch adversarial loss: 0.632210\n", "epoch 38; iter: 0; batch classifier loss: 0.560379; batch adversarial loss: 0.684928\n", "epoch 39; iter: 0; batch classifier loss: 0.562528; batch adversarial loss: 0.677446\n", "epoch 40; iter: 0; batch classifier loss: 0.529946; batch adversarial loss: 0.635687\n", "epoch 41; iter: 0; batch classifier loss: 0.591911; batch adversarial loss: 0.613083\n", "epoch 42; iter: 0; batch classifier loss: 0.547849; batch adversarial loss: 0.619833\n", "epoch 43; iter: 0; batch classifier loss: 0.579394; batch adversarial loss: 0.640527\n", "epoch 44; iter: 0; batch classifier loss: 0.583798; batch adversarial loss: 0.628325\n", "epoch 45; iter: 0; batch classifier loss: 0.506970; batch adversarial loss: 0.674767\n", "epoch 46; iter: 0; batch classifier loss: 0.568930; batch adversarial loss: 0.665351\n", "epoch 47; iter: 0; batch classifier loss: 0.538762; batch adversarial loss: 0.629402\n", "epoch 48; iter: 0; batch classifier loss: 0.516253; batch adversarial loss: 0.636964\n", "epoch 49; iter: 0; batch classifier loss: 0.561682; batch adversarial loss: 0.668488\n", "epoch 0; iter: 0; batch classifier loss: 0.820005; batch adversarial loss: 0.722675\n", "epoch 1; iter: 0; batch classifier loss: 0.766418; batch adversarial loss: 0.735107\n", "epoch 2; iter: 0; batch classifier loss: 0.663670; batch adversarial loss: 0.683751\n", "epoch 3; iter: 0; batch classifier loss: 0.627521; batch adversarial loss: 0.724878\n", "epoch 4; iter: 0; batch classifier loss: 0.658958; batch adversarial loss: 0.711914\n", "epoch 5; iter: 0; batch classifier loss: 0.616108; batch adversarial loss: 0.703247\n", "epoch 6; iter: 0; batch classifier loss: 0.631988; batch adversarial loss: 0.686020\n", "epoch 7; iter: 0; batch classifier loss: 0.612958; batch adversarial loss: 0.643690\n", "epoch 8; iter: 0; batch classifier loss: 0.594714; batch adversarial loss: 0.710108\n", "epoch 9; iter: 0; batch classifier loss: 0.604984; batch adversarial loss: 0.685677\n", "epoch 10; iter: 0; batch classifier loss: 0.583938; batch adversarial loss: 0.722176\n", "epoch 11; iter: 0; batch classifier loss: 0.630401; batch adversarial loss: 0.625929\n", "epoch 12; iter: 0; batch classifier loss: 0.555562; batch adversarial loss: 0.677659\n", "epoch 13; iter: 0; batch classifier loss: 0.577612; batch adversarial loss: 0.650003\n", "epoch 14; iter: 0; batch classifier loss: 0.578795; batch adversarial loss: 0.673109\n", "epoch 15; iter: 0; batch classifier loss: 0.633555; batch adversarial loss: 0.675850\n", "epoch 16; iter: 0; batch classifier loss: 0.561541; batch adversarial loss: 0.640132\n", "epoch 17; iter: 0; batch classifier loss: 0.580910; batch adversarial loss: 0.672168\n", "epoch 18; iter: 0; batch classifier loss: 0.609921; batch adversarial loss: 0.681990\n", "epoch 19; iter: 0; batch classifier loss: 0.582073; batch adversarial loss: 0.676439\n", "epoch 20; iter: 0; batch classifier loss: 0.554300; batch adversarial loss: 0.655638\n", "epoch 21; iter: 0; batch classifier loss: 0.574455; batch adversarial loss: 0.703369\n", "epoch 22; iter: 0; batch classifier loss: 0.616503; batch adversarial loss: 0.672330\n", "epoch 23; iter: 0; batch classifier loss: 0.596940; batch adversarial loss: 0.627144\n", "epoch 24; iter: 0; batch classifier loss: 0.526252; batch adversarial loss: 0.636385\n", "epoch 25; iter: 0; batch classifier loss: 0.567764; batch adversarial loss: 0.637352\n", "epoch 26; iter: 0; batch classifier loss: 0.565771; batch adversarial loss: 0.640122\n", "epoch 27; iter: 0; batch classifier loss: 0.612459; batch adversarial loss: 0.621020\n", "epoch 28; iter: 0; batch classifier loss: 0.628073; batch adversarial loss: 0.672986\n", "epoch 29; iter: 0; batch classifier loss: 0.564179; batch adversarial loss: 0.609661\n", "epoch 30; iter: 0; batch classifier loss: 0.571793; batch adversarial loss: 0.670722\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 31; iter: 0; batch classifier loss: 0.511612; batch adversarial loss: 0.648284\n", "epoch 32; iter: 0; batch classifier loss: 0.544786; batch adversarial loss: 0.646542\n", "epoch 33; iter: 0; batch classifier loss: 0.592674; batch adversarial loss: 0.619197\n", "epoch 34; iter: 0; batch classifier loss: 0.516194; batch adversarial loss: 0.649289\n", "epoch 35; iter: 0; batch classifier loss: 0.556044; batch adversarial loss: 0.643912\n", "epoch 36; iter: 0; batch classifier loss: 0.549158; batch adversarial loss: 0.681171\n", "epoch 37; iter: 0; batch classifier loss: 0.526406; batch adversarial loss: 0.651180\n", "epoch 38; iter: 0; batch classifier loss: 0.530251; batch adversarial loss: 0.661557\n", "epoch 39; iter: 0; batch classifier loss: 0.585511; batch adversarial loss: 0.624180\n", "epoch 40; iter: 0; batch classifier loss: 0.538059; batch adversarial loss: 0.661109\n", "epoch 41; iter: 0; batch classifier loss: 0.507525; batch adversarial loss: 0.656621\n", "epoch 42; iter: 0; batch classifier loss: 0.497432; batch adversarial loss: 0.651942\n", "epoch 43; iter: 0; batch classifier loss: 0.575450; batch adversarial loss: 0.669939\n", "epoch 44; iter: 0; batch classifier loss: 0.566490; batch adversarial loss: 0.630970\n", "epoch 45; iter: 0; batch classifier loss: 0.531581; batch adversarial loss: 0.616198\n", "epoch 46; iter: 0; batch classifier loss: 0.570508; batch adversarial loss: 0.602386\n", "epoch 47; iter: 0; batch classifier loss: 0.547381; batch adversarial loss: 0.657618\n", "epoch 48; iter: 0; batch classifier loss: 0.558624; batch adversarial loss: 0.622465\n", "epoch 49; iter: 0; batch classifier loss: 0.593583; batch adversarial loss: 0.627919\n", "epoch 0; iter: 0; batch classifier loss: 1.161748; batch adversarial loss: 0.704076\n", "epoch 1; iter: 0; batch classifier loss: 0.755109; batch adversarial loss: 0.768507\n", "epoch 2; iter: 0; batch classifier loss: 0.694703; batch adversarial loss: 0.787294\n", "epoch 3; iter: 0; batch classifier loss: 0.713644; batch adversarial loss: 0.730444\n", "epoch 4; iter: 0; batch classifier loss: 0.650497; batch adversarial loss: 0.739268\n", "epoch 5; iter: 0; batch classifier loss: 0.661025; batch adversarial loss: 0.708553\n", "epoch 6; iter: 0; batch classifier loss: 0.567720; batch adversarial loss: 0.687943\n", "epoch 7; iter: 0; batch classifier loss: 0.718481; batch adversarial loss: 0.694921\n", "epoch 8; iter: 0; batch classifier loss: 0.610949; batch adversarial loss: 0.705375\n", "epoch 9; iter: 0; batch classifier loss: 0.598705; batch adversarial loss: 0.683368\n", "epoch 10; iter: 0; batch classifier loss: 0.553869; batch adversarial loss: 0.672155\n", "epoch 11; iter: 0; batch classifier loss: 0.640992; batch adversarial loss: 0.686772\n", "epoch 12; iter: 0; batch classifier loss: 0.593011; batch adversarial loss: 0.673066\n", "epoch 13; iter: 0; batch classifier loss: 0.608318; batch adversarial loss: 0.691243\n", "epoch 14; iter: 0; batch classifier loss: 0.559907; batch adversarial loss: 0.701043\n", "epoch 15; iter: 0; batch classifier loss: 0.624393; batch adversarial loss: 0.645846\n", "epoch 16; iter: 0; batch classifier loss: 0.594353; batch adversarial loss: 0.655844\n", "epoch 17; iter: 0; batch classifier loss: 0.612468; batch adversarial loss: 0.708583\n", "epoch 18; iter: 0; batch classifier loss: 0.620018; batch adversarial loss: 0.668019\n", "epoch 19; iter: 0; batch classifier loss: 0.598801; batch adversarial loss: 0.636265\n", "epoch 20; iter: 0; batch classifier loss: 0.635403; batch adversarial loss: 0.656513\n", "epoch 21; iter: 0; batch classifier loss: 0.587358; batch adversarial loss: 0.702326\n", "epoch 22; iter: 0; batch classifier loss: 0.584986; batch adversarial loss: 0.657859\n", "epoch 23; iter: 0; batch classifier loss: 0.537419; batch adversarial loss: 0.673981\n", "epoch 24; iter: 0; batch classifier loss: 0.559243; batch adversarial loss: 0.667986\n", "epoch 25; iter: 0; batch classifier loss: 0.599423; batch adversarial loss: 0.628164\n", "epoch 26; iter: 0; batch classifier loss: 0.562937; batch adversarial loss: 0.652360\n", "epoch 27; iter: 0; batch classifier loss: 0.557364; batch adversarial loss: 0.636395\n", "epoch 28; iter: 0; batch classifier loss: 0.537692; batch adversarial loss: 0.645858\n", "epoch 29; iter: 0; batch classifier loss: 0.545602; batch adversarial loss: 0.618875\n", "epoch 30; iter: 0; batch classifier loss: 0.586505; batch adversarial loss: 0.689386\n", "epoch 31; iter: 0; batch classifier loss: 0.633933; batch adversarial loss: 0.666522\n", "epoch 32; iter: 0; batch classifier loss: 0.571702; batch adversarial loss: 0.638385\n", "epoch 33; iter: 0; batch classifier loss: 0.599029; batch adversarial loss: 0.630923\n", "epoch 34; iter: 0; batch classifier loss: 0.599588; batch adversarial loss: 0.674220\n", "epoch 35; iter: 0; batch classifier loss: 0.629427; batch adversarial loss: 0.657447\n", "epoch 36; iter: 0; batch classifier loss: 0.602158; batch adversarial loss: 0.656446\n", "epoch 37; iter: 0; batch classifier loss: 0.570137; batch adversarial loss: 0.673896\n", "epoch 38; iter: 0; batch classifier loss: 0.580427; batch adversarial loss: 0.637193\n", "epoch 39; iter: 0; batch classifier loss: 0.518466; batch adversarial loss: 0.642166\n", "epoch 40; iter: 0; batch classifier loss: 0.511679; batch adversarial loss: 0.572230\n", "epoch 41; iter: 0; batch classifier loss: 0.589481; batch adversarial loss: 0.636720\n", "epoch 42; iter: 0; batch classifier loss: 0.595775; batch adversarial loss: 0.658395\n", "epoch 43; iter: 0; batch classifier loss: 0.493201; batch adversarial loss: 0.670631\n", "epoch 44; iter: 0; batch classifier loss: 0.571037; batch adversarial loss: 0.658096\n", "epoch 45; iter: 0; batch classifier loss: 0.549227; batch adversarial loss: 0.624626\n", "epoch 46; iter: 0; batch classifier loss: 0.537962; batch adversarial loss: 0.648099\n", "epoch 47; iter: 0; batch classifier loss: 0.496648; batch adversarial loss: 0.610983\n", "epoch 48; iter: 0; batch classifier loss: 0.542883; batch adversarial loss: 0.683463\n", "epoch 49; iter: 0; batch classifier loss: 0.610309; batch adversarial loss: 0.674872\n", "epoch 0; iter: 0; batch classifier loss: 1.030058; batch adversarial loss: 0.694517\n", "epoch 1; iter: 0; batch classifier loss: 0.779704; batch adversarial loss: 0.691069\n", "epoch 2; iter: 0; batch classifier loss: 0.716739; batch adversarial loss: 0.682716\n", "epoch 3; iter: 0; batch classifier loss: 0.706991; batch adversarial loss: 0.673462\n", "epoch 4; iter: 0; batch classifier loss: 0.616680; batch adversarial loss: 0.668715\n", "epoch 5; iter: 0; batch classifier loss: 0.648628; batch adversarial loss: 0.663882\n", "epoch 6; iter: 0; batch classifier loss: 0.657879; batch adversarial loss: 0.672087\n", "epoch 7; iter: 0; batch classifier loss: 0.606218; batch adversarial loss: 0.639501\n", "epoch 8; iter: 0; batch classifier loss: 0.647747; batch adversarial loss: 0.670788\n", "epoch 9; iter: 0; batch classifier loss: 0.637523; batch adversarial loss: 0.665442\n", "epoch 10; iter: 0; batch classifier loss: 0.627855; batch adversarial loss: 0.652970\n", "epoch 11; iter: 0; batch classifier loss: 0.668981; batch adversarial loss: 0.654432\n", "epoch 12; iter: 0; batch classifier loss: 0.531797; batch adversarial loss: 0.649315\n", "epoch 13; iter: 0; batch classifier loss: 0.524006; batch adversarial loss: 0.651782\n", "epoch 14; iter: 0; batch classifier loss: 0.560182; batch adversarial loss: 0.625848\n", "epoch 15; iter: 0; batch classifier loss: 0.654811; batch adversarial loss: 0.647128\n", "epoch 16; iter: 0; batch classifier loss: 0.578116; batch adversarial loss: 0.647392\n", "epoch 17; iter: 0; batch classifier loss: 0.533814; batch adversarial loss: 0.633563\n", "epoch 18; iter: 0; batch classifier loss: 0.568312; batch adversarial loss: 0.673429\n", "epoch 19; iter: 0; batch classifier loss: 0.528856; batch adversarial loss: 0.642570\n", "epoch 20; iter: 0; batch classifier loss: 0.565710; batch adversarial loss: 0.654657\n", "epoch 21; iter: 0; batch classifier loss: 0.537325; batch adversarial loss: 0.690365\n", "epoch 22; iter: 0; batch classifier loss: 0.581344; batch adversarial loss: 0.678076\n", "epoch 23; iter: 0; batch classifier loss: 0.594040; batch adversarial loss: 0.658689\n", "epoch 24; iter: 0; batch classifier loss: 0.536327; batch adversarial loss: 0.622362\n", "epoch 25; iter: 0; batch classifier loss: 0.534556; batch adversarial loss: 0.626238\n", "epoch 26; iter: 0; batch classifier loss: 0.524584; batch adversarial loss: 0.629111\n", "epoch 27; iter: 0; batch classifier loss: 0.663187; batch adversarial loss: 0.640692\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "epoch 28; iter: 0; batch classifier loss: 0.594791; batch adversarial loss: 0.668779\n", "epoch 29; iter: 0; batch classifier loss: 0.597656; batch adversarial loss: 0.613658\n", "epoch 30; iter: 0; batch classifier loss: 0.601195; batch adversarial loss: 0.627271\n", "epoch 31; iter: 0; batch classifier loss: 0.580568; batch adversarial loss: 0.687125\n", "epoch 32; iter: 0; batch classifier loss: 0.534202; batch adversarial loss: 0.658943\n", "epoch 33; iter: 0; batch classifier loss: 0.602238; batch adversarial loss: 0.641595\n", "epoch 34; iter: 0; batch classifier loss: 0.559796; batch adversarial loss: 0.616791\n", "epoch 35; iter: 0; batch classifier loss: 0.604800; batch adversarial loss: 0.647952\n", "epoch 36; iter: 0; batch classifier loss: 0.533659; batch adversarial loss: 0.622971\n", "epoch 37; iter: 0; batch classifier loss: 0.559279; batch adversarial loss: 0.657430\n", "epoch 38; iter: 0; batch classifier loss: 0.536190; batch adversarial loss: 0.621344\n", "epoch 39; iter: 0; batch classifier loss: 0.610820; batch adversarial loss: 0.656518\n", "epoch 40; iter: 0; batch classifier loss: 0.565097; batch adversarial loss: 0.652956\n", "epoch 41; iter: 0; batch classifier loss: 0.550960; batch adversarial loss: 0.634046\n", "epoch 42; iter: 0; batch classifier loss: 0.579663; batch adversarial loss: 0.654857\n", "epoch 43; iter: 0; batch classifier loss: 0.535663; batch adversarial loss: 0.658840\n", "epoch 44; iter: 0; batch classifier loss: 0.554223; batch adversarial loss: 0.612743\n", "epoch 45; iter: 0; batch classifier loss: 0.565461; batch adversarial loss: 0.615858\n", "epoch 46; iter: 0; batch classifier loss: 0.515072; batch adversarial loss: 0.611399\n", "epoch 47; iter: 0; batch classifier loss: 0.555440; batch adversarial loss: 0.599392\n", "epoch 48; iter: 0; batch classifier loss: 0.560522; batch adversarial loss: 0.693490\n", "epoch 49; iter: 0; batch classifier loss: 0.538099; batch adversarial loss: 0.629652\n" ] } ], "source": [ "Search = ModelSearch(models, metrics, hyperparameters, thresholds)\n", "Search.grid_search(data_orig, privileged=privileged, unprivileged=unprivileged, preprocessors=preprocessors, postprocessors=postprocessors)\n", "\n", "Search.to_csv(\"fklearn/interface/static/data/test-file.csv\")" ] }, { "cell_type": "markdown", "metadata": { "comet_cell_id": "2c6e7f3a299b" }, "source": [ "## Step 5: Render visualization of search results\n", "\n", "Along with the ability to run a grid search, fairkit-learn also provides functionality to visualize the results of the grid search. Fairkit-learn uses Bokeh to render a visualization within the notebook, which you can use when completing the next task to explore trained models' performance and fairness.\n", "\n", "The visualzation includes a graph that plots the search results that are pareto optimal. Each data point in the graph is a model with its own settings (e.g., hyper-parameters, pre/post processing). Each model class has its own color to make it easier to see which models are being shown in the visualization. To get more information on each model's settings, hover over the data point of interest; a tooltip will pop up with model settings.\n", "\n", "Within the visualization, you can control what metrics and models are being included in the visualization. The drop down menus allow you to specify the x and y axes for the graph. The checklist below the list of models allows you to select which metrics can be considered in the graph. \n", "\n", "To view the *Pareto frontier* for any two metrics (e.g., accuracy and disparate impact), select those two metrics from the drop down menu and **only** check those boxes in the checklist\n", "\n", "Below we provide code to load Bokeh and plot the results from the search in the interactive plot." ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "comet_cell_id": "d75c4def39e12" }, "outputs": [ { "data": { "text/html": [ "\n", "
" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/javascript": [ "\n", "(function(root) {\n", " function now() {\n", " return new Date();\n", " }\n", "\n", " var force = true;\n", "\n", " if (typeof root._bokeh_onload_callbacks === \"undefined\" || force === true) {\n", " root._bokeh_onload_callbacks = [];\n", " root._bokeh_is_loading = undefined;\n", " }\n", "\n", " var JS_MIME_TYPE = 'application/javascript';\n", " var HTML_MIME_TYPE = 'text/html';\n", " var EXEC_MIME_TYPE = 'application/vnd.bokehjs_exec.v0+json';\n", " var CLASS_NAME = 'output_bokeh rendered_html';\n", "\n", " /**\n", " * Render data to the DOM node\n", " */\n", " function render(props, node) {\n", " var script = document.createElement(\"script\");\n", " node.appendChild(script);\n", " }\n", "\n", " /**\n", " * Handle when an output is cleared or removed\n", " */\n", " function handleClearOutput(event, handle) {\n", " var cell = handle.cell;\n", "\n", " var id = cell.output_area._bokeh_element_id;\n", " var server_id = cell.output_area._bokeh_server_id;\n", " // Clean up Bokeh references\n", " if (id != null && id in Bokeh.index) {\n", " Bokeh.index[id].model.document.clear();\n", " delete Bokeh.index[id];\n", " }\n", "\n", " if (server_id !== undefined) {\n", " // Clean up Bokeh references\n", " var cmd = \"from bokeh.io.state import curstate; print(curstate().uuid_to_server['\" + server_id + \"'].get_sessions()[0].document.roots[0]._id)\";\n", " cell.notebook.kernel.execute(cmd, {\n", " iopub: {\n", " output: function(msg) {\n", " var id = msg.content.text.trim();\n", " if (id in Bokeh.index) {\n", " Bokeh.index[id].model.document.clear();\n", " delete Bokeh.index[id];\n", " }\n", " }\n", " }\n", " });\n", " // Destroy server and session\n", " var cmd = \"import bokeh.io.notebook as ion; ion.destroy_server('\" + server_id + \"')\";\n", " cell.notebook.kernel.execute(cmd);\n", " }\n", " }\n", "\n", " /**\n", " * Handle when a new output is added\n", " */\n", " function handleAddOutput(event, handle) {\n", " var output_area = handle.output_area;\n", " var output = handle.output;\n", "\n", " // limit handleAddOutput to display_data with EXEC_MIME_TYPE content only\n", " if ((output.output_type != \"display_data\") || (!output.data.hasOwnProperty(EXEC_MIME_TYPE))) {\n", " return\n", " }\n", "\n", " var toinsert = output_area.element.find(\".\" + CLASS_NAME.split(' ')[0]);\n", "\n", " if (output.metadata[EXEC_MIME_TYPE][\"id\"] !== undefined) {\n", " toinsert[toinsert.length - 1].firstChild.textContent = output.data[JS_MIME_TYPE];\n", " // store reference to embed id on output_area\n", " output_area._bokeh_element_id = output.metadata[EXEC_MIME_TYPE][\"id\"];\n", " }\n", " if (output.metadata[EXEC_MIME_TYPE][\"server_id\"] !== undefined) {\n", " var bk_div = document.createElement(\"div\");\n", " bk_div.innerHTML = output.data[HTML_MIME_TYPE];\n", " var script_attrs = bk_div.children[0].attributes;\n", " for (var i = 0; i < script_attrs.length; i++) {\n", " toinsert[toinsert.length - 1].firstChild.setAttribute(script_attrs[i].name, script_attrs[i].value);\n", " }\n", " // store reference to server id on output_area\n", " output_area._bokeh_server_id = output.metadata[EXEC_MIME_TYPE][\"server_id\"];\n", " }\n", " }\n", "\n", " function register_renderer(events, OutputArea) {\n", "\n", " function append_mime(data, metadata, element) {\n", " // create a DOM node to render to\n", " var toinsert = this.create_output_subarea(\n", " metadata,\n", " CLASS_NAME,\n", " EXEC_MIME_TYPE\n", " );\n", " this.keyboard_manager.register_events(toinsert);\n", " // Render to node\n", " var props = {data: data, metadata: metadata[EXEC_MIME_TYPE]};\n", " render(props, toinsert[toinsert.length - 1]);\n", " element.append(toinsert);\n", " return toinsert\n", " }\n", "\n", " /* Handle when an output is cleared or removed */\n", " events.on('clear_output.CodeCell', handleClearOutput);\n", " events.on('delete.Cell', handleClearOutput);\n", "\n", " /* Handle when a new output is added */\n", " events.on('output_added.OutputArea', handleAddOutput);\n", "\n", " /**\n", " * Register the mime type and append_mime function with output_area\n", " */\n", " OutputArea.prototype.register_mime_type(EXEC_MIME_TYPE, append_mime, {\n", " /* Is output safe? */\n", " safe: true,\n", " /* Index of renderer in `output_area.display_order` */\n", " index: 0\n", " });\n", " }\n", "\n", " // register the mime type if in Jupyter Notebook environment and previously unregistered\n", " if (root.Jupyter !== undefined) {\n", " var events = require('base/js/events');\n", " var OutputArea = require('notebook/js/outputarea').OutputArea;\n", "\n", " if (OutputArea.prototype.mime_types().indexOf(EXEC_MIME_TYPE) == -1) {\n", " register_renderer(events, OutputArea);\n", " }\n", " }\n", "\n", " \n", " if (typeof (root._bokeh_timeout) === \"undefined\" || force === true) {\n", " root._bokeh_timeout = Date.now() + 5000;\n", " root._bokeh_failed_load = false;\n", " }\n", "\n", " var NB_LOAD_WARNING = {'data': {'text/html':\n", " \"\\n\"+\n", " \"BokehJS does not appear to have successfully loaded. If loading BokehJS from CDN, this \\n\"+\n", " \"may be due to a slow or bad network connection. Possible fixes:\\n\"+\n", " \"
\\n\"+\n", " \"\\n\"+\n",
" \"from bokeh.resources import INLINE\\n\"+\n",
" \"output_notebook(resources=INLINE)\\n\"+\n",
" \"
\\n\"+\n",
" \"\\n\"+\n \"BokehJS does not appear to have successfully loaded. If loading BokehJS from CDN, this \\n\"+\n \"may be due to a slow or bad network connection. Possible fixes:\\n\"+\n \"
\\n\"+\n \"\\n\"+\n \"from bokeh.resources import INLINE\\n\"+\n \"output_notebook(resources=INLINE)\\n\"+\n \"
\\n\"+\n \"