Skip to content

Commit

Permalink
Add CycleGAN colab notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
bilal2vec committed Oct 4, 2019
1 parent 43521b0 commit b90e1f0
Showing 1 changed file with 255 additions and 0 deletions.
255 changes: 255 additions & 0 deletions CycleGAN.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,255 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "CycleGAN",
"provenance": [],
"collapsed_sections": [],
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/github/bkkaggle/pytorch-CycleGAN-and-pix2pix/blob/master/CycleGAN.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "5VIGyIus8Vr7",
"colab_type": "text"
},
"source": [
"Take a look at the [repository](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix) for more information"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "7wNjDKdQy35h",
"colab_type": "text"
},
"source": [
"# Install"
]
},
{
"cell_type": "code",
"metadata": {
"id": "TRm-USlsHgEV",
"colab_type": "code",
"colab": {}
},
"source": [
"!git clone https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "Pt3igws3eiVp",
"colab_type": "code",
"colab": {}
},
"source": [
"import os\n",
"os.chdir('pytorch-CycleGAN-and-pix2pix/')"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "z1EySlOXwwoa",
"colab_type": "code",
"colab": {}
},
"source": [
"!pip install -r requirements.txt"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "8daqlgVhw29P",
"colab_type": "text"
},
"source": [
"# Datasets\n",
"\n",
"Download one of the official datasets with:\n",
"\n",
"- `bash ./datasets/download_cyclegan_dataset.sh [apple2orange, orange2apple, summer2winter_yosemite, winter2summer_yosemite, horse2zebra, zebra2horse, monet2photo, style_monet, style_cezanne, style_ukiyoe, style_vangogh, sat2map, map2sat, cityscapes_photo2label, cityscapes_label2photo, facades_photo2label, facades_label2photo, iphone2dslr_flower]`\n",
"\n",
"Or use your own dataset by creating the appropriate folders and adding in the images.\n",
"\n",
"- Create a dataset folder under `/dataset` for your dataset.\n",
"- Create subfolders `testA`, `testB`, `trainA`, and `trainB` under your dataset's folder. Place any images you want to transform from a to b (cat2dog) in the `testA` folder, images you want to transform from b to a (dog2cat) in the `testB` folder, and do the same for the `trainA` and `trainB` folders."
]
},
{
"cell_type": "code",
"metadata": {
"id": "vrdOettJxaCc",
"colab_type": "code",
"colab": {}
},
"source": [
"!bash ./datasets/download_cyclegan_dataset.sh horse2zebra"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "gdUz4116xhpm",
"colab_type": "text"
},
"source": [
"# Pretrained models\n",
"\n",
"Download one of the official pretrained models with:\n",
"\n",
"- `bash ./scripts/download_cyclegan_model.sh [apple2orange, orange2apple, summer2winter_yosemite, winter2summer_yosemite, horse2zebra, zebra2horse, monet2photo, style_monet, style_cezanne, style_ukiyoe, style_vangogh, sat2map, map2sat, cityscapes_photo2label, cityscapes_label2photo, facades_photo2label, facades_label2photo, iphone2dslr_flower]`\n",
"\n",
"Or add your own pretrained model to `./checkpoints/{NAME}_pretrained/latest_net_G.pt`"
]
},
{
"cell_type": "code",
"metadata": {
"id": "B75UqtKhxznS",
"colab_type": "code",
"colab": {}
},
"source": [
"!bash ./scripts/download_cyclegan_model.sh horse2zebra"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "yFw1kDQBx3LN",
"colab_type": "text"
},
"source": [
"# Training\n",
"\n",
"- `python train.py --dataroot ./datasets/horse2zebra --name horse2zebra --model cycle_gan`\n",
"\n",
"Change the `--dataroot` and `--name` to your own dataset's path and model's name. Use `--gpu_ids 0,1,..` to train on multiple GPUs and `--batch_size` to change the batch size. I've found that a batch size of 16 fits onto 4 V100s and can finish training an epoch in ~90s.\n",
"\n",
"Once your model has trained, copy over the last checkpoint to a format that the testing model can automatically detect:\n",
"\n",
"Use `cp ./checkpoints/horse2zebra/latest_net_G_A.pth ./checkpoints/horse2zebra/latest_net_G.pth` if you want to transform images from class A to class B and `cp ./checkpoints/horse2zebra/latest_net_G_B.pth ./checkpoints/horse2zebra/latest_net_G.pth` if you want to transform images from class B to class A.\n"
]
},
{
"cell_type": "code",
"metadata": {
"id": "0sp7TCT2x9dB",
"colab_type": "code",
"colab": {}
},
"source": [
"!python train.py --dataroot ./datasets/horse2zebra --name horse2zebra --model cycle_gan"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "9UkcaFZiyASl",
"colab_type": "text"
},
"source": [
"# Testing\n",
"\n",
"- `python test.py --dataroot datasets/horse2zebra/testA --name horse2zebra_pretrained --model test --no_dropout`\n",
"\n",
"Change the `--dataroot` and `--name` to be consistent with your trained model's configuration.\n",
"\n",
"> from https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix:\n",
"> The option --model test is used for generating results of CycleGAN only for one side. This option will automatically set --dataset_mode single, which only loads the images from one set. On the contrary, using --model cycle_gan requires loading and generating results in both directions, which is sometimes unnecessary. The results will be saved at ./results/. Use --results_dir {directory_path_to_save_result} to specify the results directory.\n",
"\n",
"> For your own experiments, you might want to specify --netG, --norm, --no_dropout to match the generator architecture of the trained model."
]
},
{
"cell_type": "code",
"metadata": {
"id": "uCsKkEq0yGh0",
"colab_type": "code",
"colab": {}
},
"source": [
"!python test.py --dataroot datasets/horse2zebra/testA --name horse2zebra_pretrained --model test --no_dropout"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "OzSKIPUByfiN",
"colab_type": "text"
},
"source": [
"# Visualize"
]
},
{
"cell_type": "code",
"metadata": {
"id": "9Mgg8raPyizq",
"colab_type": "code",
"colab": {}
},
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"img = plt.imread('./results/horse2zebra_pretrained/test_latest/images/n02381460_1010_fake.png')\n",
"plt.imshow(img)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "0G3oVH9DyqLQ",
"colab_type": "code",
"colab": {}
},
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"img = plt.imread('./results/horse2zebra_pretrained/test_latest/images/n02381460_1010_real.png')\n",
"plt.imshow(img)"
],
"execution_count": 0,
"outputs": []
}
]
}

0 comments on commit b90e1f0

Please sign in to comment.