{"id":360,"date":"2020-08-13T11:09:29","date_gmt":"2020-08-13T02:09:29","guid":{"rendered":"http:\/\/cedartrees.co.kr\/?p=360"},"modified":"2021-04-03T19:16:17","modified_gmt":"2021-04-03T10:16:17","slug":"cnn-fashion-mnist-test","status":"publish","type":"post","link":"http:\/\/blog.cedartrees.co.kr\/index.php\/2020\/08\/13\/cnn-fashion-mnist-test\/","title":{"rendered":"CNN Fashion-MNIST \ud14c\uc2a4\ud2b8 (PyTorch)"},"content":{"rendered":"\n<p>Fashion-MNIST\uc5d0 \ub300\ud55c \uc124\uba85\uc740 \uc544\ub798 \ub9c1\ud06c\ub85c \ub300\uc2e0\ud558\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<p><code>Fashion-MNIST<\/code>&nbsp;is a dataset of&nbsp;<a href=\"https:\/\/jobs.zalando.com\/tech\/\">Zalando<\/a>&#8216;s article images\u2014consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28&#215;28 grayscale image, associated with a label from 10 classes. We intend&nbsp;<code>Fashion-MNIST<\/code>&nbsp;to serve as a direct&nbsp;<strong>drop-in replacement<\/strong>&nbsp;for the original&nbsp;<a href=\"http:\/\/yann.lecun.com\/exdb\/mnist\/\">MNIST dataset<\/a>&nbsp;for benchmarking machine learning algorithms. It shares the same image size and structure of training and testing splits.<\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Machine Learning Meets Fashion\" width=\"525\" height=\"295\" src=\"https:\/\/www.youtube.com\/embed\/RJudqel8DVA?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" width=\"800\" height=\"561\" src=\"http:\/\/cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/embedding.gif\" alt=\"\" class=\"wp-image-362\"\/><\/figure>\n\n\n\n<p>\ubcf8 \uc608\uc81c \ucf54\ub4dc\ub294 \ub370\uc774\ud130\uc14b\uc744 \ud559\uc2b5\ud574\uc11c \uc785\ub825\ub418\ub294 \uc774\ubbf8\uc9c0\uac00 \uc5b4\ub5a4 \ubd84\ub958\uc5d0 \uc18d\ud558\ub294\uc9c0\ub97c \uc608\uce21\ud574\ubcf4\ub294 \uac83\uc785\ub2c8\ub2e4. Fashion-MNIST \ub370\uc774\ud130\uc14b\uc744 \ubca1\ud130 \uacf5\uac04\uc5d0 \ud45c\uc2dc\ud558\uba74 \uc704\uc640 \uac19\uc740 \uc774\ubbf8\uc9c0\ub85c \ubd84\ub958\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<p>\uc774\uc81c \ud559\uc2b5\uc744 \uc704\ud574 \ud574\ub2f9 \ub370\uc774\ud130\uc14b\uc744 \ub2e4\uc6b4\ub85c\ub4dc\ud569\ub2c8\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\"># Define a transform to normalize the data\ntransform = transforms.Compose([transforms.ToTensor(),\n                                transforms.Normalize((0.5,), (0.5,))])\n\n# Download and load the training data\ntrain_loader = torch.utils.data.DataLoader(datasets.FashionMNIST('..\/F_MNIST_data\/', download=True, train=True, transform=transform), batch_size=128, shuffle=True)\n\n# Download and load the test data\ntest_loader = torch.utils.data.DataLoader(datasets.FashionMNIST('..\/F_MNIST_data\/', download=True, train=False, transform=transform), batch_size=128, shuffle=True)<\/pre>\n\n\n\n<p>\ub2e4\uc6b4\ub85c\ub4dc\ud55c \ub370\uc774\ud130\uac00 \uc5b4\ub5a4 \uc774\ubbf8\uc9c0\uac00 \uc788\ub294\uc9c0 \uc0b4\ud3b4\ubcf4\uae30 \uc704\ud574\uc11c \ub79c\ub364\ud558\uac8c \uba87\uac1c\uc758 \uc0d8\ud50c\uc744 \ucd94\ucd9c\ud574\uc11c \ud45c\uc2dc\ud574\ubcf4\uaca0\uc2b5\ub2c8\ub2e4. \ud574\ub2f9 \uc774\ubbf8\uc9c0\ub4e4\uc740 10\uac1c [&#8216;t-shirt&#8217;, &#8216;trouser&#8217;, &#8216;pullover&#8217;, &#8216;press&#8217;, &#8216;coat&#8217;, &#8216;sandal&#8217;, &#8216;shirt&#8217;, &#8216;sneaker&#8217;, &#8216;bag&#8217;, &#8216;ankleboot&#8217;]\ub85c \ubd84\ub958\ud560 \uc218 \uc788\ub294 \ud328\uc158 \uc544\uc774\ud15c\ub4e4\uc785\ub2c8\ub2e4. <\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">x_train, y_train = next(iter(train_loader))\nx_valid, y_valid = next(iter(test_loader))\n\nfig, ax = plt.subplots(5,5)\nfig.set_size_inches((20,14))\nfor i in range(5):\n    for j in range(5):\n        idx = numpy.random.randint(128)\n        ax[i][j].imshow(x_train[idx,0,:])\n        ax[i][j].set_xlabel(label[y_train[idx].item()])\n        ax[i][j].set_xticklabels([])\n        ax[i][j].set_yticklabels([])<\/pre>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" width=\"1024\" height=\"760\" src=\"http:\/\/cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-2-1024x760.png\" alt=\"\" class=\"wp-image-361\" srcset=\"http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-2-1024x760.png 1024w, http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-2-300x223.png 300w, http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-2-768x570.png 768w, http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-2.png 1072w\" sizes=\"(max-width: 706px) 89vw, (max-width: 767px) 82vw, 740px\" \/><\/figure>\n\n\n\n<p>\ud559\uc2b5\uc744 \uc704\ud55c \ubaa8\ub378\uc744 \uc120\uc5b8\ud569\ub2c8\ub2e4. \uc774\uc804 MNIST \ub370\uc774\ud130\uc14b\uc744 \ud14c\uc2a4\ud2b8\ud588\uc744 \ub54c\uc640 \uac19\uc740 \ubaa8\ub378\uc744 \uc7ac\ud65c\uc6a9\ud588\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">class Net(nn.Module):\n    def __init__(self):\n        super(Net, self).__init__()\n        \n        self.convs = nn.Sequential(\n            nn.Conv2d(1, 10, kernel_size=3), # input_channel, output_channel, kernel_size\n            nn.ReLU(),\n            nn.BatchNorm2d(10),\n            nn.Conv2d(10, 20, kernel_size=3, stride=2),\n            nn.ReLU(),\n            nn.BatchNorm2d(20),\n            nn.Conv2d(20, 40, kernel_size=3, stride=2)\n        )\n        \n        self.layers = nn.Sequential(\n            nn.Linear(40*5*5, 500),\n            nn.Dropout(p=0.2),\n            nn.ReLU(),\n            nn.BatchNorm1d(500),\n            nn.Linear(500,250),\n            nn.Linear(250,100),\n            nn.Dropout(p=0.2),\n            nn.ReLU(),\n            nn.BatchNorm1d(100),\n            nn.Linear(100,50),\n            nn.Linear(50, 10),\n            nn.Softmax(dim=-1)\n        )\n\n    def forward(self, x):\n        x = self.convs(x)\n        x = x.view(-1, 40*5*5)\n        return self.layers(x)\n    \ncnn = Net().to(DEVICE)<\/pre>\n\n\n\n<p>\uc774\uc804 MNIST \ucf54\ub4dc\ub294 \ud558\ub098\uc758 mini batch \ub370\uc774\ud130\ub9cc \ud559\uc2b5\ud588\ub2e4\uba74 \uc774\ubc88\uc5d0\ub294 \uc804\uccb4 \ub370\uc774\ud130\ub97c \ub300\uc0c1\uc73c\ub85c \ud559\uc2b5\uc744 \uc9c4\ud589\ud569\ub2c8\ub2e4. \ub9ce\uc740 \ud559\uc2b5\uc744 \uac70\uce5c\ub2e4\uba74 \ubaa8\ub378\uc758 \uc815\ud655\ub3c4\uac00 \ub192\uc544\uc9c0\uaca0\uc9c0\ub9cc \uc131\ub2a5\uc744 \ub192\uc774\ub294 \ud14c\uc2a4\ud2b8\uac00 \uc544\ub2c8\uae30 \ub54c\ubb38\uc5d0 \ucd5c\uc18c\ud55c\uc758 \ud559\uc2b5 epcohs\ub9cc \uc218\ud589\ud569\ub2c8\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">optimizer = optim.Adam(cnn.parameters())\ncriterion = nn.CrossEntropyLoss()\n\nhist_loss = []\nhist_accr = []\n\nepochs = 30\n\nfor epoch in range(epochs):\n\n    for idx, (data, label) in enumerate(train_loader):\n        data, label = data.to(DEVICE), label.to(DEVICE)\n        output = cnn(data)\n        loss = criterion(output, label)\n        \n        predict = torch.argmax(output, dim=-1) == label\n        accuracy = predict.float().mean().item()\n\n        optimizer.zero_grad()\n        loss.backward()\n        optimizer.step()\n        \n        hist_loss.append(loss.item())\n        hist_accr.append(accuracy)\n\n        if idx % 100 == 0:\n            print('Epoch {}, idx {}, Loss : {:.5f}, Accuracy : {:.5f}'.format(epoch, idx, loss.item(), accuracy))\n<\/pre>\n\n\n\n<p>\ud559\uc2b5\uc774 \uc644\ub8cc\ub418\uace0 \ud559\uc2b5\uc758 \uc9c4\ud589\uc774 \uc5b4\ub5bb\uac8c \ub418\uc5c8\ub294\uc9c0 \uc54c\uae30 \uc704\ud574\uc11c \uc0ac\uc804\uc5d0 \uc815\uc758\ud55c hist_loss\uc640 hist_accr\uc744 \uc0ac\uc6a9\ud574\uc11c \uc2dc\uac01\ud654 \ud574\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">fig, ax = plt.subplots(2,1)\nfig.set_size_inches((12,8))\n\nax[0].set_title('Loss')\nax[0].plot(hist_loss, color='red')\nax[0].set_ylabel('Loss')\nax[1].set_title('Accuracy')\nax[1].plot(hist_accr, color='blue')\nax[1].set_ylabel('Accuracy')\nax[1].set_xlabel('Epochs')<\/pre>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" width=\"720\" height=\"496\" src=\"http:\/\/cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-1-1.png\" alt=\"\" class=\"wp-image-364\" srcset=\"http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-1-1.png 720w, http:\/\/blog.cedartrees.co.kr\/wp-content\/uploads\/2020\/08\/download-1-1-300x207.png 300w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p>\ud559\uc2b5\uc774 \uc644\ub8cc\ub41c \ud6c4\uc5d0 \ud14c\uc2a4\ud2b8 \ub370\uc774\ud130\ub97c \uc0ac\uc6a9\ud574\uc11c \ubaa8\ub378\uc758 \uc815\ud655\ub3c4\ub97c \ud655\uc778\ud574\ubcf4\uc558\uace0 \uacb0\uacfc \uac12\uc73c\ub85c Accuracy : 0.93750\ub97c \uc5bb\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">cnn.eval()\n\nwith torch.no_grad():\n    for idx, (data, label) in enumerate(test_loader):\n        data, label = data.to(DEVICE), label.to(DEVICE)\n        output = cnn(data)\n        loss = criterion(output, label)\n    \n        predict = torch.argmax(output, dim=-1) == label\n        accuracy =  predict.float().mean().item()\n    \n        print('Accuracy : {:.5f}'.format(accuracy))<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>Fashion-MNIST\uc5d0 \ub300\ud55c \uc124\uba85\uc740 \uc544\ub798 \ub9c1\ud06c\ub85c \ub300\uc2e0\ud558\uaca0\uc2b5\ub2c8\ub2e4. Fashion-MNIST&nbsp;is a dataset of&nbsp;Zalando&#8216;s article images\u2014consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28&#215;28 grayscale image, associated with a label from 10 classes. We intend&nbsp;Fashion-MNIST&nbsp;to serve as a direct&nbsp;drop-in replacement&nbsp;for the original&nbsp;MNIST dataset&nbsp;for benchmarking machine learning algorithms. It &hellip; <\/p>\n<p class=\"link-more\"><a href=\"http:\/\/blog.cedartrees.co.kr\/index.php\/2020\/08\/13\/cnn-fashion-mnist-test\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;CNN Fashion-MNIST \ud14c\uc2a4\ud2b8 (PyTorch)&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[41,14],"tags":[37,39,38,6],"_links":{"self":[{"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/posts\/360"}],"collection":[{"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/comments?post=360"}],"version-history":[{"count":5,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/posts\/360\/revisions"}],"predecessor-version":[{"id":370,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/posts\/360\/revisions\/370"}],"wp:attachment":[{"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/media?parent=360"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/categories?post=360"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/blog.cedartrees.co.kr\/index.php\/wp-json\/wp\/v2\/tags?post=360"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}