与我的 PC 相比,Google Colab 非常慢 [英] Google Colab is very slow compared to my PC

查看:37
本文介绍了与我的 PC 相比,Google Colab 非常慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近开始使用 Google Colab,并想训练我的第一个卷积神经网络.由于我得到的答案,我从 Google Drive 导入了图像 此处.

I've recently started to use Google Colab, and wanted to train my first Convolutional NN. I imported the images from my Google Drive thanks to the answer I got here.

然后我将用于创建 CNN 的代码粘贴到 Colab 中并开始该过程.完整代码如下:

Then I pasted my code to create the CNN into Colab and started the process. Here is the complete code:

(第 1 部分复制自 这里 对我来说效果很好

(part 1 is copied from here as it worked as exptected for me

第 1 步:

!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse

第 2 步:

from google.colab import auth
auth.authenticate_user()

第 3 步:

from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}

第 4 步:

!mkdir -p drive
!google-drive-ocamlfuse drive

第 5 步:

print('Files in Drive:')
!ls drive/

第 2 部分:复制粘贴我的 CNN

我使用 Udemy 课程的教程创建了这个 CNN.它使用带有 tensorflow 的 keras 作为后端.为了简单起见,我上传了一个非常简单的版本,足以说明我的问题

Part 2: Copy pasting my CNN

I created this CNN with tutorials from a Udemy Course. It uses keras with tensorflow as backend. For the sake of simplicity I uploaded a really simple version, which is plenty enough to show my problems

from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten 
from keras.layers import Dense 
from keras.layers import Dropout
from keras.optimizers import Adam 
from keras.preprocessing.image import ImageDataGenerator 

参数

imageSize=32

batchSize=64

epochAmount=50

美国有线电视新闻网

classifier=Sequential() 

classifier.add(Conv2D(32, (3, 3), input_shape = (imageSize, imageSize, 3), activation = 'relu')) #convolutional layer

classifier.add(MaxPooling2D(pool_size = (2, 2))) #pooling layer

classifier.add(Flatten())

神经网络

classifier.add(Dense(units=64, activation='relu')) #hidden layer

classifier.add(Dense(units=1, activation='sigmoid')) #output layer

classifier.compile(optimizer = "adam", loss = 'binary_crossentropy', metrics = ['accuracy']) #training method

图像预处理

train_datagen = ImageDataGenerator(rescale = 1./255,
                               shear_range = 0.2,
                               zoom_range = 0.2,
                               horizontal_flip = True)

test_datagen = ImageDataGenerator(rescale = 1./255) 

training_set = train_datagen.flow_from_directory('drive/School/sem-2-2018/BSP2/UdemyCourse/CNN/dataset/training_set',
                                             target_size = (imageSize, imageSize),
                                             batch_size = batchSize,
                                             class_mode = 'binary')

test_set = test_datagen.flow_from_directory('drive/School/sem-2-2018/BSP2/UdemyCourse/CNN/dataset/test_set',
                                        target_size = (imageSize, imageSize),
                                        batch_size = batchSize,
                                        class_mode = 'binary')

classifier.fit_generator(training_set,
                     steps_per_epoch = (8000//batchSize),
                     epochs = epochAmount,
                     validation_data = test_set,
                     validation_steps = (2000//batchSize))

现在是我的问题

首先,我使用的训练集是一个包含 10000 张各种分辨率的狗和猫图片的数据库.(8000 个训练集,2000 个测试集)

Now comes my Problem

First of, the training set I used is a database with 10000 dog and cat pictures of various resolutions. (8000 training_set, 2000 test_set)

我在 Google Colab(启用 GPU 支持)和我的 PC(GTX 1060 上的 tensorflow-gpu)上运行了这个 CNN

I ran this CNN on Google Colab (with GPU support enabled) and on my PC (tensorflow-gpu on GTX 1060)

这是我电脑的中间结果:

This is an intermediate result from my PC:

Epoch 2/50
63/125 [==============>...............] - ETA: 2s - loss: 0.6382 - acc: 0.6520

这是来自 Colab:

And this from Colab:

Epoch 1/50
13/125 [==>...........................] - ETA: 1:00:51 - loss: 0.7265 - acc: 0.4916

在我的情况下,为什么 Google Colab 这么慢?

Why is Google Colab so slow in my case?

我个人怀疑一个瓶颈包括从我的驱动器拉取然后读取图像,但除了选择不同的方法导入数据库之外,我不知道如何解决这个问题.

Personally I suspect a bottleneck consisting of pulling and then reading the images from my Drive, but I don't know how to solve this other than choosing a different method to import the database.

推荐答案

As @Feng 已经注意到,阅读文件从驱动器很慢.教程建议使用某种使用内存映射文件(如 hdf5 或 lmdb)来解决此问题.这样 IO 操作要快得多(有关 hdf5 格式速度增益的完整解释,请参阅 这个).

As @Feng has already noted, reading files from drive is very slow. This tutorial suggests using some sort of a memory mapped file like hdf5 or lmdb in order to overcome this issue. This way the IO Operations are much faster (for a complete explanation on the speed gain of hdf5 format see this).

这篇关于与我的 PC 相比,Google Colab 非常慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆