如何使用pytorch同时遍历两个数据加载器? [英] How to iterate over two dataloaders simultaneously using pytorch?
问题描述
我正在尝试实现一个包含两个图像的暹罗网络.我加载这些图像并创建两个单独的数据加载器.
I am trying to implement a Siamese network that takes in two images. I load these images and create two separate dataloaders.
在循环中,我想同时浏览两个数据加载器,以便可以在两个图像上训练网络.
In my loop I want to go through both dataloaders simultaneously so that I can train the network on both images.
for i, data in enumerate(zip(dataloaders1, dataloaders2)):
# get the inputs
inputs1 = data[0][0].cuda(async=True);
labels1 = data[0][1].cuda(async=True);
inputs2 = data[1][0].cuda(async=True);
labels2 = data[1][1].cuda(async=True);
labels1 = labels1.view(batchSize,1)
labels2 = labels2.view(batchSize,1)
# zero the parameter gradients
optimizer.zero_grad()
# forward + backward + optimize
outputs1 = alexnet(inputs1)
outputs2 = alexnet(inputs2)
数据加载器的返回值是一个元组.
但是,当我尝试使用zip
遍历它们时,出现以下错误:
The return value of the dataloader is a tuple.
However, when I try to use zip
to iterate over them, I get the following error:
OSError: [Errno 24] Too many open files
Exception NameError: "global name 'FileNotFoundError' is not defined" in <bound method _DataLoaderIter.__del__ of <torch.utils.data.dataloader._DataLoaderIter object at 0x7f2d3c00c190>> ignored
不是所有可迭代项目都可用zip压缩吗?但是似乎在这里我不能在数据加载器上使用它.
Shouldn't zip work on all iterable items? But it seems like here I can't use it on dataloaders.
还有其他方法可以做到这一点吗?还是我错误地采用了暹罗网络?
Is there any other way to pursue this? Or am I approaching the implementation of a Siamese network incorrectly?
推荐答案
我看到您正在努力实现正确的dataloder功能.我会的:
I see you are struggling to make a right dataloder function. i would do:
class Siamese(Dataset):
def __init__(self, transform=None):
#init data here
def __len__(self):
return #length of the data
def __getitem__(self, idx):
#get images and labels here
#returned images must be tensor
#labels should be int
return img1, img2 , label1, label2
这篇关于如何使用pytorch同时遍历两个数据加载器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!