U-net 低对比度测试图像,预测输出为灰框 [英] U-net low contrast test images, predict output is grey box

查看:45
本文介绍了U-net 低对比度测试图像,预测输出为灰框的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我从 https://github.com/zhixuhao/unet 运行 unet,但是当我运行 unet 预测的图像都是灰色的.我收到一条错误消息,说我的测试数据的图像对比度低,有人解决或解决了这个问题吗?

I am running the unet from https://github.com/zhixuhao/unet but when I run the unet the predicted images are all grey. I get an error saying low contrast image for my test data, any one had or resolved this problem?

我正在用 50 张超声图像进行训练,并在增强后得到大约 2000/3000,在 5 个时期,每个时期 300 步,批次大小为 2.

I am training with 50 ultrasound images and get around 2000/3000 after augmentation, on 5 epochs with 300 steps per epoch and batch size of 2.

非常感谢海伦娜

推荐答案

在您确定您的数据管道正确之后.这里有几件事情需要考虑,我希望下面提到的其中一项能有所帮助:

After you made sure that your data pipeline is correct. There are a few things to consider here, I hope one of the bellow mentioned helps:

1.选择合适的损失函数二元交叉熵可能会引导您的网络朝着针对所有标签进行优化的方向发展,现在如果您的图像中的标签数量不平衡,它可能会使您的网络只返回白色、灰色或黑色图像预测.尝试使用骰子系数损失

1. Choose the right loss function Binary crossentropy might lead your network in the direction of optimizing for all labels, now if you have an unbalanced amount of labels in your image, it might draw your network to just give back either white, gray or black image predictions. Try using the dice coefficient loss

2.更改 testGenerator 中的行data.pytestGenerator 方法中似乎有问题的是以下行:

2. Change the line in testGenerator A thing that seems to be an issue in data.py and the testGenerator method is the following line:

img = img / 255

改为:

img /=255. 

3.降低学习率如果您的学习率太高,您可能会收敛于不充分的最优解,这也往往仅针对灰色、黑色或白色预测进行优化.尝试使用 Adam(lr = 3e-5) 左右的学习率并训练足够数量的 epoch,您应该打印骰子损失而不是准确性来检查您的收敛性.

3. Reduce learning rate if your learning rate is too high you might converge in non-sufficient optima, which also tend to optimize for gray, black or white predictions only. Try a learning rate around Adam(lr = 3e-5) and train for a sufficient amount of epochs, you should print dice loss and not accuracy to check your convergence.

4.最后一组卷积不要使用激活函数对于最后一组卷积,即 128-> 64 -> 64 -> 1,不应使用激活函数!激活函数导致值消失!

4. Do not use activation functions for the last set of convolutions For the last set of convolutions, that is 128-> 64 -> 64 -> 1, the activation function should not be used! The activation function causes the values to vanish!

5.您的保存方法可能存在错误",请确保在保存前将图像缩放到 0 到 255 之间的值.Skimage 通常会通过低对比度图像警告来警告您.

5. Your saving method could have a "bug" make sure you scale your image to values between 0 and 255 before saving. Skimage usually warns you with a low contrast image warning.

from skimage import img_as_uint

io.imsave(os.path.join(save_path,"%d_predict.tif"%(i)),img_as_uint(img))

6.您的保存格式可能存在错误",请确保以正确的格式保存图像.我的经验是,另存为 .png 只能提供黑色或灰色图像,而 .tif 则很有魅力.

6. Your saving format could have a "bug" make sure you save your image in a proper format. I experienced that saving as .png gives only black or gray images, whereas .tif works like a charm.

7.您可能只是没有进行足够的训练,当您的网络没有按照您的意愿运行时,您通常会吓坏并中止训练.机会是,额外的训练时期正是它所需要的.

7. You might just not train enough often you'll just freak out when your network does not do as you would like it to and abort the training. Chance is, additional training epochs is exactly what it would have needed.

这篇关于U-net 低对比度测试图像,预测输出为灰框的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆