Python sklearn在训练期间显示损失值 [英] Python sklearn show loss values during training

查看:691
本文介绍了Python sklearn在训练期间显示损失值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在训练期间检查损失值,以便可以观察每次迭代的损失.到目前为止,我还没有找到一种简单的方法来让scikit学习给我损失值的历史记录,也没有找到scikit中已经存在的功能来为我绘制损失值.

I want check my loss values during the training time so I can observe the loss at each iteration. So far I haven't found an easy way for scikit learn to give me a history of loss values, nor did I find a functionality already within scikit to plot the loss for me.

如果没有办法绘制此图,那么如果我可以简单地在classifier.fit的末尾获取最终损失值,那就太好了.

If there was no way to plot this, it'd be great if I could simply fetch the final loss values at the end of classifier.fit.

注意:我知道一些解决方案是封闭形式的事实.我正在使用几个没有解析解决方案的分类器,例如逻辑回归和svm.

Note: I am aware of the fact that some solutions are closed form. I'm using several classifiers which do not have analytical solutions, such as logistic regression and svm.

有人有什么建议吗?

推荐答案

因此,我找不到很好的文档来直接获取每次迭代的损耗值,但我希望这对以后的工作有所帮助:

So I couldn't find very good documentation on directly fetching the loss values per iteration, but I hope this will help someone in the future:

old_stdout = sys.stdout
sys.stdout = mystdout = StringIO()
clf = SGDClassifier(**kwargs, verbose=1)
clf.fit(X_tr, y_tr)
sys.stdout = old_stdout
loss_history = mystdout.getvalue()
loss_list = []
for line in loss_history.split('\n'):
    if(len(line.split("loss: ")) == 1):
        continue
    loss_list.append(float(line.split("loss: ")[-1]))
plt.figure()
plt.plot(np.arange(len(loss_list)), loss_list)
plt.savefig("warmstart_plots/pure_SGD:"+str(kwargs)+".png")
plt.xlabel("Time in epochs")
plt.ylabel("Loss")
plt.close()

此代码将采用常规的SGDClassifier(几乎与任何线性分类器一样),并拦截verbose=1标志,然后进行拆分以获取详细打印时的损失.显然,这比较慢,但是会给我们带来损失并打印出来.

This code will take a normal SGDClassifier(just about any linear classifier), and intercept the verbose=1 flag, and will then split to get the loss from the verbose printing. Obviously this is slower but will give us the loss and print it.

这篇关于Python sklearn在训练期间显示损失值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆