Keras 详细训练进度条在每个批次问题上写一个新行 [英] Keras verbose training progress bar writing a new line on each batch issue

查看:51
本文介绍了Keras 详细训练进度条在每个批次问题上写一个新行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在 Keras 中运行密集前馈神经网络.两个输出有 class_weights,第三个输出有 sample_weights.出于某种原因,它为每个计算的批次打印进度详细显示,而不是更新打印在其预期的同一行上......你有没有遇到过这种情况?它是如何固定的?从外壳:

running a Dense feed-forward neural net in Keras. there are class_weights for two outputs, and sample_weights for a third output. fore some reason it prints the progress verbose display for each batch calculated, and not updating the print on the same line as its supposed to... Did this ever happens to you? How is it fixed? From the shell:

42336/747322 [====>.........................] - ETA: 79s - loss: 20.7154 - x1_loss: 9.5913 - x2_loss: 10.0536 - x3_loss: 1.0705 - x1_acc: 0.6930 - x2_acc: 0.4433 - x3_acc: 0.6821
143360/747322 [====>.........................] - ETA: 78s - loss: 20.7387 - x1_loss: 9.6131 - x2_loss: 10.0555 - x3_loss: 1.0702 - x1_acc: 0.6930 - x2_acc: 0.4432 - x3_acc: 0.6820
144384/747322 [====>.........................] - ETA: 78s - loss: 20.7362 - x1_loss: 9.6067 - x2_loss: 10.0608 - x3_loss: 1.0687 - x1_acc: 0.6930 - x2_acc: 0.4429 - x3_acc: 0.6817
145408/747322 [====>.........................] - ETA: 78s - loss: 20.7257 - x1_loss: 9.5985 - x2_loss: 10.0571 - x3_loss: 1.0702 - x1_acc: 0.6929 - x2_acc: 0.4428 - x3_acc: 0.6815
146432/747322 [====>.........................] - ETA: 78s - loss: 20.7145 - x1_loss: 9.5849 - x2_loss: 10.0605 - x3_loss: 1.0691 - x1_acc: 0.6932 - x2_acc: 0.4429 - x3_acc: 0.6816
147456/747322 [====>.........................] - ETA: 78s - loss: 20.7208 - x1_loss: 9.5859 - x2_loss: 10.0662 - x3_loss: 1.0688 - x1_acc: 0.6931 - x2_acc: 0.4429 - x3_acc: 0.6815
148480/747322 [====>.........................] - ETA: 78s - loss: 20.7078 - x1_loss: 9.5762 - x2_loss: 10.0636 - x3_loss: 1.0680 - x1_acc: 0.6932 - x2_acc: 0.4430 - x3_acc: 0.6815
149504/747322 [=====>........................] - ETA: 77s - loss: 20.6987 - x1_loss: 9.5749 - x2_loss: 10.0555 - x3_loss: 1.0683 - x1_acc: 0.6931 - x2_acc: 0.4430 - x3_acc: 0.6817
150528/747322 [=====>........................] - ETA: 77s - loss: 20.9883 - x1_loss: 9.5688 - x2_loss: 10.3509 - x3_loss: 1.0686 - x1_acc: 0.6928 - x2_acc: 0.4428 - x3_acc: 0.6819
151552/747322 [=====>........................] - ETA: 77s - loss: 20.9721 - x1_loss: 9.5606 - x2_loss: 10.3435 - x3_loss: 1.0679 - x1_acc: 0.6927 - x2_acc: 0.4426 - x3_acc: 0.6821
152576/747322 [=====>........................] - ETA: 77s - loss: 20.9585 - x1_loss: 9.5558 - x2_loss: 10.3355 - x3_loss: 1.0672 - x1_acc: 0.6926 - x2_acc: 0.4425 - x3_acc: 0.6822
153600/747322 [=====>........................] - ETA: 77s - loss: 20.9409 - x1_loss: 9.5447 - x2_loss: 10.3300 - x3_loss: 1.0662 - x1_acc: 0.6925 - x2_acc: 0.4426 - x3_acc: 0.6822
154624/747322 [=====>........................] - ETA: 77s - loss: 20.9254 - x1_loss: 9.5341 - x2_loss: 10.3250 - x3_loss: 1.0663 - x1_acc: 0.6924 - x2_acc: 0.4425 - x3_acc: 0.6825
155648/747322 [=====>........................] - ETA: 77s - loss: 20.9189 - x1_loss: 9.5270 - x2_loss: 10.3249 - x3_loss: 1.0670 - x1_acc: 0.6925 - x2_acc: 0.4425 - x3_acc: 0.6825
156672/747322 [=====>........................] - ETA: 76s - loss: 20.9069 - x1_loss: 9.5155 - x2_loss: 10.3256 - x3_loss: 1.0658 - x1_acc: 0.6927 - x2_acc: 0.4423 - x3_acc: 0.6827
157696/747322 [=====>........................] - ETA: 76s - loss: 20.9275 - x1_loss: 9.5461 - x2_loss: 10.3163 - x3_loss: 1.0651 - x1_acc: 0.6927 - x2_acc: 0.4422 - x3_acc: 0.6828
158720/747322 [=====>........................] - ETA: 76s - loss: 21.4809 - x1_loss: 10.1018 - x2_loss: 10.3133 - x3_loss: 1.0659 - x1_acc: 0.6928 - x2_acc: 0.4422 - x3_acc: 0.6829
159744/747322 [=====>........................] - ETA: 76s - loss: 21.4617 - x1_loss: 10.0871 - x2_loss: 10.3093 - x3_loss: 1.0653 - x1_acc: 0.6928 - x2_acc: 0.4421 - x3_acc: 0.6830
160768/747322 [=====>........................] - ETA: 76s - loss: 21.5462 - x1_loss: 10.1705 - x2_loss: 10.3105 - x3_loss: 1.0652 - x1_acc: 0.6928 - x2_acc: 0.4420 - x3_acc: 0.6832
161792/747322 [=====>........................] - ETA: 76s - loss: 21.5642 - x1_loss: 10.1849 - x2_loss: 10.3138 - x3_loss: 1.0655 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6832
162816/747322 [=====>........................] - ETA: 76s - loss: 21.5508 - x1_loss: 10.1739 - x2_loss: 10.3118 - x3_loss: 1.0651 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6833
163840/747322 [=====>........................] - ETA: 76s - loss: 21.5323 - x1_loss: 10.1606 - x2_loss: 10.3057 - x3_loss: 1.0659 - x1_acc: 0.6927 - x2_acc: 0.4419 - x3_acc: 0.6833
164864/747322 [=====>........................] - ETA: 75s - loss: 21.5282 - x1_loss: 10.1607 - x2_loss: 10.3016 - x3_loss: 1.0659 - x1_acc: 0.6926 - x2_acc: 0.4418 - x3_acc: 0.6834
165888/747322 [=====>........................] - ETA: 75s - loss: 21.5321 - x1_loss: 10.1696 - x2_loss: 10.2963 - x3_loss: 1.0662 - x1_acc: 0.6927 - x2_acc: 0.4417 - x3_acc: 0.6834
166912/747322 [=====>........................] - ETA: 75s - loss: 21.5131 - x1_loss: 10.1554 - x2_loss: 10.2912 - x3_loss: 1.0664 - x1_acc: 0.6927 - x2_acc: 0.4416 - x3_acc: 0.6833
167936/747322 [=====>........................] - ETA: 75s - loss: 21.5211 - x1_loss: 10.1649 - x2_loss: 10.2886 - x3_loss: 1.0676 - x1_acc: 0.6929 - x2_acc: 0.4415 - x3_acc: 0.6835
168960/747322 [=====>........................] - ETA: 75s - loss: 21.5049 - x1_loss: 10.1504 - x2_loss: 10.2870 - x3_loss: 1.0676 - x1_acc: 0.6930 - x2_acc: 0.4414 - x3_acc: 0.6835
169984/747322 [=====>........................] - ETA: 75s - loss: 21.5171 - x1_loss: 10.1684 - x2_loss: 10.2818 - x3_loss: 1.0670 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6832
171008/747322 [=====>........................] - ETA: 75s - loss: 21.5036 - x1_loss: 10.1541 - x2_loss: 10.2816 - x3_loss: 1.0678 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6828
172032/747322 [=====>........................] - ETA: 75s - loss: 21.4870 - x1_loss: 10.1377 - x2_loss: 10.2816 - x3_loss: 1.0677 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6827
173056/747322 [=====>........................] - ETA: 75s - loss: 21.4729 - x1_loss: 10.1210 - x2_loss: 10.2836 - x3_loss: 1.0683 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6824
174080/747322 [=====>........................] - ETA: 74s - loss: 21.4512 - x1_loss: 10.1085 - x2_loss: 10.2742 - x3_loss: 1.0685 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6821
175104/747322 [======>.......................] - ETA: 74s - loss: 21.4315 - x1_loss: 10.0977 - x2_loss: 10.2647 - x3_loss: 1.0690 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6817
176128/747322 [======>.......................] - ETA: 74s - loss: 21.4231 - x1_loss: 10.0880 - x2_loss: 10.2656 - x3_loss: 1.0695 - x1_acc: 0.6932 - x2_acc: 0.4412 - x3_acc: 0.6813
177152/747322 [======>.......................] - ETA: 74s - loss: 21.4059 - x1_loss: 10.0732 - x2_loss: 10.2639 - x3_loss: 1.0688 - x1_acc: 0.6931 - x2_acc: 0.4412 - x3_acc: 0.6809
178176/747322 [======>.......................] - ETA: 74s - loss: 21.4289 - x1_loss: 10.0967 - x2_loss: 10.2634 - x3_loss: 1.0688 - x1_acc: 0.6930 - x2_acc: 0.4413 - x3_acc: 0.6807
179200/747322 [======>.......................] - ETA: 74s - loss: 21.4329 - x1_loss: 10.1092 - x2_loss: 10.2557 - x3_loss: 1.0681 - x1_acc: 0.6930 - x2_acc: 0.4414 - x3_acc: 0.6807
180224/747322 [======>.......................] - ETA: 74s - loss: 21.4277 - x1_loss: 10.1099 - x2_loss: 10.2503 - x3_loss: 1.0675 - x1_acc: 0.6930 - x2_acc: 0.4415 - x3_acc: 0.6807
181248/747322 [======>.......................] - ETA: 73s - loss: 21.4088 - x1_loss: 10.0975 - x2_loss: 10.2441 - x3_loss: 1.0671 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6808
182272/747322 [======>.......................] - ETA: 73s - loss: 21.3909 - x1_loss: 10.0841 - x2_loss: 10.2405 - x3_loss: 1.0663 - x1_acc: 0.6929 - x2_acc: 0.4415 - x3_acc: 0.6811
183296/747322 [======>.......................] - ETA: 73s - loss: 21.3775 - x1_loss: 10.0699 - x2_loss: 10.2416 - x3_loss: 1.0660 - x1_acc: 0.6927 - x2_acc: 0.4415 - x3_acc: 0.6813
184320/747322 [======>.......................] - ETA: 73s - loss: 21.3682 - x1_loss: 10.0664 - x2_loss: 10.2355 - x3_loss: 1.0662 - x1_acc: 0.6928 - x2_acc: 0.4417 - x3_acc: 0.6818
185344/747322 [======>.......................] - ETA: 73s - loss: 21.4162 - x1_loss: 10.1213 - x2_loss: 10.2291 - x3_loss: 1.0658 - x1_acc: 0.6927 - x2_acc: 0.4417 - x3_acc: 0.6821
186368/747322 [======>.......................] - ETA: 73s - loss: 21.3981 - x1_loss: 10.1050 - x2_loss: 10.2259 - x3_loss: 1.0672 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6825
187392/747322 [======>.......................] - ETA: 73s - loss: 21.3793 - x1_loss: 10.0909 - x2_loss: 10.2212 - x3_loss: 1.0673 - x1_acc: 0.6928 - x2_acc: 0.4417 - x3_acc: 0.6827
188416/747322 [======>.......................] - ETA: 73s - loss: 21.3614 - x1_loss: 10.0784 - x2_loss: 10.2163 - x3_loss: 1.0668 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6830
189440/747322 [======>.......................] - ETA: 72s - loss: 21.3736 - x1_loss: 10.0909 - x2_loss: 10.2169 - x3_loss: 1.0659 - x1_acc: 0.6930 - x2_acc: 0.4417 - x3_acc: 0.6833
190464/747322 [======>.......................] - ETA: 72s - loss: 21.4615 - x1_loss: 10.0802 - x2_loss: 10.3165 - x3_loss: 1.0648 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6836
191488/747322 [======>.......................] - ETA: 72s - loss: 21.4493 - x1_loss: 10.0653 - x2_loss: 10.3194 - x3_loss: 1.0646 - x1_acc: 0.6930 - x2_acc: 0.4417 - x3_acc: 0.6837
192512/747322 [======>.......................] - ETA: 72s - loss: 21.4863 - x1_loss: 10.0997 - x2_loss: 10.3207 - x3_loss: 1.0659 - x1_acc: 0.6927 - x2_acc: 0.4416 - x3_acc: 0.6837
193536/747322 [======>.......................] - ETA: 72s - loss: 21.4750 - x1_loss: 10.0895 - x2_loss: 10.3198 - x3_loss: 1.0657 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6839
194560/747322 [======>.......................] - ETA: 72s - loss: 21.4577 - x1_loss: 10.0755 - x2_loss: 10.3168 - x3_loss: 1.0654 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6839
195584/747322 [======>.......................] - ETA: 72s - loss: 21.4429 - x1_loss: 10.0627 - x2_loss: 10.3148 - x3_loss: 1.0655 - x1_acc: 0.6929 - x2_acc: 0.4417 - x3_acc: 0.6838
196608/747322 [======>.......................] - ETA: 71s - loss: 21.4307 - x1_loss: 10.0558 - x2_loss: 10.3089 - x3_loss: 1.0660 - x1_acc: 0.6929 - x2_acc: 0.4418 - x3_acc: 0.6834
197632/747322 [======>.......................] - ETA: 71s - loss: 21.4446 - x1_loss: 10.0669 - x2_loss: 10.3107 - x3_loss: 1.0670 - x1_acc: 0.6929 - x2_acc: 0.4418 - x3_acc: 0.6830
198656/747322 [======>.......................] - ETA: 71s - loss: 21.4287 - x1_loss: 10.0552 - x2_loss: 10.3071 - x3_loss: 1.0665 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6827
199680/747322 [=======>......................] - ETA: 71s - loss: 21.4168 - x1_loss: 10.0474 - x2_loss: 10.3034 - x3_loss: 1.0660 - x1_acc: 0.6931 - x2_acc: 0.4417 - x3_acc: 0.6823
200704/747322 [=======>......................] - ETA: 71s - loss: 21.4064 - x1_loss: 10.0385 - x2_loss: 10.3015 - x3_loss: 1.0664 - x1_acc: 0.6931 - x2_acc: 0.4417 - x3_acc: 0.6819
201728/747322 [=======>......................] - ETA: 71s - loss: 21.3954 - x1_loss: 10.0320 - x2_loss: 10.2974 - x3_loss: 1.0659 - x1_acc: 0.6931 - x2_acc: 0.4416 - x3_acc: 0.6817
202752/747322 [=======>......................] - ETA: 71s - loss: 21.3870 - x1_loss: 10.0243 - x2_loss: 10.2965 - x3_loss: 1.0662 - x1_acc: 0.6931 - x2_acc: 0.4415 - x3_acc: 0.6816
203776/747322 [=======>......................] - ETA: 70s - loss: 21.3782 - x1_loss: 10.0155 - x2_loss: 10.2954 - x3_loss: 1.0673 - x1_acc: 0.6929 - 

etc...

推荐答案

我在 tqdm 中添加了对 keras 的内置支持,所以你可以改用它(pip install "tqdm>=4.41.0"):

I've added built-in support for keras in tqdm so you could use it instead (pip install "tqdm>=4.41.0"):

from tqdm.keras import TqdmCallback
...
model.fit(..., verbose=0, callbacks=[TqdmCallback(verbose=2)])

这会关闭 keras 的进度 (verbose=0),并使用 tqdm 代替.对于回调,verbose=2 表示时期和批次的单独进度条.1 表示完成后清除批处理条.0 表示只显示 epochs(从不显示批处理条).

This turns off keras' progress (verbose=0), and uses tqdm instead. For the callback, verbose=2 means separate progressbars for epochs and batches. 1 means clear batch bars when done. 0 means only show epochs (never show batch bars).

如果有问题,请在 https://github.com/tqdm/tqdm 打开一个问题/问题

这篇关于Keras 详细训练进度条在每个批次问题上写一个新行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆