每次我运行神经网络代码时结果都会改变 [英] Result changes every time I run Neural Network code

查看:53
本文介绍了每次我运行神经网络代码时结果都会改变的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

通过运行此链接神经网络–预测多个变量的值.我能够计算损失的准确性等.但是,每次运行此代码时,都会得到一个新结果.是否有可能获得相同(一致)的结果?

I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result?

推荐答案

代码随处都是 random.randint()!此外,权重在大多数情况下也是随机设置的,并且batch_size对结果也有影响(尽管影响很小).

The code is full of random.randint() everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result.

  1. Y_train,X_test,X_train是随机生成的
  2. 使用 adam 作为优化程序,这意味着您将执行随机梯度下降.以随机为起点进行迭代,以收敛.
  3. batch_size为8表示您将运行由8个随机所选样本组成的批次.
  1. Y_train, X_test, X_train are generated randomly
  2. Using adam as optimizer, means you'll be performing stochastic gradient descent. With a random beginning point of the iterations in order to converge.
  3. A batch_size of 8 means you will run batches consisting of 8 randomly selected samples.

解决方案:

  1. 在代码中设置一个随机种子,以始终具有使用 np.random.seed()
  2. 生成的随机值
  3. 尽管有很小的偏差,但不会产生太大的问题
  4. 与2相同.

如果我找到一种方法来解决 batch_size / epoch 问题的一致性抽样方法,我将编辑我的答案.

If I find a way to have consistente sampling methods for the batch_size/epoch issue I will edit my answer.

这篇关于每次我运行神经网络代码时结果都会改变的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆