如何完成以下基于Tensorflow的基于GRU的RNN? [英] How can I complete following GRU based RNN written in tensorflow?

查看:265
本文介绍了如何完成以下基于Tensorflow的基于GRU的RNN?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

到目前为止,我已经编写了以下代码:

So far I have written following code:

import pickle
import numpy as np
import pandas as pd
import tensorflow as tf

# load pickled objects (x and y)
x_input, y_actual = pickle.load(open('sample_input.pickle', 'rb'))
x_input = np.reshape(x_input, (50, 1))
y_actual = np.reshape(y_actual, (50, 1))

# parameters
batch_size = 50
hidden_size = 100

# create network graph
input_data = tf.placeholder(tf.float32, [batch_size, 1])
output_data = tf.placeholder(tf.float32, [batch_size, 1])

cell = tf.nn.rnn_cell.GRUCell(hidden_size)

initial_state = cell.zero_state(batch_size, tf.float32)

hidden_state = initial_state

output_of_cell, hidden_state = cell(inputs=input_data, state=hidden_state)

init_op = tf.initialize_all_variables()

softmax_w = tf.get_variable("softmax_w", [hidden_size, 1], )
softmax_b = tf.get_variable("softmax_b", [1])

logits = tf.matmul(output_of_cell, softmax_w) + softmax_b

probabilities = tf.nn.softmax(logits)

sess = tf.Session()
sess.run(init_op)

something = sess.run([probabilities, hidden_state], feed_dict={input_data:x_input, output_data:y_actual})

#cost = tf.nn.sigmoid_cross_entropy_with_logits(logits, output_data)


#sess.close()

但是我将s oftmax_w/b作为未初始化的变量得到了错误.

But I am getting error for softmax_w/b as uninitialized variables.

我不知道如何使用这些Wb来进行火车操作.

I am not getting how should I use these W and b and carry out train operation.

类似以下内容:

## some cost function
## training operation minimizing cost function using gradient descent optimizer

推荐答案

tf.initialize_all_variables()从图中获取当前"变量集.由于要在调用tf.initialize_all_variables()之后创建softmax_wsoftmax_b,因此它们不在tf.initialize_all_variables()所查询的列表中,因此在运行sess.run(init_op)时不会被初始化.以下应该可以工作:

tf.initialize_all_variables() gets the "current" set of variables from the graph. Since you are creating softmax_w and softmax_b after your call to tf.initialize_all_variables(), they are not in the list that tf.initialize_all_variables() consults, and hence not initialized when you run sess.run(init_op). The following should work :

softmax_w = tf.get_variable("softmax_w", [hidden_size, 1], )
softmax_b = tf.get_variable("softmax_b", [1])

init_op = tf.initialize_all_variables()

这篇关于如何完成以下基于Tensorflow的基于GRU的RNN?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆