PyBrain 网络中所有节点的激活值 [英] activation values for all nodes in a PyBrain network
问题描述
我觉得这应该是微不足道的,但我一直在努力在 PyBrain 文档、此处或其他地方找到任何有用的信息.
I feel like this should be trivial, but I've struggled to find anything useful in the PyBrain documentation, on here, or elsewhere.
问题是这样的:
我有一个在 PyBrain 中构建和训练的三层(输入、隐藏、输出)前馈网络.每层有三个节点.我想用新的输入激活网络,并将节点的结果激活值存储在隐藏层.据我所知,net.activate() 和 net.activateOnDataset() 只会返回输出层节点的激活值,并且是激活网络的唯一方法.
I have a three layer (input, hidden, output) feedforward network built and trained in PyBrain. Each layer has three nodes. I want to activate the network with novel inputs and store the resultant activation values of the nodes at the hidden layer. As far as I can tell, net.activate() and net.activateOnDataset() will only return the activation values of output layer nodes and are the only ways to activate a network.
如何获得 PyBrain 网络的隐藏层激活?
How do I get at the hidden layer activations of a PyBrain network?
我不确定示例代码在这种情况下会有多大帮助,但这里有一些(使用精简的训练集):
I'm not sure example code will help that much in this case, but here's some anyway (with a cut-down training set) :
from pybrain.tools.shortcuts import buildNetwork
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
net = buildNetwork(3, 3, 3)
dataSet = SupervisedDataSet(3, 3)
dataSet.addSample((0, 0, 0), (0, 0, 0))
dataSet.addSample((1, 1, 1), (0, 0, 0))
dataSet.addSample((1, 0, 0), (1, 0, 0))
dataSet.addSample((0, 1, 0), (0, 1, 0))
dataSet.addSample((0, 0, 1), (0, 0, 1))
trainer = BackpropTrainer(net, dataSet)
trained = False
acceptableError = 0.001
# train until acceptable error reached
while trained == False :
error = trainer.train()
if error < acceptableError :
trained = True
result = net.activate([0.5, 0.4, 0.7])
print result
在这种情况下,所需的功能是打印隐藏层的激活值列表.
In this case, desired functionality is to print a list of the hidden layer's activation values.
推荐答案
看起来应该可以:
net['in'].outputbuffer[net['in'].offset]
net['hidden0'].outputbuffer[net['hidden0'].offset]
纯粹基于查看源代码一>.
这篇关于PyBrain 网络中所有节点的激活值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!