神经网络乙状结肠功能 [英] Neural Network sigmoid function

查看:99
本文介绍了神经网络乙状结肠功能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试建立一个神经网络,但我有两个问题:

I'm trying to make a neural network and I have a couple of questions:

我的S形函数有点像

s = 1/(1+(2.7183**(-self.values)))
if s > self.weight:
        self.value = 1
    else:
        self.value = 0

self.values是连接的节点的数组,例如HL(隐藏层)1 中的 HNs(隐藏节点)已连接到所有输入节点,因此它是self.values是sum(inputnodes.values).

The self.values is a array of the connected nodes, for instance the HNs(hidden nodes) in the HL(hidden layer) 1 is connected to all input nodes, so it's self.values is sum(inputnodes.values).

HL2中的 HN 已连接到HL1中的所有HN,其self.values为sum(HL.values)

The HNs in the HL2 is connected to all HNs in HL1, and it's self.values is sum(HL.values)

问题是,每个节点的值都是1,没有权重(除非它太高,如0.90〜0.99)

The problem is, every node is getting the value of 1, no mather their weights(unless it's too high, like 0.90~0.99)

我的神经网络设置如下:

My Neural Network is set like so:

(输入,num_hidden_​​layers,num_hidden_​​nodes_per_layer,num_output_nodes) 输入是二进制值列表:

(inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output_nodes) inputs is a list of binary values:

这是显示此行为的日志.

Here's a log that shows this behavior.

>>NeuralNetwork([1,0,1,1,1,0,0],3,3,1)# 3 layers, 3 nodes each, 1 output
Layer1
Node: y1 Sum: 4, Sigmoid: 0.98, Weight: 0.10, self.value: 1
Node: y2 Sum: 4, Sigmoid: 0.98, Weight: 0.59, self.value: 1
Node: y3 Sum: 4, Sigmoid: 0.98, Weight: 0.74, self.value: 1
Layer2
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.30, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.37, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.80, self.value: 1
Layer3
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.70, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.56, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.28, self.value: 1

即使我尝试在输入中使用浮点数,结果也一样:

Even if I try using float points in the input it turns out the same:

>>NeuralNetwork([0.64, 0.57, 0.59, 0.87, 0.56],3,3,1)
Layer1
Node: y1 Sum: 3.23, Sigmoid: 0.96, Weight: 0.77, self.value: 1
Node: y2 Sum: 3.23, Sigmoid: 0.96, Weight: 0.45, self.value: 1
Node: y3 Sum: 3.23, Sigmoid: 0.96, Weight: 0.83, self.value: 1
Layer2
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.26, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.39, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.53, self.value: 1
Layer3
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.43, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.52, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.96, self.value: 0

注意第3层中的节点y3,这是唯一在S形之后返回0的节点.

Note de Node y3 in the layer3, the only one that returned a 0 after the sigmoid

我在做什么错了?

此外,是否真的有必要将每个节点与上一层中的每个其他节点连接起来?让它随机是不是更好?

Also, is it really necessary to connect every node with every other node in the previous layer? Isn't it better to let it be random?

忘了提及,这是一个正在开发的NN,我将使用遗传算法来训练网络.

Forgot to mention, this is a in-development NN, I'll be using a genetic algorithm to train the network.

class NeuralNetwork:
    def __init__(self, inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output):
        self.input_nodes = inputs
        self.num_inputs = len(inputs)
        self.num_hidden_layers = num_hidden_layers
        self.num_hidden_nodes_per_layer = num_hidden_nodes_per_layer
        self.num_output = num_output

        self.createNodes()
        self.weights = self.getWeights()
        self.connectNodes()
        self.updateNodes()

    def createNodes(self):
        self._input_nodes = []
        for i, v in enumerate(self.input_nodes):
            node = InputNode("x"+str(i+1),v)
            self._input_nodes.append(node)

        self._hidden_layers = []
        for n in xrange(self.num_hidden_layers):
            layer = HiddenLayer("Layer"+str(n+1),self.num_hidden_nodes_per_layer)
            self._hidden_layers.append(layer)

    def getWeights(self):
        weights = []
        for node in self._input_nodes:
            weights.append(node.weight)

        for layer in self._hidden_layers:
            for node in layer.hidden_nodes:
                weights.append(node.weight)
        return weights

    def connectNodes(self):
        for i,layer in enumerate(self._hidden_layers):
            for hidden_node in layer.hidden_nodes:
                if i == 0:
                    for input_node in self._input_nodes:
                        hidden_node.connections.append(input_node)
                else:
                    for previous_node in self._hidden_layers[i-1].hidden_nodes:
                            hidden_node.connections.append(previous_node)

    def updateNodes(self):
        for layer in self._hidden_layers:
            for node in layer.hidden_nodes:
                node.updateValue()

这是节点的updateValue()方法:

And here's the updateValue() method of the nodes:

def updateValue(self):
    value = 0
    for node in self.connections:
        value += node.value
    self.sigmoid(value) # the function at the beginning of the question.

创建的节点仅具有值,名称和权重(开始时是随机的).

The nodes created just have value, name, and weight(random at start).

推荐答案

您正在将几个不同的NN概念混在一起.

You are mashing together several different NN concepts.

逻辑函数(S型的广义形式)已经用作阈值.具体来说,这是一个微分阈值,对于反向传播学习算法至关重要.因此,您不需要分段阈值函数(if语句).

The logistic function (which is the generalized form of the sigmoid) already serves as a threshold. Specifically, it is a differentiable threshold which is essential for the backpropagation learning algorithm. So you don't need that piecewise threshold function (if statement).

权重是突触强度的类似物,在求和(或前馈传播)期间应用.因此,一对节点之间的每个连接的权重乘以发送节点的激活级别(阈值函数的输出).

The weights are analogues for synaptic strength and are applied during summation (or feedforward propagation). So each connection between a pair of nodes has a weight that is multiplied by the sending node's activation level (the output of the threshold function).

最后,即使进行了这些更改,具有所有正权重的完全连接的神经网络仍可能会为输出产生全1.您可以包含与抑制节点相对应的负权重,也可以显着降低连接性(例如,n层中的节点连接到n + 1层中的节点的概率为0.1).

Finally, even with these changes, a fully-connected neural network with all positive weights will probably still produce all 1's for the output. You can either include negative weights corresponding to inhibitory nodes, or reduce connectivity significantly (e.g. with a 0.1 probability that a node in layer n connects to a node in layer n+1).

这篇关于神经网络乙状结肠功能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆