神经网络激活功能 [英] Neuralnetwork activation function

查看:124
本文介绍了神经网络激活功能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是初学者级别的问题. 我有一些二进制的训练输入,对于神经网络,我正在使用S形阈值函数SigmoidFn(Input1*Weights)其中

This is beginner level question. I have several training inputs in binary and for the neural network I am using a sigmoid thresholding function SigmoidFn(Input1*Weights) where

SigmoidFn(x) =  1./(1+exp(-1.*x));

使用以上函数将给出连续的实数.但是,由于网络是一个Hopfield神经网络(单层5个输入节点和5个输出节点),因此我希望输出为二进制.我面临的问题是我无法正确理解各种阈值功能的用法和实现.以下给出的重量是真实重量,并在本文中提供.因此,我使用权重来生成多个训练示例,并通过保持权重固定来生成多个输出样本,这只是多次运行神经网络.

The use of the above function will give continuous real numbers. But, I want the output to be in binary since the network is a Hopfield neural net (single layer 5 input nodes and 5 output nodes). The problem which I am facing is I am unable to correctly understand the usage and implementation of the various thresholding fucntions. The weights given below are the true weights and provided in the paper. So, I am using the weights to generate several training examples, several output samples by keeping the weight fixed, that is just run the neural network several times.

Weights = [0.0  0.5  0.0  0.2  0.0
           0.0  0.0  1.0  0.0  0.0
           0.0  0.0  0.0  1.0  0.0
           0.0  1.0  0.0  0.0  0.0
           0.0  0.0  0.0 -0.6  0.0];


Input1 = [0,1,0,0,0]

x = Input1*Weights;   % x = 0 0 1 0 0

  1. 可以看出,相乘的结果是权重的第二行.这仅仅是巧合吗?

  1. As can be seen the result of the multiplication is the second row of the Weights. Is this a mere coincidence?

下一步

SigmoidFn  =  1./(1+exp(-1.*x))

SigmoidFn =

0.5000    0.5000    0.7311    0.5000    0.5000

  • round(SigmoidFn)
    
    ans =
    
         1     1     1     1     1
    

  • Input2 = [1,0,0,0,0]
    
    x = Input2*Weights
    
    x =  0  0.5000  0  0.2000  0
    SigmoidFn  =  1./(1+exp(-1.*x))
    
    SigmoidFn =  0.5000    0.6225    0.5000    0.5498    0.5000
    
    >> round(SigmoidFn)
    
    ans =
    
          1     1     1     1     1
    

    使用倒数函数round(SigmoidFn(x))是一个好习惯. ?获得的结果不正确. 或使用任何阈值函数时应如何获得二进制结果: (a)硬限制 (b)后勤乙状结肠 (c)Tanh

    Is it a good practice to use the round function round(SigmoidFn(x)) . ? The result obtained is not correct. or how should I obtain binary result when I use any threshold function: (a) HArd Limit (b) Logistic sigmoid (c) Tanh

    有人可以显示用于阈值的正确代码以及何时使用哪种激活函数的简要说明吗?我的意思是应该有一定的逻辑,否则为什么会有不同类型的函数? Hopfield的实现,通过保持权重固定来通过连续的迭代来调用输入模式.

    Can somebody please show the proper code for thresholding and a brief explanation of when to use which activation function?I mean there should be certain logic otherwise why are there different kinds of functions? EDIT : Implementation of Hopfield to recall the input pattern by successive iterations by keeping the weight fixed.

    Training1 = [1,0,0,0,0];
    offset = 0;
    t = 1;
    X(t,:) = Training1;
     err = 1; 
     while(err~=0)
      Out = X(t,:)*Weights > offset;
      err = ((Out - temp)*(Out - temp).')/numel(temp);
    t = t+1
     X(t,:) = temp;
     end
    

    推荐答案

    Hopfield网络不要使用S型非线性;只需将节点的状态更新为其加权输入是否大于或等于其偏移量即可.

    Hopfield networks do not use a sigmoid nonlinearity; the state of a node is simply updated to whether its weighted input is greater than or equal to its offset.

    您想要类似的东西

    output2 = Weights * Input1' >= offsets;
    

    ,其中offsetsInput1的大小相同.我使用Weights * Input1'而不是Input1 * Weights,因为我看到的大多数示例都使用左乘法进行更新(也就是说,权重矩阵的行标记了输入节点,而列标记了输出节点),但是您必须看看可以确保体重矩阵到哪里.

    where offsets is the same size as Input1. I used Weights * Input1' instead of Input1 * Weights because most examples I have seen use left-multiplication for updating (that is, the rows of the weight matrix label the input nodes and the columns label the output nodes), but you will have to look at wherever you got your weight matrix to be sure.

    您应该意识到,在收敛到代表存储模式的固定点之前,您必须多次执行此更新操作.

    You should be aware that you will have to perform this update operation many times before you converge to a fixed point which represents a stored pattern.

    为回答您的其他问题,您选择的权重矩阵不存储任何可以用Hopfield网络调用的记忆.它包含一个周期2 -> 3 -> 4 -> 2 ...,该周期将不允许网络收敛.

    In response to your further questions, the weight matrix you have chosen does not store any memories that can be recalled with a Hopfield network. It contains a cycle 2 -> 3 -> 4 -> 2 ... that will not allow the network to converge.

    通常,您将以类似于编辑时编写的方式来恢复内存:

    In general you would recover a memory in a way similar to what you wrote in your edit:

    X = [1,0,0,0,0];
    offset = 0;
    t = 1;
    err = 1;
    nIter = 100;
    
    while err ~= 0 && t <= nIter
       prev = X;
       X = X * Weights >= offset;
       err = ~isequal(X, prev);
       t = t + 1;
    end
    
    if ~err
        disp(X);
    end
    

    如果您参考Wikipedia页面,这就是所谓的同步更新方法.

    If you refer to the wikipedia page, this is what's referred to as the synchronous update method.

    这篇关于神经网络激活功能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

  • 查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆