将正则化器添加到skflow [英] Adding regularizer to skflow

查看:80
本文介绍了将正则化器添加到skflow的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近将tensorflow的形式切换为skflow.在tensorflow中,我们将lambda * tf.nn.l2_loss(weights)添加到我们的损失中.现在我在skflow中有以下代码:

I recently switched form tensorflow to skflow. In tensorflow we would add our lambda*tf.nn.l2_loss(weights) to our loss. Now I have the following code in skflow:

def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)
    return preds, loss

def exp_decay(global_step):
    return tf.train.exponential_decay(learning_rate=0.01,
                                      global_step=global_step,
                                      decay_steps=1000,
                                      decay_rate=0.005)

deep_cd = skflow.TensorFlowEstimator(model_fn=deep_psi,
                                    n_classes=2,
                                    steps=10000,
                                    batch_size=10,
                                    learning_rate=exp_decay,
                                    verbose=True,)

如何在此处以及在何处添加正则化器? Illia提示了此处,但我不知道.

How and where do I add a regularizer here? Illia hints something here but I couldn't figure it out.

推荐答案

您仍然可以向损失中添加其他组件,只需要从dnn/logistic_regression中检索权重并将它们添加到损失中即可:

You can still add additional components to loss, you just need to retrieve weights from dnn / logistic_regression and add them to the loss:

def regularize_loss(loss, weights, lambda):
    for weight in weights:
        loss = loss + lambda * tf.nn.l2_loss(weight)
    return loss    


def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)

    weights = []
    for layer in range(5): # n layers you passed to dnn
        weights.append(tf.get_variable("dnn/layer%d/linear/Matrix" % layer))
        # biases are also available at dnn/layer%d/linear/Bias
    weights.append(tf.get_variable('logistic_regression/weights'))

    return preds, regularize_loss(loss, weights, lambda)

```

请注意,变量的路径可以为在此处找到.

Note, the path to variables can be found here.

此外,我们想为所有带有变量的层(例如dnnconv2dfully_connected)添加正则化器支持,因此下周晚上的Tensorflow构建应该具有类似dnn(.., regularize=tf.contrib.layers.l2_regularizer(lambda))的内容.发生这种情况时,我将更新此答案.

Also, we want to add regularizer support to all layers with variables (like dnn, conv2d or fully_connected) so may be next week's night build of Tensorflow should have something like this dnn(.., regularize=tf.contrib.layers.l2_regularizer(lambda)). I'll update this answer when this happens.

这篇关于将正则化器添加到skflow的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆