仅将tf.nn.softmax()应用于张量的正元素 [英] Applying tf.nn.softmax() only to positive elements of a tensor

查看:139
本文介绍了仅将tf.nn.softmax()应用于张量的正元素的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我花了很长时间解决这个问题,并且在互联网上找不到任何有用的东西,所以我不得不问:

I tried far to long to solve this problem and did not find anything useful on the Internet so I have to ask:

给出张量 T ,假设 T = tf.random_normal([100]),我想应用 softmax()仅用于张量的正元素。像 T = tf.nn.softmax(T [T> 0])之类的东西在Tensorflow中当然不起作用。

Given a tensor T, let's say T = tf.random_normal([100]), I want to apply softmax() only to the positive elements of the tensor. Something like T = tf.nn.softmax(T[T>0]) which of course does not work in Tensorflow.

简而言之:我想计算softmax并仅应用于 T>元素。 0

In short: I want to compute softmax and applied only on elements T > 0.

如何在Tensorflow中做到这一点?

How can I do that in Tensorflow?

推荐答案

如果要计算softmax +仅应用于元素T> 0:



一个想法是根据您的条件创建2个分区( T> 0 ),将操作( softmax )应用于目标分区,然后将它们缝合在一起。

If you want softmax computed + applied only to elements T > 0:

An idea could be to create 2 partitions based on your condition (T > 0), apply the operation (softmax) to the target partition, then stitch them back together.

使用 tf.dynamic_partition tf.dynamic_stitch

import tensorflow as tf

T = tf.random_normal(shape=(2, 3, 4))

# Creating partition based on condition:
condition_mask = tf.cast(tf.greater(T, 0.), tf.int32)
partitioned_T = tf.dynamic_partition(T, condition_mask, 2)
# Applying the operation to the target partition:
partitioned_T[1] = tf.nn.softmax(partitioned_T[1])

# Stitching back together, flattening T and its indices to make things easier::
condition_indices = tf.dynamic_partition(tf.range(tf.size(T)), tf.reshape(condition_mask, [-1]), 2)
res_T = tf.dynamic_stitch(condition_indices, partitioned_T)
res_T = tf.reshape(res_T, tf.shape(T))

with tf.Session() as sess:
    t, res = sess.run([T, res_T])
    print(t)
    # [[[-1.92647386  0.7442674   1.86053932 -0.95315439]
    #  [-0.38296485  1.19349718 -1.27562618 -0.73016083]
    #  [-0.36333972 -0.90614134 -0.15798278 -0.38928652]]
    # 
    # [[-0.42384467  0.69428021  1.94177043 -0.13672788]
    #  [-0.53473723  0.94478583 -0.52320045  0.36250541]
    #  [ 0.59011376 -0.77091616 -0.12464728  1.49722672]]]
    print(res)
    # [[[-1.92647386  0.06771058  0.20675084 -0.95315439]
    #  [-0.38296485  0.10610957 -1.27562618 -0.73016083]
    #  [-0.36333972 -0.90614134 -0.15798278 -0.38928652]]
    # 
    # [[-0.42384467  0.06440912  0.22424641 -0.13672788]
    #  [-0.53473723  0.08274478 -0.52320045  0.04622314]
    #  [ 0.05803747 -0.77091616 -0.12464728  0.14376813]]]



< hr>

上一个答案



仅当您希望对<$ c $的所有元素计算softmax时,此答案才有效c> T ,但仅适用于大于 0 的那些。


Previous answer

This answer is valid only if you want softmax to be computed over all elements of T but applied only to those greater than 0.

使用 tf.where()

T = tf.where(tf.greater(T, 0.), tf.nn.softmax(T), T)

这篇关于仅将tf.nn.softmax()应用于张量的正元素的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆