有没有办法将pytorch张量转换为tensorflow张量 [英] Is there any way to convert pytorch tensor to tensorflow tensor

查看:26
本文介绍了有没有办法将pytorch张量转换为tensorflow张量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

https://githubSA8.com//blob/ec67cbdc411278dd29e8888e9fd6451695efc26c/context_fusion/self_attn.py#L29

我需要使用上面链接中在 TensorFlow 中实现的 mulit_Dimension_attention,但我使用的是 PyTorch,所以我可以将 Pytorch Tensor 转换为 TensorFlow Tensor 或者我必须在 PyTorch 中实现它.

I need to use mulit_dimensional_attention from the above link which is implemented in TensorFlow but I am using PyTorch so can I Convert Pytorch Tensor to TensorFlow Tensor or I have to implement it in PyTorch.

我在这里尝试使用的代码我必须将rep_tensor"作为 TensorFlow 张量类型传递,但我有 PyTorch 张量

code which I am trying to use here I have to pass 'rep_tensor' as TensorFlow tensor type but I have PyTorch tensor

def multi_dimensional_attention(rep_tensor, rep_mask=None, scope=None,
                                keep_prob=1., is_train=None, wd=0., activation='elu',
                                tensor_dict=None, name=None):

    # bs, sl, vec = tf.shape(rep_tensor)[0], tf.shape(rep_tensor)[1], tf.shape(rep_tensor)[2]

    ivec = rep_tensor.shape[2]
    with tf.variable_scope(scope or 'multi_dimensional_attention'):
        map1 = bn_dense_layer(rep_tensor, ivec, True, 0., 'bn_dense_map1', activation,
                              False, wd, keep_prob, is_train)
        map2 = bn_dense_layer(map1, ivec, True, 0., 'bn_dense_map2', 'linear',
                              False, wd, keep_prob, is_train)
        # map2_masked = exp_mask_for_high_rank(map2, rep_mask)

        soft = tf.nn.softmax(map2, 1)  # bs,sl,vec
        attn_output = tf.reduce_sum(soft * rep_tensor, 1)  # bs, vec

        # save attn
        if tensor_dict is not None and name is not None:
            tensor_dict[name] = soft

        return attn_output

推荐答案

您可以将 pytorch 张量转换为 numpy 数组,然后将其转换为 tensorflow 张量,反之亦然:

You can convert a pytorch tensor to a numpy array and convert that to a tensorflow tensor and vice versa:

import torch
import tensorflow as tf

pytorch_tensor = torch.zeros(10)
np_tensor = pytorch_tensor.numpy()
tf_tensor = tf.convert_to_tensor(np_tensor)

话虽如此,如果你想训练一个结合使用 pytorch 和 tensorflow 的模型,那将是……笨拙、缓慢、有问题,至少需要很长时间才能写出来.因为图书馆必须弄清楚如何反向传播成本.

That being said, if you want to train a model that uses a combination of pytorch and tensorflow that's going to be... awkward, slow, buggy and take a long time to write to say the least. Since the libraries have to figure out how to backpropagate the cost.

因此,除非您拥有的 pytorch 注意力块是预先训练过的,否则我建议您只使用一个或另一个库,有很多示例可以在其中任何一个中实现您想要的任何东西,并且有大量针对这两个库的预训练模型.Tensorflow 通常要快一些,但速度差异并不那么明显,而且我上面介绍的那种hack"可能会使整个过程比单独使用任何一个库都慢.

So unless the pytorch attention block you have is pre-trained, I'd recommend just sticking to one library or another, there's plenty of examples for implementing anything you want in either and plenty of pretrained models for both. Tensorflow is usually a bit faster but the speed differences aren't that significant and the kind of "hack" I presented above will likely make the whole thing slower than using either of the libraries standalone.

这篇关于有没有办法将pytorch张量转换为tensorflow张量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆