优化Tensorflow以使用CPU [英] Optimizing tensorflow to CPU use

查看:157
本文介绍了优化Tensorflow以使用CPU的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个需要针对CPU优化的模型.

I have a model that needs to be optimized to CPU.

当前该模型需要1024 x 1024字节的数据.

Currently the model takes a 1024 x 1024 bytes data.

images = img[y:y+1024,x:x+1024,:]

根据本文档,他们希望将默认的tensorflow数据格式从NHCW更改为NCHW格式.

As per this document, they want to change the default tensorflow data format from NHCW to NCHW format.

如何从NHWC转换为NCHW格式?

https://software.intel .com/en-us/articles/tensorflow-optimizations-on-modern-intel-architecture

推荐答案

根据本文档,他们希望将默认的张量流数据格式从NHCW更改为NCHW格式.

As per this document, they want to change the default tensorflow data format from NHCW to NCHW format.

实际上,我从未见过任何支持NHCW格式的Tensorflow函数.例如, tf.nn.conv2d tf.nn.max_pool 支持NHWCNCHW(最后一个是cudnn6量化卷积的性能最高的张量格式,类似于NCHW.)

Actually, I've never seen any Tensorflow function that supports NHCW format. For example, tf.nn.conv2d and tf.nn.conv2d_transpose support NHWC (current default) and NCHW format. tf.nn.max_pool supports NHWC, NCHW and NCHW_VECT_C (the last one is the most performant tensor format for cudnn6's quantized convolution, similar to NCHW).

如何从NHCW转换为NCHW格式?

How can I transform from NHCW to NCHW format?

但是这种转换是可能的,例如通过 tf.transpose 也适用于高维张量:

But this transformation is possible, e.g. via tf.transpose that works with high-dimensional tensors as well:

# NHCW
original = tf.placeholder(dtype=tf.float32, shape=[None, 1024, 3, 1024])
# NCHW: swap 1 and 2 axis
transformed = tf.transpose(original, perm=[0, 2, 1, 3])

您也可以通过np.swapaxes(array, 1, 2)以numpy的方式执行此操作.

You can also do this in numpy via np.swapaxes(array, 1, 2).

这篇关于优化Tensorflow以使用CPU的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆