使用 TF-lite 将 MobileFacenet 转换为量化感知模型时,create_training_graph() 失败 [英] create_training_graph() failed when converted MobileFacenet to quantize-aware model with TF-lite
问题描述
我正在尝试量化 MobileFacenet (来自 Sirius-ai 的代码)根据 建议我想我遇到了与这个
I am trying to quantize MobileFacenet (code from sirius-ai) according to the suggestion and I think I met the same issue as this one
当我将 tf.contrib.quantize.create_training_graph()
添加到训练图中时
(train_nets.py ln.187: train_op = train(...)
之前或 train()
utils/common.py ln.38 渐变之前)
When I add tf.contrib.quantize.create_training_graph()
into training graph
(train_nets.py ln.187: before train_op = train(...)
or in train()
utils/common.py ln.38 before gradients)
它没有在图中添加量化感知操作来收集动态范围 max\min.
It did not add quantize-aware ops into the graph to collect dynamic range max\min.
我假设我应该在 tensorboard 中看到一些额外的节点,但我没有,因此我认为我没有在训练图中成功添加量化感知操作.我尝试跟踪 tensorflow,发现我对 _FindLayersToQuantize() 一无所获.
I assume that I should see some additional nodes in tensorboard, but I did not, thus I think I did not successfully add quantize-aware ops in training graph. And I try to trace tensorflow, found that I got nothing with _FindLayersToQuantize().
但是,当我添加 tf.contrib.quantize.create_eval_graph()
来优化训练图时.我可以看到一些量化感知操作作为 act_quant ...由于我没有成功地在训练图中添加操作,我没有权重加载到评估图中.因此我收到了一些错误消息
However when I add tf.contrib.quantize.create_eval_graph()
to refine the training graph. I can see some quantize-aware ops as act_quant...
Since I did not add ops in training graph successfully, I have no weights to load in eval graph.
Thus I got some error message as
Key MobileFaceNet/Logits/LinearConv1x1/act_quant/max not found in checkpoint
或
tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value MobileFaceNet/Logits/LinearConv1x1/act_quant/max
有谁知道如何解决这个错误?或者如何以良好的精度获得量化的 MobileFacenet?
Does anyone know how to fix this error? or how to get quantized MobileFacenet with good accuracy?
谢谢!
推荐答案
H,
不幸的是,contrib/quantize 工具现已弃用.它将无法支持更新的模型,我们也不再致力于此.
Unfortunately, the contrib/quantize tool is now deprecated. It won't be able to support newer models, and we are not working on it anymore.
如果您对 QAT 感兴趣,我建议您尝试使用新的 TF/Keras QAT API.我们正在积极开发并为其提供支持.
If you are interested in QAT, I would recommend trying the new TF/Keras QAT API. We are actively developing that and providing support for it.
这篇关于使用 TF-lite 将 MobileFacenet 转换为量化感知模型时,create_training_graph() 失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!