检查语言模型的复杂性 [英] Check perplexity of a Language Model
本文介绍了检查语言模型的复杂性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我使用 Keras LSTM 创建了一个语言模型,现在我想评估它是否好,所以我想计算困惑度.
I created a language model with Keras LSTM and now I want to assess wether it's good so I want to calculate perplexity.
在 Python 中计算模型复杂度的最佳方法是什么?
What is the best way to calc perplexity of a model in Python?
推荐答案
我想出了两个版本并附上了它们对应的源代码,请随时查看链接.
I've come up with two versions and attached their corresponding source, please feel free to check the links out.
def perplexity_raw(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
https://github.com/keras-team/keras/issues/8267
"""
# cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
cross_entropy = K.cast(K.equal(K.max(y_true, axis=-1),
K.cast(K.argmax(y_pred, axis=-1), K.floatx())),
K.floatx())
perplexity = K.exp(cross_entropy)
return perplexity
def perplexity(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
https://github.com/keras-team/keras/issues/8267
"""
cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity
这篇关于检查语言模型的复杂性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文