从自定义数据中使用附带工具包来训练UBM [英] training UBM with sidekit from custom data

查看:178
本文介绍了从自定义数据中使用附带工具包来训练UBM的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用SIDEKIT从我已经提取的用于情感识别的数据中训练GMM-UBM模型(与说话者识别几乎相同.我也不了解HDF5功能文件系统).我的数据是形状为(1101,78)的ndarray [78是声学特征的数量,而1101是特征矢量(帧)的数量.

I am trying to train GMM-UBM model from data that i have already extracted for emotion recognition with SIDEKIT(pretty much the same as speaker recognition. I also don't understand the HDF5 feature file system). My data is an ndarray with shape (1101,78) [78 are the number of acoustic features and 1101 the number of feature vectors(frames).

ubm = sidekit.Mixture()

llks = ubm.EM_uniform(anger, distribNb, iteration_min=3, iteration_max=10, llk_gain=0.01, do_init=True)

引发的错误是:

line 394, in _compute_all
    self.A = (numpy.square(self.mu) * self.invcov).sum(1) - 2.0 * (numpy.log(self.w) + numpy.log(self.cst))

ValueError: operands could not be broadcast together with shapes (512,78) (512,0)

这意味着协方差矩阵的形状为(512,0).那是错的吗?应该是(512,78)吗?我可能是错的.请给我一个提示

which means that the covariance matrix is of shape (512,0). Is that wrong? Should it be like (512,78)? I may be wrong. Please give me a hint

推荐答案

您可能已经知道了,但是我想我也可以为此发布一个可能的解决方案.

You might have figured it out already, but I thought I might as well post a possible solution to this.

以下代码创建尺寸为(2,100)的随机数据,并尝试使用EM_uniform算法训练128个混合gmm:

The following code creates random data with dimensions (2,100) and tries to train a 128-mixture gmm using the EM_uniform algorithm:

import sidekit
import numpy as np
import random as rn

gmm = sidekit.Mixture()
data = np.array([[rn.random() for i in range(100)],[rn.random() for i in range(100)]])
gmm.EM_uniform(data,
               distrib_nb=128,
               iteration_min=3,
               iteration_max=10,
               llk_gain=0.01,
               do_init=True)

但是,这将导致与您报告的错误相同的错误: ValueError:操作数不能与形状(128,100)(128,0)一起广播

However, this results in the same error as you have reported: ValueError: operands could not be broadcast together with shapes (128,100) (128,0)

我怀疑Sidekit.Mixture._init_uniform()中gmm.invcov的计算方式存在一些错误,因此我想出了使用Sidekit.Mixture._init()中的代码对混合物进行手动初始化的方法(初始化函数EM_split()-算法).

I suspect there is some bug in how gmm.invcov is calculated in Sidekit.Mixture._init_uniform(), so I have figured out a manual initialization of the mixture with code from Sidekit.Mixture._init() (the initialization function for the EM_split()-algorithm).

以下代码在我的计算机上运行没有错误:

The following code ran without errors on my computer:

import sidekit
import numpy as np
import random as rn
import copy

gmm = sidekit.Mixture()
data = np.array([[rn.random() for i in range(100)],[rn.random() for i in range(100)]])

# Initialize the Mixture with code from Sidekit.Mixture._init()
mu = data.mean(0)
cov = (data**2).mean(0)
gmm.mu = mu[None]
gmm.invcov = 1./cov[None]
gmm.w = np.asarray([1.0])
gmm.cst = np.zeros(gmm.w.shape)
gmm.det = np.zeros(gmm.w.shape)
gmm.cov_var_ctl = 1.0 / copy.deepcopy(gmm.invcov)
gmm._compute_all()

# Now run EM without initialization
gmm.EM_uniform(data,
               distrib_nb=128,
               iteration_min=3,
               iteration_max=10,
               llk_gain=0.01,
               do_init=False)

这给出了以下输出: [-31.419146414931213、54.759037708692404、54.759037708692404、54.759037708692404], 这是每次迭代后的对数似然值(4次迭代后的收敛度.请注意,此示例数据可减小gmm的训练方法.)

This gave the following output: [-31.419146414931213, 54.759037708692404, 54.759037708692404, 54.759037708692404], which is the log-likelihood values after each iteration (convergence after 4 iterations. Do note that this example data is way to small to train a gmm on.)

我不能保证以后会导致任何错误,如果是这种情况,请发表评论!

I cannot guarantee this leads to any errors later on, leave a comment if that is the case!

对于HDF5文件,请查看 h5py文档以获得教程.另外,hdfview允许您查看h5-文件的内容,这对于以后进行评分时的调试非常方便.

As for HDF5-files, check out the the h5py documentation for tutorials. Also, hdfview allows you to look into contents of the h5-files, which is pretty convenient for debugging later on when you get to scoring.

这篇关于从自定义数据中使用附带工具包来训练UBM的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆