合并多个Boost序列化的OpenCV Mats [英] Merge several boost serialized OpenCV Mats
问题描述
在此处跟踪问题以对OpenCV Mat_进行序列化
Follow up the question here of Serializing OpenCV Mat_
我的任务是要序列化多个OpenCV Mat.现在,我想合并所有这些垫子.我可以通过将这些二进制文件反序列化为Mats并使用push_back
方法合并它们来实现.但是,由于我自己的原因,在反序列化之前,我必须先将它们合并为二进制格式.
My task is I have multiple OpenCV Mats that are serialized. Now I want to merge all those Mats. I can do this by deserialize these binaries into Mats and use push_back
method to merge them. However, for my own reason, I have to merge them in binary format first before deserialize.
如何合并这些二进制文件,以便最终可以调用相同的反序列化来获得整个大Mat?
How can I merge these binaries so that in the end, I can call my same deserialization to get the whole big Mat?
谢谢
推荐答案
您可以执行此操作而无需使用 boost .遵循此处所述的序列化方法,您可以将矩阵数据附加到文件末尾,并注意增加最终矩阵的行数.
You can do this without using boost. Following the serialization approach described here, you can append the data of your matrix at the end of the file, taking care of increasing the number of rows of the final matrix accordingly.
这是一个可行的示例,其中matappend
可以完成任务.为了完整起见,我还将放置matread
和matwrite
函数:
Here's a working example, where matappend
does the job. I'll put also the matread
and matwrite
functions for completeness:
#include <opencv2\opencv.hpp>
#include <iostream>
#include <fstream>
using namespace std;
using namespace cv;
void matwrite(const string& filename, const Mat& mat)
{
ofstream fs(filename, fstream::binary);
// Header
int type = mat.type();
int channels = mat.channels();
fs.write((char*)&mat.rows, sizeof(int)); // rows
fs.write((char*)&mat.cols, sizeof(int)); // cols
fs.write((char*)&type, sizeof(int)); // type
fs.write((char*)&channels, sizeof(int)); // channels
// Data
if (mat.isContinuous())
{
fs.write(mat.ptr<char>(0), (mat.dataend - mat.datastart));
}
else
{
int rowsz = CV_ELEM_SIZE(type) * mat.cols;
for (int r = 0; r < mat.rows; ++r)
{
fs.write(mat.ptr<char>(r), rowsz);
}
}
}
Mat matread(const string& filename)
{
ifstream fs(filename, fstream::binary);
// Header
int rows, cols, type, channels;
fs.read((char*)&rows, sizeof(int)); // rows
fs.read((char*)&cols, sizeof(int)); // cols
fs.read((char*)&type, sizeof(int)); // type
fs.read((char*)&channels, sizeof(int)); // channels
// Data
Mat mat(rows, cols, type);
fs.read((char*)mat.data, CV_ELEM_SIZE(type) * rows * cols);
return mat;
}
void matappend(const string& filename, const Mat& mat)
{
fstream fs(filename, fstream::binary | fstream::in);
// https://stackoverflow.com/a/2390938/5008845
if (fs.peek() == fstream::traits_type::eof())
{
// The file is empty, write (same as matwrite)
fs.close();
fs.open(filename, fstream::binary | fstream::out);
// Header
int type = mat.type();
int channels = mat.channels();
fs.write((char*)&mat.rows, sizeof(int)); // rows
fs.write((char*)&mat.cols, sizeof(int)); // cols
fs.write((char*)&type, sizeof(int)); // type
fs.write((char*)&channels, sizeof(int)); // channels
}
else
{
// The file is not empty, append
fs.close();
fs.open(filename, fstream::binary | fstream::out | fstream::in);
// Read Header
int rows, cols, type, channels;
fs.read((char*)&rows, sizeof(int)); // rows
fs.read((char*)&cols, sizeof(int)); // cols
fs.read((char*)&type, sizeof(int)); // type
fs.read((char*)&channels, sizeof(int)); // channels
// Consistency check
CV_Assert((cols == mat.cols) && (type == mat.type()) && (channels == mat.channels()));
// Go to beginning of file
fs.seekp(fstream::beg);
// Overwrite the number of rows
rows += mat.rows;
fs.write((char*)&rows, sizeof(int)); // rows
// Go to end of file
fs.seekp(0, fstream::end);
}
// Write data
if (mat.isContinuous())
{
fs.write(mat.ptr<char>(0), (mat.dataend - mat.datastart));
}
else
{
int rowsz = CV_ELEM_SIZE(mat.type()) * mat.cols;
for (int r = 0; r < mat.rows; ++r)
{
fs.write(mat.ptr<char>(r), rowsz);
}
}
fs.close();
}
int main()
{
// Save the random generated data
Mat1b m1 = (Mat1b(2, 2) << 1, 2, 3, 4);
Mat1b m2 = (Mat1b(3, 2) << 5, 6, 7, 8, 9, 10);
matappend("raw.bin", m1);
matappend("raw.bin", m2);
Mat m3 = matread("raw.bin");
// m3:
// 1 2
// 3 4
// 5 6
// 7 8
// 9 10
return 0;
}
这篇关于合并多个Boost序列化的OpenCV Mats的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!