在Node JS中使用Avro序列化数据 [英] Serializing data with Avro in node js
本文介绍了在Node JS中使用Avro序列化数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想序列化来自JSON对象的数据,并以kafka作为结束将其发送到整个网络.现在,我在文件中有了一个Avro模式,该模式确定了发送到kafka的日志系统所需的字段:
I would like to serialize data from a JSON object and send it throught the network with kafka as an end. Now I have an avro schema in a file, that determinate the fields necessary to send to kafka for the logging system:
{"namespace": "com.company.wr.messages",
"type": "record",
"name": "Log",
"fields": [
{"name": "timestamp", "type": "long"},
{"name": "source", "type": "string"},
{"name": "version", "type": "string"},
{"name": "ipAddress", "type": "string"},
{"name": "name", "type": "string"},
{"name": "level", "type": "string"},
{"name": "errorCode", "type": "string"},
{"name": "message", "type": "string"}
]
}
我正在使用一个节点程序包"avro-schema",我尝试了其他程序,但没有一个能很好地工作,我只需要从node js以avro方式进行序列化.
I am using a node packages 'avro-schema', I tried others but none of then are working well, I just need to serialize in an avro way from node js.
推荐答案
使用 avsc
:
var avro = require('avsc');
// Parse the schema.
var logType = avro.parse({
"namespace": "com.company.wr.messages",
"type": "record",
"name": "Log",
"fields": [
{"name": "timestamp", "type": "long"},
{"name": "source", "type": "string"},
{"name": "version", "type": "string"},
{"name": "ipAddress", "type": "string"},
{"name": "name", "type": "string"},
{"name": "level", "type": "string"},
{"name": "errorCode", "type": "string"},
{"name": "message", "type": "string"}
]
});
// A sample log record.
var obj = {
timestamp: 2313213,
source: 'src',
version: '1.0',
ipAddress: '0.0.0.0',
name: 'foo',
level: 'INFO',
errorCode: '',
message: ''
};
// And its corresponding Avro encoding.
var buf = logType.toBuffer(obj);
您可以在此处找到可用的各种编码方法的详细信息.
You can find more information on the various encoding methods available here.
这篇关于在Node JS中使用Avro序列化数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文