Kafka流中的消息加密 [英] Message encryption in kafka streaming

查看:1282
本文介绍了Kafka流中的消息加密的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近正尝试使用kafka流进行一些敏感的数据处理.我希望实现的目标是,在对敏感数据进行加密的同时,微服务架构的功能不会受到损害,即服务之间的丢失以及流数据的处理.

I am recently trying to play with kafka streaming for some sensitive data processing. The goal I wish to achieve is that while the sensitive data is encrypted, the power of microservice architecture is not jeopardised, i.e., losely coupled services and stream data processing.

我的问题是,在卡夫卡流式传输中,是否可以使用一个密钥解密传入的消息并使用另一个密钥再次对其进行加密?我有一个计划,但是由于我对kafka流媒体不熟悉,所以我无法证明kafka流媒体能够使用Streams DSL处理此功能.谁能帮助我解决这个问题,并且最好告诉我Streams DSL中的哪个功能可以处理该问题?

My question is, in kafka streaming, is it possible that I decrypt an incoming message using one key and encrypt it again with another key? I kind of got a plan, but as I am not familiar with kafka streaming, I can not prove that kafka streaming is capable of handling this function, using Streams DSL. Can anyone help me with this question, and preferably, tell me which function in Streams DSL can handle that?

更新:基本上,我要问的是:我试图对流传输管道中的单个消息使用两次公钥加密.一次在入站主题上,一次在出站主题上.只是不确定DSL是否能够解密和加密,以及密钥的存储位置.

Update: Basically, what I am asking is: I am trying to use public key encryption twice for a single message in the streaming pipeline. Once on the inbound topic and once on the outbound topic. Just not sure if DSL is able to decrypt and encrypt, and where the keys should be stored.

谢谢!

推荐答案

如果您只是想防止他人检查您的数据,Kafka提供SSL连接以在客户端和代理之间进行加密,尽管静态数据仍会不加密.您可以将SASL添加到其他添加授权中,以限制谁可以访问群集.限制SSH访问代理文件的访问也会有所帮助

If you're simply wanting to prevent others from inspecting your data, Kafka provides SSL connection for encryption between clients and brokers, although the at-rest data will still be unencrypted. You can add SASL to additional add authorization to limit who can access the cluster. Limiting SSH access to get to the broker files would also help

您要的内容需要 a自定义序列化程序和反序列化程序组合,所有Kafka API均使用该组合.

What you are asking for requires a custom Serializer and Deserializer combination, which is used by all Kafka APIs.

使用Kafka Streams API时,您可以将它们包装在Serde类中,并在启动它之前或在Produced.withConsumed.with

When using the Kafka Streams API, you would wrap these in a Serde class and provide that to your streams properties before you start it, or between two DSL methods by Produced.with or Consumed.with

这篇关于Kafka流中的消息加密的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆