每日MySQL(部分和过滤的)复制的最佳做法? [英] Best practices for daily MySQL (partial and filtered) replication?

查看:88
本文介绍了每日MySQL(部分和过滤的)复制的最佳做法?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个合理的大型数据库,其中包含40多个表.我只需要复制几个表(+/- 5).每个表也都被过滤.

I have a reasonable large database with > 40 tables. I only need to replicated a few tables (+/- 5). And each tables is also filtered.

我正在寻找一些复制此数据的最佳实践(每天就足够了),在这里我只能选择几个表,并为每个表包括一些WHERE子句.

I'm looking for some best practices for replicating this data (daily is enough), where i can select just a few tables and included some WHERE clauses for each table.

我正在考虑为每个表(带有where子句)启动mysqldump,并为每个表创建一个单独的.sql文件.然后,我可以截断目标数据库上的所有表(每天都会覆盖所有数据),并运行mysql分别导入每个表.

I'm thinking of starting mysqldump for each table (with a where clause) and make for each table a separate .sql file. I then can then truncate all tables (all data is daily overwritten) on the destination db, and run mysql for importing each table separate.

示例:

   # dump each table 
   mysqldump -u repl_user my_database my_table -w 'id between 1000 and 1005' > my_table.sql

我知道要复制完整的数据库,并使用 blackhole 表类型.但是由于不需要35张桌子,因此似乎有些过头了.此外,有些表只需要过滤版本,而我无法通过黑洞来解决.

Im aware of replicating a full database and use blackhole table type. But since 35 tables are not needed, it seems somehow overkill. Besides, some tables only needs a filtered version, and that i can't solve via blackhole.

有更好的解决方案吗?

推荐答案

MySQL原生支持复制过滤器,但仅在数据库或表级别.这不符合您从这些表中过滤行的子集的要求.

MySQL natively supports replication filters, but only at the database or table level. This doesn't meet your requirement to filter a subset of rows from these tables.

FlexViews 是一种用于读取二进制日志并仅重播与以下内容相关的更改的工具使物化视图保持最新.您可以通过实现表过滤的方式定义物化视图.

FlexViews is a tool to read the binary log and replay only changes that are relevant to keeping a materialized view up to date. You could define your materialized view in such a way to implement your table filtering.

这篇关于每日MySQL(部分和过滤的)复制的最佳做法?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆