将 .sql 文件从本地机器导入 postgresql 或 Mysql [英] Importing .sql file into postgresql or Mysql from Local Machine

查看:41
本文介绍了将 .sql 文件从本地机器导入 postgresql 或 Mysql的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在 postgreysql 或 Mysql 数据库中导入 15GB .sql 文件.在短时间内导入如此大的数据集的更快方法或程序是什么.

任何建议将不胜感激?

解决方案

首先,真的没有.sql 文件"这样的东西.这就像说一个.dat 文件",它实际上可以是任何东西.INSERT 的列表.创建表的脚本.从现有数据库中提取信息的查询.等

该文件可能包含表和索引定义 (DDL) 和其他内容,或者它可能只是一个 INSERT 语句的列表.它也可以编写为使用自定义供应商扩展,例如 PostgreSQL 的 COPY 命令,以实现快速数据加载.

您需要查看文件,看看它是什么.确定是否需要先创建表来保存数据.看看您是否需要更改任何 DDL 以与目标数据库兼容,因为不幸的是,数据库供应商并没有始终遵循 SQL 数据类型的标准名称,因此存在用于密钥生成等内容的供应商扩展.

如果它是简单的 INSERT 到单个表中并且插入不相互依赖,则将其加载到 PostgreSQL 的最快方法是将其拆分为几个块并使用 <代码>psql -1 -v ON_ERROR_ROLLBACK=1 -f chunk.sql.

否则你只需要psql -1 -v ON_ERROR_ROLLBACK=1 -f thefile.sql.

将数据加载到 PostgreSQL 的最快方法是使用 pg_bulkload,但这相当具有破坏性,我认为它不会采用预先格式化的 SQL 输入.次佳选项是 COPY 命令,但它也适用于 CSV/TSV,而不适用于写为 INSERTs 的 SQL 格式数据.

I'm trying to import 15GB .sql file in postgreysql or Mysql database. What is the faster way or procedure to import such a big dataset in short time.

Any suggestion will be greatly appreciated ?

解决方案

To start with, there's really no such thing as a ".sql file". It's like saying a ".dat file", it could be practically anything. A list of INSERTs. A script to create tables. A query that extracts information from an existing database. Etc.

The file might contain table and index definitions (DDL) and other content, or it might just be a list of INSERT statements. It could be written to use custom vendor extensions like PostgreSQL's COPY command for fast data loading, too.

You need to look at the file and see what it is. Determine if you need to create tables to hold the data first. See if you need to change any DDL to be compatible with the target database, since unfortunately the standard names for SQL data types aren't followed all that consistently by database vendors, there are vendor extensions for things like key generation, etc.

If it's plain INSERTs into a single table and the inserts don't depend on each other the fastest way to load it into PostgreSQL is to split it into several chunks and run each chunk with psql -1 -v ON_ERROR_ROLLBACK=1 -f chunk.sql.

Otherwise you'd just have to psql -1 -v ON_ERROR_ROLLBACK=1 -f thefile.sql.

The fastest way to load data into PostgreSQL is to use pg_bulkload, but that's quite disruptive and I don't think it'll take pre-formatted SQL input. The next-best option is the COPY command, but that also works with CSV/TSV, not with SQL formatted data written as INSERTs.

这篇关于将 .sql 文件从本地机器导入 postgresql 或 Mysql的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆