mysql使用设置字段加载数据本地infile语法问题 [英] mysql load data local infile syntax issues with set fields
问题描述
我正在尝试使用mysql的LOAD DATA LOCAL INFILE
语法将.csv
文件加载到现有表中.这是我的.csv
文件(带有标题)中的一条记录:
PROD, PLANT,PORD, REVN,A_CPN, A_CREV,BRDI, DTE, LTME
100100128144,12T1,2070000,04,3DB18194ACAA,05_01,ALA13320004,20130807,171442
问题是我想在导入过程中完成3件事:
-
RECORDID
INT NOT NULL AUTO_INTEGER PRIMARY_KEY字段应在插入每一行时增加(此表列和结构已存在于mysql表中)
应该将 -
DTE和LTME连接起来并转换为mysql
DATETIME
格式,然后插入到名为TRANS_OCR
的现有mysql列中.
-
CREATED
TIMESTAMP字段应在行插入时设置为当前的unix时间戳(该表的列和结构在mysql表中也已经存在)
我正在尝试使用以下命令将此数据导入mysql表:
LOAD DATA LOCAL INFILE 'myfile.csv' INTO TABLE seriallog
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID)
SET CREATED = CURRENT_TIMESTAMP;
我认为我已经正确设置了CREATED
列,但其他原因导致发出mysql警告:
Warning: Out of range value for column 'FLEX_PN' at row 1
Warning: Row 1 was truncated; it contained more data than there were input columns
有人可以帮助我使用语法吗,LOAD DATA LOCAL INFILE
模块让我感到困惑...
找出正确的语法来完成这项工作:
sql = """LOAD DATA LOCAL INFILE %s INTO TABLE seriallog_dev
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID, @DTE, @LTME)
SET RECORDID = NULL,
TRANS_OCR = STR_TO_DATE(CONCAT(@DTE,'',@LTME), "%%Y%%m%%d%%H%%i%%s"),
CREATED = CURRENT_TIMESTAMP;"""
params = (file,)
self.db.query( sql, params )
提醒您-这是通过python的mysqldb模块完成的.
CAVEAT
此解决方案的唯一问题是,由于某种原因,我的批量插入仅插入文件中的前217行数据.我的总文件大小为19KB,所以我无法想象它对于mysql缓冲区来说太大了.
更多信息
此外,我只是直接在msyql-server CLI中尝试了此语法,它适用于所有255条记录.因此,很明显,python,python mysqldb模块或mysqldb模块建立的mysql连接存在一些问题.
完成
我只是想出了问题,这与装入数据本地infile命令无关,而是我尝试将原始.dbf文件转换为.csv之前尝试导入.csv的方法.由于某些原因,mysql导入方法在.dbf到.csv转换方法完成之前在.csv上运行-导致在.csv文件中找到并导入了部分数据集...很抱歉浪费大家的时间!>
I'm trying to use mysql's LOAD DATA LOCAL INFILE
syntax to load a .csv
file into an existing table. Here is one record from my .csv
file (with headers):
PROD, PLANT,PORD, REVN,A_CPN, A_CREV,BRDI, DTE, LTME
100100128144,12T1,2070000,04,3DB18194ACAA,05_01,ALA13320004,20130807,171442
The issue is that I want 3 extra things done during import:
A
RECORDID
INT NOT NULL AUTO_INTEGER PRIMARY_KEY field should be incremented as each row gets inserted (this table column and structure already exists within the mysql table)DTE and LTME should be concatenated and converted to a mysql
DATETIME
format and inserted into an existing mysql column namedTRANS_OCR
A
CREATED
TIMESTAMP field should be set to the current unix timestamp on row insertion (this table column and structure already exists as well within the mysql table)
I'm trying to import this data into the mysql table with the following command:
LOAD DATA LOCAL INFILE 'myfile.csv' INTO TABLE seriallog
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID)
SET CREATED = CURRENT_TIMESTAMP;
I think I have the CREATED
column set properly but the others are causing a mysql warning to be issued:
Warning: Out of range value for column 'FLEX_PN' at row 1
Warning: Row 1 was truncated; it contained more data than there were input columns
Can someone help me with the syntax, the LOAD DATA LOCAL INFILE
module is confusing to me...
Figured out the proper syntax to make this work:
sql = """LOAD DATA LOCAL INFILE %s INTO TABLE seriallog_dev
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID, @DTE, @LTME)
SET RECORDID = NULL,
TRANS_OCR = STR_TO_DATE(CONCAT(@DTE,'',@LTME), "%%Y%%m%%d%%H%%i%%s"),
CREATED = CURRENT_TIMESTAMP;"""
params = (file,)
self.db.query( sql, params )
Mind you--this is done with python's mysqldb module.
CAVEAT
The only issue with this solution is that for some reason my bulk insert only inserts the first 217 rows of data from my file. My total file size is 19KB so I can't imagine that it is too large for the mysql buffers... so what gives?
more info
Also, I just tried this syntax directly within the msyql-server CLI and it works for all 255 records. So, obviously it is some problem with python, the python mysqldb module, or the mysql connection that the mysqldb module makes...
DONE
I JUST figured out the problem, it had nothing to do with the load data local infile command but rather the method I was using to convert my original .dbf file into the .csv before attempting to import the .csv. For some reason the mysql import method was running on the .csv before .dbf to .csv conversion method finished -- resulting in a partial data set being found in the .csv file and imported... sorry to waste everyone's time!
这篇关于mysql使用设置字段加载数据本地infile语法问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!