虽然表不会更新,但BigQuery会加载CSV文件'成功' [英] BigQuery load CSV file 'successful' although the table does not update

查看:122
本文介绍了虽然表不会更新,但BigQuery会加载CSV文件'成功'的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我尝试将CS​​V上传到Google BigQuery中的预先存在的表格,但表格未更新。它说工作加载'成功',我没有错误。除了当我检查表格时,它并没有添加新添加的CSV行。



我有设置'附加到表格',虽然它是没有附加到表。



我已经尝试过的东西包括:使用类似名称复制表以便能够选择'over write table'/''write if空'即使我知道这不是我想要的,即使没有错误,启用'允许引用的换行符','允许锯齿状的行'和'忽略未知值',我增加了'允许的错误数'分别使用自动检测模式重新创建表,手动创建模式,将模式中条目的所有值设置为STRING,将所有值设置为它们应该在适用的位置的数据类型。我已将分隔符仅更改为发现值之间用逗号分隔(谁会猜到!)另外我注意到当我改变分隔符时,它实际上会添加lin es打开表格。我使用Numbers打开CSV文件,然后再次将其作为CSV文件导出。



我可以尝试不同的东西。如果有人知道什么可以帮助我,或者如果我已经留下任何细节,请让我知道。



我深表歉意,如果这是更多的不同SlackExchange的问题网站,或者这只是一个愚蠢的问题。



谢谢大家!



编辑2018/03/19 :
工作量结果:

 作业加载成功
作业ID fire2018dgk:EU.bquijob_836dd47_1622f49b68e
创建时间2018年3月16日下午3:49:53
开始时间2018年3月16日下午3:49:55
结束时间2018年3月16日下午3:49:57
用户fire2018@fog.com
目的地表firefog:data-warehouse.dgk
编写首选项附加到表格
源格式CSV
分隔符,
跳过前导行1
允许引用换行符true
允许锯齿形行true b $ b酸ce URI上传文件
架构
id:STRING
时间戳:STRING
放弃:STRING
campaign_type:STRING
广告系列:STRING
call_type: STRING
持有:STRING
hold_time:STRING
call_time:STRING
talk_time:STRING
agents_email:STRING
优先级:STRING
phone_number:STRING
日期:STRING
day_of_week:STRING
time_of_call:STRING

更新2018年3月19日:
我遇到的这个问题不仅限于此表,我也有数据库中的另一个表的这个问题,但这里的奇怪的事情是我使用以前的工作负荷,我我已经使用相同的工作负载了解了其他许多以前上传的作品。

这只是我的猜测。 csv文件中列的顺序与您的表的顺序不匹配。在mysql中有一个命令describe table_name,您可以在其中查看表中列的顺序。

另一件事是确保你没有foreignkey到一个不存在的表。



再次,我只是浪费了3天的时间用这个问题与mysql,我想分享我的输入。

I am trying to upload a CSV to a preexisting table in Google BigQuery although the table is not updating. It says the job loads 'successfully' and I have no errors. Except when I go into check the table, it doesnt add in the lines of the newly 'added' CSV.

I have the setting 'append to table' although it is not appending to table.

Things that I have tried include: copying the table with a similar name to be able to have selected 'over write table'/''write if empty' even though i know this isn't what I want, i have increased 'Number of errors allowed' even though there are no errors, enabled 'Allow quoted newlines', 'Allow jagged rows' and 'Ignore unknown values' together and separatedly, recreated the table with auto detect schema, manually created the schema, set all of the value of entries in the schema to STRING, set all of the values to data types that they should be where applicable.I have changed the delimiter only to find out that the values are separated by commas (who would have guessed!) alothough i noticed when i changed the delimiter it actually would add the lines into the table.I opened the CSV file using Numbers and then exported it once again as a CSV file.

I am running out of ideas of different things I can try.If anyone knows what might help me or if i have left any details out please let me know.

I deeply apologize if this is more of a question for a different SlackExchange site, or if this is just a stupid question.

Thanks all!

Edit 2018/03/19 : Results from job load:

Job Load Successful
Job ID  fire2018dgk:EU.bquijob_836dd47_1622f49b68e
Creation Time   Mar 16, 2018, 3:49:53 PM
Start Time  Mar 16, 2018, 3:49:55 PM
End Time    Mar 16, 2018, 3:49:57 PM
User    fire2018@fog.com
Destination Table   firefog:data-warehouse.dgk
Write Preference    Append to table
Source Format   CSV
Delimiter   ,
Skip Leading Rows   1
Allow Quoted Newlines   true
Allow Jagged Rows   true
Source URI  uploaded file
Schema  
id: STRING
timestamp: STRING
abandonned: STRING
campaign_type: STRING
campaign: STRING
call_type: STRING
holds: STRING
hold_time: STRING
call_time: STRING
talk_time: STRING
agents_email: STRING
priority: STRING
phone_number: STRING
date: STRING
day_of_week: STRING
time_of_call: STRING

Update 2018/03/19: This problem I am having is not limited to just this table , I also have this problem with another table in the database but the strange thing here is I am using a previous job load that I know works from a number of other past uploads I have done using the same job load.

解决方案

I have never used BigQuery and this is just my wild guess. The order of your columns in the csv file doesn't match that of your tables. In mysql there is a command "describe table_name" where you could see the order of the columns in your table.

The other thing is make sure you don't have foreignkey to a table that it doesn't exist.

Long story short, order of your csv columns and uploading csv files matter.

Again, I just wasted 3 days on this problem with mysql, I thought to share my input.

这篇关于虽然表不会更新,但BigQuery会加载CSV文件'成功'的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆