BigQuery PHP API发送不在Google云端存储中的json [英] BigQuery PHP API send json that isn't in Google Cloud Storage

查看:90
本文介绍了BigQuery PHP API发送不在Google云端存储中的json的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



我发现了一个例子,说明如何让这个json在Google云端存储中的文件:

  $ confLoad-> setSourceUris(array(
gs:// json_file_bucket /some_name.json
));
$ conf = new \Google_JobConfiguration;
$ conf-> setLoad($ confLoad);
$ job-> setConfiguration($ conf);
$ service = new \Google_BigQueryService($ this-> client);
$ running = $ service-> jobs-> insert($ this-> projectId,$ job);

但我不明白如何在没有外部json文件的情况下实现这一功能。

解决方案

如果您只添加一行,最好的选择是使用 TableData。 insertAll()方法,它允许您一次导入一行或几行。更多信息请此处



这里是python中的代码(我知道你在使用PHP,但它应该有点类似):

  body = {rows:[
{json:{column_name:7.7,}}
]}
response = bigquery.tabledata()。insertAll(
projectId = PROJECT_ID,
datasetId = DATASET_ID,
tableId = TABLE_ID,
body = body).execute()

另一种方法是使用媒体上传。这与将media_body参数添加到作业配置一样简单,其中media_body包含要加载的数据(并且不需要指定 sourceUris 。例如:

  job = {
'配置':{$ b $'load':load_config#在此指定目标表

$ b $ result = jobs.insert(
projectId = project_id,
body = job,
media_body = media_body).execute()

有关PHP 这里


I want to update an existing table I have in BigQuery with a single new row.

I found an example how this could be done if I have that json in a file in Google Cloud Storage:

$confLoad->setSourceUris(array(
    "gs://json_file_bucket/some_name.json"
));
$conf = new \Google_JobConfiguration;
$conf->setLoad($confLoad);
$job->setConfiguration($conf);
$service = new \Google_BigQueryService($this->client);
$running = $service->jobs->insert($this->projectId, $job);

But I don't understand how this can be achieved without an external json file.

解决方案

If you're adding just a single row, your best option is to use the TableData.insertAll() method, which lets you import a single row or a few rows at a time. More information is here.

Here is the code in python (I realize you're using PHP, but it should be somewhat similar):

body = {"rows":[
    {"json": {"column_name":7.7,}}
    ]}
response = bigquery.tabledata().insertAll(
    projectId=PROJECT_ID,
    datasetId=DATASET_ID,
    tableId=TABLE_ID,
    body=body).execute()

The alternative is to use 'media upload'. This is as simple as adding a media_body parameter to the job configuration, where media_body contains the data you are loading (and you don't need to specify sourceUris. For example:

job = {
    'configuration': {
      'load': load_config # specify destination table here
    }
  }
result = jobs.insert(
  projectId=project_id,
  body=job,
  media_body=media_body).execute()

There is more information about the media upload option in PHP here.

这篇关于BigQuery PHP API发送不在Google云端存储中的json的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆