谷歌应用脚​​本UrlFetchApp.fetch限制? [英] google apps script UrlFetchApp.fetch limitations?

查看:126
本文介绍了谷歌应用脚​​本UrlFetchApp.fetch限制?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个简单的脚本,可以抽取大约30,000个JSON字符。


$ b 我得到 SyntaxError:意外的标记:F(第12行)当我尝试使用 JSON.parse() Utilities.jsonParse();

 函数refresh(){

var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheets()[0];
sheet.clear();


var text = UrlFetchApp.fetch(http:// blablah / json);

Logger.log(text.getContentText());

// error SyntaxError:意外的标记:F(第12行)
json = JSON.parse(text);

记录器仅显示约59行JSON,但我被告知该记录器有一个空间限制 - 但我不太确定就是这样。



运行在我自己的服务器上,JSON.parse解析数据就好了,jQuery也是如此get()。



所以我在想UrlFetchApp.fetch()不能获取长文件?

很难接受,我也没有找到关于它的文档:(

您可以检查UrlFetch和其他服务的限制在新的Apps脚本仪表板上。根据我的经验,Logger.log对于一个比UrlFetch更简单的日志,UrlFetch可能正常,Logger.log没有显示它,并且你遇到了其他问题。



尝试将 getContentText 结果在其他地方,例如一个电子表格单元格。你可能面临的一个可能的错误(除了配额限制之外)是字符编码, getContentText 有一个可选参数,你可以通知页面的编码,你有检查吗?


I have a simple script that pulls about 30,000 characters of JSON.

I get SyntaxError: Unexpected token: F (line 12) when I try to parse it with JSON.parse() or Utilities.jsonParse();

function refresh() {

  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheets()[0];
  sheet.clear();


  var text = UrlFetchApp.fetch("http://blablah/json");

  Logger.log(text.getContentText());

  //error SyntaxError: Unexpected token: F (line 12)
  json = JSON.parse(text);
)

The logger only shows about 59 lines of the JSON, but I was told that the logger has a space limit - but I'm not so sure that's it.

Running on my own server, JSON.parse parses the data just fine and so does jQuery get().

So I'm thinking UrlFetchApp.fetch() just can't get long files?

Hard to accept and I've found no documentation about it :(

解决方案

You can check the UrlFetch and other services limitations on the new Apps Script dashboard. From my experience, Logger.log has a much more tighter limitation for a single log than UrlFetch, it's possible that UrlFetch is getting it fine, Logger.log is not showing it and you're running into other problem.

Try placing the getContentText result somewhere else, e.g. a spreadsheet cell. Or split it and call logger log in a for-loop.

A possible error that you might be facing (besides the quota limitation) is character encoding, getContentText has an optional parameter where you might inform the encoding of the page, have you checked that?

这篇关于谷歌应用脚​​本UrlFetchApp.fetch限制?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆