正确的方法来处理UrlFetch的速度限制,没有随机Utilities.sleep? [英] Proper way to deal with UrlFetch rate limit without random Utilities.sleep?

查看:157
本文介绍了正确的方法来处理UrlFetch的速度限制,没有随机Utilities.sleep?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们的Google表格插件为用户提供了一个扩展的函数库。



问题是,每个函数都会执行UrlFetch。因此,如果用户将列拖动大于100次,他们可能会看到错误:错误:服务在短时间内被调用次数过多:urlfetch。

显然,一个常见的解决方案是在UrlFetch函数之前添加一个随机睡眠位(例如 https: //productforums.google.com/forum/#!topic/docs/_lgg9hbU6k8 )。但是没有其他办法可以解决这个问题吗?经过随机睡眠测试后,我可能会一次将限制增加到200个函数,最大值

根本问题是我不知道限制实际是什么。例如,是否在Google的队列中同时存在100次UrlFetch请求,速率限制点击?我试图真正理解我们的选择,但甚至没有完全得到限制!



非常感谢您的帮助,特别是如果您是某人来自Google:)。

解决方案

您的问题的答案是当Google的队列中有100次UrlFetch请求时,限价点击?基本上没有。这个限制不是100次。



如果其中一个条件是错误(错误:服务在很短的时间内被调用太多次:urlfetch) met:


  • 每分钟通过urlfetch发送或接收22 MB数据
  • 3,000或更多每分钟拨打电话

  • ...或者如果您按每日最大值拨打电话。对于通话和数据的收费。



就您的情况而言,您听起来好像在达到每日数据最大值或每日之前获取错误消息调用max,所以它可能是每分钟数据的条件:22 MB通过每分钟的urlfetch发送或接收。



You 可以不断检查数字你正在通过urlfetch处理的字节,并使用它,让功能休眠一分钟,如果它接近极限。但是,这有点烦人。

您可能需要考虑尝试使功能更有效,以减少数据发送或调用次数。 如何做到这一点取决于函数,我们需要看到代码在那里提出具体的建议。



你可以在此处查找Google的配额: https://cloud.google.com/appengine/docs/quotas# UrlFetch


Our add-on for Google Sheets provides users with an extended library of functions.

The problem is, each function run does a UrlFetch. So if users drag a column down > 100 times, they will likely see the error: "Error: Service invoked too many times in a short time: urlfetch".

Apparently a common solution is to add a random bit of sleep before the UrlFetch function (eg https://productforums.google.com/forum/#!topic/docs/_lgg9hbU6k8). But is there no other way to solve this? After testing with random sleep, I maybe increase the limit to 200 functions at a time, max.

The underlying problem is I do not know know what the limitation actually is. For instance, is it when there's > 100 UrlFetch requests at once in Google's queue that the rate-limit hits? I'm trying to really understand what our options are, but don't even fully get the limitations!

Thanks so much for the help, especially if you're someone from Google :).

解决方案

The answer to your question "is it when there's > 100 UrlFetch requests at once in Google's queue that the rate-limit hits?" is basically no. The limit is not 100 calls.

You will see that error ("Error: Service invoked too many times in a short time: urlfetch") if 1 of these conditions is met:

  • 22 MB of data is sent or received via urlfetch per minute
  • 3,000 or more calls are made per minute
  • ...or if you hit the daily max. rates for calls and data..

In your case, it sounds like you get the error message before hitting the daily data max or daily call max, so it's probably data per minute condition: 22 MB is sent or received via urlfetch per minute.

You could continually check the number of bytes you're processing via urlfetch and using that, get the function to sleep for a minute if it's close to the limit. However, that's a bit annoying.

You may want to consider trying to make the function more efficient so less data is sent or fewer calls are made. How to do this depends a lot on the function, and we'd need to see the code to make specific suggestions there.

You can find Google's quotas here: https://cloud.google.com/appengine/docs/quotas#UrlFetch

这篇关于正确的方法来处理UrlFetch的速度限制,没有随机Utilities.sleep?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆