Google应用脚本超时〜5分钟? [英] Google app script timeout ~ 5 minutes?

查看:131
本文介绍了Google应用脚本超时〜5分钟?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的谷歌应用程序脚本遍历用户的谷歌驱动器文件并复制,有时将文件移动到其他文件夹。 5分钟后脚本始终停止,日志中没有错误消息。



我在一次运行中排序了数十个甚至数千个文件。



是否有任何设置或解决方法?

解决方案配额


单个脚本的时间为6分钟/执行

- https://开发人员。 google.com/apps-script/guides/services/quotas


但您还有其他限制需要熟悉。例如,您只允许每天1小时的总触发运行时间,所以您不能将一个长功能拆分为12个不同的5分钟块。



优化



也就是说,你真的需要花六分钟才能执行的原因很少。 JavaScript在几秒内排序数千行数据应该没有问题。可能会损害您的表现的是对Google Apps本身的服务调用。


您可以编写脚本以充分利用内置缓存,通过最小化读取和写入的次数。交替读取和写入命令很慢。为了加速脚本,可以用一个命令将所有数据读入数组,并对数组中的数据执行任何操作,并使用一个命令写入数据。

- https://developers.google.com/apps-script/best_practices




批处理



您可以做的最好的事情是减少服务呼叫的数量。谷歌通过允许其大部分API调用的批量版本来实现这一点。


$ b 作为一个简单的例子,而不是

  for(var i = 1; i <= 100; i ++){
SpreadsheetApp.getActiveSheet()。deleteRow(i);
}

执行此操作

  SpreadsheetApp.getActiveSheet()。deleteRows(i,100); 

在第一个循环中,您不仅需要在表单上调用100次deleteRow,而且还需要还需要获得活动工作表100次。第二个变化应该比第一个变化好几个数量级。

交织读取和写入



另外,你也应该非常小心,不要在阅读和写作之间来回频繁地来回走动。您不仅会失去批量操作中的潜在收益,而且Google也无法使用其内置缓存。


每次我们必须先清空(提交)写入缓存,以确保您正在读取最新数据(您可以通过调用 SpreadsheetApp.flush())。同样,每次写入时,我们都必须抛弃读取缓存,因为它不再有效。因此,如果您可以避免交错读取和写入,您将充分受益于缓存。

- http://googleappsscript.blogspot.com/2010/06/optimizing-spreadsheet-operations.html

例如,而不是
$ b

sheet .getRange( A1)的setValue(1)。
sheet.getRange(B1)。setValue(2);
sheet.getRange(C1)。setValue(3);
sheet.getRange(D1)。setValue(4);

执行此操作

  sheet.getRange(A1:D1)。setValues([[1,2,3,4]]); 



链接函数调用



作为last如果您的功能在六分钟内无法完成,您可以将呼叫连接在一起或打破您的功能,以处理较小的数据段。



您可以将数据存储在缓存服务

a>(临时)或属性服务(永久性)存储区以供跨执行检索(因为Google Apps脚本有一个无状态的执行)。



如果您想启动另一个事件,您可以使用 Trigger Builder Class 或在紧时间表上设置循环触发器。


My google app script is iterating through the user's google drive files and copying and sometimes moving files to other folders. The script is always stopped after 5 minutes with no error message in the log.

I am sorting tens or sometimes thousands files in one run.

Are there any settings or workarounds?

解决方案

Quotas

The maximum execution time for a single script is 6 mins / execution
- https://developers.google.com/apps-script/guides/services/quotas

But there are other limitations to familiarize yourself with. For example, you're only allowed a total trigger runtime of 1 hour / day, so you can't just break up a long function into 12 different 5 minute blocks.

Optimization

That said, there are very few reasons why you'd really need to take six minutes to execute. JavaScript should have no problem sorting thousands of rows of data in a couple seconds. What's likely hurting your performance are service calls to Google Apps itself.

You can write scripts to take maximum advantage of the built-in caching, by minimizing the number of reads and writes. Alternating read and write commands is slow. To speed up a script, read all data into an array with one command, perform any operations on the data in the array, and write the data out with one command.
- https://developers.google.com/apps-script/best_practices

Batching

The best thing you can possibly do is reduce the number of service calls. Google enables this by allowing batch versions of most of their API calls.

As a trivial example, Instead of this:

for (var i = 1; i <= 100; i++) {
  SpreadsheetApp.getActiveSheet().deleteRow(i);
}

Do this:

SpreadsheetApp.getActiveSheet().deleteRows(i, 100);

In the first loop, not only did you need 100 calls to deleteRow on the sheet, but you also needed to get the active sheet 100 times as well. The second variation should perform several orders of magnitude better than the first.

Interweaving Reads and Writes

Additionally, you should also be very careful to not go back and forth frequently between reading and writing. Not only will you lose potential gains in batch operations, but Google won't be able to use its built-in caching.

Every time you do a read, we must first empty (commit) the write cache to ensure that you're reading the latest data (you can force a write of the cache by calling SpreadsheetApp.flush()). Likewise, every time you do a write, we have to throw away the read cache because it's no longer valid. Therefore if you can avoid interleaving reads and writes, you'll get full benefit of the cache.
- http://googleappsscript.blogspot.com/2010/06/optimizing-spreadsheet-operations.html

For example, instead of this:

sheet.getRange("A1").setValue(1);
sheet.getRange("B1").setValue(2);
sheet.getRange("C1").setValue(3);
sheet.getRange("D1").setValue(4);

Do this:

sheet.getRange("A1:D1").setValues([[1,2,3,4]]);

Chaining Function Calls

As a last resort, if your function really can't finish in under six minutes, you can chain together calls or break up your function to work on a smaller segment of data.

You can store data in the Cache Service (temporary) or Properties Service (permanent) buckets for retrieval across executions (since Google Apps Scripts has a stateless execution).

If you want to kick off another event, you can create your own trigger with the Trigger Builder Class or setup a recurring trigger on a tight time table.

这篇关于Google应用脚本超时〜5分钟?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆