1000+ API 1 cron作业要求? [英] 1000+ API calls with 1 cron job?

查看:153
本文介绍了1000+ API 1 cron作业要求?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们有以下潜在的情况:

we have the following potential situation:

一个Web应用程序,可能有超过1000用户。

A web app that could have 1000+ Users.

我们要建立一个cronjob通过外部服务的API为所有用户1000+一个的cronjob调用获取数据(注:每个用户都有自己的API证书与外部API服务)

We want to setup a cronjob to fetch data via an external-service API for ALL 1000+ users with one cronjob call (NOTE: each user has their own API credentials with that external API service)!

这将是一个明智的办法做到这一点?

What would be a sensible way to do it?

信息:


  • 凭据一个API调用可能需要长达5(!)秒,找回数据。

可能的脚本:

的cronjob调用本地PHP脚本(cronjobcall.php),通过所有的1000个用户循环。对于每个用户这个脚本调用通过卷曲(localfile_calls_api.php)另一本地脚本,使实际的API调用和返回的数据保存到MySQL数据库。

Cronjob calls a local php script (cronjobcall.php), that loops through all 1000 users. For each user this script calls another local script via curl (localfile_calls_api.php) that makes the actual API call and saves the returned data into MySQL database.

cronjobcall.php

cronjobcall.php

foreach($ThousandsOfUsers as $UserId => $UserCredentials)
{

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "localfile_calls_api.php?UserId=$UserId&UserCredentials=$UserCredentials");
curl_setopt($ch, CURLOPT_HEADER, 0);
$result=curl_exec($ch);

}

localfile_calls_api.php

localfile_calls_api.php

// !!! this could take up to 5(!) seconds to return result

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://external_api_url.php?UserId=$UserId&UserCredentials=$UserCredentials");
curl_setopt($ch, CURLOPT_HEADER, 0);
$result=curl_exec($ch);

if($result)
{
save_to_MySql($result,$UserId);
}

所以,这就是为什么我认为拆分整个过程分为两个不同的PHP文件的,因为API调用本身可能需要长达5秒返回数据。

So, that's why I think of splitting the whole procedure into two different php files, because the API call itself could take up to 5 seconds to return data.

这是正确的做法?

有没有更好的办法做到这一点?

Is there a better way to do it?

非常感谢!

推荐答案

如果你真的需要作出这样的API调用,为每个用户周期性地,我将以不同的设置它:

If you really need to make that API call for each user periodically, I would set it up differently:


  • 添加两列到表: LASTUPDATED isBeingProcessed (或类似的东西);

  • 使用的cron使运行每隔X的脚本分钟(1);

  • 在你的脚本,获得XX(10?),最旧的 LASTUPDATED 最新记录,而且没有得到处理,并设置 isBeingProcessed 标记;

  • 由于每个API调用完成后,更新用户信息,包括 LASTUPDATED 日期或时间未设定了 isBeingProcessed 标志

  • Add two columns to your table: lastUpdated and isBeingProcessed (or something similar);
  • Make a script that runs every X (1?) minutes using cron;
  • In your script, get the XX (10?) records with the oldest lastUpdated date and that not being processed and set the isBeingProcessed flag;
  • As each API call finishes, update the user information including the lastUpdated date or time an unset the isBeingProcessed flag;

根据你的服务器可以处理什么的API允许,你甚至可以将它设置有多个作业同时运行/重叠,减少总时间更新了不少。

Depending on what your server can handle and what the API allows, you can even set it up to have multiple jobs run simultaneously / overlapping, reducing the total time to update a lot.

这篇关于1000+ API 1 cron作业要求?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆