许多PHP进程同时运行 [英] Lots of PHP processes running at the same time

查看:93
本文介绍了许多PHP进程同时运行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

要求

我有一个Web应用程序,允许用户安排一些社交媒体任务,例如在Facebook或Twitter上发帖。

I have a web app which allow users schedule some social media tasks like posting on Facebook or Twitter.

每个用户都可以告诉该应用随时在其社交媒体帐户上发布(14:00、15:11、17:54 ...)。

Each user can tell the app to publish on his social media accounts at any time (14:00, 15:11, 17:54...).

除此之外,我还需要每天为每个用户完成其他任务,例如吸引其关注者/朋友或在Twitter上取消关注他们的人。

Besides this, I need to complete other tasks for each user every day such as getting their followers/friends or who unfollowed them on Twitter.

情况

到目前为止,我为每个任务都有一个文件(post.php,getFollowers.php,analytics.php。 )。例如:

So far, I have had a file for each task (post.php, getFollowers.php, analytics.php...). For example:

post.php

我为以下项目创建了cron作业此脚本每分钟检查一次是否必须发布某些帖子。假设我们运行该脚本,并且它会发现此时想要发推的三个用户,它将使用foreach循环对用户进行迭代并在每个帐户中发布。

I have created a cron job for this script which check every minute if some post must be published. Let's assume we run the script and it finds three users who want to tweet at this time, it will iterate the users with a foreach loop and post in every accounts.

。 ..其他脚本也是如此:吸引每个想要做某事的用户,创建队列并对其进行迭代。

...the other scripts do the same: get every users who want to do something, create a queue and iterate it.

问题


  1. 发布任务需要按时完成。

  2. 像获取关注者这样的长任务需要每天运行。

(1)在Twitter和Facebook上发帖需要30到40秒,因此,如果有五个用户希望在14:00时发帖, (3),4和5会比较晚。

(1) Posting on Twitter and Facebook takes 30-40s, so if five users want to post at 14:00, it will be late for the 3, 4 and 5.

(2)吸引一个用户的一些关注者需要40-60s,因此,如果只有1000个用户,该脚本将花费11 -16h绝对不可扩展。我应该能够在2-3小时内完成此任务。

(2) Getting some followers of one user takes 40-60s, so just with 1000 users the script would spend 11-16h which definitely is not scalable. I should be able to get this task done in just 2-3h.

解决方案?

我认为我可以通过分离用户任务并为每个用户执行一个流程来解决这两个问题。

I have thought I could solve both problems by separating user tasks and executing a process for each user.

这是正确且可扩展的解决方案吗?您将如何以可扩展的方式解决这些问题?

Is this a correct and scalable solution? How would you solve these problems in a scalable way?

预先感谢。

推荐答案

使用托管的分布式计划任务服务,例如AWS Elastic Beanstalk 工人等级 IronWorker

Use a managed, distributed scheduled task service, such as AWS Elastic Beanstalk Worker Tier or IronWorker.

使用AWS EB,您将在项目中包含一个 cron.yaml 文件,该文件包含以下配置:

With AWS EB, you would include in your project a cron.yaml file containing a config such as:

version: 1
cron:
 - name: "post"
   url: "/post"
   schedule: "* * * * *"

哪个将触发POST请求到 http:// localhost / post 每分钟。

Which will trigger a POST request to http://localhost/post every minute.

我还建议计划任务本身发送帖子,而是触发其他多个任务。使用AWS EB,您可以使用 AWS进行操作适用于PHP的SDK

I would also suggest that the scheduled task itself not send the posts, but rather trigger other, multiple, tasks to do so. Using AWS EB, you would do so using the AWS SDK for PHP:

use Aws\Common\Aws;

$aws = Aws::factory('/path/to/my_config.json');
$client = $aws->get('Sqs');

$client->sendMessage(array(
    'QueueUrl'     => $queueUrl,
    'MessageBody'  => json_encode($post),
    'DelaySeconds' => $delay,
));

这将触发对您配置的工人层级URL的POST请求(即 http:// localhost / worker ),每条在正文中具有JSON编码数据的消息。

This will trigger a POST request to your configured URL for the Worker Tier (ie. http://localhost/worker) for every message with the JSON encoded data in the body.

这种方法可以使您更好与同时发送的帖子数量成比例。

This approach allows you to better scale with the number of posts to send simultaneously.

这篇关于许多PHP进程同时运行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆