多任务并从 Locust 配置文件中读取用户和孵化率 [英] Multiple tasks and reading user and hatch rate from Locust config file

查看:95
本文介绍了多任务并从 Locust 配置文件中读取用户和孵化率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如配置文件中指定的,如何使用单独的用户数和孵化率调用单独的端点.以下是基本代码.请提出建议.

How to call separate end points with separate no of users and hatch rate, as specified in config file. Following is basic code. Please suggest.

from locust import HttpUser, task, between, TaskSet, User
from locust.env import Environment

class MyBase(TaskSet):
    # base class called by all client calls. 
    def getToken(self):
        return token

class GetCallTasks(MyBase):
    @task
    # Need set 1 of USer and hatch rate
    def getInfo(self):
        # verify=False get rids of InsecureRequestWarning warning
        self.client.get()

class PostAndDeleteTasks(MyBase):
    @task
    # Need set 2 of USer and hatch rate
    def deleteStatement(self):
        response = self.client.post()
        response = self.client.delete()

class ApiUser(HttpUser):
    # how do we call PostAndDeleteTasks and GetCallTasks with separate number of users and hatch rate?
    tasks = [PostAndDeleteTasks]
    tasks = [GetCallTasks]
    wait_time = between(0.100, 1.500)

if __name__ == '__main__':
    env = Environment()
    ApiUser(env).run()

推荐答案

这取决于您的用例以及您正在寻找的确切内容.如果 Locust 生成的每个用户首​​先运行 getInfo() 然后 deleteEstatement() 是可以接受的,你可以使用 SequentialTask​​Set 并定义要按顺序完成的任务.要运行 deleteEstatement() 两次,它可以在两个不同的任务中复制,或者同一个任务可以只执行两次相同的代码.在这种情况下,每个用户都会以相同的顺序运行完全相同的步骤,并且在单一的稳定孵化率下,您将获得两个任务的稳定孵化率.

It depends on your use case and exactly what it is you're looking for. If it'd be acceptable for each user Locust generates to first run getInfo() and then deleteEstatement(), you could use a SequentialTaskSet and define the tasks to be done in order. To run deleteEstatement() twice it could be duplicated in two different tasks or the same task could just do the same code twice. In this case, the every user would run the exact same steps in the same order and with a single steady hatch rate you'd have a steady hatch rate for both tasks.

如果您不需要执行确切数量的任务,而只需要每 1 个 getInfo() 大约 2 个 deleteEstatement() 的总体思路,您可以对任务用户.然后,Locust 会随机为每个用户选择一个任务来执行,但例如,可以为每个用户选择 2 个 deleteEstatement() 任务和 1 个 getInfo() 任务,因此您定义的任务的比例大致为 2:1.与第一种情况一样,您仍然有两个任务的单一孵化率,但每个任务的确切运行计数会有点模糊(这通常是可取的,因为这通常更接近真实世界的流量).

If you don't need an exact number of tasks performed but just need the general idea of roughly 2 deleteEstatement() for every 1 getInfo(), you can use weights for either tasks or users. Locust will then randomly choose a task to perform for each user but, for example, could have 2 deleteEstatement() tasks and 1 getInfo() to choose from for each user so you'd roughly have a 2:1 ratio for the tasks you defined. As in the first scenario, you'd still have a single hatch rate for both tasks but the exact run count for each task would be a bit fuzzy (which is typically desirable as that is often closer to how real world traffic is).

顺便说一句,在这两种情况下,您都可以在 ApiUser 类中定义 tasks 一次,就像 tasks = [PostAndDeleteTasks, GetCallTask​​s].如何拥有它,tasks 中只会有 GetCallTask​​s,因此 PostAndDeleteTasks 中的任何内容都不会运行.

As an aside, in both of those situations you would define tasks in your ApiUser class just once like tasks = [PostAndDeleteTasks, GetCallTasks]. How you have it, tasks would only have GetCallTasks in it so anything in PostAndDeleteTasks would never be run.

另一种选择是运行 2 个独立的 Locust 实例,一个只运行 getInfo(),一个只运行 deleteEstatement().然后,您可以操作每个并随意更改每个的用户数和孵化率.这确实是为不同的任务集设置不同孵化率的唯一内置和支持方式.

Another option is running 2 separate Locust instances, one running only getInfo() and one running only deleteEstatement(). You could then operate each one and dynamically change the user count and hatch rate for each at will. This is really the only built-in and supported way of having different hatch rates for different TaskSets.

但如果您真的不想管理多个 Locust 实例,Locust 几乎可以运行任何代码,并且您可以通过多种不同的方式连接到 Locust.文档有一个关于他们所谓的自定义客户端的例子向您展示一种方法.关键部分是 self._locust_environment.events.request_success.fire()self._locust_environment.events.request_failure.fire() 因为这是代码发送消息的地方向 Locust 询问正在发生的事情以及任务是成功还是失败.您还可以使用一系列函数来覆盖 TaskSet 中的 self.tasks 以作为序列中的任务运行,或者作为 function: weight 对的字典.同样,这仍然会有一个用户计数和孵化率会触发此事件,但您可以完全控制从那里发生的事情.

But if you really don't want to have multiple Locust instances to manage, Locust is capable of running just about any code and has several different ways you can hook into Locust. The docs have an example about what they call custom clients that show you one way to do that. The key part is self._locust_environment.events.request_success.fire() and self._locust_environment.events.request_failure.fire() as that's where the code would send a message to Locust about what's happening and whether the task was a success or failure. You can also overwrite self.tasks in a TaskSet with either a list of functions to run as tasks in a sequence or as a dict of function: weight pairs. Again, this would still have one user count and hatch rate that would trigger this but then you're in full control of what happens from there.

最后,根据您需要做什么,还有一些有用的 事件钩子将 Locust 用作库 您可以研究一下.

And lastly, depending on what you need to do there are also useful event hooks and ways to use Locust as a library you could look into.

这篇关于多任务并从 Locust 配置文件中读取用户和孵化率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆