使用goroutines时对HTTP GET请求的时间响应 [英] Time response for HTTP GET request when using goroutines

查看:281
本文介绍了使用goroutines时对HTTP GET请求的时间响应的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个简单的代码,用于显示文本文件(url_list.txt)中列出的每个URL的GET响应时间。



当请求按顺序触发时,返回的次数对应于单个URL的预期响应时间。然而,当同时执行相同的代码时,返回的响应时间通常比预期的要高。



似乎在调用http.Get(url)之前捕获的 time_start 不是实际发送请求的时间。我猜测http.Get(url)的执行会在一定程度上排队。



有没有更好的方法来捕获使用URL响应时间goroutines?



以下是我的代码: b

顺序请求:
$ b

package main

import(fmt
net / http
io / ioutil
time
strings


func get_resp_time(url字符串){
time_start:= time.Now()
resp,err:= http.Get(url)
if err!= nil {
panic(err)

defer resp.Body.Close()
fmt.Println(time.Since(time_start),url)
}

func main(){
content,_:= ioutil.ReadFile(url_list.txt)
url:= strings.Split(string(content),\\\


for _,url:=范围网址{
get_resp_time(url)
//去get_resp_time(url)
}

//time.Sleep(20 * time.Second)
}


并发请求

  
$ b $ net $ http $ $ b $ i
$ time




$


func get_resp_time(url字符串){
time_start:= time.Now()
resp,err:= http.Get(url)
if err!= nil {
panic(err)
}
defer resp.Body.Close()
fmt.Println(time.Since(time_start),url)


func main(){
content,_:= ioutil.ReadFile(url_list.txt)
url:= strings.Split(string(内容),\\\


为_,url:=范围url {
// get_resp_time(url)
go get_resp_time(url)
}

time.Sleep(20 * time.Second)
}


解决方案

你正在开始所有的请求。如果文件中有1000个网址,那么您一次启动1000个例程。这可能会起作用,但可能会给你带来错误,说明不在套接字或文件句柄中。我建议一次开始数量有限的提取,如下面的代码。



这应该也有助于时间安排。

  package main 

import(
fmt
io / ioutil
log
net / http
strings
sync
time


func get_resp_time(url string){
time_start:= time.Now()
resp,err:= http.Get(url)
if err!= nil {
log.Printf(Error fetching:% v),err)
}
defer resp.Body.Close()
fmt.Println(time.Since(time_start),url)
}

func main(){
content,_:= ioutil.ReadFile(url_list.txt)
url:= strings.Split(string(content),\\\

$ b $ const worker = 25

wg:= new(sync.WaitGroup)
in:= make(chan string,2 * workers)

代表i:= 0;我<工人; i ++ {
wg.Add(1)
去func(){
推迟wg.Done()
为url:=范围在{
get_resp_time(url)
}
}()
}

为_,url:=范围网址{
if url!={
in < - url
}
}
close(in)
wg.Wait()
}


I have a simple code that prints GET response time for each URL listed in a text file (url_list.txt).

When the requests are fired sequentially the returned times correspond to the expected response times of individual URLs.

However, when the same code is executed concurrently the returned response times are typically higher than expected.

It seems that the time_start I capture before the http.Get(url) is called is not the time of when the request is actually sent. I guess the execution of http.Get(url) is queued to some extend.

Is there a better way to capture URL response time when using goroutines?

Here is my code:

Sequential requests:

package main

import ("fmt"
        "net/http"
        "io/ioutil"
        "time"
        "strings"
)

func get_resp_time(url string) {
        time_start := time.Now()
        resp, err := http.Get(url)
        if err != nil {
            panic(err)
        }
        defer resp.Body.Close()
        fmt.Println(time.Since(time_start), url)
}

func main() {
    content, _ := ioutil.ReadFile("url_list.txt")
    urls := strings.Split(string(content), "\n")

    for _, url := range urls {
        get_resp_time(url)
        //go get_resp_time(url)
    }

    //time.Sleep(20 * time.Second)
}

Concurrent requests:

package main

import ("fmt"
        "net/http"
        "io/ioutil"
        "time"
        "strings"
)

func get_resp_time(url string) {
        time_start := time.Now()
        resp, err := http.Get(url)
        if err != nil {
            panic(err)
        }
        defer resp.Body.Close()
        fmt.Println(time.Since(time_start), url)
}

func main() {
    content, _ := ioutil.ReadFile("url_list.txt")
    urls := strings.Split(string(content), "\n")

    for _, url := range urls {
        //get_resp_time(url)
        go get_resp_time(url)
    }

    time.Sleep(20 * time.Second)
} 

解决方案

You are starting all the requests at once. If there are 1000s of urls in the file then you are starting 1000s of go routines all at once. This may work, but may give you errors about being out of sockets or file handles. I'd recommend starting a limited number of fetches at once, like this code below.

This should help with the timing also.

package main

import (
    "fmt"
    "io/ioutil"
    "log"
    "net/http"
    "strings"
    "sync"
    "time"
)

func get_resp_time(url string) {
    time_start := time.Now()
    resp, err := http.Get(url)
    if err != nil {
        log.Printf("Error fetching: %v", err)
    }
    defer resp.Body.Close()
    fmt.Println(time.Since(time_start), url)
}

func main() {
    content, _ := ioutil.ReadFile("url_list.txt")
    urls := strings.Split(string(content), "\n")

    const workers = 25

    wg := new(sync.WaitGroup)
    in := make(chan string, 2*workers)

    for i := 0; i < workers; i++ {
        wg.Add(1)
        go func() {
            defer wg.Done()
            for url := range in {
                get_resp_time(url)
            }
        }()
    }

    for _, url := range urls {
        if url != "" {
            in <- url
        }
    }
    close(in)
    wg.Wait()
}

这篇关于使用goroutines时对HTTP GET请求的时间响应的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆