替代ioutil.ReadAll吗? [英] Alternative To ioutil.ReadAll in go?

查看:124
本文介绍了替代ioutil.ReadAll吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

对于一个程序,我正在将此函数作为goroutine在for循环中运行,具体取决于传入的URL数量(没有设置数量).

For a program I'm making this function is ran as a goroutine in a for loop depending on how many urls are passed in (no set amount).

func makeRequest(url string, ch chan<- string, errors map[string]error){
  res, err := http.Get(url)
  if err != nil {
    errors[url] = err
    close(ch)
    return
  }

  defer res.Body.Close()
  body, _ := ioutil.ReadAll(res.Body)
  ch <- string(body)
}

必须使用整个响应,因此ioutil.ReadAll看起来很合适,但对可以传递的url数量没有限制,并且ReadAll的性质是它全部存储在内存中,因此开始感觉不像黄金票.我是Go的新手,所以如果您决定回答,如果您可以在解决方案背后给出一些解释,将不胜感激!

The entire body of the response has to be used so ioutil.ReadAll seemed like the perfect fit but with no restriction on the amount of urls that can be passed in and the nature of ReadAll being that it's all stored in memory it's starting to feel less like the golden ticket. I'm fairly new to Go so if you do decide to answer, if you could give some explanation behind your solution it would be greatly appreciated!

推荐答案

当我学习如何使用Go时,我得到的一个见解是,ReadAll对于大型读者通常效率低下,并且像您的情况一样,它受到任意输入的约束.很大,可能会泄漏内存.一开始,我曾经像这样进行JSON解析:

One insight that I got as I learned how to use Go is that ReadAll is often inefficient for large readers, and like in your case, is subject to arbitrary input being very big and possibly leaking out memory. When I started out, I used to do JSON parsing like this:

data, err := ioutil.ReadAll(r)
if err != nil {
    return err
}
json.Unmarshal(data, &v)

然后,我了解了一种解析JSON的更有效的方法,该方法就是简单地使用Decoder类型.

Then, I learned of a much more efficient way of parsing JSON, which is to simply use the Decoder type.

err := json.NewDecoder(r).Decode(&v)
if err != nil {
    return err
}

这不仅更加简洁,而且在内存方面和时间方面都更加高效:

Not only is this more concise, it is much more efficient, both memory-wise and time-wise:

  • 解码器不必分配巨大的字节片来容纳读取的数据-它可以简单地重用一个微小的缓冲区,该缓冲区将与Read方法一起使用,以获取所有数据并进行解析.这样可以节省大量的分配时间,并消除了GC的压力
  • 当第一个数据块进入时,JSON解码器可以开始解析数据-不必等待所有内容完成下载.
  • The decoder doesn't have to allocate a huge byte slice to accommodate for the data read - it can simply reuse a tiny buffer which will be used against the Read method to get all the data and parse it. This saves a lot of time in allocations and removes stress from the GC
  • The JSON Decoder can start parsing data as soon as the first chunk of data comes in - it doesn't have to wait for everything to finish downloading.

现在,您的问题当然与JSON无关,但是此示例非常有用,它说明了如果您可以直接使用Read并一次解析数据块,则可以这样做.特别是对于HTTP请求,解析比读取/下载要快,因此这可能导致解析的数据几乎在请求正文到达时立即准备就绪.

Now, of course your question has nothing to do with JSON, but this example is useful to illustrate that if you can use Read directly and parse data chunks at a time, do it. Especially with HTTP requests, parsing is faster than reading/downloading, so this can lead to parsed data being almost immediately ready the moment the request body finishes arriving.

在您的情况下,您似乎暂时还没有对数据进行任何处理,因此没有太多建议可以帮助您.但是io.Readerio.Writer接口与UNIX管道的Go等效,因此您可以在许多不同的地方使用它们:

In your case, you seem not to be actually doing any handling of the data for now, so there's not much to suggest to aid you specifically. But the io.Reader and the io.Writer interfaces are the Go equivalent of UNIX pipes, and so you can use them in many different places:

将数据写入文件:

f, err := os.Create("file")
if err != nil {
    return err 
}
defer f.Close()

// Copy will put all the data from Body into f, without creating a huge buffer in memory
// (moves chunks at a time)
io.Copy(f, resp.Body)

将所有内容打印到标准输出:

Printing everything to stdout:

io.Copy(os.Stdout, resp.Body)

将响应的正文放置到请求的正文中:

Pipe a response's body to a request's body:

resp, err := http.NewRequest("POST", "https://example.com", resp.Body)

这篇关于替代ioutil.ReadAll吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆