限制对HTTP GET请求的响应中读取的数据量 [英] Limiting amount of data read in the response to a HTTP GET request
本文介绍了限制对HTTP GET请求的响应中读取的数据量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
client:= * http.Client {
Transport:& http.Transport {
Dial:(& net.Dialer {
Timeout:30 * time.Second,
KeepAlive:30 * time.Second,
})。Dial,
TLSHandshakeTimeout:10 * time.Second,
ResponseHeaderTimeout:10 * time.Second,
},
}
现在,当我对多个URL进行GET请求时,我不希望遇到能够传递大量数据的URL。 / p>
response,err:= client.Get(page.Url)
checkErr(err)
body, err:= ioutil.ReadAll(response.Body)
checkErr(err)
page.Body = string(body)
有没有办法限制GET请求从资源接收的数据量(字节)并停止?
解决方案
我们e io.LimitedReader
p>
LimitedReader从R中读取数据,但将数据量限制为仅返回N个字节。 b
$ blimitedReader:=& io.LimitedReader {R:response.Body,N:limit}
body,err:= ioutil.ReadAll (limitedReader)
或
body,err:= ioutil.ReadAll(io.LimitReader(response.Body,limit))
I'm scraping HTML pages and have set up a HTTP client like so:
client := *http.Client{ Transport: &http.Transport{ Dial: (&net.Dialer{ Timeout: 30 * time.Second, KeepAlive: 30 * time.Second, }).Dial, TLSHandshakeTimeout: 10 * time.Second, ResponseHeaderTimeout: 10 * time.Second, }, }
Now when I make GET requests of multiple URLs I don't want to get stuck with URLs that deliver massive amount of data.
response, err := client.Get(page.Url) checkErr(err) body, err := ioutil.ReadAll(response.Body) checkErr(err) page.Body = string(body)
Is there a way to limit the amount of data (bytes) the GET request accepts from a resource and stops?
解决方案Use an
io.LimitedReader
A LimitedReader reads from R but limits the amount of data returned to just N bytes.
limitedReader := &io.LimitedReader{R: response.Body, N: limit} body, err := ioutil.ReadAll(limitedReader)
or
body, err := ioutil.ReadAll(io.LimitReader(response.Body, limit))
这篇关于限制对HTTP GET请求的响应中读取的数据量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文