wget或curl从stdin [英] wget or curl from stdin

查看:228
本文介绍了wget或curl从stdin的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从stdin提供网址时下载网页。基本上一个进程连续生成URL到stdout /文件,我想管道他们wget或curl。

I'd like to download a web pages while supplying URLs from stdin. Essentially one process continuously produces URLs to stdout/file and I want to pipe them to wget or curl. (Think about it as simple web crawler if you want).

这似乎很好:

tail 1.log | wget -i - -O - -q 

但是当我使用'tail -f' (缓冲或wget正在等待EOF?):

But when I use 'tail -f' and it doesn't work anymore (buffering or wget is waiting for EOF?):

tail -f 1.log | wget -i - -O - -q

任何人都可以使用wget,curl或任何其他标准Unix工具?理想情况下,我不会想在循环中重新启动wget,只是让它运行下载URL。

Could anybody provide a solution using wget, curl or any other standard Unix tool? Ideally I don't won't want to restart wget in the loop, just keep it running downloading URLs as they come.

推荐答案

你需要使用的是xargs。例如

What you need to use is xargs. E.g.

tail -f 1.log | xargs -n1 wget -O - -q

这篇关于wget或curl从stdin的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆