使用 CRON 作业访问 url? [英] Using CRON jobs to visit url?
问题描述
我有一个 Web 应用程序,它必须执行重复的任务,发送消息和警报,我已经使用脚本页面在浏览器中加载时执行这些任务,即 http://example.com/tasks.php 并且我通过 iframe 将它包含在我的网络应用程序的每个页面中.
I have a web application that has to perform a repeated tasks, Sending messages and alerts, I, already, use a script page do those tasks when it loaded in the browser i.e http://example.com/tasks.php and I included it by the mean of iframe in every page of my web application.
现在我想将其更改为使用 CRON 作业,因为第一种方法可能会导致卡纸性能,那么我如何制作访问 http://example.com/tasks.php.但是,我不希望此 CRON 作业创建诸如 day.*! 之类的输出文件!
Now I want to change this to use CRON jobs because the first approach may leads to jam performance, So How could I make a CRON job that visits http://example.com/tasks.php. However, I don't want this CRON job creating output files such as day.*!
我将应用程序托管在允许通过 cPanel 执行 CRON 作业的共享托管服务上.
I host the application on shared hosting service that permits CRON jobs via cPanel.
推荐答案
* * * * * wget -O - http://yoursite.com/tasks.php >/dev/null 2>&1
那应该对你有用.只需有一个加载页面的 wget
脚本.
That should work for you. Just have a wget
script that loads the page.
使用-O -
表示web请求的输出将被发送到STDOUT(标准输出)
Using -O -
means that the output of the web request will be sent to STDOUT (standard output)
通过添加 >/dev/null
我们指示标准输出重定向到一个黑洞.通过添加 2>&1
我们指示 STDERR(错误)也被发送到 STDOUT,因此所有输出将被发送到黑洞.(所以它会加载网站,但永远不会在任何地方写入文件)
by adding >/dev/null
we instruct standard output to be redirect to a black hole.
by adding 2>&1
we instruct STDERR (errors) to also be sent to STDOUT, and thus all output will be sent to a blackhole. (so it will load the website, but never write a file anywhere)
这篇关于使用 CRON 作业访问 url?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!