将动态网页转储到文件中? [英] Dumping a dynamic web page to file?
问题描述
我是C ++程序员,并且是Web开发的新手.我需要弄清楚如何每秒将动态第三方网站的html登录/转储到计算机上的静态html文件中?动态网页每秒刷新一次,并用最新的价格信息更新html表.我希望将此表(或整个html页)的静态快照每秒保存到磁盘.这样,我可以使用自己的程序解析文件,然后将更新的价格信息添加到数据库中.我该怎么做呢?如果我不能这样做,是否有办法让前夕删除(和登录)发布/获取消息并回复动态网页发送的消息?
I'm a C++ programmer and I'm new to web development. I need to figure out how I can log/dump the html of a dynamic 3rd party website to a static html file on my computer, every second? The dynamic webpage refreshes every second and updates a html table with latest price info. I would like a static snapshot of this table (or the whole html page) to be saved to disk every second. That way I can parse the file with my own program and add the updated price info to a database. How do I do this? If I cant do it this way, is there a way to eves drop (and log) on the the post/get messages and replies the dynamic webpage sends?
推荐答案
查看 cURL库.我相信,从网站上抓取内容并进行处理/业务逻辑,然后插入或更新数据库将是最有效的方法,而不是将文件内容保存到磁盘.
Look into the cURL Library. I believe Scraping the content from a website, and doing your processing/business logic, then inserting or updating your database would be the most efficient way to do it, rather than saving the files contents to disk.
或者, file_get_contents()效果很好假设您启用了allow_url_fopen.
Alternatively, file_get_contents() works pretty well assuming you have allow_url_fopen enabled.
这篇关于将动态网页转储到文件中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!