使用YouTube API通过JSON feed获取视频中的所有评论 [英] Using YouTube API to get all comments from a video with the JSON feed
问题描述
我正在使用YouTube API通过如下所示的参数化查询来获取视频的评论:
I'm using the YouTube API to get comments for a video with a parameterized query like the following:
http://gdata.youtube.com/feeds/api/videos/theVideoID/comments?v=2&alt=json
问题在于,每个查询最多可获取50个结果.我想获取每个评论.我目前正在使用start-index
和max-results
参数来解决此问题.我一次执行50次迭代时遇到了一些麻烦,因为有时迭代的起始索引高于注释数,而我无法弄清楚,所以我尝试一次计算一次.一次执行50次可能会更好,所以请让我知道这是否是更好的解决方案.现在:
The problem with this is that the maximum number of results you can get per query is 50. I want to get every comment. I'm currently using the start-index
and max-results
parameters to solve this. I had a bit of trouble doing iterations of 50 at a time because sometimes the iteration would have a start-index above the number of comments and I couldn't figure that out, so I just tried to work out one at a time. It may be better to do 50 at a time, so let me know if that is the better solution. For now:
我正在使用PHP来获取评论数量:
I'm using PHP to get the amount of comments:
<?php
$video_ID = 'gT2HYxOdxUk';
$JSON = file_get_contents("https://gdata.youtube.com/feeds/api/videos/{$video_ID}?v=2&alt=json");
$JSON_Data = json_decode($JSON);
$commentCount = $JSON_Data->{'entry'}->{'gd$comments'}->{'gd$feedLink'}->{'countHint'};
?>
然后我要调用JavaScript/jQuery函数将所有注释加载到数组中.为了进行测试,它将它们打印到div中.对于初学者,这是我调用函数的方式:
And then I'm calling a JavaScript/jQuery function to load all comments into an array. For testing, it prints them into a div. For starters, here's how I'm calling the function:
<body onLoad="loadComments('<?php echo $commentCount; ?>', '<?php echo $video_ID; ?>')">
接下来,实际功能:
function loadComments(count, videoID) {
for(i = 1; i <= count; i++) {
$.ajax({
url: "http://gdata.youtube.com/feeds/api/videos/" + videoID + "/comments?v=2&alt=json&max-results=1" + "&start-index=" + i,
dataType: "jsonp",
success: function(data){
$.each(data.feed.entry, function(key, val) {
comments.push(val.content.$t);
$('#commentOutput').append(val.content.$t + '<br>'); //Just for testing purposes.
});
}
});
}
}
问题在于它真的很虚弱.当我像这样将count
变量用作for循环的终止部分时,它总是像211条注释中的45条一样.如果我手动输入211,它将转到195.如果输入的数字较小(例如1-15),则几乎总是可以全部输入. 20多岁,这永远是不对的.
The problem is that it is really iffy. When I use the count
variable as the terminating part of the for loop like this, it always gets like, for example, 45 out of 211 comments. If I manually enter 211, it will go to around 195. If I put in a low number, like 1-15, it pretty much always gets them all. 20+, it's never right.
我需要弄清楚如何利用max-results
和start-index
参数来始终获取给定视频的所有评论.谢谢!
I need to figure out how to get this to consistently get all the comments of a given video by taking advantage of the max-results
and start-index
parameters. Thanks!
推荐答案
我刚遇到这个问题,并且我注意到它被问到已经有一段时间了.但是由于没有人回答,我认为我应该这样做.
I just came across this question and I notice that its been quite some time when this was asked. But since nobody answered it yet, I think I should do that.
理想情况下,应该使用Youtube的PHP API(使用Zend_GData)并在PHP中使用以下代码:
What you should ideally do is, use Youtube's PHP API (using Zend_GData) and use the following code in PHP:
<?php
require_once 'Zend/Loader.php'; // the Zend dir must be in your include_path
Zend_Loader::loadClass('Zend_Gdata_YouTube');
$yt = new Zend_Gdata_YouTube();
$yt->setMajorProtocolVersion(2);
$video = parse_url("http://www.youtube.com/watch?v=K-ob8sr9ZX0");
parse_str(urldecode($video['query']), $query);
$videoId = $query['v'];
$commentFeed = $yt->retrieveAllEntriesForFeed($yt->getVideoCommentFeed($videoId));
foreach ($commentFeed as $commentEntry) {
echo "Full text: " . $commentEntry->content->text . "<br />";
}
此处的关键元素是 retrieveAllEntriesForFeed()方法.
The key element here is the retrieveAllEntriesForFeed() method.
您可以构造一个JSON并将其发送回等待的Javascript,而不是回显所有注释.
Instead of echo-ing all the comments, you can construct a JSON and send it back to the waiting Javascript.
它不使用 max-results 或 start-index ,但是在没有它们的情况下效果很好.
It does not use the max-results or start-index, but does the job well without them.
这篇关于使用YouTube API通过JSON feed获取视频中的所有评论的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!