下载Group I拥有使用Facebook Graph API的管理员权限的所有帖子 [英] Download all posts for Group I have Admin rights to using Facebook Graph API

查看:150
本文介绍了下载Group I拥有使用Facebook Graph API的管理员权限的所有帖子的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们正在尝试检索在过去一年向我们集团提交的所有帖子以及相关的评论和图片。我已经尝试使用GraphAPI来做到这一点,但分页意味着我必须获取数据,然后复制下一个链接,然后再次运行。不幸的是,这意味着很多工作,因为该组有超过200万个职位。

We are trying to retrieve ALL the posts, with associated comments and images, made to our group in the last year. I've tried using GraphAPI to do this but pagination means I have to get data, then copy the "next" link, and run again. Unfortunately, this means a LOT of work, since there are over 2 million posts to the group.

任何人都知道一个方法,而不用花几天点击?还要考虑到该团队拥有4000多名会员,并且日益增长,目前平均每天约有1000个职位。

Does ANYONE know of a way to do this without spending a few days clicking? Also consider that the group has 4000+ members and is growing everyday, with, on average, about 1000 posts a DAY at the moment.

对于好奇,计划是剔除... ...
我很无聊的编程,最近开始学习Python ...

For the curious, the PLAN is to cull the herd... I am HOPELESS at programming and have recently started learning Python...

推荐答案

我这样做,你可能必须遍历所有的帖子,直到数据为空。注意这是Python 2.x版本。

I made it like this, you'll probably have to iterate through all posts until data is empty. Note this is Python 2.x version.

from facepy import GraphAPI
import json

group_id = "YOUR_GROUP_ID"
access_token = "YOUR_ACCESS_TOKEN"

graph = GraphAPI(access_token)

# https://facepy.readthedocs.org/en/latest/usage/graph-api.html
data = graph.get(group_id + "/feed", page=False, retry=3, limit=800)

with open('content.json', 'w') as outfile:
  json.dump(data, outfile, indent = 4)

这篇关于下载Group I拥有使用Facebook Graph API的管理员权限的所有帖子的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆