Scrapy管道解析 [英] Scrapy Pipeline to Parse

查看:88
本文介绍了Scrapy管道解析的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我建立了一个管道,以将抓取的数据放入我的Parse Backend

I made a pipeline to put scrapy data to my Parse Backend

PARSE ='api.parse.com' 端口= 443

PARSE = 'api.parse.com' PORT = 443

但是,我找不到在Parse中发布数据的正确方法.因为每次它都会在我的Parse DB中创建未定义的对象.

However, I can't find the right way to post the data in Parse. Because everytime it creates undefined objects in my Parse DB.

 class Newscrawlbotv01Pipeline(object):
    def process_item(self, item, spider):
        for data in item:
            if not data:
                raise DropItem("Missing data!")
        connection = httplib.HTTPSConnection(
            settings['PARSE'],
            settings['PORT']
        )
        connection.connect()
        connection.request('POST', '/1/classes/articlulos', json.dumps({item}), {
       "X-Parse-Application-Id": "XXXXXXXXXXXXXXXX",
       "X-Parse-REST-API-Key": "XXXXXXXXXXXXXXXXXXX",
       "Content-Type": "application/json"
     })
        log.msg("Question added to PARSE !", level=log.DEBUG, spider=spider)
        return item

错误示例:

TypeError: set([{'image': 'http://apps.site.lefigaro.fr/sites/apps/files/styles/large/public/thumbnails/image/sport24.png?itok=caKsKUzV',
 'language': 'FR',
 'publishedDate': datetime.datetime(2016, 3, 16, 21, 53, 10, 289000),
 'publisher': 'Le Figaro Sport',
 'theme': 'Sport',
 'title': u'Pogba aurait rencontr\xe9 les dirigeants du PSG',
 'url': u'sport24.lefigaro.fr/football/ligue-des-champions/fil-info/prolongation-entre-le-bayern-et-la-juve-796778'}]) is not JSON serializable

推荐答案

我找到了解决方法

class Newscrawlbotv01Pipeline(object):
def process_item(self, item, spider):
    for data in item:
        if not data:
            raise DropItem("Missing data!")
    connection = httplib.HTTPSConnection(
        settings['PARSE'],
        settings['PORT']
    )

    connection.connect()
    connection.request('POST', '/1/classes/Articles', json.dumps(dict(item)), {
   "X-Parse-Application-Id": "WW",
   "X-Parse-REST-API-Key": "WW",
   "Content-Type": "application/json"
 })
    log.msg("Question added to PARSE !", level=log.DEBUG, spider=spider)
    return item
    #self.collection.update({'url': item['url']}, dict(item), upsert=True)

这篇关于Scrapy管道解析的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆