python pymysql.err.OperationalError:(2013年,“查询期间与MySQL服务器的连接丢失") [英] python pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')

查看:140
本文介绍了python pymysql.err.OperationalError:(2013年,“查询期间与MySQL服务器的连接丢失")的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在抓取一个网站,然后将数据存储到mysql中,代码可以正常工作,但是一段时间后,它将出现以下错误.我正在使用python 3.5.1和pymysql连接数据库.

I am scraping a website and then storing the data into mysql, the code works fine but after sometime the it give the following error. I am using python 3.5.1 and pymysql to connect to database.

pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')

这是我的代码:

from bs4 import BeautifulSoup
import urllib.request
import re
import json
import pymysql
import pymysql.cursors


connection = pymysql.connect(host='XXX.XXX.XXX.XX',
                             user='XXX',
                             password='XXX',
                             db='XXX',
                             charset='utf8mb4',
                             cursorclass=pymysql.cursors.DictCursor)

r = urllib.request.urlopen('http://i.cantonfair.org.cn/en/ExpExhibitorList.aspx?k=glassware')
soup = BeautifulSoup(r, "html.parser")

links = soup.find_all("a", href=re.compile(r"expexhibitorlist\.aspx\?categoryno=[0-9]+"))
linksfromcategories = ([link["href"] for link in links])

string = "http://i.cantonfair.org.cn/en/"
linksfromcategories = [string + x for x in linksfromcategories]


for link in linksfromcategories:

  response = urllib.request.urlopen(link)
  soup2 = BeautifulSoup(response, "html.parser")

  links2 = soup2.find_all("a", href=re.compile(r"\ExpExhibitorList\.aspx\?categoryno=[0-9]+"))
  linksfromsubcategories = ([link["href"] for link in links2])

  linksfromsubcategories = [string + x for x in linksfromsubcategories]
  for link in linksfromsubcategories:

        response = urllib.request.urlopen(link)
        soup3 = BeautifulSoup(response, "html.parser")
        links3 = soup3.find_all("a", href=re.compile(r"\ExpExhibitorList\.aspx\?categoryno=[0-9]+"))
        linksfromsubcategories2 = ([link["href"] for link in links3])

        linksfromsubcategories2 = [string + x for x in linksfromsubcategories2]
        for link in linksfromsubcategories2:

              response2 = urllib.request.urlopen(link)
              soup4 = BeautifulSoup(response2, "html.parser")
              companylink = soup4.find_all("a", href=re.compile(r"\expCompany\.aspx\?corpid=[0-9]+"))
              companylink = ([link["href"] for link in companylink])
              companydetail = soup4.find_all("div", id="contact")
              companylink = [string + x for x in companylink]
              my_list = list(set(companylink))

              for link in my_list:
                  print (link)
                  response3 = urllib.request.urlopen(link)
                  soup5 = BeautifulSoup(response3, "html.parser")
                  companydetail = soup5.find_all("div", id="contact")                      
                  for d in companydetail:
                        lis = d.find_all('li')
                        companyname = lis[0].get_text().strip()
                        companyaddress = lis[1].get_text().strip()
                        companycity = lis[2].get_text().strip()
                        try:
                            companypostalcode = lis[3].get_text().strip()
                            companypostalcode = companypostalcode.replace(",","")                                
                        except:
                            companypostalcode = lis[3].get_text().strip()
                        try:
                            companywebsite = lis[4].get_text().strip()
                            companywebsite = companywebsite.replace("\xEF\xBC\x8Cifl...","")
                        except IndexError:
                            companywebsite = 'null'


                        try:
                            with connection.cursor() as cursor:


                                print ('saving company details to db')
                                cursor.execute("""INSERT INTO company(
                                                                       companyname,address,city,pincode,website) 
                                                                   VALUES (%s, %s, %s, %s, %s)""",
                                                                   (companyname, companyaddress, companycity, 
                                                                    companypostalcode, companywebsite))
                            connection.commit()

                        finally:
                            print ("Company Data saved")
                  productlink = soup5.find_all("a", href=re.compile(r"\ExpProduct\.aspx\?corpid=[0-9]+.categoryno=[0-9]+"))
                  productlink = ([link["href"] for link in productlink])

                  productlink = [string + x for x in productlink]
                  productlinkun = list(set(productlink))
                  for link in productlinkun:

                      print (link)
                      responseproduct = urllib.request.urlopen(link)
                      soupproduct = BeautifulSoup(responseproduct, "html.parser")
                      productname = soupproduct.select('div[class="photolist"] li a')
                      for element in productname:
                          print ("====================Product Name=======================")
                          productnames = element.get_text().strip()
                          print (productnames)
                          try:
                              with connection.cursor() as cursor:

                                  # Create a new record
                                  print ('saving products to db')
                                  cursor.execute("""INSERT INTO products(
                                                                       companyname,products) 
                                                                   VALUES (%s, %s)""",
                                                                   (companyname, productnames))
                                  connection.commit()

                          finally:
                              print ("Products Data Saved")

现在我无法找出我的代码出了什么问题

Now I can't find out where my code is going wrong

推荐答案

希望它可以帮助您

while True:  #it works until the data was not saved
    try:
        with connection.cursor() as cursor:


            print ('saving company details to db')
            cursor.execute("""INSERT INTO company(
                                                   companyname,address,city,pincode,website) 
                                               VALUES (%s, %s, %s, %s, %s)""",
                                               (companyname, companyaddress, companycity, 
                                                companypostalcode, companywebsite))
        connection.commit()
        break
    except OperationalError:
        connection.ping(True)
print ("Company Data saved")

使用连接池

或阅读来源

这篇关于python pymysql.err.OperationalError:(2013年,“查询期间与MySQL服务器的连接丢失")的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆