Python-AttributeError:“ NoneType”对象没有属性“ execute” [英] Python - AttributeError: 'NoneType' object has no attribute 'execute'

查看:347
本文介绍了Python-AttributeError:“ NoneType”对象没有属性“ execute”的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试运行登录到Amazon Redshift DB的python脚本,然后执行SQL命令。我使用名为Airflow的工具进行工作流管理。运行以下代码时,我可以正常登录数据库,但是尝试执行SQL命令时出现以下错误。

I am trying to run a python script that logs into Amazon Redshift DB and then execute a SQL command. I use a tool called Airflow for workflow management. When running the below code, I am able to login fine to the DB but when trying to execute the SQL command get the below error.

**AttributeError: 'NoneType' object has no attribute 'execute'**

代码:

## Login to DB

def db_log(**kwargs):
  global db_con
  try:
    db_con = psycopg2.connect(
       " dbname = 'name' user = 'user' password = 'pass' host = 'host' port = '5439'")
  except:
    print("I am unable to connect")
    print('Connection Task Complete')
    task_instance = kwargs['task_instance']
    task_instance.xcom_push(key="dwh_connection" , value = "dwh_connection")
    return (dwh_connection)


def insert_data(**kwargs):
  task_instance = kwargs['task_instance']
  db_con_xcom = task_instance.xcom_pull(key="dwh_connection", task_ids='DWH_Connect')
  cur = db_con_xcom
  cur.execute("""insert into tbl_1 select limit 2 ;""")

有人可以帮我解决吗这个。谢谢。

Could anyone help me fix this. Thanks..

完整代码:

## Third party Library Imports
import pandas as pd
import psycopg2
import airflow
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
from sqlalchemy import create_engine
import io

# Following are defaults which can be overridden later on
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2018, 5, 29, 12),
'email': ['airflow@airflow.com']
}

dag = DAG('sample1', default_args=default_args)

## Login to DB

def db_log(**kwargs):
  global db_con
  try:
    db_con = psycopg2.connect(
       " dbname = 'name' user = 'user' password = 'pass' host = 'host' port = '5439'")
  except:
    print("I am unable to connect")
    print('Connection Task Complete')
    task_instance = kwargs['task_instance']
    task_instance.xcom_push(key="dwh_connection" , value = "dwh_connection")
    return (dwh_connection)



t1 = PythonOperator(
  task_id='DWH_Connect',
  python_callable=data_warehouse_login,provide_context=True,
  dag=dag)

#######################

def insert_data(**kwargs):
  task_instance = kwargs['task_instance']
  db_con_xcom = task_instance.xcom_pull(key="dwh_connection", task_ids='DWH_Connect')
  cur = db_con_xcom
  cur.execute("""insert into tbl_1 select limit 2 """)


##########################################

t2 = PythonOperator(
  task_id='DWH_Connect1',
  python_callable=insert_data,provide_context=True,dag=dag)

t1 >> t2


推荐答案

您确定已添加了您的代码?您在第一个任务的python_callable中调用 data_warehouse_login 函数,但这是未定义的。假设这应该是 db_log 并且第一个任务成功,那么您实际上并没有将任何东西都压缩到第二个任务中(因为您的 xcom_push 仅在出错时触发。

Are you sure you've added the entirety of your code? You call the data_warehouse_login function in the first task's python_callable but that is undefined. Assuming that this is meant to be db_log and the first task was successful, you're not actually xcom-ing anything to the second task (as your xcom_push only triggers on error).

通常不建议xcoming连接对象。或者,您可能要考虑使用随附的PostgresHook,它应该可以覆盖您的用例,并且可以与Amazon Redshift一起很好地工作。

Generally wouldn't advise xcom-ing a connection object anyway. Alternatively, you may want to consider using the included PostgresHook, which should cover your use case and works equally well with Amazon Redshift.

https://github.com/apache/incubator-airflow/blob/master/airflow/hooks/postgres_hook.py

这篇关于Python-AttributeError:“ NoneType”对象没有属性“ execute”的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆