运行时无法修改mapred.job.name。它不在允许在运行时修改的参数列表中 [英] Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime

查看:1337
本文介绍了运行时无法修改mapred.job.name。它不在允许在运行时修改的参数列表中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在气流中进行一些繁琐的工作。我建立了自定义jdbc连接,您可以在图中看到。我可以通过气流Web ui(数据分析->临时查询)查询配置单元表。我也想从Internet运行一些示例dag文件:

I am trying to run some hive job in airflow. I made custome jdbc connection which you can see in the image. I could query hive tables through airflow web ui(data profiling->ad hoc query). Also I want to run some sample dag file from Internet:

#File Name: wf_incremental_load.py
from airflow import DAG
from airflow.operators import BashOperator, HiveOperator
from datetime import datetime, timedelta

default_args = {
    'owner': 'airflow',
    'start_date': datetime(2019, 3, 13),
    'retries': 1,
    'retry_delay': timedelta(minutes=5)
}

dag = DAG('hive_test', default_args=default_args,schedule_interval='* */5 * * *')

touch_job = """
 touch /root/hive.txt
"""
# Importing the data from Mysql table to HDFS
task1 = BashOperator(
        task_id= 'make_file',
        bash_command=touch_job,
        dag=dag
)

# Inserting the data from Hive external table to the target table
task2 = HiveOperator(
        task_id= 'hive_table_create',
        hql='CREATE TABLE aaaaa AS SELECT * FROM ant_code;',
    hive_cli_conn_id='hive_jdbc',
        depends_on_past=True,
        dag=dag
)
# defining the job dependency
task2.set_upstream(task1)

但是,当我在气流中运行此作业时,出现一些错误:

However, when I run this job in airflow, I got some errors:

错误jdbc.Utils:无法从ZooKeeper
中读取HiveServer2配置错误:无法为ZooKeeper中的任何服务器URI打开客户端传输:无法打开新会话:java.lang.IllegalArgumentException :无法在运行时修改mapred.job.name。它不在允许在运行时修改的参数列表中(状态= 08S01,代码= 0)
beeline> USE default;
没有当前连接

[2019-03-13 13:32:25,335] {models.py:1593} INFO - Executing <Task(HiveOperator): hive_table_create> on 2019-03-13T00:00:00+00:00
[2019-03-13 13:32:25,336] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmphSGJhO']
[2019-03-13 13:32:27,130] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,129] {__init__.py:51} INFO - Using executor SequentialExecutor
[2019-03-13 13:32:27,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,547] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py
[2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
[2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   DeprecationWarning)
[2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
[2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   DeprecationWarning)
[2019-03-13 13:32:27,602] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,602] {cli.py:520} INFO - Running <TaskInstance: hive_test.hive_table_create 2019-03-13T00:00:00+00:00 [running]> on host name02.excard.co.kr
[2019-03-13 13:32:27,625] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code;
[2019-03-13 13:32:27,634] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,634] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''}
[2019-03-13 13:32:27,636] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'}
[2019-03-13 13:32:27,637] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,637] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_rXXLyV/tmpdZYjMS
[2019-03-13 13:32:32,323] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,323] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
[2019-03-13 13:32:32,738] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,738] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
[2019-03-13 13:32:32,813] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,813] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
[2019-03-13 13:32:32,830] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,830] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
[2019-03-13 13:32:32,895] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,895] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
[2019-03-13 13:32:32,941] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,941] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
[2019-03-13 13:32:32,959] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,959] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
[2019-03-13 13:32:32,967] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,967] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
[2019-03-13 13:32:32,980] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,980] {hive_hooks.py:251} INFO - beeline> USE default;
[2019-03-13 13:32:32,988] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,988] {hive_hooks.py:251} INFO - No current connection
[2019-03-13 13:32:33,035] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
beeline> USE default;
No current connection
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
    self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
  File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
    raise AirflowException(stdout)
AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
beeline> USE default;
No current connection

[2019-03-13 13:32:33,037] {models.py:1817} INFO - All retries failed; marking task as FAILED
[2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Traceback (most recent call last):
[2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/bin/airflow", line 32, in <module>
[2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     args.func(args)
[2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/utils/cli.py", line 74, in wrapper
[2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     return f(*args, **kwargs)
[2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 526, in run
[2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     _run(args, dag, ti)
[2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 445, in _run
[2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     pool=args.pool,
[2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 73, in wrapper
[2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     return func(*args, **kwargs)
[2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
[2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     result = task_copy.execute(context=context)
[2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
[2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
[2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create   File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
[2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create     raise AirflowException(stdout)
[2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create airflow.exceptions.AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
[2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
[2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
[2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
[2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
[2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
[2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
[2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
[2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create beeline> USE default;
[2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create No current connection
[2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 
[2019-03-13 13:32:35,201] {logging_mixin.py:95} INFO - [2019-03-13 13:32:35,201] {jobs.py:2527} INFO - Task exited with return code 1

请帮助解决问题。

更新:

我在hive-site.xml中添加hive.security.authorization.sqlstd.confwhitelist.append:mapred.job.name *。

I add hive.security.authorization.sqlstd.confwhitelist.append : mapred.job.name* in hive-site.xml.

所以现在我得到了一个稍微不同的错误:

So now I got a little different error:

错误:无法在ZooKeeper中打开任何服务器URI的客户端传输:无法打开新会话:java.lang.IllegalArgumentException:无法修改airflow.ctx .task_id在运行时。它不在允许在运行时修改的参数列表中(状态= 08S01,代码= 0)
beeline> USE default;
没有当前连接
追溯(最近一次通话为上一次):

[2019-03-13 14:54:31,946] {models.py:1593} INFO - Executing <Task(HiveOperator): hive_table_create> on 2019-03-13T00:00:00+00:00
[2019-03-13 14:54:31,947] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 11 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmpGDjT7j']
[2019-03-13 14:54:33,793] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:33,792] {__init__.py:51} INFO - Using executor SequentialExecutor
[2019-03-13 14:54:34,189] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,189] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py
[2019-03-13 14:54:34,192] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
[2019-03-13 14:54:34,193] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create   DeprecationWarning)
[2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
[2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create   DeprecationWarning)
[2019-03-13 14:54:34,219] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,218] {cli.py:520} INFO - Running <TaskInstance: hive_test.hive_table_create 2019-03-13T00:00:00+00:00 [running]> on host name02.excard.co.kr
[2019-03-13 14:54:34,240] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code;
[2019-03-13 14:54:34,249] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,249] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''}
[2019-03-13 14:54:34,251] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'}
[2019-03-13 14:54:34,253] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,252] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_wNbQlL/tmpFN6MGy
[2019-03-13 14:54:39,061] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,060] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
[2019-03-13 14:54:39,443] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,443] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
[2019-03-13 14:54:39,532] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,532] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
[2019-03-13 14:54:39,552] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,551] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
[2019-03-13 14:54:39,664] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,664] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
[2019-03-13 14:54:39,856] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,856] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
[2019-03-13 14:54:41,134] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,134] {hive_hooks.py:251} INFO - 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
[2019-03-13 14:54:41,147] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,146] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
[2019-03-13 14:54:41,167] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,167] {hive_hooks.py:251} INFO - beeline> USE default;
[2019-03-13 14:54:41,180] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,180] {hive_hooks.py:251} INFO - No current connection
[2019-03-13 14:54:41,253] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
beeline> USE default;
No current connection
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
    self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
  File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
    raise AirflowException(stdout)
AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
beeline> USE default;
No current connection


推荐答案

请添加到 hive.security.authorization.sqlstd.confwhitelist.append 以下列出的参数:

Please add to hive.security.authorization.sqlstd.confwhitelist.append params listed below:

airflow.ctx.dag_id
airflow.ctx.task_id
airflow.ctx.execution_date
airflow.ctx.dag_run_id
airflow.ctx.dag_owner
airflow.ctx.dag_email
mapred.job.name

或只是

airflow.ctx.*
mapred.job.name



<默认情况下,气流会在运行时修改这些参数。
应该可以。

Airflow modify these params at runtime by default. This should work.

这篇关于运行时无法修改mapred.job.name。它不在允许在运行时修改的参数列表中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆