python中的错误日志无法与Azure Databricks一起使用 [英] Error logging in python not working with azure databricks

查看:65
本文介绍了python中的错误日志无法与Azure Databricks一起使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

任何人都没有回答与此问题有关的问题

Question related to this problem was not answered by anyone

我尝试在Azure数据块中使用python实现错误日志记录.如果我在python(pycharm)中尝试下面的代码,它按预期工作.但当我在Azure Databricks(python)中尝试相同的代码时,它没有创建一个文件,并且不将任何内容写入文件.我尝试创建一个文件在天蓝色的数据湖gen2中.我已经给出了带有数据安装点的路径湖店gen2.

I tried implementing error logging using python in azure data bricks. If i try the below code in python(pycharm) it is working as expected. But when i try the same code in azure databricks(python) it is not creating a file and not writing any contents into the file. I tried creating a file in azure data lake gen2. i have given the path with mount point of data lake store gen2.

能否请您帮忙解释为什么python代码无法按预期工作天蓝色数据砖(python)

Can you please help why the python code is not working as expected in azure data bricks(python)

# importing module
import logging

dbutils.fs.mkdirs('/dbfs/mnt/sales/region/country/sample/newfile.txt')

# Create and configure logger
logging.basicConfig(filename="/dbfs/mnt/sales/region/
                   country/sample/newfile.txt",
                          format='%(asctime)s %(message)s',
                          filemode='a')

# Creating an object
logger = logging.getLogger()

# Setting the threshold of logger to DEBUG
logger.setLevel(logging.DEBUG)

# Test messages
logger.debug("Harmless debug Message")
logger.info("Just an information")
logger.warning("Its a Warning")
logger.error("Did you try to divide by zero")
logger.critical("Internet is down")


If i open the file i expect the output to be like below which is 
happening with python but the same is not working with azure data 
bricks(python)

2019-06-06 00:19:23,881 Harmless debug Message
2019-06-06 00:19:23,881 Just an information
2019-06-06 00:19:23,881 Its a Warning
2019-06-06 00:19:23,881 Did you try to divide by zero
2019-06-06 00:19:23,881 Internet is down
2019-06-06 00:19:33,447 Harmless debug Message
2019-06-06 00:19:33,447 Just an information
2019-06-06 00:19:33,447 Its a Warning
2019-06-06 00:19:33,447 Did you try to divide by zero
2019-06-06 00:19:33,447 Internet is down

推荐答案

在Databricks中,您已在路径/dbfs/mnt/sales 上安装了Blob存储容器(或ADLS Gen2文件系统)..您不能对由blob存储支持的文件进行随机写入,而Python日志记录库仅会静默失败.

In Databricks, you have mounted a Blob storage container (or ADLS Gen2 file system) at the path /dbfs/mnt/sales. You cannot make random writes to files backed by blob storage and the Python logging library is just failing silently.

https://docs.databricks.com/data/databricks-file-system.html#local-file-api-limitations

要对此进行测试:

# this works
with open('/dbfs/mnt/container-name/my-app.log', 'w') as fid:
    fid.write('this is a message')

# this fails, since append to existing file is a random write operation
with open('/dbfs/mnt/container-name/my-app.log', 'a') as fid:
    fid.write('this message will not work')

这篇关于python中的错误日志无法与Azure Databricks一起使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆