有什么方法可以检查Hadoop文件是否已经打开以进行写入? [英] Is there any way to check whether the Hadoop file is already opened for write?

查看:186
本文介绍了有什么方法可以检查Hadoop文件是否已经打开以进行写入?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的计算机上正在运行多个Java实例,我想检查是否已在任何实例中以写入(fs.create(file) or fs.append(file))模式打开Hadoop文件.

我在Hadoop文件的FileStatus中进行了尝试,未找到任何内容.

有什么方法可以检查Hadoop文件是否已打开以进行写入?

Multiple Java instances are running on my machine and I want to check whether the Hadoop file is already opened in write (fs.create(file) or fs.append(file)) mode in any of the instances.

I Tried in FileStatus of the Hadoop file, not found anything.

Is there any way to check whether the Hadoop file is already opened for write?

一种方法是尝试再次创建/附加文件并捕获异常,但是我有成千上万个文件,并且不想尝试每个文件.另外,如果创建/附加成功,那么我必须关闭文件,并且lastModifiedTime也将被更改.我不想修改Hadoop文件的FileStatus.

One way is to try to create/append a file again and catch the exception, but I have thousands of files and don't want to try every file. Also, if create/append is a success, then I have to close the file and lastModifiedTime will also get changed. I don't want to modify the FileStatus of a Hadoop file.

推荐答案

DistributedFileSystem提供了方法isFileClosed(path)检查文件是否已打开以进行写入.

DistributedFileSystem provides the method isFileClosed(path) to check whether the file is opened for write.

try {
    DistributedFileSystem dfs = new DistributedFileSystem();
    Path path = new Path("/path/to/hadoop/file");
    FileSystem fs = path.getFileSystem(conf);
    dfs.initialize(fs.getUri(),conf);

    if (dfs.exists(path) && !dfs.isFileClosed(path)) {
        System.out.println("File " + path +" already opened in write mode");
    }
} catch (IOException e) {
    e.printStackTrace();
}

这篇关于有什么方法可以检查Hadoop文件是否已经打开以进行写入?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆