从单个文本文件加载大量属性文件并插入到LinkedHashMap中 [英] Load a lot of properties file from a single text file and insert into LinkedHashMap
问题描述
我有一个包含大量属性的文件,每行可能大约有1000个,每个属性文件将有大约5000个键值对。例如: - 示例示例(abc.txt) -
abc1.properties
abc2.properties
abc3.properties
abc4.properties
abc5.properties
所以我打开这个文件,当它读取每一行时,我都会在loadProperties方法中加载属性文件。
public class Project {
public static HashMap< String,串GT; HashMap的;
public static void main(String [] args){
BufferedReader br = null;
hashMap = new LinkedHashMap< String,String>();
尝试{
br = new BufferedReader(new FileReader(C:\\\\\\\\\\\\\\\\\\\\\'') WEB-INF\\classes\\abc.txt));
String line = null;
while((line = br.readLine())!= null){
loadProperties(line); //加载abc1.properties第一次
}
} catch(FileNotFoundException e1){
e1.printStackTrace();
}
catch(IOException e){
e.printStackTrace();
} finally {
try {
br.close();
} catch(IOException e){
e.printStackTrace();
}
}
}
//我在这个方法中加载每个属性文件。然后检查在hashMap中是否存在关键字
(如果它存在于hashMap中),然后将
新关键字值与以前的关键字值进行连接。并继续做每次你
找到钥匙存在。
private static void loadProperties(String line){
Properties prop = new Properties();
InputStream in = Project.class.getResourceAsStream(line);
字符串值=空;
尝试{
prop.load(in);
for(Object str:prop.keySet()){
if(hashMap.containsKey(str.toString())){
StringBuilder sb = new StringBuilder()。append(hashMap。 get(str))。append( - )。append(prop.getProperty((String)str));
hashMap.put(str.toString(),sb.toString());
} else {
value = prop.getProperty((String)str);
hashMap.put(str.toString(),value);
System.out.println(str + - + value);
}
} catch(IOException e){
// TODO自动生成的catch块
e.printStackTrace();
} finally {
try {
in.close();
} catch(IOException e){
// TODO自动生成的catch块
e.printStackTrace();
}
}
}
}
所以我的问题是我有超过1000个属性文件,每个属性文件有超过5000个键值对。并且大多数属性文件具有相同的密钥但具有不同的值,因此如果密钥相同,则必须将该值与先前的值连接。因此,随着属性文件的持续增加以及属性文件中的键值对,LinkedHashMap的大小是否有任何限制。所以这段代码已经足够优化来处理这类问题了吗?解析方案
映射除了内存堆的大小之外没有任何限制您为您的JVM分配的内容,并且可以使用选项 -Xmx
从性能的角度来看,您的代码是可以的。
但我可以提出以下改进建议。
$ b
-
避免使用
hashMap.containsKey(str.toString())
然后hashMap.get(str)
。containsKey(key)
被实现为return get(key)!= null
,所以你实际调用get()
两次。你可以这样说:
value = map.get(key);
if(value!= null){
value + = str;
map.put(key,value);
-
不要调用
str.toString()
。这个调用只是创建另一个等于原始实例的String实例。因为Properties类不是参数化的,所以使用转换,即(String)str 。 如果你仍然有性能问题,您可以先合并所有属性文件,然后使用
Properties.load()
加载它们一次。可能你会获得一些性能优势。
I have a file which contains lots of properties file line by line may be around 1000 and each properties file will be having around 5000 key-value pair. For eg:- Sample Example(abc.txt)-
abc1.properties
abc2.properties
abc3.properties
abc4.properties
abc5.properties
So I am opening this file and as it reads each line I am loading the properties file in loadProperties method. And storing key-value pair from that property in LinkedHashMap.
public class Project {
public static HashMap<String, String> hashMap;
public static void main(String[] args) {
BufferedReader br = null;
hashMap = new LinkedHashMap<String, String>();
try {
br = new BufferedReader(new FileReader("C:\\apps\\apache\\tomcat7\\webapps\\examples\\WEB-INF\\classes\\abc.txt"));
String line = null;
while ((line = br.readLine()) != null) {
loadProperties(line);//loads abc1.properties first time
}
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
} finally {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
//I am loading each property file in this method. And checking whether the key
already exists in the hashMap if it exists in the hashMap then concatenate the
new key value with the previous key value. And keep on doing everytime you
find key exists.
private static void loadProperties(String line) {
Properties prop = new Properties();
InputStream in = Project.class.getResourceAsStream(line);
String value = null;
try {
prop.load(in);
for(Object str: prop.keySet()) {
if(hashMap.containsKey(str.toString())) {
StringBuilder sb = new StringBuilder().append(hashMap.get(str)).append("-").append(prop.getProperty((String) str));
hashMap.put(str.toString(), sb.toString());
} else {
value = prop.getProperty((String) str);
hashMap.put(str.toString(), value);
System.out.println(str+" - "+value);
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
in.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
So My Question is as I am having more than 1000 properties file and each properties file is having more than 5000 key-value pair. And most of the property file have the same key but with different values so I have to concatenate the value with the previous value if the key is same. So is there any limitation on the size with the LinkedHashMap as the property file keep on increasing and also the key-value pair in properties file. So this code is optimized enough to handle this kind of problem?
Map does not have any limitations except size of your memory heap that you allocated for your JVM and can control using option -Xmx
Your code is OK from performance perspective.
But I can suggest the following improvements.
Avoid using
hashMap.containsKey(str.toString())
and thenhashMap.get(str)
.containsKey(key)
is implemented asreturn get(key) != null
, so you actually callget()
twice. You can say something like the following instead:value = map.get(key); if (value != null) { value += str; } map.put(key, value);
Do not call
str.toString()
. This call just create yet another String instance equal to the original one. Since Properties class is not parametrized use casting instead i.e.(String)str
.If you still have performance problem you can merge all properties files first and then load them as using
Properties.load()
once. Probably you will get some performance benefits.
这篇关于从单个文本文件加载大量属性文件并插入到LinkedHashMap中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!