为什么HashMap重新哈希key对象提供的哈希码? [英] Why does a HashMap rehash the hashcode supplied by the key object?

查看:175
本文介绍了为什么HashMap重新哈希key对象提供的哈希码?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在阅读Java 1.6 API提供的HashMap类的代码,无法完全理解下面的操作(在put和get方法的主体中找到)的需要:

I am reading the code of the HashMap class provided by the Java 1.6 API and unable to fully understand the need of the following operation (found in the body of put and get methods):

int hash = hash(key.hashCode());

其中方法 hash()以下正文:

 private static int hash(int h) {
         h ^= (h >>> 20) ^ (h >>> 12);
    return h ^ (h >>> 7) ^ (h >>> 4);
}

通过对提供的哈希码执行位操作,有效地重新计算哈希。我无法理解这样做的必要,即使API声明如下:

This effectively recalculates the hash by executing bit operations on the supplied hashcode. I'm unable to understand the need to do so even though the API states it as follows:


这是至关重要的
,因为HashMap使用两个长度的散列哈希表,那么
否则会在低位中遇到与
不同的hashCodes冲突。

This is critical because HashMap uses power-of-two length hash tables, that otherwise encounter collisions for hashCodes that do not differ in lower bits.

我理解,键值pars存储在数据结构的数组中,并且该数组中项的索引位置由其哈希确定。
我不能理解的是这个函数如何添加任何值到散列分布。

I do understand that the key value pars are stored in an array of data structures, and that the index location of an item in this array is determined by its hash. What I fail to understand is how would this function add any value to the hash distribution.

推荐答案

,只是在针对关键对象的现有哈希函数有错误并且没有做好混合低位的足够好的工作的情况下。根据pgras引用的, p>

As Helper wrote, it is there just in case the existing hash function for the key objects is faulty and does not do a good-enough job of mixing the lower bits. According to the source quoted by pgras,

 /**
  * Returns index for hash code h.
  */
 static int indexFor(int h, int length) {
     return h & (length-1);
 }

哈希值与二的幂长度, length-1 被保证是1的序列)。由于此AND运算,仅使用 h 的低位。将忽略 h 的其余部分。想象一下,无论什么原因,原始哈希只返回可被2整除的数字。如果直接使用它,那么将不会使用hashmap的奇数位置,从而导致冲突数量增加x2。在一个真正病态的情况下,一个坏的哈希函数可以使hashmap的行为更像一个列表而不是一个O(1)容器。

The hash is being ANDed in with a power-of-two length (therefore, length-1 is guaranteed to be a sequence of 1s). Due to this ANDing, only the lower bits of h are being used. The rest of h is ignored. Imagine that, for whatever reason, the original hash only returns numbers divisible by 2. If you used it directly, the odd-numbered positions of the hashmap would never be used, leading to a x2 increase in the number of collisions. In a truly pathological case, a bad hash function can make a hashmap behave more like a list than like an O(1) container.

Sun工程师必须运行测试表明太多的哈希函数在它们的低位中不够随机,并且许多哈希值不够大以至于不能使用高位。在这些情况下,HashMap的 hash(int h)中的位操作可以提供比大多数预期使用情况(由于较低的冲突率)的净改进,即使额外的计算是必需的。

Sun engineers must have run tests that show that too many hash functions are not random enough in their lower bits, and that many hashmaps are not large enough to ever use the higher bits. Under these circumstances, the bit operations in HashMap's hash(int h) can provide a net improvement over most expected use-cases (due to lower collision rates), even though extra computation is required.

这篇关于为什么HashMap重新哈希key对象提供的哈希码?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆