JDK7和JDK8中HashMap的结构优化

来源:互联网 发布:玩游戏 手机 知乎 编辑:程序博客网 时间:2024/05/20 05:59

JDK8对HashMap做了较大的改动和优化,在以前的HashMap上,是通过hash映射+装填因子来实现的,每个桶都接了相应的链表,当hash映射不均匀,大量key都映射到同一个桶下的链表里,这时候,元素数量到达临界值时,此时map中元素较多,发生冲突的可能性较大,此时rehash。

在7下的实现:有几个关键的变量:

threshold:临界值,即map的capacity * loadFactor的值,每次扩容时capacity 2倍后的值

loadFactorL:0.75f装填因子

size:实际key-value元素个数

capacity:map的Entry数组大小,初始化为16

简单说下实现,初始化map的时候,Entry[] table 数组的大小为16,装填因子为0.75f,在put对象的时候,首先计算k的hash值,然后根据hash值得到所在桶的数组下标,映射方式为 i = hash&(table.length-1),得到下标后,遍历对应桶数组的链表,如果相同则修改并返回old值,不存在同一个对象,则创建新的Entry,在创建之前,会进行一下判断:

public V put(K key, V value) {        if (key == null)            return putForNullKey(value);        int hash = hash(key);        int i = indexFor(hash, table.length);        for (Entry<K, V> e = table[i]; e != null; e = e.next) {            Object k;            if (e.hash == hash && ((k = e.key) == key || key.equals(k))) {                V oldValue = e.value;                e.value = value;                e.recordAccess(this);                return oldValue;            }        }        modCount++;        addEntry(hash, key, value, i);        return null;    }
    void addEntry(int hash, K key, V value, int bucketIndex) {        if ((size >= threshold) && (null != table[bucketIndex])) {            resize(2 * table.length);            hash = (null != key) ? hash(key) : 0;            bucketIndex = indexFor(hash, table.length);        }        createEntry(hash, key, value, bucketIndex);   

先判断所有元素个数是否已经达到临界值(capacity * loadFactor),若已达临界值,则将table的容量扩大两倍(两倍时,hash需要移动的数量最少),然后rehash(头插法),之后将目的元素采用头插法放入到链表中(即桶数组下标为i的位置),使用头插法是为了避免再次遍历链表。这种方法,避免了hash桶上的链表过长的情况,即极端情况下,hash冲突映射到同一个桶。


以上图片是对7中HashMap的简单描述,这里只是形象的描述,并不准确,hash桶数量以及rehash后的位置并没有计算,这里只是形象的说明一下。

我们继续看下JDK8对HasnMap的改进,其中,几个重要的因子还是一样的。只是对HashMap的结构进行了改进。简单的来说,就是新增了TreeNode节点类型,在链表长度增加到一定值时,将链表改为红黑数结构(这种优化对极端情况下的复杂度,为OLogN)。

新增属性有:

    /**     * The bin count threshold for using a tree rather than list for a     * bin.  Bins are converted to trees when adding an element to a     * bin with at least this many nodes. The value must be greater     * than 2 and should be at least 8 to mesh with assumptions in     * tree removal about conversion back to plain bins upon     * shrinkage.     */    static final int TREEIFY_THRESHOLD = 8;    /**     * The bin count threshold for untreeifying a (split) bin during a     * resize operation. Should be less than TREEIFY_THRESHOLD, and at     * most 6 to mesh with shrinkage detection under removal.     */    static final int UNTREEIFY_THRESHOLD = 6;
从注释我们可以看出,这两个因素决定了何时将链表rehash为红黑树。

我们先从构造函数看起。

    public HashMap() {        this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted    }
这里只是初始化了0.75f的装填因子,而其余一些初始化信息会首次在put时完成。

    public V put(K key, V value) {        return putVal(hash(key), key, value, false, true);    }
    static final int hash(Object key) {        int h;        return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);    }
 final V putVal(int hash, K key, V value, boolean onlyIfAbsent,                   boolean evict) {        Node<K,V>[] tab; Node<K,V> p; int n, i;        if ((tab = table) == null || (n = tab.length) == 0)            n = (tab = resize()).length;        if ((p = tab[i = (n - 1) & hash]) == null)            tab[i] = newNode(hash, key, value, null);        else {            Node<K,V> e; K k;            if (p.hash == hash &&                ((k = p.key) == key || (key != null && key.equals(k))))                e = p;            else if (p instanceof TreeNode)                e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);            else {                for (int binCount = 0; ; ++binCount) {                    if ((e = p.next) == null) {                        p.next = newNode(hash, key, value, null);                        if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st                            treeifyBin(tab, hash);                        break;                    }                    if (e.hash == hash &&                        ((k = e.key) == key || (key != null && key.equals(k))))                        break;                    p = e;                }            }            if (e != null) { // existing mapping for key                V oldValue = e.value;                if (!onlyIfAbsent || oldValue == null)                    e.value = value;                afterNodeAccess(e);                return oldValue;            }        }        ++modCount;        if (++size > threshold)            resize();        afterNodeInsertion(evict);        return null;    }


 final Node<K,V>[] resize() {        Node<K,V>[] oldTab = table;        int oldCap = (oldTab == null) ? 0 : oldTab.length;        int oldThr = threshold;        int newCap, newThr = 0;        if (oldCap > 0) {            if (oldCap >= MAXIMUM_CAPACITY) {                threshold = Integer.MAX_VALUE;                return oldTab;            }            else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&                     oldCap >= DEFAULT_INITIAL_CAPACITY)                newThr = oldThr << 1; // double threshold        }        else if (oldThr > 0) // initial capacity was placed in threshold            newCap = oldThr;        else {               // zero initial threshold signifies using defaults            newCap = DEFAULT_INITIAL_CAPACITY;            newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);        }        if (newThr == 0) {            float ft = (float)newCap * loadFactor;            newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?                      (int)ft : Integer.MAX_VALUE);        }        threshold = newThr;        @SuppressWarnings({"rawtypes","unchecked"})            Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];        table = newTab;        if (oldTab != null) {            for (int j = 0; j < oldCap; ++j) {                Node<K,V> e;                if ((e = oldTab[j]) != null) {                    oldTab[j] = null;                    if (e.next == null)                        newTab[e.hash & (newCap - 1)] = e;                    else if (e instanceof TreeNode)                        ((TreeNode<K,V>)e).split(this, newTab, j, oldCap);                    else { // preserve order                        Node<K,V> loHead = null, loTail = null;                        Node<K,V> hiHead = null, hiTail = null;                        Node<K,V> next;                        do {                            next = e.next;                            if ((e.hash & oldCap) == 0) {                                if (loTail == null)                                    loHead = e;                                else                                    loTail.next = e;                                loTail = e;                            }                            else {                                if (hiTail == null)                                    hiHead = e;                                else                                    hiTail.next = e;                                hiTail = e;                            }                        } while ((e = next) != null);                        if (loTail != null) {                            loTail.next = null;                            newTab[j] = loHead;                        }                        if (hiTail != null) {                            hiTail.next = null;                            newTab[j + oldCap] = hiHead;                        }                    }                }            }        }        return newTab;    }


首次put时,Entry[] table数组为null,初始化此数组为16的长度,然后将临界值threshold设置为16*0.75f = 12,每次两倍扩展数组时,都会重新计算threshold的值。完成这些初始化后,会计算出hash值,然后和7中一样,hash&(table.length-1)就是数组下标,然后创建链表节点,并将引用赋值给table[i]。之后每次pput节点时,若目标数组i为空,则直接创建新节点,并将引用赋值给table[i],否则,若table[i]和put节点的hash和对象都相同则直接 替换,若不满足,则查看table[i]是哪种节点类型,若是树节点,则调用table[i]的
putTreeVal方法将节点插入树中,不是树节点则是链表节点,遍历table[i]所指向的链表,当数量到达8的时候,将链表修改为红黑树并将节点插入,否则,链表数量尚未达到8,不需要重构为红黑树,则将节点插在链表尾部。
if (++size > threshold)            resize();

最后校验所有元素数量是否大于临界值,是的话则resize,将数组扩展为2。将链表优化为树,在最坏的情况下,将7版本中HashMap的复杂度从O(n)优化为了O(logN)。

从网上找到了一张JDK8版本的HashMap的图片,放在这里算是补充说明吧。




0 0
原创粉丝点击