How good is the provided hash function--are we really getting constant time operations with our hashmap? Explain. What is one other way you could implement the hash function? Anything creative is acceptable in this answer, but justify how your hashing function works. If I have to resize the hashmap to add more buckets, what is th Big-O complexity when you resize? (Explain why) What is 'open addressing' in regards to hash maps and hash tables?
Please help questions below:
- How good is the provided hash function--are we really getting constant time operations with our hashmap? Explain.
- What is one other way you could implement the hash function? Anything creative is acceptable in this answer, but justify how your hashing function works.
- If I have to resize the hashmap to add more buckets, what is th Big-O complexity when you resize? (Explain why)
- What is 'open addressing' in regards to hash maps and hash tables?
What is a Hash Function?
A capacity that changes over a given enormous telephone number to a little pragmatic whole number worth. The planned whole number worth is utilized as a file in the hash table. In basic terms, a hash work maps a major number or string to a little number that can be utilized as the record in the hash table.
What is implied by Good Hash Function?
A decent hash capacity ought to have the accompanying properties:
- Productively calculable.
- Ought to consistently convey the keys (Each table position similarly logical for each key)
- For instance: For telephone numbers, a terrible hash work is to take the initial three digits. A superior capacity is viewed as the last three digits. If it's not too much trouble, note that this may not be the best hash work. There might be better ways.
Practically speaking, we can frequently utilize heuristic strategies to make a hash work that performs well. Subjective data about the dissemination of the keys might be helpful in this plan cycle. By and large, a hash capacity ought to rely upon each and every piece of the key, so two keys that contrast in just the slightest bit or one gathering of pieces (whether or not the gathering is toward the start, end, or center of the key or present all through the key) hash into various qualities. Hence, a hash work that essentially removes a part of a key isn't reasonable. Likewise, on the off chance that two keys are essentially digited or character changes of one another, (for example, 139 and 319), they ought to likewise hash into various qualities.
The two heuristic strategies are hashing by division and hashing by augmentation which are as per the following:
- The mod technique:
In this technique for making hash capacities, we map a key into one of the openings of table by taking the rest of key separated by table_size. That is, the hash work is
h(key) = key mod table_size
for example key % table_size
Since it requires just a solitary division activity, hashing by division is very quick.
While utilizing the division strategy, we generally stay away from specific upsides of table_size like table_size ought not be a force of a number guess r, since in the event that table_size = r^p, h(key) is only the p least request pieces of key. Except if we know that all low-request p-bit designs are similarly logical, we are in an ideal situation planning the hash capacity to rely upon every one of the pieces of the key.
It has been observed that the best outcomes with the division strategy are accomplished when the table size is prime. Be that as it may, regardless of whether table_size is prime, an extra limitation is called for. Assuming r is the quantity of conceivable person codes on a PC, and on the off chance that table_size is a prime with the end goal that r % table_size equivalent 1, hash work h(key) = key % table_size is just the amount of the twofold portrayal of the characters in the key mod table_size.
Assume r = 256 and table_size = 17, in which r % table_size for example 256 % 17 = 1.
So for key = 37599, its hash is
37599 % 17 = 12
However, for key = 573, its hash work is too
573 % 17 = 12
Thus it tends to be seen that by this hash work, many keys can have a similar hash. This is called Collision.
A prime not excessively near a definite force of 2 is frequently great decision for table_size.
Trending now
This is a popular solution!
Step by step
Solved in 4 steps with 1 images