2019-04-09 in class

This commit is contained in:
Claudio Maggioni 2019-04-09 13:38:46 +02:00
parent 9ed14f6cb9
commit 0151274467
2 changed files with 29 additions and 4 deletions

View file

@ -216,7 +216,7 @@ Instead of using keys directly, the index returned by the hash function is used.
**Pigeon hole problem:** More keys (pigeons) in the universe than holes (indexes **Pigeon hole problem:** More keys (pigeons) in the universe than holes (indexes
for table t) for table t)
### Solution ## The solution: the *chained hash table*
For every cell in table T don't store True/False but a linked list of keys for For every cell in table T don't store True/False but a linked list of keys for
each index in the table. This is called *Chained hash table*. each index in the table. This is called *Chained hash table*.
@ -244,14 +244,16 @@ If *n*, the number of elements that we want to store in the hash table, grows,
then *|T|* must also grow. *alpha* represents the time complexity of both then *|T|* must also grow. *alpha* represents the time complexity of both
insertion and search. insertion and search.
## Growing a Chained hash table ## Growing a *chained hash table*
In order to grow a table, a new table must be created. The hash function (or its In order to grow a table, a new table must be created. The hash function (or it
s
range parameters) must be changed as well. range parameters) must be changed as well.
### Rehashing ### Rehashing
*Rehashing* is the process of putting all the elements of the old table in the new *Rehashing* is the process of putting all the elements of the old table in the
new
table according to the new hash function. The complexity is O(n), since table according to the new hash function. The complexity is O(n), since
`Chained-hash-insert` is constant. `Chained-hash-insert` is constant.
@ -265,3 +267,16 @@ then the complexity of insertion is O(n^2) due to all the *rehashing* needed.
If the table size is doubled when *overfloiwng*, then the complexity for If the table size is doubled when *overfloiwng*, then the complexity for
insertion becomes linear again by sacrificing some memory complexity. insertion becomes linear again by sacrificing some memory complexity.
# Binary search trees
Implementation of a dynamic set over a *totally ordered* (with an order relation
that has ...)
## Interface
- `Tree-Insert(T, k)` adds a key K to tree T;
- `Tree-Delete(T, k)` deletes the key K from tree T;
- `Tree-Search(T, k)` returns if key K is in the tree T;
- `Tree-Minimum(T)` finds the smallest element in the tree;
- `Tree-Maximum(T)` finds the biggest element in the tree;
- `Tree-successor(T, k)` `Tree-predecessor(T, k)

10
tree.py Normal file
View file

@ -0,0 +1,10 @@
#!/usr/bin/env python3
# vim: set ts=2 sw=2 et tw=80:
class Node:
def __init__(self, k):
self.key = k
self.left = None
self.right = None