From 9ed14f6cb9215e9982e95227bc7a394fece19421 Mon Sep 17 00:00:00 2001 From: Claudio Maggioni Date: Tue, 9 Apr 2019 11:20:37 +0200 Subject: [PATCH] Partial lesson 2019-04-09 --- notes.md | 45 +++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 41 insertions(+), 4 deletions(-) diff --git a/notes.md b/notes.md index cb61c61..2871c25 100644 --- a/notes.md +++ b/notes.md @@ -143,7 +143,7 @@ def right(x): **Max heap property**: for all i > 1 A[parent(i)] >= A[i] -# Data structures +# Some data structures Way to organize information @@ -160,7 +160,7 @@ A data structure has data and meta-data (like size, length). ## Queue (FIFO) -## Structure +### Structure - Based on array - `length` @@ -175,7 +175,7 @@ Data structure for fast search A way to implement a dictionary is a *Direct-access table* -## API of dictionary +### API of dictionary - `Insert(D, k)` insert a ket `k` to dictionary `D` - `Delete(D, k)` removes key `k` @@ -183,12 +183,14 @@ A way to implement a dictionary is a *Direct-access table* Many different implementations -## Direct-access table +# Direct-access tables - universe of keys = {1,2,...,M} - array `T` of size M - each key has its own position in T +## The 'dumb' approach + ```python def Insert(D, x): @@ -228,3 +230,38 @@ def Chained_hash_search(T, k): return List_search(T[hash(k)], k) ``` + +Elements are spreaded evenly across the table if the hash function is good. + +*alpha* = *n / |T|* + +is the average-case average length of the linked lists inside +the table (where n is the number of elements in the table and *|T|* is the size +of the table. + +A good hash table implementation makes the complexity of *alpha* O(1). +If *n*, the number of elements that we want to store in the hash table, grows, +then *|T|* must also grow. *alpha* represents the time complexity of both +insertion and search. + +## Growing a Chained hash table + +In order to grow a table, a new table must be created. The hash function (or its +range parameters) must be changed as well. + +### Rehashing + +*Rehashing* is the process of putting all the elements of the old table in the new +table according to the new hash function. The complexity is O(n), since +`Chained-hash-insert` is constant. + +### Growing the table every time + +If the table is grown by a constant factor every time the table *overflows*, +then the complexity of insertion is O(n^2) due to all the *rehashing* needed. + +### Growing the table by doubling the size + +If the table size is doubled when *overfloiwng*, then the complexity for +insertion becomes linear again by sacrificing some memory complexity. +