back to teaching page



CS 217: Algorithm Design and Analysis, Fall 2019

Jiao Tong University


Time: every Monday and Thursday, 14:00 - 15:40, week 1 to 12.
Location:上院409
Teacher: Dominik Scheder.
Teaching Assistant:汤舒扬 (Tang Shuyang, htftsy@gmail.com).
Textbook: Algorithms by Sanjoy Dasgupta, Christos Papadimitriou, and Umesh Vazirani
Material: This is mostly a blackboard class, but here is some material.

Grading: 50% final exam, 50% homework. The stundents will form groups of size 4-5. Homework will be handed in by every group.

Contents and Goals: Contents of this class are: Classical algorithms and algorithmic design paradigms (e.g. divide and conquer, greedy algorithms, dynamic programming); maximum flow and minimum cut; Linear programming (duality, simplex, applications). We will also discuss some important data structures and how to use them in algorithms. Depending on the time frame, we will teach more advanced topics like randomized, approximation, exponential, and streaming algorithms. We will devote ample time to: (i) the path from a basic idea to the final algorithm; (ii) proofs of correctness; (iii) running time analysis.
Goals: Deep understanding of the mathematical strcuture underlying the algorithms we treat in class. Ability to apply learned methods and paradigms to new problems.

Date

Week

#

Content

Tuesday, November 20, 2018

Week 11

21

Homework 8 handed out; here are the latex sources

Thursday, November 14, 2018

Week 10

20

Monday, November 11, 2018

Week 10

19

Thursday, November 7, 2019

Week 9

18

Homework 7 handed out; here are the latex sources

Saturday, November 2, 2019

Week 9

17

Thursday, October 31, 2018

Week 8

16

Fourier-Motzkin Elimination and Farkas Lemma more formally. Proof of the Strong LP Duality Theorem.

Monday, October 28, 2018

Week 8

15

How to dualize programs with some equality constraints and unbounded variables.

Fourier-Motzkin elimination and Farkas lemma, informally.

Thursday, October 24, 2019

Week 7

14

The Vertex Cover LP: proof that there is an optimal solution that is also a 0/1-solution. Existence of optimal solutions for every feasible and bounded linear program.

Homework 6 handed out; here are the tex sources.

Monday, October 28, 2019

Week 7

13

Linear programs, formally (variables, constraints, feasible solutions, value). The dual of a linear program. The Weak Duality Theorem.

Example: A linear program for Minimum Vertex Cover. In-class exercise: find the dual program; it is the linear program for Maximum Matching.

Thursday, October 24, 2019

Week 6

12

Homework 4.4: an O(m log*(m)) algorithm for the Maximum Capacity Path problem in directed graphs. An O(m) algorithm for undirected graphs.

Introduction to linear programming. Here are the slides. Here are my lecture notes on linear programming.

Monday, October 21, 2019

Week 6

11

Application of Hall's Theorem:

  • every doubly-stochastic matrix is a convex combination of permutation matrices
  • matching markets: market clearing prices always exist (envy free allocation, maximizing total valuation); here are the pdf slides
Homework 5 handed out; here are the tex sources.

Thursday, October 17, 2019

Week 5

10

Example where Ford-Fulkerson with maximum capacity path augmentation fails to terminate (pdf slides).

Hall's Theorem and a short proof using Kőnig's Theorem.

Applications of Hall's Theorem: completing a partially filled-in Latin square; doubly-stochastic matrices.

Monday, October 14, 2019

Week 5

9

Running time of Dinic' algorithm on Unit Capacity Networks. Here are the pdf slides

Thursday, October 10, 2019

Week 5

8

The Ford-Fulkerson method (FF). An example where FF takes long time to terminate, if it is unlucky in finding paths.

The classical pathological example where FF fails to terminate (slides).

Termination of Edmonds-Karp. Better implementation: Dinic' algorithm (slides).

Homework 4 handed out; here are the tex sources.

Monday, September 30, 2019

Week 4

7

Network flows. Two definitions of network flow. Characterization of maximum flows: no s-t-path in the residual network. The Max-Flow Min-Cut Theorem.

Thursday, September 27, 2018

Week 3

6

Union-by-rank with path compression; running time analysis with inverse Ackermann function.

Kirchhoff's Matrix Tree Theorem.

Homework 3 handed out; here are the tex sources.

Monday, September 23, 2019

Week 3

5

Minimum Spanning Trees: Kruskal's algorithm.

Improving the running time: several union-find data structures. We will continue this topic on Thursday.

Here are my slides on the union by rank data structure.

Thursday, September 19, 2019

Week 2

4

Quicksort: analyzing the expected running time via indicator variables Ai,j which are 1 if i is an ancestor of j.
Quickselect: idea, pseudocode, and a suboptimal running time analysis.
Determinstic selection: the median-of-medians algorithm.

Here are some notes about Quicksort and its analysis.

Homework 2 handed out. Here are the tex sources.

Monday, September 16, 2019

Week 2

3

Mergesort, worst case analysis. Determining the exact number of comparisons in the worst case.
Quicksort: worst case running time. Idea of the analysis of the expected number of comparisons via indicator variables.

Thursday, September 12, 2019

Week 1

2

Computing Fibonacci numbers via fast matrix multiplication. Computing them modulo k. How does python multiply Big Integers? How to solve recurrences without remembering the Master Theorem. Homework 1 handed out. Here are the latex sources files of Homework 1.

See my brief slides on the "school method" for integer division.

See my slides on the homework assignment policy of this class.

Monday, September 9, 2019

Week 1

1

Introdcution, administrative stuff.
Euclid's algorithm. Running time and exact bit complexity. We observed that the exact bit complexity is not obvious. Our best upper bound estimate so far is O(n3), but experimental data suggests O(n2).

Fibonacci numbers and algorithms for computing them. Recursive and dynamic programming algorithm. Running time of the DP algorithm is O(n2), which we explained theoretically and observed experimentally.
Fast matrix exponentiation algorithm for computing Fibonacci numbers. We have not yet determined its asymptotic complexity, but experiments with our Python implementation suggest that computing F2n takes three times as long as computing Fn.

Here is the python code.


back to teaching page