To login with Google, please enable popups

or

Don’t have an account? Sign up

To signup with Google, please enable popups

or

Sign up with Google or Facebook

or

By signing up I agree to StudyBlue's

Terms of Use and Privacy Policy

Already have an account? Log in

Week 1.pdf

- StudyBlue
- United-kingdom
- Loughborough University
- Computer Science
- Computer Science Coc104
- Robert Mercas
- Week 1.pdf

Gareth N.

File Size:
26
Introduction Introduction Robert Mercas¸ 17COC104 - Algorithm Analysis Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Module Admin I Dr Robert Mercas¸ Room N.318, third floor, Haslegrave Building. Office Hours: (please email to make an appointment) I References Main References: Computer algorithms : introduction to design and analysis, S. Baase, A. Van Gelder, Addison Wesley (any ed.) Introduction to algorithms, T. H. Cormen, C. E. Leiserson, R. L. Rivest, C. Stein, MIT Press (any ed.) The algorithm design manual, S. S. Skiena, Springer (any ed.) Copies are available in the library Mercas¸ Introduction 17COC104 1 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Module Information I Teaching and Assessment Approx 33 lectures/tutorials (2 lectures and 1 tutorial each week) Assessment: 100% Exam (2 hours) Example exam paper(s) will appear towards the end of the module. Where to get help? Tutorials, and email/meetings (R.G.Mercas@lboro.ac.uk) I Tutorials (feedback sessions) Tutorial sheets in advance on Learn Please try the questions in advance if you can; we will go through the questions and solutions during the tutorials Mercas¸ Introduction 17COC104 2 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Module Information I Why study algorithms? Algorithmic design is a foundation for efficient and effective programs Design techniques that we will study will help to solve a variety of Computer Science & Mathematics problems Being able to analyse algorithms in terms of efficiency and correctness is a fundamental part of Computer Science Many of these skills can be used when problem solving in future careers. Mercas¸ Introduction 17COC104 3 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Module Aims I The module aims are: The aim of this module is to enable students to analyse the amount of resources needed to solve a given computational problem and to compare the efficiency of different algorithms for that problem. The design of algorithms for solving a variety of computational problems will be studied. Techniques for proving statements about the complexity of computational problems are also explored. Mercas¸ Introduction 17COC104 4 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Learning Outcomes I Knowledge and understanding: Upon completion, students should be able to: decide whether a given problem can be solved exactly and in what time-scale, or whether, as is the case in many real world problems, only approximations to their exact solution can be obtained select the correct algorithmic design technique for a given problem demonstrate how particular algorithms work to give correct solutions to problems I Computer related cognitive abilities: analyse an algorithm and determine its complexity class use a variety of strategies to design algorithms describe how particular algorithms work for given inputs prove results about the limits of computation and the limits of efficient computation Mercas¸ Introduction 17COC104 5 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Main topics I Algorithmic analysis and asymptotic notations Complexity classes O,Ω,Θ I Induction recap and structural induction I Graph theory and related algorithms Floyd-Warshall algorithm I Sorting algorithm and analysis Insertion sort, merge sort, quick sort, heap sort, radix sort Mercas¸ Introduction 17COC104 6 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Main topics I String matching Brute force, finite automata, Knuth-Morris-Pratt (KMP algorithm) I Dynamic programming & applications I Complexity theory Lower bounds NP-hardness, PSPACE hardness, approximations I Undecidability I Parallel algorithms Mercas¸ Introduction 17COC104 7 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples What is an algorithm? I An algorithm is a sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time. Computer Problem OutputInput Algorithm Mercas¸ Introduction 17COC104 8 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Why study algorithms? Once we designed an algorithm and described it in pseudo-code, we have to check its: I Correctness I Termination I Efficiency (time / space complexity) We may also be interested in its: I Simplicity (easy to understand and implement) I Optimality (best possible algorithm) Two very important tools: RAM Model of computation and asymptotic analysis of worst-case complexity Mercas¸ Introduction 17COC104 9 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples The RAM model There are many models of computation possible (Turing machine, RAM, PRAM, RASP etc.) We will use the RAM (Random Access Machine) model. Figure: Random Access Machine model Mercas¸ Introduction 17COC104 10 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples The RAM model I Machine-independent, language-independent generic model of computation I Operations executed one after another, not concurrently. Each memory access(read/write) takes one step. Each arithmetic or logical operation, comparison takes one step. Each step (basic operation) takes constant time. Loops and subroutine calls are not basic operations. I Running time (or time complexity) = number of basic operations executed (amount of work done), expressed as a function of the size n of the problem. I Space complexity = number of memory cells needed to solve the problem, expressed as a function of the size n of the problem. Mercas¸ Introduction 17COC104 11 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples The RAM model I Criticism: too simple, e.g.: multiplication versus addition or memory access time We make an abstraction that is useful in general. Mercas¸ Introduction 17COC104 12 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Complexity I For time complexity and for space complexity we can consider: worst-case complexity average-case complexity best-case complexity (less common) I For space complexity: space complexity allows us to measure the amount of space used with respect to input size in-place algorithm: the amount of extra space is constant with respect to the input size Mercas¸ Introduction 17COC104 13 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Example - Find max I Input: A list of n integers a[1],a[2],a[3], . . . ,a[n] I Output: The largest element of the list I Algorithm (Find max): 1 Assign a[1] to a variable max. 2 For each element of the list, compare it with max. 3 If the element is larger, then update max. Mercas¸ Introduction 17COC104 14 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples ALGORITHM Find max: 1: // Input: list of n integers a[1],a[2], . . . ,a[n] 2: // Output: maximal element of list 3: max← a[1] 4: for i ← 2 to n do 5: if (a[i] > max) then 6: max← a[i] 7: // i ← i + 1 8: return max Mercas¸ Introduction 17COC104 15 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Maximal Element I Time: if assignment takes time c1, increment takes time c2 and comparison takes time c3 then the maximum total time is: c1n + c2(n − 1) + c3(n − 1) = (c1 + c2 + c3)n − (c2 + c3) I On RAM: as c1 = c2 = c3 = 1, we have: 3n − 2 time steps I Space: n + 2 Mercas¸ Introduction 17COC104 16 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Optimality I An algorithm is optimal if there is no algorithm for that particular problem that performs fewer basic operations (in the worst case). it is thus the best possible I Lower bounds for time complexity a minimum number of operations proven to be needed to solve a problem (hence any algorithm will have to execute at least this many operations) Similar notions exist for space complexity Mercas¸ Introduction 17COC104 17 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Optimality - worst case analysis I Examples: 1 Finding maximal element of a list of n elements - lower bound of n − 1 comparisons; thus FindMax is optimal! 2 Matrix multiplication of two n × n matrices: Best known algorithm - c1n2.3728 operations m Gap Best known lower bound - c2n2 operations 3 Sorting list of n elements: Best known algorithm - c1n log n operations m Optimality Best known lower bound - c2n log n operations Mercas¸ Introduction 17COC104 18 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Time Complexity Analysis Values of some important functions as n→∞ (approximate): n log2 n n n log2 n n 2 n3 2n 10 3.3 10 3.3× 10 102 103 103 102 6.6 102 6.6× 102 104 106 1.3× 1030 103 10 103 104 106 109 10301 104 13 104 1.3× 105 108 1012 · · · 105 17 105 1.7× 106 1010 1015 · · · For comparison, there are about 1080 atoms in the observable universe.. And the universe is approximately 4.354× 1020 milliseconds old. Mercas¸ Introduction 17COC104 19 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Time Complexity Analysis I For a given algorithm, A, we denote: Worst case efficiency - T (n) is the maximum running time (in terms of number of basic operations) for any input of size n. Average case - Tavg(n) is the average running time (in terms of number of basic operations executed on typical input of size n). Note: this is not the average of best and worst case. Best case - Tbest (n) is the minimum running time (in terms of number of basic operations) for any input of size n. I In this module, we mostly focus on worst case efficiency of algorithms, given by T (n). When we say efficiency we will mean worst case time efficiency unless otherwise stated. Mercas¸ Introduction 17COC104 20 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Which algorithm is the fastest? I Consider a problem that can be solved by 5 algorithms, A1,A2,A3,A4,A5 using different number of operations (time complexity): f1(n) = 50n + 20 f2(n) = 10n log2 n + 100 f3(n) = n2 − 3n + 6 f4(n) = 2n2 f5(n) = 2n/8− n/4 + 2 I Algorithm A1 requires f1(n) operations on an input on size n, A2 requires f2(n) operations etc. I Which algorithm will be fastest? This will depend upon the size of the input n as we will now see. Mercas¸ Introduction 17COC104 21 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Which algorithm is the fastest? n 1 2 4 8 16 32 64 128 256 512 f1(n) = 50n + 20 70 120 220 420 820 1620 3220 6420 12820 25620 f2(n) = 10n log2 n + 100 100 120 180 340 740 1700 3940 9060 20580 46180 f3(n) = n 2 − 3n + 6 4 4 10 46 214 934 3910 16006 64774 1E+06 f4(n) = 2n 2 2 8 32 128 512 2048 8192 32768 1E+05 5E+05 f5(n) = 2 n/8 · n/4 + 2 2 2 3 32 8190 5E+08 2E+18 ... ... ... In this table, we can see the maximum number of operations each algorithm (A1,A2,A3,A4,A5) will require for various values of n. Mercas¸ Introduction 17COC104 22 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Which algorithm is the fastest? n 1 2 4 8 16 32 64 128 256 512 f1(n) = 50n + 20 70 120 220 420 820 1620 3220 6420 12820 25620 f2(n) = 10n log2 n + 100 100 120 180 340 740 1700 3940 9060 20580 46180 f3(n) = n 2 − 3n + 6 4 4 10 46 214 934 3910 16006 64774 1E+06 f4(n) = 2n 2 2 8 32 128 512 2048 8192 32768 1E+05 5E+05 f5(n) = 2 n/8 · n/4 + 2 2 2 3 32 8190 5e+08 2E+18 Highlighted in the (red) shaded boxes above, we see the fastest algorithm for that particular value of n. What we are usually interested in is “which algorithm is fastest for larger values of n?” Mercas¸ Introduction 17COC104 23 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples A (log-log) graph of the running times Mercas¸ Introduction 17COC104 24 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Polynomial versus exponential functions I Note that there is a huge difference between functions involving powers of n (e.g., n, n2, n3, called polynomial functions) and functions involving powering by n (e.g. 2n, 3n, called exponential functions) I Among polynomial functions, those with the same power are more comparable e.g. f3(n) = n2 − 3n + 6 and f4(n) = 2n2 Mercas¸ Introduction 17COC104 25 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Relative growth rates Mercas¸ Introduction 17COC104 26 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Order of growth of important functions I We can define a hierarchy of functions, each having a greater order of growth than its predecessor: 1 – constant loga n – logarithmic n,n2,n3, . . . ,nk – polynomial 2n – exponential n! – factorial I We can further refine the hierarchy by inserting n log n between n and n2, n2 log n between n2 and n3 etc. I As we move down the list, successive functions have greater order of growth than the previous ones. I As n increases, the values of the later functions increase much more rapidly than the earlier ones. Mercas¸ Introduction 17COC104 27 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Order of growth of important functions (2) I What about (log n)3 and n, which has the greater order of growth? Recall that n = 2log2 n, thus we are comparing (log n)3 and 2log n We therefore see that (log n)3 is lower in the hierarchy than n I Similarly, we see that (log n)k is lower than n in the hierarchy for any constant k ≥ 1 Mercas¸ Introduction 17COC104 28 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Order of growth of important functions (3) I When we have a function, we can classify it to some function class in the hierarchy: For example, given f (n) = 5n3 + 3n2 + 7n + 12 The term with the highest power is 5n3 The growth rate of f (n) is dominated by n3 I This concept is captured by O notation (Big-O notation) Mercas¸ Introduction 17COC104 29 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-O notation I Big-O notation is a way of comparing functions that ignores multiplicative constants and small input sizes. I f (n) ∈ O(g(n)), which we read as “f(n) is of order g(n)” Roughly speaking, this means f (n) is at most a constant times g(n) for all large n Often also written as f (n) = O(g(n)) I Examples: 2n3 + 100 ∈ O(n3) 8n3 ∈ O(n3) 2n log n ∈ O(n log n) n3 + n2 ∈ O(n3) Mercas¸ Introduction 17COC104 30 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-O Exercise I Determine the order of growth of the following functions 1 5n4 + 3n2 + 100 2 4n2 log n + n3 + 5n2 + n 3 2n2 + 8n2 log n − 3n 4 6n2 + 2n I We should look for the term highest in the hierarchy O(log n) < O(log2 n) < O(√n) < O(n) < O(n log n) < O(n2) < O(2n) < O(n!) Mercas¸ Introduction 17COC104 31 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-O Notation - formal definition I f (n) ∈ O(g(n)) if there exists constants c > 0 and n0 > 0, such that f (n) ≤ c · g(n) for all n > n0 I Alternatively, we can write this in the form: ∃c > 0,∃n0 ≥ 0, ∀n ≥ n0 : f (n) ≤ c · g(n) I either of the above two definitions is the formal definition. Recall that ∃ means “there exists” and ∀ means “for all”. Mercas¸ Introduction 17COC104 32 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-O Notation - visualisation of t(n) ∈ O(g(n)) Mercas¸ Introduction 17COC104 33 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples A note on logarithms I For logarithmic functions, the order of the logarithm a > 1 does not matter under big-O notation since: loga n = logb n logb a , where logb a is a constant making it possible to switch from one base to another (with a new multiplicative constant) I Throughout the module, if we do not specify a base for a logarithm, we will usually mean base 2. Mercas¸ Introduction 17COC104 34 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Proof of orders of growth Question Prove that 2n2 + 4n ∈ O(n2). Proof. Since n ≤ n2 for all n ≥ 1, then: 2n2 + 4n ≤ 2n2 + 4n2 = 6n2 for all n ≥ 1 Therefore by definition, 2n2 + 4n ∈ O(n2) Mercas¸ Introduction 17COC104 35 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-O Exercise (2) 1 For each of the following, state if they are true or false a. n2 log n + n3 + 3n2 + 3 is of order O(n2 log n)? b. 8n2 + 5n − 2000 ∈ O(n3)? c. 32n2 − 5n + 6n3 + 10000 ∈ O(n3)? 2 Prove that log16(n) ∈ O(log2(n)) (you can use the logarithmic identity loga n = logb n logb a ). Mercas¸ Introduction 17COC104 36 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Ω and Big-Θ Notation I We have seen that f (n) ∈ O(g(n)) means that f (n) has smaller order of growth than g(n). I How do we represent that f (n) has a larger order of growth? How about an equivalent order of growth? I For this, we use Ω (pronounced Big-Omega) and Θ (pronounced Big-Theta) notation, respectively I Intuitively, a function f (n) is said to be in Ω(g(n)), denoted f (n) ∈ Ω(g(n)) if f (n) is bounded below by some positive constant multiple of g(n) for all large n. I A function f (n) ∈ Θ(n) if f (n) is bounded above and below by some positive constant multiples of g(n), for all large n. Mercas¸ Introduction 17COC104 37 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Ω Notation A function t(n) is said to be in Ω(g(n)) (pronouned Big-Omega), denoted t(n) ∈ Ω(g(n)), if t(n) is bounded below by some constant multiple of g(n) for all large n. Mercas¸ Introduction 17COC104 38 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Ω notation - formal definition I t(n) ∈ Ω(g(n)) if there exists constants c > 0 and n0 > 0, such that t(n) ≥ c · g(n), for all n > n0 I We can prove that n3 ∈ Ω(n2): n3 ≥ n2, for all n ≥ 0 I As we see, Big-Omega is a lower bound for a particular function. Mercas¸ Introduction 17COC104 39 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Θ notation A function t(n) is in Θ(g(n)) (pronouned Big-Theta), denoted t(n) ∈ Θ(g(n)), if t(n) is bounded above and below by some constant multiples of g(n) for all large n. Mercas¸ Introduction 17COC104 40 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Θ notation - formal definition I t(n) ∈ Θ(g(n)) if there exist constants c1 > 0 and c2 > 0, and an integer n0 > 0, such that c2 · g(n) ≤ t(n) ≤ c1 · g(n), for all n > n0 Mercas¸ Introduction 17COC104 41 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Big-Θ notation - proof technique Let us prove that 12n(n − 1) ∈ Θ(n2). First, we prove the right inequality (the upper bound): 1 2 n(n − 1) = 1 2 n2 − 1 2 n ≤ 1 2 n2 for all n ≥ 0 and then the left inequality (the lower bound): 1 2 n(n − 1) = 1 2 n2 − 1 2 n ≥ 1 2 n2 − 1 2 n 1 2 n (for all n ≥ 2) = 1 4 n2 Hence, we can select c2 = 14 , c1 = 1 2 and n0 = 2. Mercas¸ Introduction 17COC104 42 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Useful facts regarding O-notation If f1(n) ∈ O(g1(n)) and f2(n) ∈ O(g2(n)), then f1(n) + f2(n) ∈ O(max{g1(n),g2(n)}) Example If an algorithm has two separate sections, one of which requires O(n3) and the second of which requires O(n5), the overall complexity of the algorithm is O(n5). In other words I O(g1(n)) +O(g2(n))→ O(max{g1(n),g2(n)}) The analogous assertions are true for the Ω and Θ notations as well: I Ω(g1(n)) + Ω(g2(n))→ Ω(max{g1(n),g2(n)}) I Θ(g1(n)) + Θ(g2(n))→ Θ(max{g1(n),g2(n)}) Mercas¸ Introduction 17COC104 43 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Useful facts regarding O-notation (2) Multiplying by a constant does not change the asymptotic behaviour of a function I O(c · g(n))→ O(g(n)) I Ω(c · g(n))→ Ω(g(n)) I Θ(c · g(n))→ Θ(g(n)), where c is a strictly positive number Mercas¸ Introduction 17COC104 44 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Useful facts regarding O-notation (3) However, when multiplying two functions of n, both are important: I O(g1(n))×O(g2(n))→ O(g1(n)× g2(n)) I Ω(g1(n))× Ω(g2(n))→ Ω(g1(n)× g2(n)) I Θ(g1(n))×Θ(g2(n))→ Θ(g1(n)× g2(n)) Mercas¸ Introduction 17COC104 45 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples A very useful theorem The following theorem is very convenient for comparing the order of growth of two functions: Theorem lim n→∞ t(n) g(n) = { 0 implies t(n) has a smaller order of growth than g(n) c implies t(n) has the same order of growth as g(n) ∞ implies t(n) has a larger order of growth than g(n) where c > 0 is a constant. Or more compactly: lim n→∞ t(n) g(n) = { 0 ⇒ t(n) ∈ O(g(n)) c ⇒ t(n) ∈ Θ(g(n)) ∞ ⇒ t(n) ∈ Ω(g(n)) Mercas¸ Introduction 17COC104 46 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Basic rules of limits Recall that: lim n→∞ c · f (n) = c · limn→∞ f (n) if c is a constant, and that: lim n→∞(f (n) + g(n)) = limn→∞ f (n) + limn→∞g(n) Mercas¸ Introduction 17COC104 47 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples An example Let us use the theorem from two slides ago to compare the growth rates of 12n(n − 1) and n2 (which we did previously using a different approach). lim n→∞ 1 2n(n − 1) n2 = 1 2 lim n→∞ n2 − n n2 = 1 2 lim n→∞ ( 1− 1 n ) = 1 2 which implies that 12n(n − 1) ∈ Θ(n2). Mercas¸ Introduction 17COC104 48 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Example 2 Let us compare the growth rate of 2n and n! To do this, we will need the following Sterling’s approximation for n! (you do not need to memorize this): n! ≈ √ 2pin (n e )n for large values of n (equal in the limit; e is Euler’s number) Thus, we see that: lim n→∞ n! 2n = lim n→∞ √ 2pin ( n e )n 2n = lim n→∞ √ 2pin nn 2nen = lim n→∞ √ 2pin ( n 2e )n =∞ Therefore n! has a higher order of growth than 2n Mercas¸ Introduction 17COC104 49 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Some comments on notations in the module I Recall that when using O, (equivalently Ω, Θ) notation, we can say either: f (n) ∈ O(g(n)), or else, f (n) = O(g(n)). You may see both notations in textbooks, online etc. I When writing pseudocode, some authors start arrays from 0, others from 1. We may use both throughout the module, depending upon which makes the algorithm easier to understand. E.g. A[0],A[1], . . . ,A[n − 1], or else, A[1],A[2], . . . ,A[n]. I We use a variety of pseudocode styles; you should become used to seeing pseudocode written in slightly different ways. Mercas¸ Introduction 17COC104 50 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples Conclusion I Basics of algorithms I Introduction to algorithmic analysis I Asymptotic notations and examples I We will go through many examples during the tutorial Mercas¸ Introduction 17COC104 51 Module Introduction Algorithms Asymptotic Analysis Asymptotics Examples