Sunday, 23 November 2014

Big O-Me-Ga' Let's Go From Left to Right

Big Omega is related to Big Oh in the way that, while the latter defines an upper bound, the former defines a lower bound for a given function. This is done, usually, by expressing the function in a simpler function, akin to the Big Oh's expression. In mathematical form, it takes the following shape:

f(x) ∈ Ω(g(x))

The full definition would be the following:

∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ f(x) ≥ c(g(x))


This is identical to the definition of Big Oh, except that instead of f(x) ≤ c(g(x)) we have f(x) ≥ c(g(x)). This shows how this second equation approaches the function from the other side, thus defining a lower bound. Once again, we need to use proofs to establish whether the left side of the inequality is equal to the right side of the inequality. To go from left to right, we need to be, once again, creative.

Let's work with the following example:

8n^4 + 5n^2 - n ∈ Ω(-n^5 + 2n^3 - 4)

Then this would be equal to:

∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ 8n^4 + 5n^2 - n ≥ c(-n^5 + 2n^3 - 4)

First, we should work on the inequality

8n^4 + 5n^2 - n ≥ 8n^4 + 5n^2 #we remove -n to start off the inequality
                          ≥ 8n^2 + 5n^2 #since n^4 ≥ n^2, this inequality would still be true
                          ≥ 13n^2 #addition
                          ≥ n∙n^2 #since we have 13 as a coefficient, we could make B = 13 so that n ≥ 13
                          ≥ n^3 #multiplication
                          ≥ 2n^3 - n^3 #still n^3
                          ≥ 2n^3 - n^5 #since - n^5 ≤ - n^3 this is still true
                          ≥ -n^5 + 2n^3 - 4 #subtracting 4, this is still true
                          ≥ 1(-n^5 + 2n^3 - 4) #we could make c = 1

We have established values for both c and B, so that c = 1 and B = 13. In order to work the proof, we would have to do the following arrangement:
8n^4 + 5n^2 - n ∈ Ω(-n^5 + 2n^3 - 4)
∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ 8n^4 + 5n^2 - n ≥ c(-n^5 + 2n^3 - 4)
Assume n ∈ ℕ and n ≥ 13 #domain assumption and antecedent
    Then 8n^4 + 5n^2 - n ≥ 8n^4 + 5n^2 #add n
                          ≥ 8n^2 + 5n^2 #since n^4 ≥ n^2
                          ≥ 13n^2 #add both together
                          ≥ n∙n^2 #since n ≥ 13
                          ≥ n^3 #multiply both together
                          ≥ 2n^3 - n^3
                          ≥ 2n^3 - n^5 #since - n^5 ≤ - n^3
                          ≥ -n^5 + 2n^3 - 4 #subtract 4
    Then ∀ n ∈ ℕ, n ≥ B ⇒ 8n^4 + 5n^2 - n ≥ c(-n^5 + 2n^3 - 4) #introduce ⇒
Then 8n^4 + 5n^2 - n ∈ Ω(-n^5 + 2n^3 - 4) #introduce Big Omega



Done! 

Sunday, 16 November 2014

Big Oh My God Let's Go From Left to Right

Big Oh is a mathematical expression meant to describe the behaviour of a function as its parameter tends to a certain value or infinity. This is done, usually, by expressing the function in a simpler function. In mathematical form, it takes the following shape:

f(x) ∈ O(g(x))

The full definition would be the following:

∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ f(x) ≤ c(g(x))


Relating this to everything we have learnt before, we need to use proofs to establish whether the left side of the inequality is equal to the right side of the inequality. To go from left to right, we need to be very, very creative.

Let's work with the following example:

n^4 - 5n^2 + 1 ∈ O(6n^5 - 5n^3 + 3n)

Then this would be equal to:

∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ n^4 - 5n^2 + 1 ≤ c(6n^5 - 5n^3 + 3n)

First, we should work on the inequality

n^4 - 5n^2 + 1 ≤ n^4 + 1 #we remove - 5n^2 to start off the inequality
                        ≤ n^4 + n^4 #since n^4 ≥ 1, this inequality would still be true
                        ≤ 2n^4 #addition
                        ≤ n∙n^4 #since we have 2 as a coefficient, we could make B = 2 so that n ≥ 2
                        ≤ n^5 #multiplication
                        ≤ 6n^5 - 5n^5 #still n^5
                        ≤ 6n^5 - 5n^3 #since - n^5 ≤ - n^3 this is still true
                        ≤ 6n^5 - 5n^3 + 3n #adding 3n, this is still true
                        ≤ 1(6n^5 - 5n^3 + 3n) #we could make c = 1

We have established values for both c and B, so that c = 1 and B = 2. In order to work the proof, we would have to do the following arrangement:
n^4 - 5n^2 + 1 ∈ O(6n^5 - 5n^3 + 3n)
∃ c ∈ ℝ⁺, ∃ B ∈ ℕ, ∀ n ∈ ℕ, n ≥ B ⇒ n^4 - 5n^2 + 1 ≤ c(6n^5 - 5n^3 + 3n)
Assume n ∈ ℕ and n ≥ 2 #domain assumption and antecedent
    Then n^4 - 5n^2 + 1 ≤ n^4 + 1 #add 5n^2
                                      ≤ n^4 + n^4 #since n^4 ≥ 1
                                      ≤ 2n^4 #add both together
                                      ≤ n∙n^4 #since n ≥ 2
                                      ≤ n^5 #multiply both together
                                      ≤ 6n^5 - 5n^5
                                      ≤ 6n^5 - 5n^3 #since - n^5 ≤ - n^3
                                      ≤ 6n^5 - 5n^3 + 3n #add 3n
    Then ∀ n ∈ ℕ, n ≥ 2 ⇒ n^4 - 5n^2 + 1 ≤ c(6n^5 - 5n^3 + 3n) #introduce ⇒
Then n^4 - 5n^2 + 1 ∈ O(6n^5 - 5n^3 + 3n) #introduce Big Oh

Done! 

Sunday, 9 November 2014

Algorithm Run Time - From a Specific Action to Steps

Computing power is a known limitation when programming. Though for the untrained eye, computers seem vastly powerful, when programming you are trying to reduce both the load on the processor and the running time. As we use less resources, we will have time to do more at the same time, and in less time. That is the rationale behind analyzing the run time of algorithms and code in general. When talking about code, there are three different ways in which run code is interpreted:

  • Linear: Each extra loop increases run time in the same way
  • Exponential: Each extra loop increases run time exponentially, so many extra loops will drastically increase run time
  • Logarithmic: Each extra loop increases run time logarithmically, so after one point extra loops have a minimal impact on run time
To calculate run time, we need to pay close look at our code in order to establish how many steps and loops are given. Let's look at this piece of code and its outcome:

    for i in range(l):
        print(i)

If we use l = 10, how many times will print() be used?
Print will be used 10 times. The output would be the following:

0
1
2
3
4
5
6
7
8
9


What about the number of steps? Would it be 10 steps? 
We need to consider that there are two steps involved: for and print(). In this case, we would be talking about 20 steps in total, one for for and one for print() per loop.

Now let's change the code a little bit:


    for i in range(l):
        print(i)
        for j in range(l):
            print(j)

If we use l = 10, how many times will print() be used?
Print will be used 10 times in the outer loop, and 10 times in the inner loop for each loop in the outer loop. This means that print will be used 10 * 10 = 100. The output would be the following:


0
0
1
2
3
4
5
6
7
8
9
1
0
1
2
3
4
5
6
7
8
9
2
0
1
2
3
4
5
6
7
8
9
3
0
1
2
3
4
5
6
7
8
9
4
0
1
2
3
4
5
6
7
8
9
5
0
1
2
3
4
5
6
7
8
9
6
0
1
2
3
4
5
6
7
8
9
7
0
1
2
3
4
5
6
7
8
9
8
0
1
2
3
4
5
6
7
8
9
9
0
1
2
3
4
5
6
7
8
9

What about the number of steps? Would it be 100 steps?
Similarly to the previous question, we need to consider both inner and outer loop, and their interaction. The inner loop has 2 steps, and the outer loop has 2 steps, and each time the outer loop runs, the inner loop runs l times. Hence, in this case, we are talking about 20 steps per each run of the inner loop, so it would be 200 steps of the inner loop with 20 steps of the outer loop. In total, 220 steps.

As we can see, proper analysis of the code needs to take place in order to decipher how the loops interact and, finally, how many steps will be taken. 

Sunday, 2 November 2014

Solving a Mystery, Case by Case

Some of the proofs we have seen before were pretty direct. When you have P(x) ⇒ Q(x), there is little that can go wrong~! But things do become a little bit more complicated when we encounter ∧. This conjunction symbol indicates that, if we are to prove something, we need to show that both sides are true. When you find a conjunction on the antecedent, such as having P(x) ⇒ Q(x) were P(x) ⇔ R(x) ∧ S(x), you can just leave it as the assumption.

For example:

∀ x ∈ ℕ, R(x) ∧ S(x) ⇒ Q(x)

Would lead us to:
Assume x ∈ ℕ
    Assume R(x) ∧ S(x)
         ....
    Then R(x) ∧ S(x) ⇒ Q(x)
Then ∀ x ∈ ℕ, R(x) ∧ S(x) ⇒ Q(x)

This is fairly easy, and makes our lives' easier! Nonetheless, there will be plenty of times when we will want to disprove a conjunction. To do this, we will have to work on different cases. Let's look at the converse of the previous statement, and let's assume it is true:

∀ x ∈ ℕ, Q(x) ⇒ R(x) ∧ S(x)

This would lead us to:
Assume x ∈ ℕ
    Assume Q(x)
         Then...
             Assume... #case 1 that leads to proving R(x)
                 Then...
             Assume... #case 2 that leads to proving S(x)
                 Then...
         Then...
    Then Q(x) ⇒ R(x) ∧ S(x)
Then ∀ x ∈ ℕ, Q(x) ⇒ R(x) ∧ S(x)

As you can see, separate cases spawn while we are working on this proof. The important thing is to remember that these cases need to lead to proving R(x) and S(x) and need to be within context of Q(x). That sounds like a lot of work, but many times it is easy to see the light!