Thursday 4 December 2014

Slog 4: Paper Folding Exercise

We start by taking a A4 size sheet of paper and stretch it so that you have one end between your left thumb and index finger and the other between your right thumb and forefinger. We then folded the sheet of paper so that the left end is on top of the right end. This process is repeated several times and then the paper is unfolded entirely. Now, we notice that there are creases that either point up or down.
Here is the sequence for the first few folds:

1) D
2) U D D
3) U U D D U D D
4) U U D U U D D D U U D D U D D
5) U U D U U D D U U U D D U D D D U U D U U D D D U U D D U D D

[ U =  up, D = down]

We notice that the middle crease is always down and the part after it is exactly the same as the previous sequence. After looking into it further we realize that the first half is the reverse of the previous sequence but with the ups swapped with down and vice versa. This can be nicely summed up by the following recursive statement: return reverse(fold(n-1)) + fold(1) + fold(n - 1)

for n = 2,
reverse(fold (1)) + fold (1) + fold(1) = U + D + D = U D D

We can also observe that the number of creases increases exponentially with the number of folds or to be exact 2^(n) - 1.

Here is the python code:

def fold(n):
    base = "D"
    if n < 1:
        pass
   
    elif  n == 1:
        return base

    else:
        return reverse(fold(n-1)) + base + fold(n - 1)

    def reverse(str):
        reverse = ""
        for i in range(len(str)):
            reverse += str[len(str) - 1 - i]

        swap = ""
        for i in range(len(str)):
            if reverse[i] == "D":
                swap += "U"

            elif reverse[i] == "U":
                swap += "D"

        return swap


Tuesday 4 November 2014

Slog 3: O(n) notation

We, as computer scientists, care about the efficiency of the programs we write. This is true especially for when we care about how the program scales with the size of the input. One of the ways we can define the efficiency of a program is by measuring its running time. The running time is defined as the number of steps executed and it is defined this way so that the efficiency of the algorithm is independent of hardware.  So basically, we want to know how our running time increases as the size of our input increases.

O(n) notation acts as an upper bound to the running time of the algorithm and, it is used to classify functions depending on how they scale. Some of the most common ones from slowest to fastest growth rate:
1) f(n) = 1
2) f(n) = log n
3) f(n) = sqrt(n)
4) f(n) = n
5) f(n) = n^2
6) f(n) = 2^n
7) f(n) = n^n

As an example, a more rigorous definition of  O(n^2) = $$ \exists c \in R+, \exists B \in N, \forall n \in N, n >= B => f(n) < cn^2 $$

Tuesday 14 October 2014

Slog 2: Proofs

A proof is a structured argument from an initial hypotheses to a conclusion such that each step in the proof follows the laws of logic and is one of the ways to find the truth. It is especially useful in abstract cases because there is no empirical evidence and we can only argue based on logic. In logic proofs, the hypotheses are called premises and must be already proven in order to be deemed useable and the conclusion is generally, the statement you are trying to prove(not true in some cases such as proof by contradiction).

Here is an example of a proof we did in class:
For all n is in N, if n is odd, n^2 is odd
    Assume n is in N
        Assume n is odd
            Then there exists j is in N such that n = 2j + 1
            Then n^2 = (2j + 1)^2
                            = 4j^2 + 4j + 1
                            = 2(2j^2 + 2j) + 1
            Then there exists k = 2j^2 + 2, n^2 = 2k + 1
        Then n^2 is odd
    Then n is odd implies n^2 is odd
Then for all n is in N, if n is odd, n^2 is odd

We can also use this result to prove n^2 is even implies n is even since that is the contrapositive of our result. This format resembles python code due to the emphasis on the indentation and I like it more than the format for mathematical proofs because it feels like point form but still conveys the entire meaning of the proof.





Monday 29 September 2014

Slog 1: Basics of logic and fun with negation

Over the past weeks, we learnt the basics of logic that I got a glimpse of in MAT137. I have liked puzzles since I was a kid and elements of logic come naturally from the exercise of solving puzzles. 
One thing, that I had some trouble with in the first weeks of class was translating from English to the language of math. For example, There is a pen, and a telephone would translate to (x ∈ O, P(x)) ∧ (x ∈ O, T(x)), not x ∈ O, P(x)  T(x) as I had originally thought. Another segment of the lecture, I had trouble with was "sentence vs statement" which sounds quite simple at first glance but 2 hours into a 3 hour lecture, even the easiest topics take a while to register. It also didn't help that many of them seemed to have the same context but were unrelated, and I felt they were somewhat poorly phrased and only complicated the subject matter.

After learning the basic symbols and language of logic we learned some quantifiers and ways to evaluate logic statements. But what interested me most was our introduction to negation.  I think it may have been because we did a lot of exercises to negate logic statements, and the lectures felt more like puzzle solving sessions and were much more enjoyable.