Sunday 30 March 2014

Sorting and Efficiency

This is my entry on sorting and efficiency. Sorting refers to any type of algorithm/function that can be used to order the elements of a list or some other kind of data structure. We normally consider the efficiency of a sorting algorithm by how many calls the algorithm will make in the worst case scenario. The efficiency of different types of algorithms are categorized by "big Oh", for example O(n^2) would mean that in the worst case scenario the algorithm will have to make some constant multiplied by n^2 calls in order to sort n elements. Below are some of the main sort algorithms we have used specifically on lists and their efficiency.

Selection sort: Searches through the entire unsorted section of the list to find the smallest element, and moves it to the end of the sorted part of the list (efficiency is O(n^2))
Insertion sort: Sorts the list by moving the next element in the unsorted portion of the list to the appropriate place in the sorted part of the list (efficiency is O(n^2))
Bubble sort: Repeatedly searches through the entire list and swaps the position of adjacent elements that are not in the correct order (efficiency is O(n^2))
Merge sort: Recursively splits the list into halves, then sorts the halves and merges them back together (efficiency is O(nlog(n)))
Quicksort: Partitions the list into two halves where one half is smaller than the first element and the other half is greater than the first element, then puts the first element in between these two partitions and repeats the same process for the two partitioned lists (efficiency is O(nlog(n)))
We also talked briefly about python's built in sorting system list.sort(). This is the fastest of all sorting algorithms, but it is still of order O(nlog(n)) as this is the most efficient that it is possible for a sorting algorithm to be.

Sunday 2 March 2014

Recursion

This is my week 7 SLOG entry, on the topic of recursion. In computer science, recursion is essentially writing methods/functions that complete a task by calling on themselves a finite number of times. Whenever you write a recursive function it is essential to have what is called a "base case . A base case is essentially an iteration of a recursive function that does not involve calling itself, and if you do not have a base case your program will crash when you call the function because it will call itself an infinite number of times and you will get an error saying "Maximum recursion depth exceeded". Recursion is useful because it can help you do complicated tasks by relating them to their most simple cases. For example, consider the Towers of Hanoi problem with three stools, where you have to move rings from one stool to another stool but you cannot place a large . Though this problem seems complicated, it can be coded very simply using recursion. Your recursive method could use the following logic:

If there is only one cheese on the stool, move it to the destination

Else: Recursively call this function to move all cheeses except for one to an intermediate stool
Move the bottom cheese to the destination stool
Recursively call your function to move all cheeses from the intermediate stool to the destination stool.

This shows how recursion can solve more complicated problems by reducing them to their simplest cases.

Sunday 9 February 2014

This is my SLOG entry for week 5 of CSC148. This week's lab went well, and my partner and I finished the lab with enough time left over to work through some of the bonus problems. I've joined a group for assignment 1 with two other students in my residence, and working with them and sharing ideas has definitely been easier and more helpful than working alone. I found Monday's in-class demonstration of the optimal solution to the three-stool cheese problem to be helpful.

Sunday 2 February 2014

Week 4

This is the second entry of my SLOG, for week 4. The main focus this week was recursion, a topic that I have seen before but not in a great amount of detail. I had not looked at the nesting depth of lists before, but the examples shown in class made it fairly easy to understand. I am still managing to keep up to date on classwork, but I'm definitely noticing a difference between the difficulty of this course and CSC108 which I took last term. I found that working through the lab on Tuesday was particularly helpful in solidifying my understanding of exceptions, and it helped me to complete exercise two. I've also begun work on Assignment 1, and everything seems to be going well so far.

Sunday 26 January 2014

Week 3 - Object Oriented Programming

My name is Torin and this is the first SLOG entry, on the topic of object oriented programming. I took the CSC 108 course offered last term as well as two years of computer science in high school so I already had some experience with object oriented programming before I started this course, but I've already learned new things this term. I generally understand the logic behind composition and inheritance (for example the has - a and is - a relationships) but in high school my computer science courses were taught in java, so the main challenge for me so far has been trying to identify the subtle differences in these topics between java and python (however I have also noticed that there are a LOT of similarities between the two). I was quite happy with how my lab went this week, as my partner and I were able to complete it in slightly less than an hour as opposed to the first week where it took us the full two hours.