Part 1: Computers
We are going to start by answering a question so fundamental that you might even laugh that I ask it at all: what is a computer? Yes, I know you are using one right now, but do you really know what a computer is? What is the difference between a computer and a calculator? Other than size and shape, what makes different computers different? What makes one computer better than another?
We are going to learn the answers to all of these questions eventually. For now, try to take a stab at answering that first question.
Q: What is a computer?
A: Something that… computes?
Okay, but what does it mean to “compute”? Feel free to try to come up with a definition on your own, but this is actually a really difficult question; even computer scientists had a hard time defining it! The first decent definition was put forth in the 1930s, and even though it is the one we use today, computer scientists are still not quite sure if there could be a better way to define it.
But what is this definition?
A computation is any function that can be evaluated using an algorithm.
Woah, terminology alert! We can break it down:
A function is something that can be given input and will return output. And if you give it the same input several times, it must return the same output each time.
For example, “Given two numbers, what is their sum?” is a function. Its input is two numbers and its output is a single number. “What day of the week was it on a certain date?” is also a function. With the date as the input, you can look backwards or forwards in the calendar to see if that day was “Monday”, “Tuesday”, etc. and that is your output. “What is your favorite color?” is not a function! If I ask you now, your output might be green, but if I ask later on that could change. The output needs to be the same every time the function gets the same input. You could say that the output depends only on what the input is.
Given a function and its input, the process of determining its output is called evaluating the function.
For my previous example of what day of the week a certain date was, looking up the date in a calendar would be “evaluating” the function.
An algorithm is a well-defined, step-by-step process.
In other words, an algorithm is just a list of instructions. However, the instructions must be completely clear. There should be no room for interpretation. For example, “multiply this number by 2” would work as a step in an algorithm since anyone who knows how to multiply will get the exact same result. However “draw a pretty picture” would not work in an algorithm because it is ambiguous. What do you mean by “pretty”? What tools can I draw with? What should I draw a picture of?
Now read it again: a computation is any function that can be evaluated using an algorithm. A computation takes input and produces output by following a clear, step-by-step process and where the output depends only on the input. Got it?
You might be wondering: since a computation is a function that can be evaluated using an algorithm, does that mean that there are functions that cannot be evaluated using any algorithm? Yes! They are really cool but unfortunately are beyond the scope of this book. Search the web for “uncomputable functions”.
Now get ready because here comes the first set of exercises. This is not school so you do not have to do them and some problems are tricky enough that I do not expect you to figure them out very easily. But the best way to learn is by doing, so at the very least try to think about each question a little bit before moving on.
Exercises
- Come up with three more functions. What is the input? What is the output? How is the function evaluated?
- Come up with something that takes input and produces output, but is not a function. What makes it not a function? Can you turn your non-function into a function by making it require more input?
- You probably learned how to multiply long numbers by hand in elementary school. Did you realize that the method you learned was an algorithm? Try to write out a multiplication algorithm for multiplying a three-digit number by a one-digit number as step-by-step instructions. Assume that the reader of the instructions knows how to multiply and add only single digit numbers.