The world of computing is full of buzzwords: AI, supercomputers, machine learning, the cloud, quantum computing and more. One word in particular is used throughout computing – algorithm.
In the most general sense, an algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. The facts are data, and the useful information is knowledge for people, instructions for machines or input for yet another algorithm. There are many common examples of algorithms, from sorting sets of numbers to finding routes through maps to displaying information on a screen.
To get a feel for the concept of algorithms, think about getting dressed in the morning. Few people give it a second thought. But how would you write down your process or tell a 5-year-old your approach? Answering these questions in a detailed way yields an algorithm.
To a computer, input is the information needed to make decisions.
When you get dressed in the morning, what information do you need? First and foremost, you need to know what clothes are available to you in your closet. Then you might consider what the temperature is, what the weather forecast is for the day, what season it is and maybe some personal preferences.
All of this can be represented in data, which is essentially simple collections of numbers or words. For example, temperature is a number, and a weather forecast might be “rainy” or “sunshine.”
Next comes the heart of an algorithm – computation. Computations involve arithmetic, decision-making and repetition.
So, how does this apply to getting dressed? You make decisions by doing some math on those input quantities. Whether you put on a jacket might depend on the temperature, and which jacket you choose might depend on the forecast. To a computer, part of our getting-dressed algorithm would look like “if it is below 50 degrees and it is raining, then pick the rain jacket and a long-sleeved shirt to wear underneath it.”
After picking your clothes, you then need to put them on. This is a key part of our algorithm. To a computer a repetition can be expressed like “for each piece of clothing, put it on.”
Finally, the last step of an algorithm is output – expressing the answer. To a computer, output is usually more data, just like input. It allows computers to string algorithms together in complex fashions to produce more algorithms. However, output can also involve presenting information, for example putting words on a screen, producing auditory cues or some other form of communication.
So after getting dressed you step out into the world, ready for the elements and the gazes of the people around you. Maybe you even take a selfie and put it on Instagram to strut your stuff.
Sometimes it’s too complicated to spell out a decision-making process. A special category of algorithms, machine learning algorithms, try to “learn” based on a set of past decision-making examples. Machine learning is commonplace for things like recommendations, predictions and looking up information.
For our getting-dressed example, a machine learning algorithm would be the equivalent of your remembering past decisions about what to wear, knowing how comfortable you feel wearing each item, and maybe which selfies got the most likes, and using that information to make better choices.
So, an algorithm is the process a computer uses to transform input data into output data. A simple concept, and yet every piece of technology that you touch involves many algorithms. Maybe the next time you grab your phone, see a Hollywood movie or check your email, you can ponder what sort of complex set of algorithms is behind the scenes.
• Jory Denny is Assistant Professor of Computer Science at the University of Richmond. This article originally appeared on TheConversation