What is “code”? Although “code” can be applied to a variety of areas—including, but not limited to computer code, genetic code, and ethical code—each distinct type of code has an important similarity. Essentially, all code contains instructions that lead to an intended end. Philip E. Auerswald, author of The Code Economy, argues that code drives human history. Humanity’s constant effort to create and improve allows concepts to actualize and processes to develop.
In the shortened excerpt below, Auerswald discusses “code in action,” explaining not only the what, but the how behind modern innovation.
You are baking chocolate chip cookies. You have arrayed before you the required ingredients: 1 cup of butter, 1 cup of white sugar, 1 cup of brown sugar, 2 eggs, 1 teaspoon of baking soda, 2 teaspoons of hot water, 1/2 teaspoon of salt, and (importantly) 2 cups of chocolate chips. These items, 30 minutes of your own labor, and access to the oven, bowls, and other durable equipment you employ constitute the inputs into your production process. The work will yield 24 servings of two cookies each, which constitute your output.
The essential idea is that the “what” of output cannot exist without the “how” by which it’s produced. In other words, production is not possible without process.”
From the perspective of standard microeconomic theory, we have fully described the production of chocolate chip cookies: capital, labor, and raw material inputs combine to yield an output. However, as is obvious to even the least experienced baker, simply listing ingredients and stating the output does not constitute a complete recipe. Something important is missing: the directions—how you make the cookies.
The “how” goes by many names: recipes, processes, routines, algorithms, and programs, among others.
The essential idea is that the “what” of output cannot exist without the “how” by which it’s produced. In other words, production is not possible without process. These processes evolve according to both their own logic and the path of human decision-making. This has always been true: the economy has always been at least as much about the evolution of code as about the choice of inputs and the consumption of output. Code economics is as old as the first recipe and the earliest systematically produced tool, and it is as integral to the unfolding of human history as every king, queen, general, prime minister, and president combined.
We cannot understand the dynamics of the economy—its past or its future—without understanding code.
The word “code” derives from the Latin codex, meaning “a system of laws.” Today code is used in various distinct contexts—computer code, genetic code, cryptologic code (i.e., ciphers such as Morse code), ethical code, building code, and so forth—each of which has a common feature: it contains instructions that require a process in order to reach its intended end. Computer code requires the action of a compiler, energy, and (usually) inputs in order to become a useful program. Genetic code requires expression through the selective action of enzymes to produce proteins or RNA, ultimately producing a unique phenotype. Cryptologic code requires decryption in order to be converted into a usable message.
To convey the intuitive meaning of the concept I intend to communicate with the word “code,” as well as its breadth, I use two specific and carefully selected words interchangeably with code: recipe and technology.
My motivation for using “recipe” is evident from the chocolate chip cookie example I gave above. However, I do not intend the culinary recipe to be only a metaphor for the how of production; the recipe is, rather, the most literal and direct example of code as I use the word. There has been code in production since the first time a human being prepared food. Indeed, if we restrict “production” to mean the preparation of food for consumption, we can start by imagining every single meal consumed by the roughly 100 billion people who have lived since we human beings cooked our first meal about 400,000 years ago: approximately four quadrillion prepared meals have been consumed throughout human history.
Each of those meals was in fact (not in theory) associated with some method by which the meal was produced—which is to say, the code for producing that meal. For most of the first 400,000 years that humans prepared meals we were not a numerous species and the code we used to prepare meals was relatively rudimentary. Therefore, the early volumes of an imaginary “Global Compendium of All Recipes” dedicated to prehistory would be quite slim. However, in the past two millennia, and particularly in the past two hundred years, both the size of the human population and the complexity of our culinary preparations have taken off. As a result, the size of the volumes in our “Global Compendium” would have grown exponentially.
“We cannot understand the dynamics of the economy—its past or its future—without understanding code.”
Let’s now go beyond the preparation of meals to consider the code involved in every good or service we humans have ever produced, for our own use or for exchange, from the earliest obsidian spear point to the most recent smartphone. When I talk about the evolution of code, I am referring to the contents of the global compendium containing all of those production recipes. They are numerous.
This brings me to the second word I use interchangeably with code: technology. If we have technological gizmos in mind, then the leap from recipes to technology seems big. However, the leap seems smaller if we consider the Greek origin of the word “technology.” The first half derives from techné (τέχνη), which signifies “art, craft, or trade.” The second half derives from the word logos (λόγος), which signifies an “ordered account” or “reasoned discourse.” Thus technology literally means “an ordered account of art, craft, or trade”—in other words, broadly speaking, a recipe.
Substantial anthropological research suggests that culinary recipes were the earliest and among the most transformative technologies employed by humans. We have understood for some time that cooking accelerated human evolution by substantially increasing the nutrients absorbed in the stomach and small intestine. However, recent research suggests that human ancestors were using recipes to prepare food to dramatic effect as early as two million years ago—even before we learned to control fire and began cooking, which occurred about 400,000 years ago.
Simply slicing meats and pounding tubers (such as yams), as was done by our earliest ancestors, turns out to yield digestive advantages that are comparable to those realized by cooking. Cooked or raw, increased nutrient intake enabled us to evolve smaller teeth and chewing muscles and even a smaller gut than our ancestors or primate cousins. These evolutionary adaptations in turn supported the development of humans’ larger, energy-hungry brain.
The first recipes—code at work—literally made humans what we are today.
Featured image credit: “Macro Photography of Pile of 3 Cookie” by Lisa Fotios. Public Domain via Pexels.
Recent Comments
There are currently no comments.