There are many proposed definitions of artificial intelligence (AI), each with its own slant, but most are roughly aligned around the concept of creating computer programs or machines capable of behavior we would regard as intelligent if exhibited by humans. John McCarthy, a founding father of the discipline, described the process in 1955 as “that of making a machine behave in ways that would be called intelligent if a human were so behaving.”
“Unplugging” from social media does not necessarily equate to quitting. As The Happiness Effect author Donna Freitas found out, the decision to temporarily quit social media is a common among university students. Some students quit because they feel “too obsessed” or “addicted,” while others cite online drama as their reason to take a break.
The hype of technological progress is that it will change the world and make life better for everyone. For young technologists, this may be true, but their blinkered vision does not recognise that, not just the elderly, but many others, cannot cope with electronic communications and the benefits of on-line shopping or banking, etc. In many developed nations 25% of adults are of retirement age.
Technological advances have provided immense improvements in our lives, but often with a hidden cost. Even the historic skills of bronze and iron working were driven by a desire not only for ploughs and tools, but for better weapons of war. This is still the case for much of modern science. Technical knowledge has helped to combat diseases, improve health, provide more food, offer faster travel, or ease hardship, and this is progress.
The hack of the Democratic National Committee by the Russian government and the subsequent publication of confidential emails during the 2016 US presidential election elevated cyber security in the context of international affairs to an unprecedented level in the public’s consciousness, not only in the United States but around the world.
Every country that is on the ascendant feels the need for a “coming out” party. In the last half century, that need has been met most often by hosting the Olympic Games. Japan did it in 1964, South Korea followed in 1988, and China in 2008. The Olympic itch seems to come in the wake of economic growth that takes per capita income to the vicinity of $6,000
The ancient Greek philosophers believed that the Sun, Moon, planets, and stars were mathematically perfect orbs, made from unearthly materials. These bodies were believed to move on perfectly symmetric celestial spheres, through which a backdrop of fixed stars could be seen, rotating majestically every 24 hours. At the centre was the motionless Earth. For the Greeks, the power of reason was more important than observation.
When we walk into a restaurant, we are often confronted by the sight of people taking pictures of their food with their smartphones. Online, our Facebook feeds seem dominated by pictures of people’s hamburgers and desserts. What is going on with food porn? How is consumer desire itself transformed by contemporary technology?
The ultimate fate of the right to be forgotten remains to be seen. Although Europe has temporarily resolved this question in favor of the right by adopting its General Data Protection Regulation, many questions surrounding the issue still must be answered. It’s unclear whether other parts of the world will follow Europe’s lead. Internationally, writers are exploring some of these matters.
A chatbot, or chatterbot, is computer program designed to engage in conversation through written or spoken text. It was one of the words on the Oxford Dictionaries Word of the Year 2016 shortlist. The idea of a chatbot originates with Alan Turing’s mid twentieth century aspiration to build a thinking computer. Turing proposed a test to determine what might count as success in this venture.
The media is full of stories about how this or that area of the brain has been shown to be active when people are scanned while doing some task. The images are alluring and it is tempting to use them to support this or that just-so story. However, they are limited in that the majority of the studies simply tell us where in the brain things are happening. But the aim of neuroscience is to discover how the brain works.
Unidentified aerial phenomena, commonly referred to as UFOs, has been the focus of research by sociologists, scholars of religion, anthropologists, philosophers, and astronomers. The information age now offers new and innovative ways to study the phenomena, and author Diana Walsh Pasulka sat down with astronomer and computer scientist Jacques Vallee to discuss how “big data” and information processing will influence the field of study.
Settlers in North America during the 1600s and 1700s grew and raised all their own food, with tiny exceptions, such as importing tea. In the nineteenth century, well over 80 percent of the American public either lived at one time on a farm or made their living farming. Today, just over 1 percent does that in the United States, even though there is a surge going on in small organic family farming.
The internet is arguably the most important invention in recent history. To recognize its importance, World Internet Day is celebrated each year on October 29, the date on which the first electronic message was transferred from one computer to another in 1969. At that time, a UCLA student programmer named Charley Kline was working under the supervision of his professor Leonard Klinerock, and transferred a message from a computer housed at UCLA to one at Stanford.
We have reached an age where the trajectories of the advancement of technology including mobile applications, artificial intelligence, and virtual and augmented reality may rapidly spike at any given moment, potentiating an increased incidence of unforeseen consequences in the form of distraction-related morbidity. In the not-too-distant past, logging onto the internet meant sitting in front of a computer.
Mark Twain is reputed to have quipped, “Reports of my death have been greatly exaggerated.” Such hyperbole aptly applies to predictions that digital reading will soon triumph over print.
In late 2012, Ben Horowitz (co-founder of Andreessen Horowitz Venture Capital) declared, “Babies born today will probably never read anything in print.” Now four years on, the plausibility of his forecast has already faded.