The idea that the United States economy runs on information is so self-evident and commonly accepted today that it barely merits comment. There was an information revolution. America “stopped making stuff.” Computers changed everything. Everyone knows these things, because of an incessant stream of reinforcement from liberal intellectuals, corporate advertisers, and policymakers who take for granted that the US economy shifted toward an “knowledge-based” economy in the late twentieth century.
Pre-eminent among writers of mystery stories is, in my opinion, Dorothy L. Sayers. She is ingenious, witty, original – and scientific too, including themes like the fourth dimension, electroplating, and the acoustics of bells in some of her best stories. She is also the inventor of the voice-activated lock, which her hero Lord Wimsey deploys in the 1928 short story ‘The Adventurous Exploit of the Cave of Ali Baba’.
Our lives are full of distractions: overheard conversations, the neighbor’s lawnmower, a baby crying in the row behind us, pop-up ads on our computers. Much of the time we can mentally dismiss their presence. But what about when we are reading? I have been studying how people read with printed text versus on digital devices.
The words digital economy conjure images of young, tech-savvy entrepreneurs breaking moulds in a world where technology is disruptive. But could the reality be much more mundane and mercantile? When Facebook released “Facebook at work” earlier this year, the social networking goliath laid a huge challenge at the feat of LinkedIn, a powerful incumbent that had until then dominated its corner of the market.
The International Space Station was originally conceived as our base camp to the stars – the first step in a long journey of human civilisation exploring new planets, asteroids, and galaxies, and perhaps even helping us to meet other forms of life in the universe along the way. The International Space Station is an incredible feat in human engineering, politics, and bravery.
Noise barriers are not regarded with a great deal of affection. In fact, they’re not much regarded at all; perhaps not surprising, given that the goal of their installers is to ensure that those who benefit notice neither the barrier nor the noise sources it hides. The majority are basic workmanlike structures, built according to tried and trusted principles.
During the night, between 3rd and 4th September 1946, things were stirring in the basement of the internal medicine department, at the university hospital of Lund, Southern Sweden. A 47-year-old man had been admitted for treatment. His main problem was uraemia (urea in the blood), but he was also suffering from silicosis (a lung disorder), complicated by pneumonia.
There has been much recent talk about a possible robot apocalypse. One person who is highly skeptical about this possibility is philosopher John Searle. In a 2014 essay, he argues that “the prospect of superintelligent computers rising up and killing us, all by themselves, is not a real danger”.
How does one preserve the ephemera of the digital world? In a movement as large as the Arab Spring, with a huge digital imprint that chronicled everything from a government overthrow to the quiet boredom of waiting between events, archivists are faced with the question of how to preserve history. The Internet may seem to provide us with the curse of perfect recall, but the truth is it’s far from perfect — and perhaps there’s value in forgetting.
The Bodleian recently launched a festival celebrating drawing. As part of this, the artist Tamarin Norwood retreated to our Printing Workshop, turned off her devices and learned how to set type. She proceeded, in her inky and delightful way, to compose a series of Print Tweets.
Oxford University Press is excited to be welcoming Professor Steve Furber as the new Editor-in-Chief of The Computer Journal. In an interview between Justin Richards of BCS, The Chartered Institute of IT and Steve, we get to know more about the SpiNNaker project, ethical issues around Artificial Intelligence (AI), and the future of the IT industry.
Can a robot be conscious? I will try to discuss this without getting bogged down in the rather thorny issue of what consciousness –– really is. Instead, let me first address whether robot consciousness is an important topic to think about. At first sight, it may seem unimportant. Robots will affect us only through their outward behavior, which may be more or less along the lines of what we tend to think of as coming along with consciousness, but given this behavior, its consequences to us are not affected by whether or not it really is accompanied by consciousness.
There is a widely held conception that progress in science and technology is our salvation, and the more of it, the better. This is the default assumption not only among the general public, but also in the research community including university administration and research funding agencies, all the way up to government ministries. I believe the assumption to be wrong, and very dangerous.
A galaxy is a gigantic system possessing billions of stars, vast amounts of gas, dust and dark matter held together by gravitational attraction. Typical size of galaxies can be anywhere from a few tens-of-thousands to a few hundreds-of-thousands of light-years.
In a British Council report Martin Rose argues that the way STEM subjects are taught reinforces the development of a mind-set receptive to violent extremism. Well taught social sciences, on the other hand, are a potentially powerful intellectual defence against it. Whilst his primary focus was MENA (Middle East and North Africa) he draws implications for education in the West.
While myriad forces are changing the face of contemporary healthcare, one could argue that nothing will change the way medicine is practiced, more than current advances in technology. Indeed, technology is changing the entire world at a remarkable rate – with mobile phones, music players, emails, databases, laptop computers, and tablets transforming the way we work, play, and relax.