: Zoltán Cséfalvay
: Freedom, Innovation, Prosperity The Secrets of Success in the Digital Era
: MCC Press
: 9789636441609
: 1
: CHF 24.40
:
: Wirtschaft
: English
: 460
: DRM
: PC/MAC/eReader/Tablet
: ePUB

Why is it that we cannot predict the technological future? Is the destruction of the old truly an inevitable companion to the birth of the new? Are our reservations about new technologies justified? How will artificial intelligence reshape economic competition among nations? Can the state have any meaningful active role in innovation at all? Why has Europe fallen behind in the innovation race? And why does an invisible iron curtain still exist, one that keeps Central and Eastern Europe from competing with new startups in the global innovation arena? What can we learn from the success of Silicon Valley, Estonia, Israel, or Singapore?
These questions continue to captivate economists, policymakers, and the wider public alike because the past 250 years have clearly shown that the countries that made it into the club of the wealthy were those at the forefront of technological innovation, while the others remained confined in the low- or middle-income trap.
Through a comprehensive exploration encompassing illuminating examples and statistical analysis, Zoltán Cséfalvay examines the connections between free competition, the digital economy, technological progress, and innovation, and how their complex dynamics lead to success in the competition among nations.


Prof. Dr. Zoltán Cséfalvay heads the Centre for Next Technological Futures at Mathias Corvinus Collegium (Budapest), where he gives lectures and conducts research on digitalisation, robotisation, and artificial intelligence in Europe. Previously, he worked as a senior researcher at the Joint Research Centre of the European Commission in Seville (2019-2020). He served as the Hungarian ambassador to the OECD and UNESCO in Paris (2014-2018) and as Minister of State for Economic Strategy in Hungary (2010-2014). He was Professor of Economic Geography at Andrássy University Budapest (2002-2010) and Professor at Kodolányi János University in Hungary for more than two decades. As a research fellow, he worked in Budapest, Vienna, Munich, Heidelberg, and Cardiff. He is the author of 15 books and more than 80 articles in edited books and peer-reviewed journals in English, German, and Hungarian. He recently published his latest book-FREEDOM, INNOVATION, PROSPERITY: The Secrets of Success in the Digital Era-about the impact of the current wave of new technologies on business, society, and geopolitics.

1

The Future Is Here—Only It’s Different From How We Imagined

PREDICTING THINGS IS HARD, PREDICTING TECHNOLOGY IS PRACTICALLY IMPOSSIBLE

The history of technology is replete with predictions that failed to materialise, even in the short term, and most of these predictions were made by technology experts. Thomas Watson, the first president of IBM, is still legendary for his 1943 statement: “I think there is a world market for maybe five computers.”[1] Considering the cupboard-sized computers produced at the time and in the subsequent decades, hardly anyone would have considered keeping a computer at home. This, however, did not prevent IBM from venturing into the personal computer industry when the demand did arrive. Three decades later, in 1977, another charismatic computer developer, Ken Olsen, founder and chief executive of Digital Equipment Corporation (DEC), made a similar prediction: “the personal computer will fall flat on its face in business […]. There is no reason for any individual to have a computer in their home.”[2] DEC, founded in 1957, was at the time the second-largest computer manufacturer in the world, with over 100,000 employees, supplying major corporations, banks, and government institutions with powerful computers that were capable of processing large databases. It’s no wonder, then, that the personal computer was conceived not at IBM or DEC but in a small garage at 2066 Crist Drive, Los Altos, California, in 1975. And it was only after their ideas had been rejected by Hewlett-Packard and Atari, the dominant Californian companies of the day, that 20-year-old Steve Jobs and Stephen Wozniak, then aged 25, set about building the first personal computer.

Less than a year later, in 1976, Bill Gates founded Microsoft. Microsoft’s operating system rapidly became the global standard of personal computers. Still, Gates can himself be credited with a rather pessimistic statement on technology’s limits: “640 kilobytes ought to be enough for anyone.”[3] Incidentally, this was the capacity of the portable data storage unit: the floppy disk. In comparison, a simple flash drive has a capacity of 32 or 64 gigabytes, and to draw an even more stark contrast, an iPhone has more storage capacity than the NASA computers had in 1969 when Neil Armstrong was the first human to set foot on the Moon. Or, to put it into perspective, Calum Chace observes that “today’s iPhone, if it could have been built in 1957, would have cost one and a half times today’s global GDP, would have filled a 100-storey building three km long and wide, and would have used 30 times the world’s current power generating capacity.”[4]

A hotly debated topic today, artificial intelligence, as a concept and science, was born in 1956 at an international convention at Dartmouth College, New Hampshire. One of the event’s participants and one of the founding fathers of AI, Marvin Minsky, made a bold promise that “in from three to eight years we will have a machine with the general intelligence of an average human being.”[5] Yet, such machines still do not exist. That is not to say, however, that artificial intelligence and machine learning are not ubiquitous in our everyday lives. Just to name a few examples: we use AI whenever we search for something on Google and the machine makes relevant suggestions for us; or when we apply a spam filter and the machine automatically decides that certain emails should be stored in the spam folder and eventually deleted. When we order a book on Amazon or spend what feels like hours deciding what to watch next on Netflix, we trust the machine to make a suggestion based on our personal taste, just like when we scroll on Facebook or Twitter (now X), the machine decides which posts are prioritised in our feed.[6]

The famous science fiction writer Arthur C. Clarke, author of the 1960s bestseller2001: A Space Odyssey and screenwriter of the eponymous movie adaptation directed by Stanley Kubrick, is right when he says that “any sufficiently advanced technology is indistinguishable from magic”.[7] This magic, however, is all gone when it becomes common practice, mass-produced and part of our mundane, everyday life. Then the spell vanishes into thin air. Just over a hundred years ago, photography and motion pictures were still considered magic, whereas today anyone can use their smartphone to take photos or videos at will. The same goes for artificial intelligence. As John McCarthy, who came up with the idea for the convention at Dartmouth College, ver