Last week’s cyberattack raises an important question about society’s ability to cope with its increasing dependence on technology.
Last week, a massive cyberattack struck users, companies and organisations on a global level. The WannaCry virus, a form of Ransomware, targeted outdated Windows systems worldwide. The victims of this attack include the Spanish telecommunications company Telefónica, American courier service FedEx, and the UK’s National Health Service (NHS).
This attack has served as a monition to society of the need to update software promptly in order to ensure maximum security online. In the NHS’ case, hospital managers had neglected to install a patch released in April which Windows claims would have immunised their computers to such an attack. The NHS has recently claimed that the patch did not work. Either way, questions must be asked about the wisdom of using an obsolete computer system in such an important and sensitive organisation.
From the point of view of Millennials, however, this act of cyberterrorism presents a far bigger point of consideration: a reminder of technology’s inescapable and unstoppably increasing role in society, and a question of whether the members of that society will be sufficiently prepared.
It is clear that our growing dependence on technology allowed this attack to occur. First, the far-reaching damage of the virus was facilitated by the fact that one computer system formed the basis of technological setups all around the world. Second, Bitcoin, the inchoate virtual currency pipped to become a mainstay in future society, permitted the hackers to demand a ransom from behind the safety of anonymity. In this manner, international giants and a national health service were temporarily brought to their knees in the blink of an eye.
Technology’s role as the bedrock of all areas of society is only set to become more established. A presentation on the future of Artificial Intelligence (AI) by Henderson Global Investors highlighted the exponential rise of technology since its inception through two examples. Whereas the newly invented personal computer attracted one billion users, 2.5 billion people currently use the mobile cloud, and the rapidly approaching Internet of Things is expected to include 100s of billions of devices. The second example demonstrates the quickening level of uptake of these technological advancements. It took 62 years to reach 80% adoption of the telephone, 33 years for electricity, 20 years for the TV, 8 years for social media, and as little as 2 years are envisaged for AI.
These changes promise incredible benefits, of which we hear much every day. But they also carry less-mentioned risk, such as that exposed by last week’s cyberattack. In order to mitigate this risk, when all aspects of life are dictated by lines of code and invisible pulses, we must be more knowledgeable about the foundations of the world around us.
The ramifications of not improving technology education could be severe. Cyberterrorism will become more commonplace as the reach and potential rewards increase for such criminal behaviour, and ignorance of the ways to avert such danger could lead to unnecessary vulnerability.
An unprecedented degree of ignorance about everyday operations would also dramatically increase reliance on those who do design and therefore understand them. The upshot of this would be a reversion to a form of inequality not seen since the absolute monarchies of the Middle Ages, as not only wealth but also power is cordoned off to the tiny minority which possesses the knowledge and means to control the bedrock of society. It would be technocracy on steroids, in which a ubiquitous threat were kept away solely by a miniscule elite. Ayn Rand’s Atlas Shrugged would seem a vision of egalitarian Utopia compared to this.
This might seem far-fetched at first, but one only has to look back at recent changes caused by technological advancement to see that it is far from hyperbolic. In the past, someone with basic engineering skills, and a handy grasp of automotive vehicles, could make repairs to cars and fine tune them. Nowadays, as nearly all functions are computerised, specialists in garages must be sought for the simplest problems. Similarly, everyday items and household amenities are becoming increasingly complex and sophisticated, and thus reliant on experts for upkeep. This trend will only continue.
The result of all this is an urgent need to introduce a higher level of compulsory education to all members of society. This could come in the form of an expanded Information Technology (IT) curriculum at school, compulsory to a later age than it currently is. This should not merely teach students how to use Microsoft Excel or surf the Internet safely and effectively. It must foster a general understanding of just how much society now relies upon technology, particularly the ways by which they can cope and thrive in this new age.
Of course not everybody should be expected to be able to tackle giant cyberattacks. But a greater knowledge about how to identify them, the different forms in which they come, and the easy steps by which we can guard against them, will certainly prevent many incidents in the future.
For example, the most common entry route for computer viruses is still phishing, despite it being incredibly easy to avert with the correct training and proficiency. That proficiency must become universal.
This is not a development of the argument that technology and the sciences are more valued by contemporary society than the humanities. Given the existing dearth in sufficient IT education, it is in fact more about realising the importance of technology in the daily lives of all citizens, not just those working in the sector. Last week’s cyberattack provides a wake-up call that dependence on technology is ever-increasing, and so too must our ability to use and understand it.
Marcus Solarz Hendriks