From the Paleolithic age (2.5million to 10,000BC) when archaeologists and anthropologists say the earliest humans began using of tools and complex behaviours to the use of stones and the domestication of fire between 500,000 BC and 400,000BC, technology has remained an ever-present feature of the human life ― in fact rising in equal proportions with the generational growth and development of the human race.
It started from some of the then-big-time inventions that are today overlooked and dismissed as “common” necessities. But were they really “common?” At the time when humans wandered literally aimlessly day and night and were nothing short of the present-day cattle nomads, the advent of shelter at a period before 380,000BC was anything but common. Add that to the era when humans roamed the bush stark naked (forget the hubbub that greets a Genevieve Nnaji’s flamboyantly explorative decision to flaunt half her bosoms at the recent Africa Magic Viewers Choice Awards in Lagos or the opprobrium that swarms Agbani Darego after one of her numerous foreign runway sessions when some eagle-eyed onlookers claimed to have been harassed, perhaps they were in fact pleasured, by the glare of her straightforward nipples) and consider what breathtaking innovation clothing must have been at that era!
On the strength of its addictiveness and compulsiveness, it was always undebatable that technology would remain a permanent feature of humans’ daily living. And so it has remained. The early human, having seen, for example, the comfort of shelter over itinerant living thought there had to be a better travel alternative to the foot or the camel. Not in a particular order, on came a bicycle, then the motorcycle, the tricycle, the car, the plane, the boat, the ship, and the train. A time even came when the thinking shifted from the fastest mode of communication-induced transportation to the fabrication of effective transportation-defeating modes of communication. Then came telephony. In a matter of decades, the earliest forms of telephony, which employed scarcely moveable “land” gadgets, became obsolete, as telephone communication became digitized and mobile. The term “telephony” itself began to enjoy wider usage, from conversations via the phone to faxing, voicemail, mobile communication, internet calling and video conferencing.
In present times, it is hard to imagine any aspect of daily human activities that has yet to be digitized. In domestic circles, a chore as unavoidable as clothe-washing has been digitized and the associated drudgery removed. Now, the machine does the job; and the modern-day woman now has the luxury of the removal of one extra burden that the early-time woman shouldered like a plague. Manual mode of food-warming has faded into oblivion (save in the developing world populated by technological laggards), following Percy Spencer’s 1955 invention of the microwave oven.
Some of these inventions have been somewhat weird. And one of the weirdest, unarguably, is medical psychologist Robert G. Edwards’ In-Vitro Fertilisation (IVF), which ensures that pregnancy can occur without intercourse, just by removing a woman’s ovum/ova and fertilizing it with artificially collected semen in a fluid medium in the laboratory. The zygote (fertlised eggs) is then transferred to the patient’s uterus with the intention of establishing a successful pregnancy. So surprised and ecstatic was the world by the magic that the inventor was awarded the 2010 Nobel Prize in Psychology or Medicine. Louise Brown, the 1978-born first product of that invention whose parents futilely tried to conceive for nine years, got married to nightclub doorman Wesley Mullinder in 2003.
Yet Senior Vice President, Office of the Chairman and Chief Executive Officer of Cisco, Howard Charney, says the world just hasn’t seen enough. Delivering a keynote address, What Drives the Future?, at the Cisco Expo 2013 which held in Sun City, South Africa between 3rd and 5th March 2013, Mr. Charney singles out the Internet as the driver of the future, even if “99 per cent of the world” has not even been connected, despite all the talk of the pervasiveness of the Internet and its speed of making inroads into the running of every profession.
In his Internet of Everything (IoE), he sees a wide-ranging Internet interconnectivity of people and people, people and machines, people and mobiles, people and processes, machines and machines, people and data, people and things, data and things that will, in future, find application in industry-specific uses such as smart grid and connected commercial vehicles, and cross-industry uses such as telecommuting and avoidance of travel.
“The Internet of Everything brings together people, process, data and things to make
networked connections more relevant and valuable than ever before — turning information into actions that create new capabilities, richer experiences and unprecedented economic opportunity for businesses, individuals and countries,” he says. In his estimation, IoE’s limitless potentials include an opportunity to grow aggregate corporate profits by 21 per cent by 2022. And one of the reasons this is possible: about 50 billion devices will be connected to the internet by 2020.
In the future also, Charney sees information delivered immediately after collection, referred to in technological terms as “real-time data,” as withholding the potential to reshape the world ― a mission it has already begun. Real-time data, for instance, finds application in the war against the sale and consumption of counterfeit drugs, known to be responsible for the death of some 700,000 people globally, annually.
Already, real-time-data-powered mPedigree Network partners the principal telecom operators in Africa, the leading pharmaceutical industry associations on the continent and Fortune 500 technology powerhouses to empower African patients and consumers to protect themselves from the fatal effects of pharmaceutical counterfeiting, particularly in vulnerable parts of the world. “An important side effect,” according to Charney, “is the steady recovery of the more than $200m that legitimate pharmaceutical companies lose daily to this genocide trade.”
So, Charney sees ― and rightly, too, ― a future that will not only witness further technological advancement that will make life simpler to live but also the creation of technologies that will make humans more responsible in life. Some, just some of these technologies include the mobile sign-language translator; the iWatch, a smart watch with a flexible glass that can be shaped to the human wrist from which people can access their phones and other devices; the Google Glasses, an augmented reality wearable computer with a Head-Mounted Display (HMD) that seems the closest to ubiquitous computing, which is the idea that Internet and computers will be accessible anywhere in the world by a user without the physical use of hands; plus many other yet-unnamed but under-construction innovations that will help to reduce road and air accidents, and as well lessen incidents of natural global tragedies.
From developing human technologies to ease out the stress of daily human living, technology is unequivocally shifting to the stage of redefining the very existence of living itself and the very mould of life itself ― to amazing levels whose end is outright unpredictable. Indeed, the star attraction of the future is not the anxious wait for the technologies to emerge but rather the sheer impossibility of pinpointing the limits of emergent technologies. And in the absence of ability to draw out the finiteness of future technologies, isn’t it just safe to dub the future “the technology of everything?” I suppose so!