Перевести на Переведено сервисом «Яндекс.Перевод»

Will Human Intelligence As We Know It Disappear In The Age Of A.I.?

Who is more intelligent: a caveman, a smartphone or your brother? What if we also put artificial intelligence in line? Humanity has come a long way from the African savannah to arrive at the urban skyscrapers, and not only physically, but also cognitively.


Stone tools, speech, writing, pen, and paper all proved to be decisive tools in putting human intelligence on a path never seen before. What could happen with the emergence of A.I., in which case it’s not about using an instrument anymore but to collaborate with a created, but responsive source of intelligence? Is humanity ready for it?

Human intelligence in perspective – Abstracting away

Imagine that a Saturday morning you decide to make pancakes. From the ability to make that simple decision to the capacity to organize shopping through language use until following up the baking procedure and eating the end product (not to speak about agreeing on who washes the dishes!), the process resorts to our intelligence more than we ever think about. “The ability to reason, plan, solve problems, think abstractly, comprehend ideas and language, and learn,” does not only make numerous people pancake kings, but also the life form on planet Earth which is able to shape its environment to its needs. No other animal has ever done anything similar, but we have to acknowledge that our ancestors also had to come a long way.

The migration of people to all areas of the Earth along with the industrialization of modern societyhas abstracted the modern man away from the direct environment. Just think about how far humanity got from the African savannah: instead of chasing an antelope for hours and then striking it dead with a knob stick, you go into the store with a plastic card and ask for a pre-packaged slice of meat. No savannah, no knob stick, no hunting – and no idea how an antelope looks in real life. Without a lot more collective imagination, creativity and cognitive ability it’s hard to see the second as a viable way of gathering food. Over the centuries, greater importance has been placed on collective cognitive ability and intelligence to allow us to function in modern society.

While most animals follow their instincts and genetic codes to survive and even prosper in their respective settings, you have zero chance of survival in an open office packed with computers, printers, and judgmental colleagues – only presenting human instincts. Changes have been made so rapidly in society that the natural selection mechanism of evolution has not had a chance to catch up to the progress we have made technologically. Thus, the part of the brain that is able to process abstract thought is used to help us navigate and cope with our “foreign” environments. That’s why intelligence and especially collective intelligence has become increasingly important over the past millennia.

human intelligence
Source: thoughtco.com

From reciting the Odyssey to forgetting birthdays

Researchers argue that the invention of tools completely rewired the brains of Homo Sapiens, but the appearance of writing, later on, printing and finally computers have no less impact on cognition, memory, and intelligence in general. The brain of the hunter-gatherer, who was scrapping meat from the bones of huge mammals with stone tools, that of the agricultural human, starting to grow wild rye 12,000 years ago, that of the codex printing priest or the homo technologicus might be different on many levels.

Let’s take, for example, the appearance of writing dated back to the Sumerians, who invented it as a means of long-distance communication necessitated by trade in around 3,500-3,000 B.C. Writing enabled a fraction of the population to pin down details related to business transactions, but oral storytelling prevailed for centuries to come. Historians say that Homer was not one person but generations of storytellers who recited the book-long ballad of Odysseus and the Iliad over and over again.

With the spread of writing, the need for remembering long passages declined, so the individual capacity for recalling facts, numbers or stories also diminished. The individual human brain rather settled for making sense of texts and writing, whereas the collective mind became richer and less forgetful through rolls and books. The appearance of printing, and later on, computers further accelerated this diverging procedure – until humanity has found itself in the situation that the collective human intelligence has reached the level that it could discover gravitational waves and black holes, while individuals don’t even remember the birthdays of their friends anymore – since they have Facebook and their smartphones to remind them.

human intelligence
Source: netdirectdistribution.com

Are washing machines smarter than us?

Alongside our capacities to process and remember information, researchers debate a lot about how human intelligence changed in general, some arguing that it increased (see the Flynn effect) and others maintain that it declined due to genetic changes. However, what’s beyond doubt, and also visible from the above examples, is that the reliance of humans on ever more complex technologies increased so much that it has impacted human cognition, memory-making, and intelligence in general.

We’ve eagerly traded our agency in ever increasing ratios for anticipated gains, even when we’re unfamiliar with the choices and assumptions built into a particular instance of that trade-off. For example, we accepted the washing machine doing the laundry for us while we can watch TV – although we have no idea how it works and the majority of people would be in huge trouble if they had to wash their clothes with their bare hands. In general, our risk-taking has paid off, and the resulting real gains have generally far outstripped losses. Thus, humanity seems to be content with shrugging off responsibilities to technology in return for more comfort.

human intelligence
kiswash.co.uk

Is your laptop or your smartphone the extension of your mind?

With the appearance of personal computers and smartphones, that phenomenon is present to such extent that some philosophers argue that they should be treated as extensions of our personality and mind. In ‘The Extended Mind’, Andy Clark and David Chalmers argue that technology is actually part of us.

According to traditional cognitive science, ‘thinking’ is a process of symbol manipulation or neural computation, which gets carried out by the brain. Clark and Chalmers broadly accept this computational theory of mind but claim that tools can become seamlessly integrated into how we think. Objects such as smartphones or notepads are often just as functionally essential to our cognition as the thoughts wandering around in our heads. They augment and extend our minds by increasing our cognitive power and freeing up internal resources. Just think about it for a moment how powerless you would feel without your smartphone for a day – no communication, no calendar, no transport information, no Google, no Wikipedia, no news, no weather forecast, no photos.

The theory of extensive intelligence says that we should incorporate technologies into our concept about human intelligence, which might eventually result in treating these objects as external dimensions of ourselves. With the appearance of ever more sophisticated technologies to which we delegate the most intimate data about our lives, such as health data, it makes more and more sense not to make a clear distinction where the technology ends and the human starts. Especially, if you look at cases when the FBI reportedly pressed a dead gunman’s finger to the fingerprint-recognition feature on his iPhone in an attempt to unlock it. Regardless of who’s affected, it’s disquieting to think about the police using a corpse to break into someone’s digital afterlife. With the extended mind theory, such intrusions are considered unethical and if incorporated into our laws, also illegal.

human intelligence
Source: youtube.com

How would A.I. change the course of events?

We’re at an interesting transition point where we are moving from using our tools as passive extensions of ourselves, to working with them as active partners. An ax or a hammer is a passive extension of a hand, but a drone forms a distributed intelligence along with its operator and is closer to a dog or horse than a device. And although artificial intelligence is the worst joke teller yet, such tools can interact with us in ways never before possible, from scripting interviews based on voice recordings until producing cat videos for YouTube.

Getting into active partnership with ever-learning tools means a significant leveling up from previous time periods – for the machines. They are not only taking over manual, but also cognitive tasks. The area of humanity where it can feel superior in comparison to technology is shrinking, and we arrived at yet another crossroad of redefining what it means to be human. Is there the possibility that we delegate so many tasks to technologies that we practically find ourselves empty-handed and redundant? When deep learning algorithms teach other algorithms on how to play chess, where does the ‘human touch’ make any difference? Israeli historian, Yuval Noah Harari, warned several times that one of the biggest challenges of the 21st century will be the danger of humans becoming insignificant – compared to technology, first of all, A.I.

Although this rather dark vision would definitely be further down the road, with the development of artificial general intelligence (AGI) in sight, the fastest growing field of A.I., artificial narrow intelligence (ANI) with computer vision and natural language processing, is already fit for completing various, highly predetermined task-sets. However, that doesn’t mean that artificial intelligence could mimic human intelligence, cognition, and consciousness in any way. We are decades, maybe centuries away from AGI and any kind of singularity. Moreover, without humans, not even ANI algorithms perform well. Numerous research, also in medicine, proves that the best scenario is even the collaboration of A.I. and humans.

human intelligence
theceomagazine.com

Perhaps we all should familiarize with the thought that independent, cognitive human intelligence will come to an end with the emergence of A.I.; and the future rather leads to the combination of the two.


medicalfuturist.com

Log in or sign up on  to add a comment to scientific problem you are interested in!
Comments (0)