It won’t be long before computers can think faster and better than humans. Artificial Intelligence exists, and it is getting smarter at an incredible rate. As so many focus on how well machines can think, alarmingly few people are concentrating on how well humans are thinking (or rather, how they aren’t thinking as well as they used to). A side effect of epic proportions has accompanied our ventures into superior technology. We have become inseparable from our computers and, as time goes on, we are relying on them for more and more of our daily cognitive functions. The brain needs exercise. When you don't use it often enough, your abilities deteriorate. The resources at our disposal are enabling us to perform more advanced tasks faster than ever before, but when it comes to plain old thinking, is technology making us dumb? The answer is two sided. On one hand, our ability to locate information is exponentially higher. On the other, we are retaining far less than we ever have.
The Google Effect
If you Google "Is technology making us dumb," a slew of responses show up, mostly in the affirmative. The irony is that enough people Google this to warrant the creation of information on the topic. The fact that our first instinct when faced with such a question is to look to Google is highly indicative of the answer.
A variety of studies have been done that indicate a vast majority of people are suffering from the "Google Effect," or a lack of memory created by their propensity to rely on technology for answers. Studies have shown that 90 percent of people suffer from digital amnesia. Seventy percent of parents don't have their children's phone numbers memorized, and 49 percent do not know their partner's number. In fact, many adults are more likely to remember their childhood friend's home number from 15 or 20 years ago than they are to remember their husband or wife's current cell phone number. Back then, the phone didn't do the remembering for you.
In a world where important information is just a swipe away, our brains purposely do not commit data to memory. Instead, we treat our technology as an extension of our own memory, and rely on that. This makes us good at remembering where to look for different information, but much worse at remembering the information itself. This extends to images as well. One study tested the memory of a group of students in a museum. Those who took pictures were able to remember fewer objects and fewer details about those objects they did recall than those who simply observed. Many scientists speculate that this transfers into actual living memories as well. If you're tweeting or posting on Facebook about an event, you're less likely to take in the smaller details and commit them to memory. Our brains are so accustomed to our phones committing things to memory that we pay less attention to life itself, and our memories become dimmer. Yes, you can pull up your Facebook post from that night and relive it vicariously. But is it the same as being able to play the moving picture in your head?
Nicholas Carr, author of The Shallows: How the Internet is Changing the Way We Think, Read and Remember and The Glass Cage: Where Automation is Taking Us, states that, thanks to technology, "There is a superficiality to a lot of our thinking." Carr believes that the lack of memory is contributing to a lack of thought. "It's not as if remembering and thinking are separate processes. The more things you remember, the more material you have to work on, the more interesting your thoughts are likely to be," he says. Furthermore, many important life lessons come from our day to day memories. We are at risk of losing our street smarts as much as our book smarts if we are unable to vividly remember the events that taught us the proper ways to act and treat others.
Changes to the Brain
When you look at what our constant use of technology does to the brain, it makes giving five-year-olds iPads a bit alarming. Between the year 2000 and 2016, the average human attention span fell from 12 seconds to eight seconds, a change that most scientists attribute to constant use of technology. Most teachers feel that this generation of students is more distracted than previous years, as well as the idea that technology was contributing to, if not causing, the problem.
Overuse of technology can also disrupt your sleep, which has its own impact on the brain. Studies have shown that blue-enriched light, which is emitted by most electronics with a backlit screen, can suppress the body’s release of melatonin at night. Melatonin regulates your internal clock, and smaller amounts of it in your system can damage your ability to maintain a regular sleep schedule. Aside from side effects such as decreased focus and increasingly bad moods, lack of sleep can actually lead to a loss of brain tissue.
Using your GPS for all navigation is also damaging to your brain. A series of studies showed that people who rely on GPS to get around have less activity in the hippocampus, an area of the brain involved in both memory and navigation, than those who use maps and learn to navigate based on landscape indicators. Your spatial memory develops far less when you are on GPS autopilot than when you need to observe what is around you to determine where you are going and how to get back.
Perhaps the most alarming brain alteration caused by the overuse of technology is the addiction that many people develop. While many of us dismiss the phrase "internet addiction" as something parents and teachers use to scare kids out of playing too many violent video games, a 2012 study showed that spending too much time on the Internet can actually cause changes in the brain that mimic those caused by drug and alcohol dependence. Internet addicts have abnormal white and grey matter in their brains, which disrupts the regions responsible for processing emotion and regulating attention and decision-making. These abnormalities strongly mimic those found in alcohol and drug addicts.
This isn't to say there aren't skills that being plugged-in can't hone. A study regarding the frequency and habits of teen texting revealed that students that used text abbreviations more often tended to score higher on a measure of verbal reasoning ability. Scientists believe this is likely because the condensed language of texting requires an awareness of how sounds relate to written English.
Creativity and Education
Many will argue that the lack of memorization present in today's society leaves us with more space for creative thought. To some degree, this is true, but we have not figured out how to hone the possibility of this extra space. Andrew Keen, author of The Internet is Not the Answer, says: "We need to think eclectically and daringly," he says. "The big issue is how to teach creativity. We don't need to learn facts, to remember stuff is less important, so the nature of professions are shifting; Teachers should bear this is mind. The question is, how do you teach children to think differently?" In schools, children are taught to memorize and regurgitate information. This conflicts intensely with the rise of a society that places more value on new ways to process this information and the most efficient way to find it than it does the actual retention of facts.
However, many creative arts are being hindered by technology as well. Have you ever read a vintage magazine? The copy is astounding, and there is little writing today that compares. Copy writing is slowly becoming a lost art. Rather than writing to the magazine reader who will be drawn in by beautiful prose, copywriters must seek the attention of the Googler, who will only see their words if they contain the proper keyword ratio. We are writing to machines more than we are writing to humans, and it is forcing us to sacrifice creativity and art for the sake of post reach. This means that too often the writer who can optimize a post with keywords will often find more readers than one who has mastered the art of the written word. It is becoming harder for real writers to make a living without sacrificing quality, because those of lesser talent can churn out quantity SEO articles that will get more clicks.
On the flip side, many more people are able to explore their creative side because of technology. Just about anyone can start a site to exhibit their writing, photography, painting, drawing, or any other art form they choose to practice. This, perhaps, is the best way to embrace the added space technology leaves for creativity. Take advantage of it to hone our natural creative talents, and encourage others to do the same.
One skill that technology has honed in humans is the ability to multitask. With so much information available at the touch of a button, we need to be able to do many things at once to keep up. However, there are adverse effects to multitasking constantly. Studies have shown that multitasking reduces gray matter density in the Anterior Cingulate Cortex (ACC), which contains both emotional and cognitive components, most notably reward anticipation, decision-making, empathy, impulse control, and emotion. In essence, the ACC acts as a hub for processing and assigning control to other areas of the brain, based on whether the messages are cognitive or emotional. Therefore, the reduction in gray matter density that occurs from multitasking can reduce our ability to make sound decisions, modulate our emotions, have empathy, and to connect emotionally to others. This decreased emotional connection to the world increases stress, distrust, and even aggression.
Is There a Solution?
How do we stop technology from making us dumb? The simplest answer is to unplug, but at this point, that is hardly possible. Technology has created global markets and economies that would detrimentally damage entire nations if they were shut down. It has opened a world of progress and opportunity that few could have predicted. So how do we keep human intelligence intact without interrupting progress?
Many have turned to science to answer the question. CEOs and business owners, in particular, are buying into research on so-called "smart drugs" that can expand the capabilities of the human brain. While it is the stuff of sci-fi such as Limitless, it is also a reality in the works. Surveys have shown that up to 20 percent of Ivy League students used medications such as Adderall, Ritalin, and Modafinil to enhance their performance in school. This habit began to creep into the professional world, and spawned the development of nootropics, or drugs designed to make users think faster, remember for longer, and concentrate harder.
These drugs give Silicon Valley tech entrepreneurs, Wall Street analysts, and others with intellectually demanding jobs a competitive edge, and counteracts the effects of constant technology use. To perform well in the tech industry, you must constantly be plugged in, so many are turning to these drugs to aid their performance. They not only counteract the technological impacts on the brain, but they take human performance above and beyond. The purpose of the drugs was to enhance, not bring us back to our previous status quo, but if technology is damaging the brains of our most prized intellectuals than these drugs may become much more than the tech industry's equivalent of steroids. One must also beg the question of whether previous generations did not seek smart drugs because they did not have the resources or the idea, or because they were not seeing deterioration in their ability to remember and concentrate.
While some are developing new drugs, others are focusing on what the drugs we already use can do to improve brain function. J. J. Abrams' Fringe explored the concept of LSD as a brain-enhancing drug in the early 2000s, and extensive research has been done on this front in the real world. From the drugs mentioned above as the favorites of students to acid, many scientists are trying to isolate the right dosage and strain of pre-existing medications that will expand the brain's capability without harmful side effects. It is hard to tell how many people are taking nootropics, but there are entire internet communities online devoted to researching and discussing their use.
The effectiveness of smart drugs is still relatively unknown, and varies extensively from user to user. Many consider the use of drugs as cheating. If steroids are considered cheating in sports, aren't smart drugs or nootropics cheating in the professional world? The line is blurry, and it many be years before these drugs are commonplace enough for it to become more defined. Many fear that these drugs can widen the already-gargantuan gap between the privileged and the underprivileged. Others believe it can close the gap by giving the underprivileged a leg up. That all depends on how available they are made to the public and what the cost is to acquire them. The viability of widespread administering of these drugs has not been explored, and resting the future of humanity on any drug is concerning. What is certain is that this is the most-pursued solution to the damage we are seeing to the human brain from our technology use. To date the effects are present and identified, but not detrimental. However, as we continue to develop new technologies and become more reliant on them, there is nothing to say we won't damage our intellectual capabilities beyond repair.