Stephen Hawking Is Right: A.I. Is Terrifying

Famed and absolutely brilliant physicist Stephen Hawking is in the news recently for his musings on A.I., otherwise known as Artificial Intelligence. As one story quotes the scientist,

One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.

Hawking is right. The scary part is that a financial market drives the need and desire to create Artificial Intelligence. It is like an arms race, in many ways: Who can come up with the smartest, the quickest, the most human-like, the most… and on an on.

The terrifying part is this:

Hawking is right about A.I. in the same way that medical professionals and biologists were right about antibiotic resistance.

Hawking is right about A.I. in the same way that climate scientists and environmental scientists were right about the devastating effects of our overconsumption of resources, leading to a quickly shifting climate.

What’s so terrifying about that? What is horrifying about these comparisons is that very few members of the public and almost no corporations heeded these warnings. Doctors and pharmaceutical companies still push antibiotic use in unnecessary situations, causing us to come upon the brink of a post-antibiotic age. Manufacturers, corporations, and government agencies are still allowing stunning amounts of pollution and carbon into our environment, opening the door for globally dire conditions.

The biggest problem with this is that we are on the absolute tipping point of these stories; climate change is evident, but the full effects are not yet felt. We haven’t had a pandemic yet, but it is not a matter of, “if,” it is a matter of, “when.” Once these things happen and the global population feels the heat, everyone will be stunned. They will all wonder why we didn’t do anything to stop it.

Well, we did try to stop it. The informed one tried. The people who were willing to change their lifestyles to limit consumption, the people who questioned every antibiotic prescribed to them. I’m not saying we can cut down our consumption of resources to zero, nor am I saying that you shouldn’t take the medications that your doctor tells you to. What I’m saying is that the overarching problems have been here. They have been here, and just like using the excuse of not realizing you’re breaking the speed limit when a cop pulls you over, ignorance does not excuse the offense.

So when Stephen Hawking tells us something along the lines of Artificial Intelligence needing to be closely monitored and limited, and that it could cause an existential breakdown of humanity, I personally believe we should listen.

Science and technology are wonderful things. I appreciate them. But I am not “of” technology. I am not defined by it. I don’t use an electric coffee pot. I own a television from the early 2000’s. I do not believe that just because technology exists that we should immediately adopt it.

We are speeding forward into a cultural horizon which exhibits exorbitant consumption, an absolute dependency on technology, and a general disregard for the wisdom of those who study the subjects we depend upon.

The human race, particularly those who consume said technology and resources, needs to change its ways. And fast. It is going to affect our wallets and our egos, but I would rather have an empty wallet and a complete life than a fat wallet and an empty life.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s