Why Westerners Fear Robots and the Japanese Do Not

https://www.wired.com/story/ideas-joi-ito-robot-overlords


Sometime in the late Eighties, I participated in a meeting organized by the Honda Foundation in which a Japanese professor—I can’t remember his name—made the case that the Japanese had more success integrating robots into society because of their country’s indigenous Shinto religion, which remains the official national religion of Japan.

Shinto, unlike Judeo-Christian monotheists and the Greeks before them, do not believe that humans are particularly “special.” Instead, there are spirits in everything, rather like “The Force” in Star Wars. Nature doesn’t belong to us, we belong to Nature, and spirits live in everything, including rocks, tools, homes, and even empty spaces.

The West, the professor contended, has a problem with the idea of things having spirits and feels that anthropomorphism, the attribution of human-like attributes to things or animals, is childish, primitive, or even bad. He argued that the Luddites who smashed the automated looms that were eliminating their jobs in the 19th century were an example of that, and for contrast he showed an image of a Japanese robot in a factory wearing a cap, having a name and being treated like a colleague rather than a creepy enemy.

The general idea that Japanese accept robots far more easily than Westerners is fairly common these days. Osamu Tezuka, the Japanese cartoonist and the creator of Atom Boy noted the relationship between Buddhism and robots, saying, ”Japanese don’t make a distinction between man, the superior creature, and the world about him. Everything is fused together, and we accept robots easily along with the wide world about us, the insects, the rocks—it’s all one. We have none of the doubting attitude toward robots, as pseudohumans, that you find in the West. So here you find no resistance, simply quiet acceptance.” And while the Japanese did of course become agrarian and then industrial, Shinto and Buddhist influences have caused Japan to retain many of the rituals and sensibilities of a more pre-humanist period.

In Sapiens, Yuval Noah Harari, an Israeli historian, describes the notion of “humanity” as something that evolved in our belief system as we morphed from hunter-gatherers to shepherds to farmers to capitalists. As early hunter-gatherers, nature did not belong to us—we were simply part of nature—and many indigenous people today still live with belief systems that reflect this point of view. Native Americans listen to and talk to the wind. Indigenous hunters often use elaborate rituals to communicate with their prey and the predators in the forest. Many hunter-gatherer cultures, for example, are deeply connected to the land but have no tradition of land ownership, which has been a source of misunderstandings and clashes with Western colonists that continues even today.

It wasn’t until humans began engaging in animal husbandry and farming that we began to have the notion that we own and have dominion over other things, over nature. The notion that anything—a rock, a sheep, a dog, a car, or a person—can belong to a human being or a corporation is a relatively new idea. In many ways, it’s at the core of an idea of “humanity” that makes humans a special, protected class and, in the process, dehumanizes and oppresses anything that’s not human, living or non-living. Dehumanization and the notion of ownership and economics gave birth to slavery at scale.

In Stamped from the Beginning, the historian Ibram X. Kendi describes the colonial era debate in America about whether slaves should be exposed to Christianity. British common law stated that a Christian could not be enslaved, and many plantation owners feared that they would lose their slaves if they were Christianized. They therefore argued that Blacks were too barbaric to become Christian. Others argued that Christianity would make slaves more docile and easier to control. Fundamentally, this debate was about whether Christianity—giving slaves a spiritual existence—increased or decreased the ability to control them. (The idea of permitting spirituality is fundamentally foreign to the Japanese because everything has a spirit and therefore it can’t be denied or permitted.)

This fear of being overthrown by the oppressed, or somehow becoming the oppressed, has weighed heavily on the minds of those in power since the beginning of mass slavery and the slave trade. I wonder if this fear is almost uniquely Judeo-Christian and might be feeding the Western fear of robots. (While Japan had what could be called slavery, it was never at an industrial scale.)

Lots of powerful people (in other words, mostly white men) in the West are publicly expressing their fears about the potential power of robots to rule humans, driving the public narrative. Yet many of the same people wringing their hands are also racing to build robots powerful enough to do that—and, of course, underwriting research to try to keep control of the machines they’re inventing, although this time it doesn’t involved Christianizing robots … yet.

Douglas Rushkoff, whose book, Team Human, is due out early next year, recently wrote about a meeting in which one of the attendees’ primary concerns was how rich people could control the security personnel protecting them in their armored bunkers after the money/climate/society armageddon. The financial titans at the meeting apparently brainstormed ideas like using neck control collars, securing food lockers, and replacing human security personnel with robots. Douglas suggested perhaps simply starting to be nicer to their security people now, before the revolution, but they thought it was already too late for that.

Friends express concern when I make a connection between slaves and robots that I may have the effect of dehumanizing slaves or the descendants of slaves, thus exacerbating an already tense and advanced war of words and symbols. While fighting the dehumanization of minorities and underprivileged people is important and something I spend a great deal of effort on, focusing strictly on the rights of humans and not the rights of the environment, the animals, and even of things like robots, is one of the things that has gotten us in this awful mess with the environment in the first place. In the long run, maybe it’s not so much about humanizing or dehumanizing, but rather a problem of creating a privileged class—humans—that we use to arbitrarily justify ignoring, oppressing and exploiting.

Technology is now at a point where we need to start thinking about what, if any, rights robots deserve and how to codify and enforce those rights. Simply imagining that our relationships with robots will be like those of the human characters in Star Wars with C-3PO, R2-D2 and BB-8 is naive.

As Kate Darling, a researcher at the MIT Media Lab, notes in a paper on extending legal rights to robots, there is a great deal of evidence that human beings are sympathetic to and respond emotionally to social robots—even non-sentient ones. I don’t think this is some gimmick; rather, it’s something we must take seriously. We have a strong negative emotional response when someone kicks or abuses a robot—in one of the many gripping examples Kate cites in her paper, a U.S. military officer called off a test using a leggy robot to detonate and clear minefields because he thought it was inhumane. This is a kind of anthropomorphization, and, conversely, we should think about what effect abusing a robot has on the abusing human.

My view is that merely replacing oppressed humans with oppressed machines will not fix the fundamentally dysfunctional order that has evolved over centuries. As a Shinto, I’m obviously biased, but I think that taking a look at “primitive” belief systems might be a good place to start. Thinking about the development and evolution of machine-based intelligence as an integrated “Extended Intelligence” rather than artificial intelligence that threatens humanity will also help.

As we make rules for robots and their rights, we will likely need to make policy before we know what their societal impact will be. Just as the Golden Rule teaches us to treat others the way we would like to be treated, abusing and “dehumanizing” robots prepares children and structures society to continue reinforcing the hierarchical class system that has been in place since the beginning of civilization.

It’s easy to see how the shepherds and farmers of yore could easily come up with the idea that humans were special, but I think AI and robots may help us begin to imagine that perhaps humans are just one instance of consciousness and that “humanity” is a bit overrated. Rather than just being human-centric, we must develop a respect for, and emotional and spiritual dialogue with, all things.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

July 30, 2018 at 06:06AM

Sorry, Elon. There’s Not Enough CO2 To Terraform Mars

http://blogs.discovermagazine.com/d-brief/?p=26059

Mars might not have the right ingredients to terraform into our planetary home away from home – even with the recent discovery of liquid water buried near its south pole.
Research published Monday in Nature Astronomy puts a kibosh on the idea of terraforming Mars. At the heart of the study is carbon dioxide. Carbon dioxide, a greenhouse gas, is abundant on Mars — its thin atmosphere is made of the stuff, and the white stuff we often see on the surface is dry ice, not snow. CO2 is even tra

via Discover Main Feed https://ift.tt/1dqgCKa

July 30, 2018 at 11:08AM

Hackers find creative way to steal $7.7 million without being detected

https://arstechnica.com/?p=1350443


Hackers managed to steal $7.7 million dollars’ worth of cryptocurrency from the platform known as KICKICO by using a novel technique—destroying existing coins and then creating new ones totaling the same amount and putting them in hacker-controlled addresses, KICKICO officials said.

The technique evaded KICKICO’s security measures because it didn’t change the number of KICK tokens issued on the network. Such security measures are generally designed to spot thefts and other malicious actions by detecting sudden shifts in total cryptocurrency funds available on the market. The unknown attackers were able to destroy the existing coins and create new ones by first obtaining the secret cryptographic key controlling the KICKICO smart contract. KICKICO officials didn’t learn of the breach until they received complaints from several users reporting that $800,000 dollars’ worth of digital coins were missing from their wallets.

KICKICO officials said they have since recovered the stolen tokens and are in the process of returning them to their original owners. In a blog post disclosing the incident, KICKICO officials wrote:

The hackers gained access to the private key of the owner of the KickCoin smart contract. In order to hide the results of their activities, they employed methods used by the KickCoin smart contract in integration with the Bancor network: hackers destroyed tokens at approximately 40 addresses and created tokens at the other 40 addresses in the corresponding amount. In result, the total number of tokens in the network has not changed. But thanks to the rapid response of our community and our coordinated team work, we were able to regain control over the tokens and prevent further possible losses by replacing the compromised private key with the private key of the cold storage.

At the moment the problem is completely eliminated, the wallets of KickCoin holders are safe.

The post didn’t say how the hackers managed to steal the private crypto key or whether the hole that made the theft possible has been closed. The incident is the latest reminder of how susceptible cryptocurrency exchanges and platforms are to malicious hacks. People who use digital coins should keep them in cold-storage whenever possible, meaning wallets that aren’t connected to the Internet. Cold storage doesn’t prevent all thefts, but it will prevent many of them.

via Ars Technica https://arstechnica.com

July 30, 2018 at 01:25PM