Advanced Artificial Intelligence (AI) which is beyond mere chatbots will soon be used to manipulate social media platforms like Facebook, Twitter or Instagram, Tesla CEO Elon Musk has warned.
San Francisco, Advanced Artificial Intelligence (AI) which is beyond mere chatbots will soon be used to manipulate social media platforms like Facebook, Twitter or Instagram, Tesla CEO Elon Musk has warned.
In a tweet, the staunch AI critic said that day is not far.
“If advanced AI (beyond basic bots) hasn’t been applied to manipulate social media, it won’t be long before it is,” Musk tweeted on Thursday.
In his recent debate with former Alibaba Chairman Jack Ma, Musk entered into a classic argument over the capabilities of emerging technologies like AI.
Musk said that computers will one day surpass humans in “every single way”.
He has predicted that a single company that develops “God-like super intelligence” might achieve world domination.
If not regulated or controlled soon, AI could become an “immortal dictator” and there will be no escape for humans, the SpaceX CEO had warned recently.
In a new documentary on AI, Musk said: “At least when there’s an evil dictator, that human is going to die. But for an AI there would be no death. It would live forever, and then you’d have an immortal dictator, from which we could never escape.
“If AI has a goal and humanity just happens to be in the way, it will destroy humanity as a matter of course without even thinking about it. No hard feelings,” Musk told Chris Paine, the director of the documentary titled “Do You Trust This Computer?”
Musk has always been a critic of AI and asked for stiff regulations to curb the technology.
In a recent tweet, Musk said that people should be more concerned with AI than the risk posed by North Korea.
“If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea,” Musk tweeted.
“AI is a rare case where we need to be proactive in regulation instead of reactive because if we’re reactive in AI regulation it’s too late,” he added.