He couldn’t get over his fiancee’s death. So he brought her back as an AI chatbot

Post Reply
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

He couldn’t get over his fiancee’s death. So he brought her back as an AI chatbot

Post by Yuli Ban »

He couldn’t get over his fiancee’s death. So he brought her back as an A.I. chatbot
Designed by a Bay Area programmer, Project December was powered by one of the world’s most capable artificial intelligence systems, a piece of software known as GPT-3. It knows how to manipulate human language, generating fluent English text in response to a prompt. While digital assistants like Apple’s Siri and Amazon’s Alexa also appear to grasp and reproduce English on some level, GPT-3 is far more advanced, able to mimic pretty much any writing style at the flick of a switch.

In fact, the A.I. is so good at impersonating humans that its designer — OpenAI, the San Francisco research group co-founded by Elon Musk — has largely kept it under wraps. Citing “safety” concerns, the company initially delayed the release of a previous version, GPT-2, and access to the more advanced GPT-3 has been limited to private beta testers.

But Jason Rohrer, the Bay Area programmer, opened a channel for the masses.

Users could select from a range of built-in chatbots, each with a distinct style of texting, or they could design their own bots, giving them whatever personality they chose.

Joshua had waded into Project December by degrees, starting with the built-in chatbots. He engaged with “William,” a bot that tried to impersonate Shakespeare, and “Samantha,” a friendly female companion modeled after the A.I. assistant in the movie “Her.” Joshua found both disappointing; William rambled about a woman with “fiery hair” that was “red as a fire,” and Samantha was too clingy.

But as soon as he built his first custom bot — a simulation of Star Trek’s Spock, whom he considered a hero — a light clicked on: By feeding a few Spock quotes from an old TV episode into the site, Joshua summoned a bot that sounded exactly like Spock, yet spoke in original phrases that weren’t found in any script.

As Joshua continued to experiment, he realized there was no rule preventing him from simulating real people. What would happen, he wondered, if he tried to create a chatbot version of his dead fiancee?
Whoa.

Now this feels like "Brave New World" territory.
And remember my friend, future events such as these will affect you in the future
Post Reply