Apple patents a way to clone you (so your clone can lie)


Recently I wrote in Bloomberg Businessweek that someday you will have an artificially intelligent doppelganger, a clone of “you” whom you could send into work while you stay home to play golf. It would be relatively easy, simply combining three current technologies:

1. Voice recognition, like that in your car’s GPS system.

2. AI simulation, similar to Apple’s Siri.

3. Social media data sets, with all the content about your persona you upload into Facebook.

Obviously, a mechanical robot would be expensive, but with so much communication occurring over screens, it would be cheap to set up an image of your face, populate your fake mind, and like Siri, boot you up over Web communications or Skype calls to perform any service. 

On June 19, Apple was awarded a patent that would make such cloning real. Obscurely titled “Techniques to Pollute Electronic Profiling,” the patent spells out how to use a device to clone a person’s identity; set up the simulated intelligence in specific areas of interest; and — most interesting — process inputs to determine whether to act as you would, or respond by answering differently.

The Apple patent would create a cloned Siri-like avatar of you that could lie.

Why the lying? Apple notes “a significant concern with electronic commerce and the proliferation of electronic transactions is that of privacy.” People, organizations, businesses and government may be monitoring you, and this freaks some people out. Apple goes on to reference George Orwell and the Big Brother in his novel “1984,” in which this Big Brother sentient observation thing would watch people, and people would act in a way that was not true to themselves, but rather in the way they knew the government wanted them to act.

Apple suggests its AI-type avatar would block such monitoring by mimicking your behavior so observing entities will think the avatar is you … and then doing things unlike you to throw the observers off track. The patent states, “over time the eavesdropper will begin to associate divergent areas of interest for the clone as being the norm for a particular principal that the eavesdropper believes it is profiling or performing successful dataveillance on.”

Which raises lots of ethical questions. What will the world be like when you can send an electronic clone online to act like you, and do work for you (nice PowerPoint, avatar, I really like how you wove in the Wikipedia entry on “marketing”…)?

And once we’re in that world, what happens when your avatar begins to lie?

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *