Your synthetic clone could deceive somebody

According to an experiment, it was found that AI-generated faces are indistinguishable from real faces and look more trustworthy than their human counterparts.

This leads to worrying and scary outcomes where a fake agent could easily manipulate people in a variety of fields and have more success than a human manipulator thanks to the power of an AI.

A synthetic face could be employed for advertising, for example. It’s cheaper and easier to generate an actor directly from your PC. In addition, since the AI will be much more able to manipulate others and be trustworthy, its target will surely be achieved.

However, there are worse scenarios. Since the internet is heading to the metaverse, the world will be entirely digital, and we’ll run into several digital avatars behind which there will be other people: friends or strangers, but often they could be fake agents that try to engage us to promote something. Therefore, spending more time in virtual worlds means you’re more likely to bump into fake avatars than on the internet.

When our personal information, like behavior, interests, etc., is at the disposal of an A.I. with a reliable and friendly face that acts with a specific target, the probability of deception is higher if these agents act misleadingly. That’s why we need a solution to identify these agents that can even detect our emotions in real-time, detecting our facial expressions and vocal inflections to adjust their conversational technique to maximize their persuasive power.

But what about when those fake faces appear as familiar faces?

>>>  A.I. bias may be dangerous

In a Microsoft blog post, Executive Vice President Charlie Bell stated that in the metaverse, fraud, and phishing attacks could come from a familiar face, such as an avatar impersonating a coworker, leaving us constantly unsure if the people we’re talking to are the people we know or fakes.

Creating a “digital twin” is the process of accurately recreating a person’s looks and voice in the metaverse. Jensen Haung, the CEO of NVIDIA, gave a keynote talk employing a cartoonish digital twin. According to him, fidelity will significantly improve in the future, as will AI engines’ ability to control your avatar autonomously so you can be in many places at once. We’ll be able to perform many tasks at the same time delegating our digital twins to more boring and repetitive operations and if someone runs into one of our twins, they may not be able to recognize it’s fake.

Our twins, on the other hand, could become “evil twins” if they were employed to imitate our looks, voice, and mannerisms for fraudulent purposes.

Bad actors, according to Bell, could entice you into a fake virtual bank, complete with a fake teller who requests your personal information. Or, if you’re a victim of corporate espionage, you could be invited to a fake meeting in a conference room that looks exactly like the virtual conference room you’re used to. Without even realizing it, you’ll be handing over personal information to unknown third parties.

However, it’s possible that imposters won’t need to employ similar measures. After all, seeing a familiar face that looks, sounds, and acts like someone you know is an effective weapon in and of itself. This means that metaverse platforms should require equally robust authentication mechanisms to ensure that we’re engaging with a real person (or their authorized twin) rather than an evil twin set up to fool us.

>>>  The behavior of artificial neural networks

Because virtual reality and augmented reality technologies are meant to deceive the senses, these platforms will expertly blur the lines between the real and the artificial. When evil actors employ such powers, they quickly become dangerous. This is why it is in everyone’s best interest, both consumers and businesses, to strive for increased security. We could for example employ another A.I. to expose fake agents or use blockchain to identify their original identity.

The alternative is a metaverse rife with widespread fraud, from which it may never recover.

Source venturebeat.com