Scammers Can Now Clone Your Personality With AI

Your personality may not be as unique as you might think. In fact, scammers can now replicate your voice, image, and personality to create a replica that acts just like you.

In a landmark study, Stanford researchers created an artificial intelligence system to almost perfectly clone people’s personalities.

The study, which involved 1,052 human participants and their AI digital doppelgangers, revealed that their AI clones could predict human responses to questions with up to 85% accuracy.

A Two Hour Interview Is All It Takes To Clone A Personality

Creating an AI clone of the participant’s personality began with a two-hour conversation.

Using an AI agent interviewer they named “Isabella”, the security researchers conducted in-depth interviews with each participant to capture their life stories, political views, and values.

The interviews, averaging 6,491 words per participant, laid the groundwork for creating “generative agents,” AI clones who can think and respond like the participants.

After each participant’s AI clones were created, the participants were asked to answer questions, perform tasks, and take a personality test. The AI was also asked to simulate the answers, and 85% of the time, it correctly simulated the same answers.

The AI Clones Go Far Beyond Simple Mimicry

What makes the findings so fascinating isn’t just the accuracy of the clones. It was the variety of personalities and beliefs it could replicate about each person.

The AI clones could predict participants’ responses to political questions as accurately as the participants themselves. They learned what political beliefs each person had and how they would answer.

Using the Big 5 personality test, AI clones could almost perfectly replicate the participant’s personality characteristics.

In scenarios presented to the participants and the AI clones (such as Prisoners Dilemma and Trust Game), the AI made strikingly similar choices to the participants.

Imagine When Scammers Take Advantage Of This Capability

It’s only a matter of time before scammers use this capability to improve their deepfakes. The most obvious use will be perfect impersonation scams to steal millions from consumers and businesses.

The BEC Deepfake will be first. Consider a corporate infiltrator such as a deepfake CEO. Scammers could scour the internet, LinkedIn, YouTube, and blog posts to gather data on the CEO’s speech, thoughts, activities, voice, and imagery.

Using this data, the scammers create a perfect digital clone of the CEO who looks, talks and acts just like them.

  • Send emails matching the CEO’s writing style and decision-making patterns
  • Reference ongoing projects with accurate internal terminology
  • Maintain consistent personality traits across multiple interactions
  • Conduct realistic Zoom and Team meetings and fool everyone
  • Request realistic wire transfers to accounts the scammers control

These days could soon become a reality.

Read The Full Study

You can read the complete study here.