Updater
July 03, 2025 , in technology

Analyst Avatars Are Here – What Next?

Easy availability of powerful avatar generators is driving their use among corporate figures from financial analysts to company CEOs.

Eidosmedia Analyst Avatars

AI & the rise of professional avatars | Eidosmedia

Investor clients of international bank UBS now have an alternative way to consume the reporting by the banks' analysts - instead of the usual PDF or web page, they can now hear the analysis presented to them by an on-screen avatar, modeled on the analyst themselves.

Meanwhile, some CEOs are using avatars to appear in earnings calls and other meetings. Professional use of AI avatars is spreading and raising a number of questions around trust, credibility and business ethics generally.

Avatar creation is now mainstream

UBS created lifelike avatars of its experts to meet customer demand for video-based advice while freeing up their time for other activities. And it's worth pointing out that the tools needed to do this are already available, even to organizations without the resources of a global bank.

Not unlike a deep fake, the videos were created with the help of AI video platform Synesthesia, which captures their likenesses and their voices and is capable of producing output in over 60 languages. Synesthesia is one of a number of tools aimed at creating realistic video figures from text and image inputs (KingyAI has a useful overview of the technology and the platforms available).

In UBS’s case, it used OpenAI to analyze reports and create a script that the Synesthesia avatar would read. Analysts can opt in to this, choosing whether or not they want an avatar speaking for them, and approve any content that goes out with their likeness.

Do avatars add value?

While UBS says the move is motivated by client demand, not everyone is convinced. One reader commenting on the Financial Times article said: “This makes no sense.” They point out that PDFs are easier to search, and reading is faster than watching videos. This was countered by another reader who observed: "-Yes, but reading takes more effort and people are lazy."

Another commenter is concerned that an avatar that sticks to a script won’t be able to provide the right kind of value: “Sell-side analyst meetings are only useful because you can ask the analysts what is not in the published reports. What is the point of a ‘meeting’ with an analyst's ‘avatar’ if it is just going to read out what's in the report?”

Avatars as CEO-substitutes

Given the ease of creating avatars, their use has been spreading to even more high-profile cases. A recent example is Zoom CEO Eric Yuan who recently used an avatar to replace him in the company's earnings call.

This may have had a promotional purpose, however, as Zoom has introduced its own avatar tool, allowing users to record and send brief messages through their lookalikes. TechCrunch reports that the CEO of Swedish payments company Klarna, which has been very public in its commitment to the use of AI, has also been using an avatar in meetings with analysts and investors.

Another, less controversial, application of avatar technology is in staff training. One company, Cicero, is using the tech to build a workplace roleplay platform — letting employees practice and prepare for potentially challenging conversations by simulating them with AI avatars first.

The interactive avatar

So far we have been talking about avatars as delivery channels for information in one-way conversations. But the technology now exists to create fully interactive avatars, capable of replying to queries and comments from a human interlocutor.

The HeyGen company provides tools for producing standard, one-way avatars, but a recent promotional video also offers the user the possibility to to 'be in more than one place at once ' as living, interactive clones of themselves. The company's website offers a selection of example avatars in different roles that visitors can interact with.

The avatar responses are created by a generative AI model in a similar way to a standard chatbot and then voiced by the avatar with suitable facial expressions.

Obviously using such a system for 'self-cloning' raises serious questions of accountability - few people would be comfortable allowing an AI model to appear to speak for them in such a personal way.

How to trust, how to verify

As is often the case with AI technology, the technical possibilities are way ahead of the social and judicial adjustments needed to assimilate the technology into our working and living environments.

What checks and safeguards will we need to be sure that an apparently realistic video interlocutor is who they claim to be? How accountable is a person for the statements and actions of their AI-generated clone?

The jury is still out on these and many other questions. One thing we can be sure of, however: the ability of technology to simulate our most human qualities is destined only to increase.

Interested?

Find out more about Eidosmedia products and technology.

GET IN TOUCH