What’s new in Immersive Training Platform? Introducing 3D Analytics

UneeQ's Immersive Training Platform now analyzes much more than just the words learners say. Here's what you need to know.

Published
April 8, 2026
by
Paul Haggo
Updated
What’s new in Immersive Training Platform? Introducing 3D Analytics
Table of Contents
This is some text inside of a div block.
Loading the Elevenlabs Text to Speech AudioNative Player...

If you've used our Immersive Training Platform, you already know the post-session analytics are a big deal. After every AI roleplay, learners get targeted, specific feedback on their performance — not generic tips, but coaching based on exactly what they said and how they handled the conversation.

One recent AI sales training user put it well: "The suggestions in the UneeQ platform are much more targeted, specific, and positive. It quotes the seller and literally tells them what they said that was good and what wasn't — the feedback felt so individual."

That kind of feedback is powerful. But what you say is only half the conversation.

What is 3D Analytics?

3D Analytics is a new capability in UneeQ's Immersive Training Platform that measures much more than what learners say during AI roleplay sessions

Using video analysis, our 3D Analytics evaluates body language, tone of voice, eye contact, facial expressions, and posture. We then deliver coaching feedback alongside the existing conversation analytics. 

We designed 3D Analytics to help organizations improve soft skills training by giving learners a complete picture of how they show up in face-to-face conversations. Not just what words they use, but how it’s delivered.

Diving deeper into 3D Analytics

The launch of 3D Analytics fills what we believe is a large gap in the AI roleplay market.

Non-verbal cues make up a huge part of how we come across in face-to-face interactions. If learners can nail the script but present themselves as unsure, disengaged, or lacking conviction, the message doesn't land. That's true in sales training, customer service training, and leadership development alike.

This is what sets immersive learning apart from voice-only AI roleplay tools or text-based training. You can't practice body language in a chat window. You can't build executive presence by reading a script. You need to see yourself in action and get feedback on the full picture.

With 3D Analytics, your people get coached on the many aspects of building emotional intelligence:

  • Body language: Are they open and engaged, or closed off?
  • Tone of voice: Does delivery match the message?
  • Eye contact: Are they connecting or drifting?
  • Facial expressions: Do they look confident, empathetic, uncertain?
  • Posture and positioning: What's their presence communicating before they even speak?

Available now. Optional by design.

3D Analytics is now live for all Immersive Training Platform users. And because we know video analysis isn't right for every organization's policies, we've made it completely optional. Turn it on when you want it. Turn it off when you don't.

Pull your data, build your dashboards

We’ve also shipped another hotly requested feature: API-based session data export.

You can now pull user session data — names, emails, scores, duration, dates, and more — directly via API. This means enterprises can integrate AI training performance data into their own BI platforms, analytics tools, or custom dashboards outside the UneeQ platform.

Output formats include JSON and xAPI, so it works with how your L&D and sales enablement teams already operate. And for organizations that need training data to flow into an existing LMS or enterprise system, this is a big step toward making that seamless.

Check out our docs to get started.

Give it a try with a personal demo

Keen to try these new features?

Contact our team to book a demo of Immersive Training Platform. We’d be happy to show you how 3D Analytics works, talk integrations, and create a custom scenario that allows your teams to practice their toughest conversations.

Or check out the FAQ below if you still have questions.

A call to action banner saying "let's chat. Book a meeting with a UneeQ expert" where .users can book a free demo of our platform. The image also has a female digital human doing the OK sign with her left hand.

Frequently asked questions

  • 3D Analytics is a feature in UneeQ's Immersive Training Platform that analyzes non-verbal communication during AI roleplay sessions — including body language, tone of voice, eye contact, facial expressions, and posture. It delivers coaching feedback alongside conversation analytics, giving learners a complete picture of how they perform in simulated face-to-face interactions.

    3D Analytics analyzes body language, tone of voice, eye contact, facial expressions, and posture during AI roleplay, delivering coaching feedback and conversation analytics for face-to-face training simulations.
  • AI roleplay uses digital humans — lifelike, emotionally intelligent avatars — to simulate realistic conversations. Learners practice scenarios like sales calls, customer complaints, or leadership discussions in a psychologically safe environment. The AI responds dynamically based on what the learner says and how they say it, then provides performance feedback.

    UneeQ's Immersive Training Platform adds visual presence and non-verbal behavior that voice-only tools can't replicate — making practice feel closer to the real thing.

    AI roleplay in immersive learning uses emotionally intelligent digital humans to simulate realistic conversations, providing dynamic responses and performance feedback in a psychologically safe environment with visual presence and non-verbal behavior.
  • Yes. UneeQ's API-based data export lets you pull session data — including scores, duration, user info, and more — in JSON or xAPI format. This allows enterprises to integrate training analytics into existing BI tools, LMS platforms, or custom dashboards.

    UneeQ's API-based data export supports JSON and xAPI formats, allowing enterprises to integrate AI training analytics into BI tools, LMS platforms, or custom dashboards.
  • UneeQ's Immersive Training Platform measures both verbal and non-verbal communication skills. This includes conversation quality, objection handling, empathy, active listening, tone of voice, body language, eye contact, and overall presence.

    These are the soft skills that have historically been difficult to measure at scale — despite this, they drive high performance in sales, customer service, leadership, adult learning, and many other use cases.

    UneeQ's AI training tools measure verbal and non-verbal soft skills including conversation quality, objection handling, empathy, active listening, tone of voice, body language, eye contact, and overall presence — skills historically difficult to measure at scale.
  • Digital humans create face-to-face practice environments that feel realistic — without the awkwardness of peer roleplay or the limitations of voice-only AI. Learners practice their most common or most difficult conversations repeatedly in a judgment-free space, building confidence as they go.

    UneeQ's digital humans respond with emotional intelligence — facial expressions, gestures, and adaptive behavior driven by Synanim™, our proprietary AI animation technology. This makes practice more immersive, more realistic, and more effective.

    Digital humans improve corporate training by creating realistic face-to-face practice environments. UneeQ's digital humans use Synanim™ AI animation technology to deliver emotional intelligence — facial expressions, gestures, and adaptive behavior — in a judgment-free, repeatable space.