With workplaces heading towards the metaverse, can behaviour AI help maintain the human touch?
Humanising Autonomy Team
March 1, 2023

The world has seen a huge shift in work environments in the past few years. At the height of Covid-19, more than half of the working population in the US, Europe and other countries were working remotely. The social, human element of in-person work environments transformed the dynamic and warm relationships between colleagues into a one-way relationship between the employee and their technology. Working relationships were hindered by blind spots in video-based technology, in which cropped screen placements, latency and employees opting to turn their cameras off resulted in a loss of behavioural understanding.

Although the initial awkwardness of virtual meetings gradually eased and found its niche in encouraging “shared experiences” such as screen sharing, group breakouts and emoji responses, the gap in understanding how engaged the person on the screen really was remained. Employers have always grappled with employee burnout, attracting and retaining talent and spotting potentially risky situations in their workforce. In the era of remote working, however, it is even more difficult to decipher a colleague’s energy and enthusiasm levels. 

It looks like this behavioural blind spot will not be resolved by a wholescale return to the office anytime soon. According to the Accenture Future of Work Study 2021, “83% of workers around the globe prefer a hybrid work model,” despite businesses still investing in onsite workplaces. Further, hybrid working may just be a stepping stone towards a future workplace environment set in extended realities. In its Global Workforce of the Future 2022 survey, Adecco Group reported that a third of respondents “said they would work in the metaverse (32%), with Gen Z (46%), agency workers (58%) and those with caregiving responsibilities (47%) especially keen”. 

If these experts are to be proved right, companies must rely on more than the maturity of technologies such as AR, VR and the blockchain to encourage people to work, play, shop, game and socialise virtually. They must imbue the metaverse with greater human understanding.

Behaviour-based AI can play an integral role in reintroducing human context into the virtual workplace. By connecting to a laptop’s camera, Humanising Autonomy’s Behaviour AI platform can accurately read, interpret and infer a person’s emotional state based on their body placement and facial landmarks, combining those indicators with an analysis of the person’s engagement, valence and arousal score. This context will help employers understand a sentiment pulse of their workforce at scale and allow them to make better-informed decisions. It means that if a team is showing signs of growing weary, the team leader can change the pace and level of work. Alternatively, if they’re enthused by content shared at a meeting or a presentation style, a manager can create similar content or encourage more dynamic styles of content sharing. 

Workplaces have been exploring ways to keep employees engaged for years, with companies gamifying backend HR processes such as payroll, employee profiles and their share options, as well as using technology like Spill Therapy, which enables regular mental-health checks to spot anyone at risk. Employees choose how much they individually share with their employer and receive an automated suggestion for a one-off health session if there’s an indication of unhappiness.

But what if employees didn’t need to be asked how they feel? What if their camera could understand an employee’s nuanced, behavioural responses and signal to the employer of their declining engagement weeks or even months before a potential burnout? What could this enable in our current hybrid work model and any potential future in the metaverse?


Metaverse pioneer Avi Bar-Zeev argues in Finding meaning in the Metaverse that AR has the potential to “improve the signal-to-noise ratio of our daily life by filtering out what we don’t need to see and then enhancing what’s most relevant and impactful to us, individually, contextually, based on what matters most”. If AR can monitor our emotions and let us know when we’re becoming less present, Bar-Zeev believes we can look forward to a future where we are “more grounded, and better connected with others around us”. 

Humanising Autonomy’s Behaviour AI is the missing link to achieving this goal in any tech-enabled interaction. 

AI-enabled metaverse applications can transform education, healthcare, the arts and the workplace, enriching our lives immeasurably in the process. The pieces are in place, and businesses are seizing the opportunity. In banking, NVIDIA’s metaverse-building tools are helping Deutsche Bank improve fraud detection. Meanwhile, in the “death tech” sector, Somnium Space CEO Artur Sychov has revealed how ChatGPT will speed up its offering in letting people communicate with deceased loved ones reincarnated as avatars. Computer-vision-based software will unlock the sense of sight through machines even further.

The success or failure of every metaverse venture, however, relies on enhancing the human-machine experience. By harnessing human-centric behaviour AI to strip away the noise and get to the heart of a viewer’s intent and desires accurately, every time, the opportunities are endless.

Sign up to our Behaviour AI platform