Core components
Client core
Avatar management and customization
24 min
overview the avatar management and customization system provides users with a visual representation within the ir engine's 3d environments it handles the selection, customization, loading, and animation of avatars, allowing users to establish their identity and presence in virtual spaces by integrating with external avatar creation services and implementing real time facial expression mapping, the system creates more engaging and personalized experiences this chapter explores the implementation, workflow, and components involved in managing avatars within the ir engine client core concepts digital representation avatars serve as the user's embodiment in virtual environments visual identity provides a recognizable form for each user spatial presence gives users a physical location in the 3d world social interaction enables non verbal communication through positioning and animations personalization allows users to express their identity or preferences in the ir engine, avatars are implemented as entities with specialized components that handle their appearance, animation, and associated ui elements avatar lifecycle the avatar system manages several stages in an avatar's lifecycle selection users choose from available avatar models or create custom ones customization users modify avatar appearance through integrated tools loading the system loads the 3d model and prepares it for rendering spawning the avatar is placed in the 3d world at an appropriate location animation the avatar is animated based on user input and system events enhancement additional features like nameplates and expressions are applied destruction the avatar is removed when the user leaves the environment this lifecycle ensures a consistent experience from avatar selection to in world interaction expression and animation avatars support various forms of expression and animation locomotion walking, running, and other movement animations gestures predefined animations like waving or pointing facial expressions emotional displays through facial morphing lip synchronization mouth movements that match speech audio webcam driven expressions real time mapping of user expressions to the avatar these capabilities make avatars more lifelike and expressive, enhancing communication in virtual environments implementation avatar spawning when a user joins a 3d environment, the system spawns their avatar // simplified from src/networking/avatarspawnsystem tsx import { spawnlocalavatarinworld } from '@ir engine/engine/src/avatar/functions/spawnlocalavatarinworld'; import { getrandomspawnpoint } from '@ir engine/engine/src/avatar/functions/getspawnpoint'; import { getstate } from '@ir engine/hyperflux'; import { authstate } from ' /user/services/authservice'; async function setupuseravatar(userid, avatarurl, sceneuuid) { // get a random spawn position const avatarspawnpose = getrandomspawnpoint(userid); // get user information from authentication state const user = getstate(authstate) user; // spawn the avatar in the world const avatarentity = spawnlocalavatarinworld({ parentuuid sceneuuid, avatarspawnpose, avatarurl avatarurl, // 3d model url name user name value // username for nameplate }); console log(`avatar spawned for ${user name value} using ${avatarurl}`); return avatarentity; } this function determines a spawn position for the avatar retrieves user information from the authentication state calls spawnlocalavatarinworld to create the avatar entity configures the avatar with the appropriate model and username returns the created entity for further reference avatar component the avatar entity is managed through the avatarcomponent // simplified from @ir engine/engine/src/avatar/components/avatarcomponent ts import { definecomponent } from '@ir engine/ecs'; export const avatarcomponent = definecomponent({ name 'avatarcomponent', schema { modelurl { type 'string', default '' }, name { type 'string', default 'user' }, height { type 'number', default 1 8 }, islocal { type 'boolean', default false }, headbone { type 'ref', default null }, lefthandbone { type 'ref', default null }, righthandbone { type 'ref', default null }, loaded { type 'boolean', default false } }, // static methods for avatar management getselfavatarentity () => { // return the entity id of the local user's avatar return avatarcomponent selfavatarentity; }, // other utility methods }); this component stores essential avatar properties like model url and username tracks the avatar's loading state maintains references to important bones for animation provides utility methods for accessing avatar entities avatar selection ui users select their avatar through a dedicated interface // simplified from src/user/menus/avatar/avatarselectmenu tsx import react, { usestate, useeffect } from 'react'; import { usemutation, usefind } from '@ir engine/common'; import { avatarpath, useravatarpath } from '@ir engine/common/src/schema type module'; import { engine } from '@ir engine/ecs'; function avatarselectionui() { // get available avatars from the server const avatarsquery = usefind(avatarpath); // get current user's avatar selection const useravatarquery = usefind(useravatarpath, { query { userid engine instance userid } }); // state for the selected avatar const \[selectedavatarid, setselectedavatarid] = usestate(''); // mutation for updating the user's avatar choice const useravatarmutation = usemutation(useravatarpath); // set initial selection based on current avatar useeffect(() => { if (useravatarquery data? length > 0) { setselectedavatarid(useravatarquery data\[0] avatarid); } }, \[useravatarquery data]); // handle avatar selection confirmation const handleconfirm = async () => { if (!selectedavatarid) return; try { // update the user's avatar preference on the server await useravatarmutation patch( null, { avatarid selectedavatarid }, { query { userid engine instance userid } } ); console log(`avatar selection updated to ${selectedavatarid}`); } catch (error) { console error('failed to update avatar selection ', error); } }; // render the avatar selection ui return ( \<div classname="avatar selection"> \<h2>select your avatar\</h2> \<div classname="avatar grid"> {avatarsquery data? map(avatar => ( \<div key={avatar id} classname={`avatar option ${selectedavatarid === avatar id ? 'selected' ''}`} onclick={() => setselectedavatarid(avatar id)} \> \<img src={avatar thumbnailurl} alt={avatar name} /> \<p>{avatar name}\</p> \</div> ))} \</div> \<button onclick={handleconfirm} disabled={!selectedavatarid} \> confirm selection \</button> \</div> ); } this component fetches available avatars from the server retrieves the user's current avatar selection allows the user to select a new avatar saves the selection to the server when confirmed provides visual feedback on the current selection avatar service the avatarservice manages avatar related operations // simplified from src/user/services/avatarservice ts import { api } from '@ir engine/common'; import { avatarpath, useravatarpath } from '@ir engine/common/src/schema type module'; import { engine } from '@ir engine/ecs'; export const avatarservice = { // get all available avatars async getavatars() { try { return await api instance service(avatarpath) find({ query { $limit 100 } }); } catch (error) { console error('failed to fetch avatars ', error); return { data \[] }; } }, // get the current user's avatar selection async getuseravatar() { try { const result = await api instance service(useravatarpath) find({ query { userid engine instance userid } }); return result data\[0] || null; } catch (error) { console error('failed to fetch user avatar ', error); return null; } }, // update the user's avatar selection async updateuseravatarid(avatarid) { try { await api instance service(useravatarpath) patch( null, { avatarid }, { query { userid engine instance userid } } ); return true; } catch (error) { console error('failed to update user avatar ', error); return false; } }, // create a new custom avatar async createcustomavatar(avatardata) { try { const newavatar = await api instance service(avatarpath) create(avatardata); // automatically select the new avatar await this updateuseravatarid(newavatar id); return newavatar; } catch (error) { console error('failed to create custom avatar ', error); return null; } } }; this service provides methods for retrieving available avatars handles fetching and updating the user's avatar selection manages the creation of custom avatars encapsulates api calls to the avatar related endpoints nameplate integration avatars are enhanced with nameplates using the xrui system // simplified from src/systems/avataruisystem tsx import { setcomponent, getcomponent, hascomponent } from '@ir engine/ecs'; import { xruinameplatecomponent } from ' /social/components/xruinameplatecomponent'; import { avatarcomponent } from '@ir engine/engine/src/avatar/components/avatarcomponent'; function addnameplatetoavatar(avatarentity) { // skip if already has a nameplate if (hascomponent(avatarentity, xruinameplatecomponent)) { return; } // get the avatar component to access the name const avatar = getcomponent(avatarentity, avatarcomponent); // add the nameplate component with the avatar's name setcomponent(avatarentity, xruinameplatecomponent, { name avatar name }); console log(`nameplate added to avatar ${avatar name}`); } // in the avataruisystem function executeavataruisystem() { // query for avatars without nameplates const avatarswithoutnameplates = queryentities(\[ withcomponent(avatarcomponent), withoutcomponent(xruinameplatecomponent) ]); // add nameplates to avatars that need them for (const entity of avatarswithoutnameplates) { // skip local avatar if configured to do so const avatar = getcomponent(entity, avatarcomponent); if (avatar islocal && !config shownameplateonself) { continue; } addnameplatetoavatar(entity); } } this system identifies avatars that need nameplates retrieves the avatar's name from the avatarcomponent adds an xruinameplatecomponent to display the name configures the nameplate based on system settings webcam driven expressions the system can use webcam input to drive avatar facial expressions // simplified from src/media/webcam/webcaminput ts import { getmutablestate } from '@ir engine/hyperflux'; import { mediastreamstate } from '@ir engine/network/src/media/mediastreamstate'; import { avatarcomponent } from '@ir engine/engine/src/avatar/components/avatarcomponent'; import { webcaminputcomponent } from ' /webcaminputcomponent'; // toggle face tracking on/off export async function togglefacetracking() { const mediastreamstate = getmutablestate(mediastreamstate); const currentstate = mediastreamstate facetracking value; if (currentstate) { // stop face tracking stopfacetracking(); mediastreamstate facetracking set(false); } else { // start face tracking const success = await startfacetracking(); if (success) { mediastreamstate facetracking set(true); } } } // start face tracking process async function startfacetracking() { try { // initialize webcam stream const stream = await navigator mediadevices getusermedia({ video { width 640, height 480 } }); // initialize face detection worker const faceworker = new worker('/workers/face detection worker js'); // start processing loop processfacetracking(stream, faceworker); return true; } catch (error) { console error('failed to start face tracking ', error); return false; } } // process webcam frames for face tracking function processfacetracking(stream, faceworker) { const video = document createelement('video'); video srcobject = stream; video play(); // process frames at regular intervals const processinterval = setinterval(async () => { // get current video frame const canvas = document createelement('canvas'); canvas width = video videowidth; canvas height = video videoheight; const ctx = canvas getcontext('2d'); ctx drawimage(video, 0, 0); // send frame to worker for processing const imagedata = ctx getimagedata(0, 0, canvas width, canvas height); faceworker postmessage({ imagedata }); // worker will return detected expressions }, 100); // handle worker responses faceworker onmessage = (event) => { const { expressions } = event data; if (!expressions) return; // get the local avatar entity const avatarentity = avatarcomponent getselfavatarentity(); if (!avatarentity) return; // update expression values on the avatar for (const \[expression, value] of object entries(expressions)) { if (value > 0 5) { // threshold for expression detection webcaminputcomponent setexpression(avatarentity, expression, value); } } }; } this implementation provides a function to toggle face tracking on and off initializes the webcam stream and face detection worker processes video frames at regular intervals updates avatar expressions based on detected facial features applies the expressions to the avatar's morph targets avatar workflow the complete avatar workflow follows this sequence sequencediagram participant user participant ui as avatar selection ui participant service as avatar service participant server as backend server participant spawn as avatar spawn system participant entity as avatar entity participant webcam as webcam input (optional) user >>ui opens avatar selection ui >>service getavatars() service >>server api service(avatarpath) find() server >>service available avatars service >>ui avatar options user >>ui selects avatar ui >>service updateuseravatarid(selectedid) service >>server api service(useravatarpath) patch() server >>service confirmation note over user, entity later, when joining a world user >>spawn joins 3d environment spawn >>service getuseravatar() service >>server api service(useravatarpath) find() server >>service user's avatar selection service >>spawn avatar model url spawn >>entity spawnlocalavatarinworld() entity >>entity load 3d model entity >>entity add nameplate opt webcam tracking enabled user >>webcam enables face tracking webcam >>webcam start webcam and detection webcam >>entity update facial expressions end entity >>user avatar visible in world this diagram illustrates the user selects an avatar through the ui the selection is saved to the server when joining a 3d environment, the system retrieves the user's avatar preference the avatar is spawned with the appropriate model a nameplate is added to display the username optionally, webcam driven expressions are enabled the avatar becomes visible to the user and others in the environment integration with other components the avatar system integrates with several other components xrui system avatars use xrui for nameplates and other ui elements // example of xrui integration with avatars import { createxrui } from '@ir engine/engine/src/xrui/createxrui'; import { setcomponent, getcomponent } from '@ir engine/ecs'; import { transformcomponent } from '@ir engine/spatial'; import { entitytreecomponent } from '@ir engine/ecs'; // implementation of xruinameplatecomponent function createnameplateforavatar(avatarentity, username) { // create the nameplate ui component const nameplateui = ({ username }) => ( \<div classname="nameplate"> \<span>{username}\</span> \</div> ); // create the xrui instance const ui = createxrui(nameplateui, { username }); // position it above the avatar const transform = getcomponent(ui entity, transformcomponent); transform position set(0, 1 8, 0); // above the avatar's head // make it a child of the avatar entity setcomponent(ui entity, entitytreecomponent, { parententity avatarentity }); return ui entity; } this integration creates ui elements that are positioned relative to avatars ensures ui elements like nameplates move with the avatar provides contextual information about users in the 3d space authentication system avatars are associated with authenticated users // example of authentication integration with avatars import { getstate } from '@ir engine/hyperflux'; import { authstate } from ' /user/services/authservice'; function getavatarinfofromauth() { const authstate = getstate(authstate); // only proceed if user is authenticated if (!authstate isauthenticated value) { return null; } // get user information from auth state const user = authstate user; return { userid user id value, username user name value, // other relevant user information }; } this integration retrieves user information from the authentication state associates avatars with specific user accounts ensures avatars display the correct username feathersjs api services avatar data is managed through feathersjs services // example of feathersjs integration with avatars import { api } from '@ir engine/common'; import { avatarpath, useravatarpath } from '@ir engine/common/src/schema type module'; // fetch avatar data from server async function fetchavatardata(userid) { try { // get the user's avatar preference const useravatarresult = await api instance service(useravatarpath) find({ query { userid } }); if (useravatarresult data length === 0) { return null; } const useravatar = useravatarresult data\[0]; // get the detailed avatar information const avatar = await api instance service(avatarpath) get(useravatar avatarid); return { id avatar id, name avatar name, modelurl avatar modelresource url, thumbnailurl avatar thumbnailresource? url }; } catch (error) { console error('failed to fetch avatar data ', error); return null; } } this integration retrieves avatar preferences and data from the server manages the relationship between users and their selected avatars provides the necessary information for avatar spawning and display benefits of avatar management the avatar management and customization system provides several key advantages user identity gives users a recognizable presence in virtual environments personalization allows users to express themselves through avatar appearance social interaction facilitates non verbal communication through avatars immersion enhances the sense of presence in virtual spaces expression enables emotional communication through animations and expressions integration works seamlessly with other systems like xrui and authentication extensibility supports integration with external avatar creation services these benefits make avatar management an essential component for creating engaging and personalized experiences in the ir engine next steps with an understanding of how users are represented as avatars, the next chapter explores how multiple users connect to the same virtual environment next instance provisioning and networking docid\ vvjmpnve2ubdtea4utin