Core components
Networking & multiplayer infra...
Media stream recording
22 min
overview the media stream recording component is a specialized feature of the ir engine's multiplayer infrastructure that enables the capture and storage of audio and video streams from players during gameplay sessions it leverages the webrtc communication backbone to access media streams and processes them into standard video formats for later viewing or analysis by providing robust recording capabilities, this system enhances the multiplayer experience with features like session replay, content creation, and performance analysis this chapter explores the implementation, workflow, and applications of media stream recording within the ir engine core concepts media capture the system captures media streams directly from the multiplayer session stream interception tapping into the existing webrtc media flows audio recording capturing player voice communications video recording capturing webcam feeds or screen shares selective recording ability to record specific players or all participants minimal performance impact designed to avoid affecting gameplay experience this approach ensures high quality recordings without disrupting the game session media processing captured streams are processed into standard media formats format conversion transforming raw webrtc streams into standard video files codec handling supporting various audio and video codecs (vp8, h 264, opus) synchronization ensuring audio and video remain properly aligned quality control balancing file size and recording quality real time processing converting streams as they arrive rather than after completion this processing creates usable media files from the raw stream data storage and access processed recordings are stored for later access cloud storage uploading recordings to cloud storage services local storage option for storing recordings on local infrastructure metadata tracking maintaining information about recordings (participants, duration, etc ) access control managing who can access recorded content retrieval mechanisms apis for finding and accessing recordings this storage system ensures recordings are preserved and accessible when needed implementation recording system initialization the recording system is initialized as part of the server setup // simplified from src/mediasouprecordingsystem ts import { definesystem } from '@ir engine/ecs/src/systemfunctions'; import { getmutablestate, none } from '@ir engine/hyperflux'; import { recordingapistate } from '@ir engine/common/src/recording/ecsrecordingsystem'; import { useeffect } from 'react'; // define the recording system export const mediasouprecordingsystem = definesystem({ uuid 'ee instanceserver mediasouprecordingsystem', // reactor component that exposes recording functionality reactor () => { useeffect(() => { // make recording functions available through hyperflux state getmutablestate(recordingapistate) merge({ createmediachannelrecorder startmediarecording, stopmediachannelrecorder stopmediarecording }); // cleanup when system is destroyed return () => { getmutablestate(recordingapistate) merge({ createmediachannelrecorder none, stopmediachannelrecorder none }); }; }, \[]); return null; }, // regular execution function execute () => { // check for recording requests // monitor active recordings // handle recording errors } }); this code defines a system for managing media recordings exposes recording functions through hyperflux state sets up cleanup when the system is destroyed implements an execution function for regular processing starting a recording the system provides functions to initiate recordings // simplified from src/mediasouprecordingsystem ts import { getstate } from '@ir engine/hyperflux'; import { mediasoupmediaproducerstate } from ' /mediasoupmediaproducerstate'; / starts recording media for specified peers @param peerids array of peer ids to record @param options recording options @returns recording id and control functions / async function startmediarecording( peerids string\[], options { audioenabled? boolean; videoenabled? boolean; outputformat? 'webm' | 'mp4'; quality? 'high' | 'medium' | 'low'; } = {} ) { // generate a unique recording id const recordingid = generateuniqueid(); // default options const { audioenabled = true, videoenabled = true, outputformat = 'webm', quality = 'medium' } = options; // track recording state const recordingstate = { id recordingid, peerids, starttime date now(), status 'initializing', tracks {} as record\<string, any> }; // store recording state activerecordings set(recordingid, recordingstate); try { // process each peer for (const peerid of peerids) { // get the peer's producers from state const mediaproducerstate = getstate(mediasoupmediaproducerstate); const peerproducers = object values(mediaproducerstate producers) flatmap(networkproducers => object values(networkproducers)) filter(producer => producer peerid === peerid); // find audio and video producers const audioproducer = peerproducers find(p => p kind === 'audio'); const videoproducer = peerproducers find(p => p kind === 'video'); // start recording for this peer if they have the required producers if ((audioenabled && audioproducer) || (videoenabled && videoproducer)) { await startmediarecordingpair( recordingid, peerid, audioenabled ? audioproducer null, videoenabled ? videoproducer null, outputformat, quality ); } } // update recording status recordingstate status = 'recording'; // return recording control interface return { recordingid, stop () => stopmediarecording(recordingid), getstatus () => getrecordingstatus(recordingid) }; } catch (error) { // handle initialization errors console error(`failed to start recording ${recordingid} `, error); // clean up any partially initialized resources await stopmediarecording(recordingid); // remove recording state activerecordings delete(recordingid); throw error; } } this function generates a unique id for the recording processes recording options with sensible defaults initializes recording state tracking retrieves the media producers for each peer starts recording for each peer with the required producers returns a control interface for managing the recording setting up media consumers the system creates special consumers to access media streams // simplified from src/mediasouprecordingsystem ts import { getrouterforpeer } from ' /mediasouputils'; / creates mediasoup consumers for recording @param peerid peer id to record @param producer producer to consume @returns consumer and transport information / async function createrecordingconsumer(peerid string, producer any) { // get the router for this peer const router = getrouterforpeer(peerid); // create a plain transport for server side media consumption // this transport doesn't use dtls/ice like normal webrtc const transport = await router createplaintransport({ listenip { ip '127 0 0 1', announcedip null }, rtcpmux true, comedia false }); // create a consumer to receive the producer's media const consumer = await transport consume({ producerid producer id, rtpcapabilities router rtpcapabilities, paused false }); // return the consumer and transport return { consumer, transport, kind producer kind, rtpparameters consumer rtpparameters, localrtpport transport tuple localport }; } this function gets the mediasoup router for the specified peer creates a plain transport for server side media consumption creates a consumer to receive the producer's media returns the consumer, transport, and related information processing media with ffmpeg the system uses ffmpeg to process and encode the media streams // simplified from src/ffmpeg ts import { spawn } from 'child process'; import ffmpegstatic from 'ffmpeg static'; import { createwritestream } from 'fs'; / starts ffmpeg process to record media @param audiotrack audio track information @param videotrack video track information @param outputpath output file path @param options encoding options @returns ffmpeg process and control functions / async function startffmpeg( audiotrack any | null, videotrack any | null, outputpath string, options { format 'webm' | 'mp4'; quality 'high' | 'medium' | 'low'; } ) { // create sdp (session description protocol) content // this tells ffmpeg what kind of media to expect and where const sdpcontent = createsdpcontent(audiotrack, videotrack); // determine codec settings based on quality const videocodecsettings = getvideocodecsettings(options quality); const audiocodecsettings = getaudiocodecsettings(options quality); // build ffmpeg arguments const ffmpegargs = \[ // input options ' protocol whitelist', 'pipe,file,rtp,udp', ' i', 'pipe 0', // read sdp from stdin // video codec settings (if video track exists) (videotrack ? videocodecsettings \[]), // audio codec settings (if audio track exists) (audiotrack ? audiocodecsettings \[]), // output format ' f', options format, // output options ' y', outputpath // output file path ]; // spawn ffmpeg process const ffmpegprocess = spawn(ffmpegstatic, ffmpegargs); // write sdp content to ffmpeg's stdin ffmpegprocess stdin write(sdpcontent); ffmpegprocess stdin end(); // handle process events ffmpegprocess on('error', (error) => { console error('ffmpeg process error ', error); }); // return process and control functions return { process ffmpegprocess, stop () => { ffmpegprocess kill('sigint'); } }; } / creates sdp content for ffmpeg @param audiotrack audio track information @param videotrack video track information @returns sdp content string / function createsdpcontent(audiotrack any | null, videotrack any | null) { // sdp header let sdp = 'v=0\n'; sdp += 'o= 0 0 in ip4 127 0 0 1\n'; sdp += 's=ir engine media recording\n'; sdp += 'c=in ip4 127 0 0 1\n'; sdp += 't=0 0\n'; // add audio media section if audio track exists if (audiotrack) { sdp += 'm=audio ' + audiotrack localrtpport + ' rtp/avp 111\n'; sdp += 'a=rtpmap 111 ' + audiotrack rtpparameters codecs\[0] mimetype split('/')\[1] + '/' + audiotrack rtpparameters codecs\[0] clockrate + '\n'; // add other audio parameters } // add video media section if video track exists if (videotrack) { sdp += 'm=video ' + videotrack localrtpport + ' rtp/avp 96\n'; sdp += 'a=rtpmap 96 ' + videotrack rtpparameters codecs\[0] mimetype split('/')\[1] + '/' + videotrack rtpparameters codecs\[0] clockrate + '\n'; // add other video parameters } return sdp; } this code creates sdp content that describes the media streams for ffmpeg determines codec settings based on the requested quality builds ffmpeg command line arguments spawns an ffmpeg process with the appropriate configuration writes the sdp content to ffmpeg's standard input returns the process and control functions uploading recordings the system uploads completed recordings to storage // simplified from src/mediasouprecordingsystem ts import { createreadstream } from 'fs'; import { uploadtostorage } from ' /storageservice'; / uploads a recording to storage @param recordingid recording id @param filepath local file path @returns storage url / async function uploadrecording(recordingid string, filepath string) { try { // create a read stream from the file const filestream = createreadstream(filepath); // generate storage path const storagepath = `recordings/${recordingid}/${path basename(filepath)}`; // upload to storage const url = await uploadtostorage(filestream, storagepath, { contenttype getcontenttype(filepath), metadata { recordingid, timestamp new date() toisostring() } }); console log(`recording ${recordingid} uploaded to ${url}`); return url; } catch (error) { console error(`failed to upload recording ${recordingid} `, error); throw error; } } / gets content type based on file extension @param filepath file path @returns content type / function getcontenttype(filepath string) { const extension = path extname(filepath) tolowercase(); switch (extension) { case ' webm' return 'video/webm'; case ' mp4' return 'video/mp4'; case ' ogg' return 'audio/ogg'; default return 'application/octet stream'; } } this code creates a read stream from the local recording file generates a storage path for the recording uploads the file to storage with appropriate metadata returns the url for accessing the uploaded recording recording workflow the complete recording workflow follows this sequence sequencediagram participant admin as admin/system participant server as ir engine server participant mediasoup participant ffmpeg participant storage admin >>server request recording (peerids, options) server >>server generate recordingid loop for each peer server >>mediasoup create plain transport mediasoup >>server transport created server >>mediasoup create consumer for peer's producer mediasoup >>server consumer created server >>server create sdp content server >>ffmpeg start process with sdp mediasoup >>ffmpeg send rtp packets ffmpeg >>ffmpeg process and encode media ffmpeg >>server write to output file end admin >>server stop recording server >>ffmpeg stop process ffmpeg >>server process terminated server >>storage upload recording file storage >>server upload complete server >>admin recording url this diagram illustrates an admin or system requests a recording with specific peers and options the server generates a unique recording id for each peer, the server creates transports and consumers in mediasoup the server creates sdp content and starts an ffmpeg process mediasoup sends rtp packets to ffmpeg, which processes and encodes the media when the recording is stopped, the ffmpeg process is terminated the server uploads the recording file to storage the recording url is returned to the admin or system integration with other components the media stream recording system integrates with several other components of the multiplayer infrastructure webrtc communication backbone the recording system leverages the webrtc infrastructure // example of webrtc integration import { getstate } from '@ir engine/hyperflux'; import { mediasoupmediaproducerstate } from ' /mediasoupmediaproducerstate'; // function to find producers for a peer function findproducersforpeer(peerid string) { // get the current producer state const producerstate = getstate(mediasoupmediaproducerstate); // find all producers for this peer across all networks const peerproducers = object values(producerstate producers) flatmap(networkproducers => object values(networkproducers)) filter(producer => producer peerid === peerid); // separate audio and video producers const audioproducer = peerproducers find(p => p kind === 'audio'); const videoproducer = peerproducers find(p => p kind === 'video'); return { audioproducer, videoproducer }; } this integration uses the webrtc state to find media producers for specific peers accesses the mediasoup infrastructure to create consumers leverages the existing media flow for recording purposes minimizes additional overhead by tapping into existing streams hyperflux state management the recording system uses hyperflux for state management // example of hyperflux integration import { definestate, getmutablestate } from '@ir engine/hyperflux'; // define recording state export const recordingstate = definestate({ name 'ee recording recordingstate', initial { activerecordings {} as record\<string, { id string; peerids string\[]; starttime number; status 'initializing' | 'recording' | 'stopping' | 'completed' | 'error'; error? string; outputurls? string\[]; }> } }); // function to update recording status function updaterecordingstatus(recordingid string, status string, error? string) { const recordingstate = getmutablestate(recordingstate); if (recordingstate activerecordings\[recordingid]) { recordingstate activerecordings\[recordingid] status set(status); if (error) { recordingstate activerecordings\[recordingid] error set(error); } } } this integration defines a state container for tracking recording status updates state as recordings progress through their lifecycle makes recording status available to other components follows hyperflux patterns for state management feathersjs services the recording system integrates with feathersjs services // example of feathersjs integration import { application } from '@feathersjs/feathers'; // recording service class recordingservice { app application; constructor(app application) { this app = app; } // start a recording async create(data { peerids string\[]; options? { audioenabled? boolean; videoenabled? boolean; outputformat? 'webm' | 'mp4'; quality? 'high' | 'medium' | 'low'; } }) { // get the recording function from hyperflux state const recordingapi = getstate(recordingapistate); if (!recordingapi createmediachannelrecorder) { throw new error('recording system not available'); } // start the recording const recording = await recordingapi createmediachannelrecorder( data peerids, data options ); // return recording information return { recordingid recording recordingid, status 'recording', peerids data peerids }; } // stop a recording async patch(recordingid string) { // get the stop function from hyperflux state const recordingapi = getstate(recordingapistate); if (!recordingapi stopmediachannelrecorder) { throw new error('recording system not available'); } // stop the recording const result = await recordingapi stopmediachannelrecorder(recordingid); // return result return { recordingid, status 'completed', outputurls result outputurls }; } // get recording status async get(recordingid string) { const recordingstate = getstate(recordingstate); const recording = recordingstate activerecordings\[recordingid]; if (!recording) { throw new error(`recording ${recordingid} not found`); } return { recordingid, status recording status, peerids recording peerids, starttime recording starttime, error recording error, outputurls recording outputurls }; } } // register the service export default function(app application) { app use('/recordings', new recordingservice(app)); } this integration creates a feathersjs service for managing recordings exposes recording functionality through a standard api leverages hyperflux state for accessing recording functions provides methods for starting, stopping, and checking recordings benefits of media stream recording the media stream recording system provides several key advantages session replay enables players to review past gameplay or meetings content creation provides high quality footage for streaming or promotional content performance analysis allows players and coaches to analyze gameplay training materials creates instructional content for new players compliance and moderation maintains records for moderation or compliance purposes event archiving preserves important events or tournaments debugging helps developers identify and fix issues by reviewing sessions these benefits make media stream recording an essential feature for enhancing the multiplayer experience conclusion the media stream recording system represents a powerful extension of the ir engine's multiplayer infrastructure by leveraging the webrtc communication backbone, it enables the capture and preservation of gameplay sessions for various purposes this capability enhances the overall multiplayer experience and provides valuable tools for players, content creators, and developers throughout this documentation series, we've explored the complete multiplayer infrastructure of the ir engine, from the webrtc communication backbone to instance lifecycle management, user connection and authorization, the feathersjs application structure, the hyperflux state management system, and finally, media stream recording together, these components create a robust foundation for building engaging multiplayer experiences