Test project for implementing WebSocket
/ WebRTC
's SFU (Selectable Forwarding Unit)
network architecture on the Unity. This project was created for use in Oculus Quest 2, but supports other platforms (PC, Android mobile) to test the connection from multiple clients.
Each protocol's client (WebRTC
or WebSocket
) inherits SfuClient
class and could calls the Connect
, Close
, Send
, OnMessage
function via the same API.
-
Broadcast
-
Unicast (by user id)
-
Multicast
-
DataChannel
-
Audio
-
Video (Not tested yet)
-
Trickle-ICE
-
Vanilla-ICE (No plans at the moment.)
-
Binary
-
Text (No plans at the moment.)
-
OnOpen
-
OnClose
-
OnJoin
-
OnExit
NetworkId
s are defined as a combination of public id
and private id
. public id
is set additionally when spawning network manageable gameobject (like Prefab
).
- Could select default protocol for Synchronise (
WebRTC
orWebSocket
)
All network-manageable gameobjects (NetworkObject
s) are listed and managed in the respective NetworkObjectGroup
. Thus, pre-existing NetworkObject
s and any spawnable gameobjects are managed in their own NetworkObjectGroup
and initialized through the NetworkObjectGroup
.
-
Transform (threshold base)
-
16bit Float
-
32bit Float
- With
Rigidbody
- With
Interpolation
- Without
Interpolation
-
Nesting synchronisable gameobjects
-
Updating based on world space transform
-
Updating based on relative transform
-
-
Animator
- With
Interpolation
- Without
Interpolation (Poor test ...)
- With
-
Avator
-
Headset
-
Hand/Finger
-
Elbows/Knees (Maybe I need to add IK first to support it ...)
-
-
Grab
-
Ray
-
Poke
demo.0.mp4
demo.1.mp4
demo.2.mp4
- Clone this repository with the following command.
git clone https://github.com/TLabAltoh/Unity-SFU-Integration
cd Unity-SFU-Integration
git submodule update --init
-
Select
Multi Pass
inProjectSettings/XR Plug-in Management/Oculus/Stereo Rendering Mode
for UI canvas and hand tracking rendering. -
Search asset that named
Config
on the Project view and set your server's ip/port. -
Open
Assets/Samples/VRProject/Scenes/MAIN.unity
Oculus Quest or PC
You may not need to check this as the bellow is the default setting.
- Replace
StandaloneInputModule
withCanvasModule
- Please confirm that the following setting
Android Mobile (not Oculus Quest)
- Replace
CanvasModule
withStandaloneInputModule
- Please confirm that the following settings
- Play
Assets/Samples/VRProject/Scenes/MAIN.unity
on the Unity Editor or build app
Note
The first joiner is treated as the host, but World Space UI operation is only supported in Oculus / PC mode. The Android mobile client must join after the host joins.
PC
: just same as Unity Editor's Scene View.
Oculus Quest
: Headset's position tracking
Android Mobile
: Joystick at bottom left of screen.
- Clone and run the server repository with the following command.
git clone https://github.com/TLabAltoh/rust-server-for-multiplayer.git
cd rust-server-for-multiplayer
build-*.bat
run-*.bat
This project has only been tested on a local network and not on a dedicated server. The server was hosted on a general Windows PC.
All the features are implemented on the sample scene. And the project has continuously do destractive update. So documentation will be made after project architecture is stable.
- Oculus quest 2 by Nosakhae is licensed under Creative Commons Attribution.
- Realistic Human Lungs by neshallads is licensed under Creative Commons Attribution.