Unreal SDK  v1.2.0
Loading...
Searching...
No Matches
Unreal SDK

Welcome to the Weart Unreal SDK documentation.

Support Unreal 4.27, 5.1, 5.2 and 5.3

The SDK allows to connect to the Weart middleware and perform various actions with the TouchDIVER devices:

  • Start and Stop the middleware operations
  • Calibrate the device
  • Receive tracking data from the devices
  • Send haptic effects to the devices

Architecture

Unreal SDK Architecture

Set-up project and import plugin

The minimum setup to use the weart SDK consists of:

  • A PC with the Middleware installed
  • A TouchDIVER device
  • An Unreal project using the SDK plugin

Create a new project by selecting Virtual Reality.

Copy the whole plugin folder either in $UE_LOCATION$/Engine/Plugins for global use or in $PROJECT_FOLDER$/Plugins to use in a specific project. Enable it inside the UE editor in Edit/Plugins.

Enable the plugin if it was placed globally

Note
In order to use the WeArt Plugin, you must convert the project to a C++ project, this can be done in Unreal 4 in File πŸ‘ͺ New C++ class. And on Unreal 5 you can find it in Tools πŸ‘ͺ New C++ class. After the file was created, restart the project.

In the World Settings change the Default GameMode to WeArtGamemode.

If the VR Preview is inactive make sure to install the plugin that corresponds to your headset. Make sure only OpenXR plugin is enabled from all public VR API.


Note
We recommend to use the OpenXR plugin for a better precision of the position and rotation for the virtual hands inside the experience
Note
The hand system is affected heavily by low FPS (Frames Per Second). For the best experience we reccomend that the experience runs in minimum 60 FPS. This can be achived by optimizind the experience (lower resolution textures, models with lower polygons, not using ray tracing if the device cannot run it properly)

Guide Components

Components

WeArtController

It is actually created automatically as soon as you enable the plugin. But if you want access to data members etc, just create a Blueprint Class (Add->Blueprints->Blueprint Class) which inherits from WeArtController.

WeArtTouchableObject

Component responsible for the description of the haptic effect to be applied in the event of a collision with the HapticObject actor.

Properties:

  • Temperature – Temperature value implemented on the target thimble or thimbles (from 0.0 to 1.0) [0.5 is environment temp – 0.0 really cold – 1.0 really hot]
  • Force – Force value applied on the target thimble (s) (from 0 to 1) [0.0 no force – 1.0 max force]
  • Texture – Type of texture rendered on the thimble or target thimbles (from 0 to N)
  • Volume Texture: Configure the intensity of texture rendering (from 0 to 100)
  • Graspable: Enable the ability to grasp the object with virtual hands

The WeArtTouchableObject component has a field called ForcedVelocity. if it is enabled, when the hand enters the touchable object, the texture feeling will run at maximum speed. It will not take in consideration the movement of the hand.

WeArtHapticObject

Component responsible for the haptic actuation of the individual digital devices belonging to the TouchDiver device.

Properties:

  • Hand Side Flag – Which device it belongs to (RIGHT | LEFT)
  • Actuation Point Flag – Target of the thimble or the thimbles on which to apply implementation (THUMB | MIDDLE | INDEX) [Multi selection]
  • Control: - (Set by TouchableObject during interaction or manually by developer)
    • Temperature – Temperature value implemented on the target thimble or thimbles (from 0.0 to 1.0) [0.5 is environment temp – 0.0 really cold – 1.0 really hot]
    • Force – Force value applied on the target thimble (s) (from 0 to 1) [0.0 no force – 1.0 max force]
    • Texture – Type of texture rendered on the thimble or target thimbles (from 0 to N)

How to create Haptic/Touchable objects:

By adding actor components to existing actors. Either insert any actor in the map, and equip it with a Haptic/Touchable component and a collision, or create a blueprint actor class which you can then reuse anytime you want. In the details panel, you should have access to all the modifiable variables.

WeArtThimbleTrackingObject

Component responsible for tracking of thimble's movements for quantifying its closed state and animating virtual hands. Properties:

  • Hand Side Flag – Which device it belongs to (RIGHT | LEFT)
  • Actuation Point Flag – Target of the thimble or the thimbles on which to apply implementation (THUMB | MIDDLE | INDEX) [Multi selection]

Hands Bluprints

These are the two blueprints for the hands:

Hand Grasp Events

In order to enable the events to be called, drag this blueprint into the level.

The blueprint contains the following events inside it. Each event returns the hand side that grabbed the object and the object that was grabbed as a WeArtTouchableObject.

Hand Offset

The two hands blueprints contain the component called WeArtDeviceTrackingObject.

This component has a flag called UseOffsetPreset. If it is enabled, the hand will use the offset present in the field OffsetPreset.

These are the presets available:

WeArtDeviceTracking

Component responsible for the tracking of the wrist on which the TouchDiver device is placed Properties:

  • Update method – Mode in which the position of the object is updated on the basis of the source tracking
  • Tracking Source – Transform reference of motion controller source (ex. VR Controller or Vive Trackers, etc. etc.)
  • Position Offset – Trasform position offset respect from the tracking source
  • Rotation Offset – Trasform rotation offset respect from the tracking source
  • Offset presets – You can choose a preset offset parameters of specific platform controller like Oculus or Vive or in any case a custom configuration

Physical Grasping System

Grabbing and physical interaction is all part of UWeArtPhysicHandler class, which is responsible for managing the physics of a skeletal mesh representing a hand in a virtual reality environment. Physical interaction between touchable objects and hands handled using Physical Assets. Physic assets contains collision data for fingers, and being used during runtime. During grasping, UWeArtPhysicHandler rebuilds collision data from physics asset to create right offsets for each finger part, because of that we can handle precise hit events and support solid grabbing.

Grasping works as with physics bodies, and with bodies with Block collsion profile.

Virtual Hand

To adjust finger positioning after grab, UVirtualHand calculates maximum and minimum closure values. After interpolation between these values It adjusts current closure of master and salve finger boides. Master have higher priority on grab condition, than slave ones. You can access VirtualHand implementation in code and animation blueprint. Each finger group handles separatly by sequence.

Speed of interpolation can be tweaked, but It also takes fps and delta time values, to make It smoother on different machines.

Phantom Hand

For debugging purposes you can use Phantom Hand, which is in-game representation of current hand in world, It can be useful to check force with which fingers push into collision object. To turn It on, use ShowDebugHand parameter in HapticHand_BP.

Middleware information

The "StatusTrackerDisplay" blueprint (shown below) is a 2D canvas containing all the status information received by the middleware.

In particular, the object will display:

  • The current middleware version
  • The current status of the middleware (as shown in the middleware UI)
  • Whether actuations are enabled or not
  • The last status code received from the middleware and, if not ok, a description of the error
  • The status of the connected TouchDIVERs
    • Mac Address
    • Assigned HandSide
    • Battery level (and will show a bolt symbol if the device is charging)
    • Calibration status during the session
    • Status of each thimble (shown on the hand icon by the three colored dots)

In order to have the middleware status display working in the scene, add the following components:

Status Codes

The current status codes (along with their description) are:

Status Code Description
0 OK Ok
100 START_GENERIC_ERROR Can't start generic error: Stopping
101 CONNECT_THIMBLE Unable to start, connect at least one thimble and retry
102 WRONG_THIMBLES Unable to start, connect the right thimbles matched to the bracelet and retry
103 BATTERY_TOO_LOW Battery is too low, cannot start
104 FIRMWARE_COMPATIBILITY Can't start while the devices are connected to the power supply
105 SET_IMU_SAMPLE_RATE_ERROR Error while setting IMU Sample Rate! Device Disconnected!
106 RUNNING_SENSOR_ON_MASK Inconsistency on Analog Sensors raw data! Please try again or Restart your device/s!
107 RUNNING_DEVICE_CHARGING Can't start while the devices are connected to the power supply
200 CONSECUTIVE_TRACKING_ERRORS Too many consecutive running sensor errors, stopping session
201 DONGLE_DISCONNECT_RUNNING BLE Dongle disconnected while running, stopping session
202 TD_DISCONNECT_RUNNING TouchDIVER disconnected while running, stopping session
203 DONGLE_CONNECTION_ERROR Error on Dongle during connection phase!
300 STOP_GENERIC_ERROR Generic error occurred while stopping session
Note
The description of each status code might change between different Middleware versions, use the status code to check instead of the description.

StatusTrackingBP

If this blueprint is placed in the scene, it will fire an event for every received middleware message about the middleware status. There are two events that can be fired:

OnMiddlewareStatus

OnDevicesStatus

These two events are just signaling when one of the two data types is received. If you want to access them do it in EventTick. Then access the local data. This data gets updated automatically.

Then split the struct to access the values.

The same can be done for ConnectedDeviceStatusLeft and ConnectedDeviceStatusRight.

WeArtThimbleSensorObject

If you want to use this feature, you will have to enable RawDataAutoStart in Project Settings -> WeArt

Note
If inside the Project Settings, the RawDataAutoStart is checked, the calibration will be forced. If you want to use CalibrationUXBP or TrackingCalibrationBP, disable RawDataAutoStart.

This component can be added to any actor and by accessing it in EventTick, you can get the sensor data of the specified thimble. The component updates the values automatically.

Make sure to set the Hand Side and Actuation Point.

Calibration UX Blueprint

The CalibrationUXBP offers an easy and precise way of calibrating the touch diver at the start of the experience. You need to put your hands in the position specified and it will signal to the middleware that the calibration process is statring.

Note
If inside the Project Settings, the RawDataAutoStart is checked, the calibration will be forced. If you want to use CalibrationUXBP, disable RawDataAutoStart.

You can drag and drop the following blueprint in the scene:

This is how it should look in the scene:

The blueprint has two fields exposed in the editor:

  • HandsMode(if this is set to "Both", the calibration will require the position of both hands, if it is set to "Single" it will require only one hand)
  • HandSide(if HandsMode is set to "Both". this field will decide which single hand will require the position. If it is set to "Right", it will require only the right hand position, if it is set to "Left", it will require only the left hand position.)

Here you can change the fields.

API

Start/Stop Client

Once connected to the middleware, it's still not possible to receive tracking data and send haptic commands to the devices. In order to do so, it's important to start the middleware with the proper command.

From WeArtController GameInstance get Subsystem

UGameInstance* gameInstance = GetOuter()->GetWorld()->GetGameInstance();
UWeArtController* weArtController = gameInstance->GetSubsystem<UWeArtController>();
Weart controller, used to connect to the Weart middleware, perform operations and receive messages.
Definition: WeArtController.h:34
weArtController->PauseController(); //Start Middleware communication
weArtController->UnpauseController(); //Stop Middleware communication
void UnpauseController()
Definition: WeArtController.cpp:60
void PauseController()
Definition: WeArtController.cpp:49

Start/Stop Calibration and events

From WeArtController GameInstance get Subsystem

UGameInstance* gameInstance = GetOuter()->GetWorld()->GetGameInstance();
UWeArtController* weArtController = gameInstance->GetSubsystem<UWeArtController>();
weArtController->StartCalibration(); //Start Middleware calibration
weArtController->StopCalibration(); //Stop Middleware calibration
void StopCalibration()
Definition: WeArtController.cpp:76
void StartCalibration()
Definition: WeArtController.cpp:70

Blueprint:

In order to enable the events to be called, drag this blueprint into the level.

Inside this blueprint there are four events that are fired when the calibration starts, finishes, it succeeds or fails. They all return the side of the hand that called the event.

There is another event that fires when the calibration stops and it does not contain a hand side.

WeArtTouchEffect

The SDK contains a basic WeArtTouchEffect class to apply effects to the haptic device. The TouchEffect class contains the effects without any processing. For different use cases (e.g. values not directly set, but computed from other parameters), create a different effect class by implementing the WeArtEffect interface.

Create Custom effect

Instantiate new effect:

Effect to be applied to the thimble.
Definition: WeArtTouchEffect.h:22

Update effect

Create and activate actuations

force.active = true; //active actuation
force.value = valueForce; //set value
Force value to be applied to an effect.
Definition: WeArtForce.h:11
float value
Definition: WeArtForce.h:20
bool active
Definition: WeArtForce.h:18

Set actuation to effect:

effect->Set(temperature, force, texture);

Add effect

Apply to your HapticObject (finger/thimble) effect:

hapticObject->AddEffect(effect);

Remove effect

To remove effect and restore actuation, get the same instance of effect and call β€œRemove” for the same HapticObject:

hapticObject->RemoveEffect(effect)

Tracking

After starting the middleware and performing the device calibration, it's possible to receive tracking data related to the TouchDIVER thimbles.

The player pawn contains two UWeArtHandController. One for each hand. Every UWeArtHandController contains three UWeArtThimbleTrackingObject. One for index, middle and thumb. From each UWeArtThimbleTrackingObject we can get the closure and abduction values. The values are from 0 to 1, 0 representing no closure or abduction and the 1 representing the maximum value.

Getting Closure and Abduction:

UWeArtThimbleTrackingObject* thumbTrackingObject;
thumbTrackingObject->GetClosure();
thumbTrackingObject->GetAbduction();

Here is a blueprint representation of getting the values, in this image the values are used for the animation of the hands.

Adding new custom texture type

WeArtCommon.h contains an enum called TextureType. We can add a new member here and assign its number. Then inside the code and the editor we can set this member to WeArtTouchableObjects and inside Texture classes.

After compiling the code, the member will be available in the editor:

Plugin configuration

Go to Project Settings, then under the Plugins category you will find the WeArt settings. Here you can edit if you want to start the middleware and the calibration at the beginning of the experience.

Level Content

Blueprints

Commonly used blueprint classes stored in Blueprints folder, in components you can find blueprints related to calibration and grasping functionality.

For debugging purpose was created DebugActor, which displays grab conditions on sreen, It can be helpful to check, If you correctly set-up grabable object.

Hands are stored in HandBlueprints folder, Left and Right inhert from HapticHand_BP, If you want to make changes for both hands, you can implement them in HapticHand_BP.

Calibration

If you did not set callibration on start, you can do that manually by using CalibrationUXBP, located in Blueprints\Components directory.

  • StatusTrackerDisplay - is the widget that show device status and properties.
  • CalibrationUXBP - handles calibration process and shows the status of the device.
  • StatusTrackingBP - actor that communicates between the middleware and StatusTrackerDisplay.

Setup Example

For an example scene ready to use open "WEARTSampleLevel". To find It go to DemoResources folder.

Make sure you have WeArtGamemode blueprint set in WorldSettings tab, after that level is ready to test SDK functionality. If you would like to start from using your own controller classes, you can change them in gamemode blueprint, or in controller class itslef.

To use controller with test hand system make sure you have you WeArtGamemode set in world settings.

This gamemode contains controller and pawn with ready to test hands.

Haptic Surafces

Around the player start placed 4 plates with different touchable surface settings, you can test haptic effects with them.

Interaction

To test temperature you can use bunsens on the table, red one produces hot particles, blue one - cold. To enable them, press the button, It can be clicked by hand or grabbed object.

On table placed several objects, on which you can test grab system.

Setup in new map

We recommend to create a new project starting with the VR Template. In order to have the project ready for VR. If you have an already existing map, you can follow the same steps.

Take the Player Start actor close to working area. Then go to world settings and set WeArtGamemode at GameMode Override.

If you play the map in VR Preview, you should be able to see your hands.

Add to the map "CalibrationUXBP", "StatusTrackingBP" and "StatusTrackerDisplay"

If the VR Priview is not active, refer to "Set-up project and import plugin" at the top of the page.

Making an object graspable

Grabbing system works on actors, that have WeArtTouchable component attached and proper collison preset.

Collider design best practice

Component hierarchy

Be sure, that you put static mesh component as root, when you create your touchable object. Better to create blank actor and drag added static mesh component as root. If you don't - grab will not work and you will see log error.

Warning
Do not use Touchable_BP - Touchable_BP is just a parent, that can be used for creating new touchable objects, like ones placed in demo level.

Physical object with grab component must have convex collision or primitive one (sphere, cude e.t.c), physical hand uses convex collision data to attach It to physic constraint. Mesh collsion can lead to physics issue. Constraint attached to tracking object inside HapticHand_BP.

Migration and Update from previous SDK

To update your application to the latest SDK, download and extract the Unreal SDK archive, then copy the source/header files in the same place as the older SDK version.

The new version includes additional files, so it's necessary to add them to the project in order to avoid linking errors.

This can be done on Visual Studio by right-clicking on the solution, then clicking on Add -> Existing Item and selecting all the SDK files. On other systems (e.g. cmake) the procedure might be different.

The SDK is retro-compatible with older versions, so there's no need to update the application code. To see the new feature and fixes added to each version, refer to the Changelog section.

Changelog

  • New physic hand and grasping system
  • New BP and UI panel fo Middleware and device status
  • Added Tracking sensor raw data
  • Expose public property clousure thimble
  • Add calibration procedure start/stop and listener
  • Add hand grasping events
  • New sample scene