-
Notifications
You must be signed in to change notification settings - Fork 146
PSMoveService Road to V1.0, Draft 0
This outlines at a high level the set of work to do to get PSMoveService to a 1.0 release. Almost all of this remaining work is additive to what we have now with the exception moving the filtering and prediction code to the client library. My reasoning for moving the filtering and prediction code to the client is two-fold:
- It cuts down on the amount of network back and forth we have to do between the config tool and the service because more of the calibration state can live purely in client side config
- Magnetometer calibration
- Camera intrinsic matrix
- Camera extrinsic matrix
- It makes easier in the future to support a client connecting to multiple services
- controller + trackers on one service
- trackers on another service
- fusion of results takes place on the client
- The VRTracker Library basically does this
NOTE I'm not married to moving the fusion work to the client. If you think this is a terrible idea, this plan can be refactored back into doing the fusion work on the service.
By moving the the filtering and prediction functionality to the client, the major remaining work on the service reduces to tracking and networking blob state to the client.
###TrackerView
- Network Request Set tracker camera control parameters (exposure, brightness, etc)
-
Network Request Turn on/off tracking for a controller
- Flags a controller as tracked
- For each tracker: Iterate thru all tracked controllers and find blob
- Publish tracked blobs via UDP DeviceDataFrame
- x/y pixel position
- pixel radius
- source tracker id
- controller id
- Assign tracking color to a controller
- Remove filter and prediction code from controller view (moved to client)
- Remove magnetometer calibration state (moved to client)
Despite the the fusion work happening on the client, the public facing client API should remaining largely the same. The one major change I want to make is to get rid of the callback functions for the async requests and instead switch to a result polling model similar to how OpenVR.
-
Remove callbacks from client API, Issue #20
- Switch all async event callback functions over to poll_next_event() style
- This is more thread safe for clients and is easier to Marshal into C#
// Process OpenVR events vr::VREvent_t event; while (m_pVRSystem->PollNextEvent(&event, sizeof(event))) { processVREvent(event); }
-
Position Filter (currently exists as a pass-thru filter only)
- Move over to the PSMoveClient DLL (not publicly exposed)
- Used by the ClientControllerView
- Implement missing filter types (LowPass, Kalman)
- Orientation Filter
- Move over to the PSMoveClient DLL (not publicly exposed)
- Used by the ClientControllerView
- ClientControllerView
- Gather latest received blob positions
- Compute 3D position
- Port over multicam Position triangulation from my fork of psmoveapi. See implementation discussion here.
- Fallback to blob radius -> Z-distrance method for one camera
- Apply position and orientation filtering
- Apply prediction
- Compute prediction using the historical data stored in the position and orientation filters
The config tool needs several calibration and setup tools ported over from psmoveapi. The most complicated ones are the old and new camera pose calibration tools that determine the location ofthe PS3EYE tracking cameras. All of the existing parts of the config tool will need some love too.
- Intrinsic Matrix
- Port over the tracker_camera_calibration tool from psmoveapi
- Save the intrinsic matrix to a config file
- Extrinsic Matrix (HMD coregistration)
- Port over the visual_coregistration_tool from psmoveapi
- Save the extrinsic matrix to a config file
- Extrinsic Matrix (Calibration Mat)
- Port over the visual_tracker_setup from psmoveapi (see https://youtu.be/33cWRaCC9hU)
- Save the extrinsic matrix to a config file
- Controller Color Selection
- Add a config panel in the controller settings to select a tracking color
- Requests a list of available colors from the server
- Updates selected color of the controller on the server
- Server saved tracked color to the controller config
- Exposure Calibration
- Port over the tracking blink calibration
- Add as a config tool for a selected tracker in the config tool
In order to make the PSMoveService broadly usable I think it makes the most sense to create an OpenVR plugin. This will expose the psmove controllers and the ps3eye trackers to SteamVR. Fortunately Valve has already provided a reference driver for the Razer Hydra so it should be pretty straight forward to adapt a plugin that connects to PSMoveService instead.
- Implement
vr::IServerTrackedDeviceProvider
- Owner if the ClientPSMoveAPI instance
- Implement
vr::IClientTrackedDeviceProvider
- Mostly no-ops on stuff, related to HMD operation
- See vr::ITrackedDeviceServerDriver-Overview
- Implement
vr::IServerTrackedDeviceProvider
- Does the bulk of the work updating and posting data about the controllers
- See vr::ITrackedDeviceServerDriver-Overview
-
I do prefer that the fusion and prediction live in the service. -My use case will have multiple clients (UE4, data logger) and I want their position estimation to come from the exact same model yet the clients will not know about each other. -Updating the state-space model is non-trivial (in terms of cycles) and shouldn't be in the main client thread. We don't really want an extra thread in the client, so putting it in the service process is easiest. -Passing the calibration matrices from the config tool to the service isn't that big of a burden. -Magnetometer calibration, and other calibrations, shouldn't have to be done separately for every client. -The main reason for fusing the trackers and IMU estimation on the client is the case of having many trackers (cameras). What happens when the number of cameras increases a lot? How reliable will it be for the client to get the multi-blob data on every frame, especially when the client is WiFi or BT and mobile? Will bandwidth be a problem? What about network overhead when communicating with many services? -The x,y position and pixel radius is maybe not enough for certain types of positional estimation. Oliver Kreylos' method requires the full blob (i.e., all pixels). -I think a better solution to having multiple services running (e.g., with different trackers) then fusing them in the client is for the trackers to be remotely-accessible by the 1 service. This probably means writing a minimal application for the trackers that exposes them to network communication. This should be non-exclusive so other services could make use of the same trackers. This isn't necessary for v 1.0
-
I wasn't planning on doing prediction by maintaining a buffer of data then extrapolating as that would be too slow for the client (wait for curve fit on every poll). Rather, I was planning on having the service update the state-space model whenever a new camera frame(s) arrives. When the client makes a request, it's trivial to predict into the future when you already have a fit model.
-
Blinking calibration isn't worth porting over. It rarely works correctly. Let's just do manual exposure settings for v1.0. If it's done in such a way that it doesn't interfere with other running clients then we can do in-game tests to verify that the current exposure setting seems to work. Then, in later versions, to the config tool exposure settings window we can add a GUI camera view that shows all colour-coded blobs (even small noisy ones) and maybe even have some exposure presets (day, night, etc) quickly accessible from a taskbar icon.
-
About going from callbacks to poll_next_event, will clients be waiting during
while (m_pVRSystem->PollNextEvent(&event, sizeof(event)))
? Is there any chance this won't return immediately?
#Notes from HipsterSloth These are all great counterpoints. I now agree we should keep the fusion work on the service.
I hadn't considered the case of multiple clients connecting to service. I'm not worried about the CPU cost of the state space model, but if each client did the fusion work independently they would get different results, which is a deal breaker. I'd also love to hear more about Oliver's blob tracking method. At the end of the day doesn't his method just compute an ellipse fit of the blob and thus we could just network that blob state?
As for multiple connections over wireless I don't think that would be a big issue. A given connection would only send UDP tracking data when the tracker had view of a controller. Most of the connections would be idle in a many-computer case. However coordination between the many tracking computers definitely becomes a lot more complicated. Mostly I was thinking about cutting down tracking latency by having to network blob tracking results to a central server and then network the fusion results to a client. But like you said, this isn't a 1.0 feature, so we shouldn't worry about it.
That's a good point about the state space model already having prediction build in. The Kalman filter technically supports this but I never did the work to add this in (feeding in accelerometer data). The LowPass filter will need to take velocity into account. Actually if you get a chance, you should take a peek at the PositionFilter and the OrientationFilter stubs I have in the HMD branch and see if they are heading in the direction you had in mind for the state space filtering or not.
I like the idea of exposure presets a lot. The task bar icon thing reminds me, back when we were first starting the service work I had made the stub of a windows task bar interface. I'll send you a video of what it looks like. Unfortunately, I don't of a good cross platform way to do the taskbar thing. But it's not that much code, so I wouldn't be opposed to just having per platform taskbar wrapper applications.
The poll event call should return immediately because it will just be reading off a queue of async query results that were read off of the network during the PSMoveClientAPI::Update() call. That said, I should check to make sure that the OpenVR PollNextEvent call is non-blocking because otherwise we now have a blocking call in the config tool (OpenVRContext::update).