Optitrak Motion Tracking System

From REALab Wiki
Jump to navigation Jump to search

The NaturalPoint Optitrak motion tracking system consists of infrared-emitting and -sensing cameras positioned to triangulate and extract information about the location, movement, and orientation of rigid bodies. We have 4 6-camera Optitrack VR:100 systems (now called flex 3) which collects motion data at 100Hz and 640x480 res, and 1 12-camera flex 13 camera system which collects motion data at 120Hz and 1280x1024 res.


There are two programs that can be used to calibrate the optitrack cameras to monitor marker motion in your area of interest, and to stream out data. Tracking tools is useful for tracking a small set of markers, which can be configured into trackables- sets of markers with fixed distances from one another. For example, you may want to use 3 markers on the hand to create a trackable for monitoring hand movements, or place 3 trackers on a hat for a subject to wear to be able to track gross position. One benefit of creating trackables over using single markers separately is that only trackables will yield orientation data (yaw, pitch, and roll) in addition to basic position information produced by single markers.

ARENA is the software to use for full body motion tracking. This requires the person being tracked to wear the body suit and hat, which altogether have 34 markers positioned in particular locations. The steps for calibration differ for ARENA and Tracking Tools. Pick the appropriate guide below.

If you are interested in using the optitrak with a touch table setup, please refer to this wiki page.

Calibration with Tracking Tools

note: if you are using the laptop setup in room 3108, make sure that the dongle for Tracking Tools is not plugged in when starting up the computer.

To calibrate the optitrak for use with the touch table, make sure that the Tracking tools dongle is plugged in. Then start Tracking Tools from the desktop.

In general, you will follow these steps in order to calibrate:

  1. Use the wand to register the area of interest
  2. Use the plane to set the ground plane
  3. If using trackables, create trackable objects
  4. Set the data to stream out (for reading into matlab)

Step-by-step calibration guide

1. Open Tracking Tools

Open Tracking Tools and select "Perform Camera Calibration".

If you are sure the setup has not changed, you can load a previous calibration that you have saved. To do so, select "Open Camera Calibration", and select a recent file. Then Proceed to Step 6.

Start trackingtools.png

2. Change Illumination

Change Illumination Type from "Strobe IR" to "Continuous"

Continuousillum trackingtools.png

3. Set Blocking Masks (if necessary)

Clear the table of the reflective equipment (e.g., finger marker, wand, ground plane) and make sure that there are no bright lights registered in the software. If there are, find their source on or around the table and remove them so that the cameras images are all dark. Once this is the case move on to Step 4.

As a last resort if there are any lights visible in the cameras' videos that can't be removed, you can block out that area in the corresponding camera with a mask. To do so, click the "Block Visible Markers" button, and it will automatically mask any markers currently registered by the cameras (as indicated by a red mark corresponding to the blocked area in the camera's field). You can use the "Clear Blocking" button to remove masks if you need to adjust or start again. Once all lights are masked, move on to Step 4.

Blockingmasks1 trackingtools.png Blockingmasks2 trackingtools.png

4. Wanding

*Important: make sure the wand size selected corresponds to that indicated on the actual wand before proceeding. Click on "Start Wanding" [(1) in the picture below] and sweep the wand over the table's surface. Your goal is to a) wand the 3D area you want to track in, e.g., most likely the area close to the table's surface; and b) "paint" all the regions as full of wand data as possible. e.g., in the picture below, camera 4 is more painted than camera 5.

When the indicated quality becomes "Very High" (2), you can click on "Calculate" (3).

Wandingcalibration trackingtools2.png

5. Apply Wanding Results

The tracking tools software will now calculate the quality of your calibration. If you wanded well, you should ultimately not see the distorted square grid shown in Cameras 2 and 3 below. Bear in mind that the grid will become less distorted as the results of the calibration are being calculated, however if the distortion in the grid persists when the cameras read "Exceptional", you should calibrate again by clicking the "Reset" button at the top of the 3-Marker Calibration panel; then go back to Step 4. When all cameras achieve "Exceptional" results, and the overall result is "Exceptional", and there is no distortion in the grids for any of the cameras, click on "Apply Result" (see picture below).

Click on "Apply" again when the dialogue pops up, and save the results of your calibration (by default the file name will include the date and time- you can load the same calibration later in the same session if you are sure that the setup has not changed).

Applycal trackingtools.png

6. Set Ground Plane

Set the ground plane on the lower left corner of the table top so that Z arrow points towards the top of the projected image. Make sure that the markers of the ground plane are parallel with the left edge of the projected screen image and at the same time parallel with the bottom edge of the projected screen image. Click on "Set Ground Plane", and save the results.

Setgroundplane1 trackingtools.JPGSetgroundplane2 trackingtools.png

7. Stream Out Data

Now stream out the data so that it can be read into Matlab. To do so, click on the "Streaming Pane" icon. Put a check in the "Broadcast Frame Data" box, and change Type from "Multicast" to "Unicast". You can now minimize Tracking Tools and it will output data continuously in the background.

Streamingicon trackingtools.png
Broadcastunicast trackingtools.png

Calibration with ARENA

-wizard -> calibration -sometimes says camera 0-x, ?? -make sure cameras 1-6 (or whatever's appropriate) -next, camera preview- remove any reflective surface you don't want the cameras to pick up -if you see any points you can't remove physically, block all visible points (but this a last resort solution) - -right click for option to change to grayscale to see what cameras are actually recording (useful if changing camera positions, i.e., not in established real room setup)

-next, start wanding- it's important when wanding that wand is visible to all cameras simultaneously -next -next, set ground plane -next, capture volume preview -next, save calibration results

-refer to the arena website for videos demonstrating how to position markers on black velcro suit - -record in t-stance until 34 markers registered at the same time (may have to dance around a bit)



Matlab Scripts

Distance travelled

% fake data
optidata = zeros(40, 14) + 1;
%optidata(20:end, :) = optidata(20:end, :) + 1;

% filter parameters in # of samples (e.g., sigma of 6, sampled at 30Hz
% would be a sigma of (30Hz / 6 samples) = 200ms
% windowsize should be big enough to fit the Gaussian distribution
windowsize = 15;
sigma = 1;

% first, filter each dimension (x, y, z) with a low-pass (Gaussian) filter
% You can probably filter the orientation (columns 4-7, 11-14) too
filtered_optidata = optidata;
for col=[1 2 3 8 9 10]
    filtered_optidata(:,col) = conv2(optidata(:,col),normpdf(-windowsize:windowsize,0,sigma)','same');  

% path length (only in x/z to ignore any 'head bobbing' movement):
headDistance = 0;
backDistance = 0;
for i=2:size(filtered_optidata,1)
    headDistance = headDistance + sqrt( (filtered_optidata(i,1)-filtered_optidata(i-1,1))^2 + (filtered_optidata(i,3)-filtered_optidata(i-1,3))^2);
    backDistance = backDistance + sqrt( (filtered_optidata(i,1)-filtered_optidata(i-1,1))^2 + (filtered_optidata(i,3)-filtered_optidata(i-1,3))^2);