What does Rotation (6DOF) and Rotation (9DOF) mean?

What does Rotation (6DOF) and Rotation (9DOF) mean?

edited 08/07/2019 - 20:51 in Bose AR SDK for Unity
godatplaygodatplay Posts: 3Member
In the SDK, there is mention of Rotation (6DOF) and Rotation (9DOF) when creating an App Intent Profile. Is this referring to number of sensors used to calculate an orientation, or the number of devices in aggregate?

The reason I ask is because, if so, I think it'd make more sense to rename to Rotation (6 axis input) or even better Orientation (2 sensors) or something like that. You can't re-integrate readings from the sensors to extrapolate extra degrees of freedom, so it doesn't make sense to refer to them as degrees of freedom. One sensor gives you a bad guess 3DOF, two gives you a mediocre guess 3DOF, and three gives you a pretty reliable 3DOF. It's always three degrees.

The other possibility is that this is referring to 3DOF aggregated across multiple devices. For example 3 from a headset plus 3 from a phone would be 6.

Comments

  • godatplaygodatplay Posts: 3Member
    Ah, found the explanation in https://developer.bose.com/guides/bose-ar/managing-sensors-and-their-data. Yeah that's definitely confusing, because in this case the number of sensors do not add more freedom.
  • Filip@Bose[email protected] Posts: 37Moderator
    @godatplay thank you for that observation! It's just a matter of naming convention but just to be clear:
    6DOF - Gyroscope Data
    9DOF - Gyroscope + Magnetometer Data

    Keep in mind our SDK is still in Beta and naming conventions can possibly change with subsequent updates  :)
  • prehensileprehensile Posts: 11Member
    We've been looking at this recently. Would I be right in understanding that when 9DOF is enabled, 0° yaw should be more or less pointing towards magnetic north?

    This doesn't always seem to be the case, but I can't work out if we're using the rotation quaternion wrong or if we're not always getting particularly accurate data.

    In the calibration routine in the Advanced Demo (in AdvancedDemoController.cs), it looks like the checks it's running are (1) that the Frames are stationary and (2) that enough time has passed for the IMU to self-calibrate (and [3] that it's not just timed out waiting for [1] and [2] to happen). It's then setting the Rotation Matcher's relative reference to whatever direction the Frames happen to be pointing in when those criteria are met.

    We lifted that calibration code, and after reading the docs about IMU Calibration, we added a check to see if LastSensorFrame.rotationNineDof.measurementUncertainty drops to 1.91, but that didn't seem to make a lot of difference.

    The short question then: if 9DOF is enabled and measurementUncertainty == 1.91, should 0° yaw point to magnetic north?
  • Filip@Bose[email protected] Posts: 37Moderator

    A bit of background: a degree of freedom (DOF) is the number of values in a calculation that may vary. https://en.wikipedia.org/wiki/Degrees_of_freedom_(statistics)

    In the case of the Bose AR devices, we currently offer two different virtual rotation sensors — one that is called 6DOF (or gameRotation in the Native SDKs) or 9DOF (or rotation in the Native SDKs).

    The 6DOF sensor takes two Vector3s:
    * Instantaneous acceleration due to gravity from the Accelerometer sensor
    * Angular velocity from the Gyroscope sensor

    The IMU combines these two inputs through a sensor fusion algorithm and outputs a single Vector3 that incorporates 6 degrees-of-freedom, or 6DOF.

    The 9DOF sensor takes in another Vector3 from the Magnetometer sensor to provide a rotation that also incorporates the magnetic field surrounding the user, when the system is confident in reading from the magnetometer and incorporating it into the outputted value.

    As for reading the 9DOF data, our understanding is that as long as the measurementUncertainty is down to 1.91 degrees, you should be facing (within a range of error of 1.91 degrees) magnetic north. The more stable the uncertainty is (in terms of minimal changes on the uncertainty over time) the more you should be able to trust that value.

    Hopefully this answer is helpful, let me know otherwise ! 🙂

  • godatplaygodatplay Posts: 3Member
    Thank you for the info, especially the detail about magnetic north. I'm still a little confused about the language, but hopefully I can elaborate well enough:

    I realize that some sensor manufacturers use it differently, especially ones that don't do sensor fusion at or below the firmware level, but in VR and AR, the figurative "local space" of the relative term 'degrees of freedom' refers to the freedom of the end user.

    Up until recently, the standard 3-sensor IMU has been thought of as 3DOF because only 3-axis orientation could be specified completely from an end user point of view. Though I guess there are newer more accurate IMUs that can be used in conjunction with machine learning models to extrapolate position, but I'm not familiar enough with the current state of the art to say whether that can be reintegrated long enough to declare it "complete".

    The convention for 6DOF refers to the 6 coordinates that specify a system completely applying to a device's orientation and position (relative to an arbitrarily set origin). So if someone says "There's a new AR device that's 6DOF!", that'd mean it'd track orientation and position. Or "There's a VR headset that's 6DOF but with 3DOF hands", that'd mean orientation and position with the head, but just orientation with the hands.

    There are already AR glasses that track both orientation and position and that are referred to as 6DOF, and more are on the way. So my point is simply that this is confusing because it seems like 6DOF means you'll get a position reading in addition to an orientation one. And this confusion will only grow over time as more AR devices come out that also track position and follow the industry standard naming convention. I'm concerned you're setting yourself up for repeated disappointment.

    Without further information, to me 9DOF would suggest that it's 6 + 3, which means one device will output position and orientation, and a second device will output only orientation, like a Lenovo Mirage Solo. For this reason, product designers of the Mirage Solo will refer to it orally as "six dot three".

    Sorry that I didn't explain this very well the first time, hopefully it's more clear now.

    And I hope the following doesn't come off as offensive, I'm just trying to take time to help as a developer offering "behind the scenes" type feedback, but the non-standard usage of the term here comes off as out of touch. People in the XR space don't use it in reference to number of values of sensor input, they use it in reference to the human body, i.e. people that would enjoy a given device.

    The same applies to Bose's usage of the term "spatialized audio". People in the XR space don't use that term to refer to "1.5D" sound (left vs right, and intensity), they use it to refer to 3D sound (simulated XYZ via HRTF or similar).
Sign In or Register to comment.