Question about BoseWearable.shared.startConnection in SDK 4.0.8

Question about BoseWearable.shared.startConnection in SDK 4.0.8

edited 08/14/2019 - 13:22 in Bose AR SDK for iOS
ericdoleckiericdolecki Member Posts: 15
I am performing my search like so:

func startSearch(){

     let sensorIntent = SensorIntent(sensors: [.gameRotation, .rotation], samplePeriods: [._20ms, ._20ms])

     let gestureIntent = GestureIntent(gestures: [.doubleTap,.headNod,.headShake,.singleTap])

     BoseWearable.shared.startConnection(mode: .alwaysShow,

                                            sensorIntent: sensorIntent,

                                            gestureIntent: gestureIntent) { result in

            switch result

            { ...

Which is great, but what exactly is supplying SensorIntent and GestureIntent doing in the startConnection method? Because I do not get callbacks for those gestures or sensors unless I set them up manually like so:

session.device?.configureSensors { config in


     config.enable(sensor: .gameRotation, at: ._20ms)

     config.enable(sensor: .rotation, at: ._20ms)



session.device?.configureGestures { config in


      config.set(gesture: .headNod, enabled: true)

      config.set(gesture: .headShake, enabled: true)

      config.set(gesture: .doubleTap, enabled: true)

      config.set(gesture: .singleTap, enabled: true)


I was hoping supplying the intents in the startConnection method would be a shortcut in setting those configurations up for me. If not, I don't know what supplying them actually does under the hood.


  • Nadine@Bose[email protected] Member Posts: 111

    Hi @ericdolecki ,

    1) you don't need to use single tap- we suggest just do double tap for the frames or qc 35 ii's 

    2)  Great question on the sensor intent. I hope I explain this well, if I don't please let me know where I can clarify:

    App intent (i.e. gestureIntent or sensorIntent), is meant to specify to your users who are using the app the features and functionality that your application requires (think pre screen, if you will). App intent helps inform the SDK the expectations of what you want to use.  

    In short, there are no shortcuts to configuring and enabling your sensors. You will still need to code that part. The sensors are technically not activated until they are configured. 

  • ericdoleckiericdolecki Member Posts: 15
    Noted in regards to single tap. 

    If one supplies no intents for the startConnection - is anything under the hood any different? What actually happens if they are supplied?

    I undertstand that unless one gets a result, you don't have a session.

    Perhaps in the startConnection method as part of the closure, in a future release developers could do something like this:

    let sensorIntent = SensorIntent(sensors: [.gameRotation, .rotation], samplePeriods: [._20ms, ._20ms])

    let gestureIntent = GestureIntent(gestures: [.affirmative,.doubleTap,.headNod,.headShake,.negative,.singleTap])

    BoseWearable.shared.startConnection(mode: .alwaysShow, sensorIntent: sensorIntent, gestureIntent: gestureIntent) { result in

        switch result {

        case .success(let session):

        self.session = session

        if session.hasSuppliedIntents { // <- make sure intents were passed into the startConnection method

            self.session.configureIntents()  // <- does all the supplied sensor/gesture configuration here in a single call


        self.session.delegate = self

        self.sensorDispatch.handler = self


  • daniel@bose[email protected] Member Posts: 41
    Thanks for your question!   Lets keep it real simple.  

    With v4 and later, you should always create intents based on your target set of devices you intend (ie the intent) the app to support.

    Starting with V4, we added intents to basically allow the connection UI to be more specific in the UI Product flow when connecting to different devices.    Intents help determine what devices the developer is targeting, and / or ignoring.  

    For example if you only want to support a double tap gesture on QC35 and Frames, you'd use .doubletap and not .input.  
    The connection UI would display text telling the user that the app doesn't support your product if connected with the NC700.

    Having said that it's possible to support a subset of the Bose AR-enabled products, we suggest you don't loose the opportunity to support all of them.   That's where .input, .affirmative, and .negative come in to play.  If you set your intents for these gestures, then they will work on the NC 700 and future devices.

    I hope that clears up the confusion as to why you should use intents. 

    @[email protected]

  • Nadine@Bose[email protected] Member Posts: 111

    Hi @ericdolecki

    The intents are used on the connection to validate that the device supports what the app is planning to do. In the process of this validation, the SDK may find out that the device needs a firmware update to support what the app wants to do. Then it will return the right error. Or it may determine that the firmware is up to date, but still doesn't support what the app wants to do. Then it will also give the exact error.

    If the app doesn't supply any intents on connection, it just skips this validation. After connection succeeds, it can still try to set the sensor configuration that it wants, but if the device doesn't support it, the app might not get such an exact error.

    When you change the configuration, if the device supports and accepts the change, you will get 

    ```WearableDeviceEvent.didWriteSensorConfiguration event```

     if not, you will get 

    ```WearableDeviceEvent.didFailToWriteSensorConfiguration(Error) event.```

    Providing the intent, ensures that you get an error (if there is one) on connection and not much late. 

    Does this help with your question?



Sign In or Register to comment.