Execute Gesture Event through code

Execute Gesture Event through code

ekalipi1ekalipi1 Member Posts: 26
I am working on BOSE AR iOS SDK 4.0.8.
I am able to detect Gesture events like tap, doubleTap etc.
Is there any way to trigger those events programatically.
e.g. headNod should trigger button tap event and headShake should trigger touchAndHold event.

Thanks in advance,
- Ashitosh


  • Nadine@Bose[email protected] Member Posts: 111
    Hi!, the only way to incorporate the gestures is programtically.   The idea is to replace the button tap with a head nod. headShake and touchAndHold are two separate gestures events in the iOS SDK. touchAndHold is only for the NC700's that was recently released a few weeks ago.  It doesn't really make sense to have one gesture trigger another gesture.  The point is for a user to perform those gestures that then maps out to some event in your app.  If you can explain more about what your app is and what you want to do, I can definitely help out!

    To clarify gestures:

    QC 35 II's and Frames: doubleTap, headNod, headShake
    NC700's: touchAndHold, headNod, headShake


  • ekalipi1ekalipi1 Member Posts: 26

    Thanks for your reply.
    We are right now focusing on Bose frames.

    The idea behind our app is to use  headNod to answer and headShake  to reject an incoming call through Bose frames.
    Note - This behavior is similar to the actions executed by a single tap and touchandHold on Bose Frames except that we do not want to touch the button at all.

    We are able to perform these actions in our Android code but are unable to do so in IOS (using Swift). Of course, ideally we would like to accomplish these actions through the Bose AR SDK.

    Incidentally, could you please let us know as soon as the touchAndHold detection starts working with Bose frames?


  • Nadine@Bose[email protected] Member Posts: 111
    Hi @ekalipi1,

    This functionality will not work with Bose Frames. The NC700's has a capactive touch that enables the touchAndHold, that the Frames do not have. The current hardware designs doesn't permit the touchAndHold for the Frames. I recommend replacing it with a doubleTap. We also don't recommend using singleTap. 

    Can you point out the trouble you are having with Swift? 
  • ekalipi1ekalipi1 Member Posts: 26
    The main thrust of our question is as follows.
    When user receives a call:
    - When we  detect a head nod, what command do we issue to accept the call? 
       Note - If the user presses the button this action occurs. We are trying to accomplish the same effect without the user actually pressing the button.
    - When we detect a head shake, what command do we issue to reject the call?
    Note - If the user does a touchAndHold (more like a pressAndHold) on the button. this does happen on BoseFrames. Again we would like to reject the call without the user actually touching the button.

    Ideally we would like to execute these actions using the Bose SDKs.
    We are able to accomplish this In Android with custom coding. 
    We have done some research in Swift and it does not appear that it supports a  similar capability.
  • Michael@Bose[email protected] Member Posts: 66
    Hi @ekalipi1 - can you speak a little more about the specific reason you'd like to use the Bose AR SDK to initiate and end telephony functions that are currently natively supported in firmware on our wearable devices?

    Example of what I mean: on the NC 700s double tap on the right side to accept a phone call, and double tap to end a phone call. This functionality automatcially works out of the box. On the Frames it's a press to  the physical button on the right side, and on the QC 35 IIs it's similar.

    If you're looking to replace that functionality with gestures like "Affirmative" (head nod), the problem you will run up against is an architectural and OS-level one. 

    For one, Bose AR functionality is application specific and initiated via an application which needs to be open and running for the functionality to continue to work.

    So, you'd need to develop an application which intercepted system level APIs for telephony to recognize a gesture such as "Affirmative" (head nod) and then call the relevant telephony api to accept a call, for example, and then a user would need to open that application, initiate a Bose AR BLE connection via the application and then leave that application running in order to have that scenario function the way you've described.

    It does seem possible, but the parts you need to explore lie squarely in the iOS side of things, related to OS/system level APIs and how applications are allowed interact with them.

    Please let us know if there's any additional detail about your potential use case you could share with us to help inform if we can help further!


    Michael @ Bose
  • ekalipi1ekalipi1 Member Posts: 26
    Hi Michael,

    Can you please send me your email address? Is it [email protected]
    We have prepared a presentation that describes our planned app and how it works (in Android).
    My email address is [email protected]

    Kishor Bapat
  • daniel@bose[email protected] Member Posts: 41
    Thanks for your question,  we'll reach back directly with the information you requested.
    @[email protected]

  • ekalipi1ekalipi1 Member Posts: 26
    Hi Daniel,

    Have not received your email address so I am unable to send you the presentation that I spoke about earlier.

    In a nutshell here is what we are doing on Android.
    - You turn on the Bose Frames and start our app.
    - When you receive a call you hear an announcement telling you who is calling. 
    - To answer the call you nod your head.
    - To reject the call, you shake your head.
    Essentially you are able to use your phone for calls without ever taking it out of your pocket. Totally hands-free and ear-free. You do not need to use the gold button on the frames at all.

    We are planning to submit the Android app to Bose this week.
    We would like to create a similar IOS app as soon as possible. The head shake and head nod work on IOS. However we cannot make IOS accept or reject the call at this point. So what we would like to do is send the equivalent command of a button press when a head nod is detected and a press and hold when a head shake is detected.

    Hope this clarifies my earlier question.

  • edited 08/19/2019 - 17:23
    daniel@bose[email protected] Member Posts: 41
    @ekalipi1  Thanks for your questions.  Your question is really specific to the IOS programming platform vs Bose AR, so bear that in mind. 

    Bear in mind that out of the box Frames and many other Bluetooth products already have the functionality to answer calls via pushing the bluetooth button.   Again this is by design to protect privacy and to make the experience consistent across products.  Below the covers, it's part of how Apple has created APIs for handling button pushes on Bluetooth devices.   

    answer a call:
    briefly press the bluetooth function button. you should hear a short beep in the headset before you hear the incoming call.

    Similar on most devices:

    While you can use a callback or delegate to intercept Bose AR gestures to run a method when a gesture happens, you can't simulate the button push on the Bluetooth device or call any API on the Apple System.  Apple doesn't directly allow developers to access to the CALL system apis.   The concern is centered around protecting users privacy, and that's why calls can only be answered by a actual button push on a connected bluetooth device.


    Alternatively,  if you want to build your own VoIP system then you can control the calls.  Developers may also build a VoIP call system and have some limited interaction with the native system UI.   But I think your use case isn't going to be supported with that scenario.

    CallKit is a framework that aims to improve the VoIP experience by allowing apps to integrate with the native phone UI


    Good luck with your project.

    @[email protected]

Sign In or Register to comment.