Bluetooth LE is not so scary, or How to improve user experience effortlessly

Bluetooth LE is not so scary, or How to improve user experience effortlessly


We recently invented a team and implemented the function of transferring money over the air using Bluetooth LE technology. I want to tell you how we did it and what Apple provides to us from the tools. Many developers think that Bluetooth is difficult, because it is a rather low-level protocol, and there are not so many specialists on it. But everything is not so scary, and in fact using this function is very simple! And those features that can be implemented using Bluetooth LE, of course, are interesting and will subsequently allow you to distinguish your application from competitors.



Let's first understand what this technology is all about and how it differs from classic Bluetooth.

What is Bluetooth LE?


Why did the Bluetooth developers call this technology Low Energy? After all, with each new version of Bluetooth power consumption has already declined many times. The answer lies in this battery.


Its diameter is only 2 cm, and the capacity is about 220 mAh. When engineers developed Bluetooth LE, they wanted the device to work with such a battery for several years. And they did it! Bluetooth LE devices with such a battery can work for a year. Who among you in the old-fashioned way turns off Bluetooth on the phone to save energy, as was done in 2000? In vain you do it - the savings will be less than 10 seconds of the phone a day. And you disable very large functionality, such as Handoff, AirDrop and others.

What did the engineers achieve by developing Bluetooth LE? Did they improve the classic protocol? Made it more energy efficient? Just optimized all the processes? Not. They completely redesigned the Bluetooth stack architecture and achieved that now, in order to be visible to all other devices, you need less time to be on the air and to occupy the channel. In turn, this made it possible to save well on energy consumption. And with the new architecture, it is now possible to standardize any new device, thanks to which developers from all over the world can communicate with the device and, therefore, easily write new applications to manage it. In addition, the architecture is based on the principle of self-discovery: when connecting to a device, you do not need to enter any PIN codes, and if your application can communicate with this device, the connection takes just milliseconds.

  • Less air time.
  • Less power consumption.
  • New architecture.
  • Reduced connection time.

How did the engineers manage to make such a huge leap in energy efficiency?

The frequency remains the same: 2.4 GHz, not certified and free for use in many countries. But the connection delay was less: 15-30 ms instead of 100 ms for classic Bluetooth. The working distance remains the same - 100 m. The transmission interval is not much, but it has changed - instead of 0.625 ms it became 3 ms.

But because of this, energy consumption could not be reduced tenfold. Of course, something had to suffer. And this is the speed: instead of 24 Mbit/s it became 0.27 Mbit/s. You will probably say that this is a ridiculous speed for the year 2018.

Where is Bluetooth LE used?




This technology is not young, it first appeared in the iPhone 4s. And already managed to win a lot of areas. Bluetooth LE is used in all smart home devices and in wearable electronics. Now there are even chips the size of coffee beans.



And how is this technology used in software?

Since Apple was the first to embed Bluetooth in its device and began to use it, by now they have advanced reasonably well and have integrated technology into their ecosystem. And now you can meet this technology in such services as AirDrop, Devices quick start, Share passwords, Handoff. And even notifications in hours are made via Bluetooth LE. In addition, Apple has put open source documentation on how to make notifications from all applications come to your own devices. What are the roles of devices within Bluetooth LE?



Broadcaster. Sends messages to anyone nearby, this device can not connect. This principle works iBeacons and navigation in the premises.

Observer. Listens to what is happening around and only receives data from publicly available messages. Connection does not create.

But with Central and Peripheral more interesting. Why didn't they just be named Server-Client? It is logical, judging by the name. And no.

Because Peripheral, in fact, acts as a server. This is a peripheral device that consumes less power and which the more powerful Central connects to. Peripheral can notify that he is nearby and what services he has. Only one device can connect to it, and Peripheral has some data. A Central can scan the air in the search for devices, send connection requests, connect to any number of devices, can read, write and subscribe to data from Peripheral.

What are we, as developers, available in the Apple ecosystem?

What is available to us?


iOS/Mac OS:

  • Peripheral and Central.
  • Background mode.
  • Restore state.
  • Connection interval 15 ms.

watchOS/tvOS:

  • watchOS 4 +/tvOS 9 +.
  • Only Central.
  • Maximum two connections.
  • Apple watch series 2+/AppleTv 4+.
  • Disable when switching to background.
  • Connection interval 30 ms.

The most important difference is the connection interval. What does it affect? To answer this question, you first need to figure out how the Bluetooth LE protocol works and why such a small difference in absolute values ​​is very important.

How the protocol works


How is the process of searching and connecting?

Peripheral announces its presence with the frequency of the advertisement interval, its package is very small and contains only a few identifiers of the services that the device provides, as well as the name of the device. The interval can be quite large and can vary depending on the current status of the device, power saving mode and other settings. Apple advises developers of external devices to tie the length of the interval to the accelerometer: increase the interval, if the device is not used, and when it is active, reduce to quickly find the device. The advertisement interval does not correlate in any way with the connection interval and is determined by the device itself depending on the power consumption and its settings. He is not available to us in the Apple ecosystem and is unknown; it is completely controlled by the system.



After we have found the device, we send a connection request, and here comes the connection interval - the time after which the second device can respond to the request. But this is when connected, and what happens when reading/writing?



The connection interval also appears when reading data - its decrease by 2 times increases the data transfer rate. But you need to understand that if both devices do not support the same interval, then the maximum one will be selected.

Let's take a look at what a packet of information consists of that Peripheral transmits.

The MTU (maximum transmission unit) of such a package is determined during the connection process and varies from device to device and depending on the operating system. In the protocol version 4.0, the MTU was about 30, and the size of the payload did not exceed 20 bytes. In version 4.2, everything has changed, now you can transfer about 520 bytes. But, unfortunately, this version of the protocol is supported only by devices younger than IPhone 5s. The overhead, regardless of the MTU size, is 7 bytes: this includes the ATT and L2CAP headers. With the recording, in general, a similar situation.



There are only two modes: with answer and without. No reply mode significantly speeds up data transfer, since there is no timeout before the next entry. But this mode is not always available, not on all devices and not on all systems. Access to this recording mode may limit the system itself, because it is considered less energy efficient. In iOS, there is a method in which you can check before recording whether such a mode is available.

Now let's consider what the protocol consists of.



The protocol consists of 5 levels. The Application Layer is your logic described above CoreBluetooth. The GATT (Generic Attributes Layer) is used to share services and features found on devices. ATT (Attributes Layer) is used to manage your features and transfer your data. L2CAP is a low-level communication protocol. The Controller is the BT chip itself.

You probably ask what GATT is and how can we work with it?

GATT consists of features and services. A characteristic is an object in which your data is stored, like a variable. A service is a group in which your characteristics are located, like a namespace. The service has a name - UUID, you choose it yourself. A service may contain a child service.



The characteristic also has its UUID - in fact, the name. Value (Value) characteristics are NSData, here you can record and store data. Descriptors is a description of your characteristic, you can describe what data you are expecting in this characteristic, or what they mean. There are many descriptors in the Bluetooth protocol, but only two are currently available on Apple systems: the human description and the data format. Also there are access levels (Permissions) for your feature:



Let's try it yourself


We had an idea to make the possibility of transferring money over the air, without requiring anything from the recipient. Imagine, you are puzzled over a very interesting task, writing the perfect code, and then a colleague suggests to go for coffee. And you are so passionate about the task that you can not leave, and ask him to buy you a cup of delicious cappuccino. He brings you coffee, and you need to give him back the money. You can translate by phone number, it works fine. But here is an awkward situation - you do not know his number. Well, like this, you have been working for three years, but have not exchanged numbers :)

Therefore, we decided to make it possible to transfer money to those who are nearby, without entering any user data. Like in AirDrop. Just select a user and send the amount he needs. Let's see what we need for this.



PUSH mapping


We need the sender:

  1. Could find all devices that are nearby and support our service.
  2. Could read the requisites.
  3. And could send a message to the recipient that successfully sent him money.

The recipient, in turn, must be able to inform other senders that he has a service with the necessary data, and be able to receive messages from the sender. I think it’s not worth describing how the process of transferring money by requisites takes place in our bank. And now let's try to implement it.

First you need to come up with the name of our service and features. As I said, this is a UUID. We simply generate them and save them on Peripheral and Central, so that both devices are the same.



You are free to use any UUID, except those that end like this: XXXXXXXX- 0000-1000-8000-00805F9B34FB - they are reserved for different companies. You yourself can buy such a number and no one will use it. This will cost $ 2500.

Next, we will need to create managers: one for the transfer of funds, the other for receiving. You just need to specify the delegates. We will transmit Central, receive Peripheral. We create both, because both the sender and the recipient can be the same person at different times.



Now we need to make it possible to detect the recipient and write the recipient's details into our characteristic.



First, create a service. Let's write the UUID and indicate that it is primary - that is, the service is central to this device. A good example: a pulsomer, for which the main service will be the current state of the pulse, and the state of the battery is background information.

Then we create two characteristics: one for reading the recipient’s details, the second for recording, so that the recipient can find out about sending money. We register them in our service, then we add to the manager, we start detection and we specify the UUID of the service so that all devices that are nearby can learn about our service before connecting to it. This data is placed in a packet that Central sends during the broadcast.

The recipient is ready, proceed to the sender. Start the search and connect.



When you turn on the manager, we start the search for devices with our service. When we find them, we get them in the delegate method and immediately connect. Important: you need to keep a strong link to all Peripheral you work with, otherwise they will “leak”.



After successful connection, we set up a delegate that will work with this device, and receive the service we need from the device.



We have successfully connected to the recipient, now we need to read his details.

After connection, we have already requested all services from the device. And after receiving them, the delegate method will be called, which will list all the services available on this device. We find the necessary and request its characteristics. The result can be found by the UUID in the delegate method that stores the data for translation. We try to read them, and we get what we need again in the delegate method.All services, features and their values ​​are cached by the system, so it is not necessary to request them later each time.



Everything, we sent money for coffee, it's time to show the recipient a beautiful notice that he was waiting for rubles in his account. For this you need to implement the process of sending a message.

We get the characteristic we need from the sender, in this case we took it from the saved value. But before that you need to get it from the device, as we did before. And then just write the data in the desired characteristics.

After that, on the other device, we get a write request in the delegate method. Here you can read the data that is sent to you, respond to any error, for example, there is no access, or this characteristic does not exist. Everything will work, but only if both devices are turned on and applications are active. And we need to work in the background!



Apple allows you to use Bluetooth in the background. To do this, you need to specify the key in info.plist, in which mode we want to use, in Peripheral or Central.



Next, in the manager, you need to specify the recovery key and create a delegate method. Now we have access to the background mode. If the application falls asleep or is unloaded from memory, then when the necessary Peripheral is found or when Central is connected, it will wake up, and the manager will be restored with your key.



Everything is fine, already ready to be released. But here designers come running to us and say: “We want to insert photos of users, so that it is easier for them to find each other”. What to do? We can write about 500 bytes into the characteristic, but on some devices there are generally 20 :(



Go down deeper


To solve this problem, we had to go deeper.



Now we talked devices at the level of GATT/ATT. But in iOS 11, we have access to the L2CAP protocol. However, in this case you will have to take care of the data transfer yourself. Packets are sent with an MTU of 2 Kb, no need to recode to anything, the usual NSStream is used. Data transfer speeds of up to 394 kilobits/sec., According to Apple.

Suppose you transfer any data of your service from Peripheral to Central in the form of normal characteristics. And it took to open the channel. You open it on Peripheral, in response you receive a PSM - this is the channel number to which you can connect, and you need to transfer it to Central using the same characteristics. The number is dynamic, the system chooses which PSM to open at the moment. After the transfer, you can already connect to Peripheral on Сentral and exchange data in a format that is convenient for you. Let's take a look at how to do this.

For starters on Peripheral open the port with encryption. You can do without encryption, then it will speed up the transfer a bit.



Next, in the delegate method, we get a PSM and send it to another device.



After connecting another device, we will call a method in which we can get the necessary NSStream from the channel.



With Central, it's even easier, we just connect to the channel with the desired number ...



... and after that we get the streams we need. In them you can transfer absolutely any data of any size, and build your own protocol on top of L2CAP. So we realized the transfer of photographs of the recipient.



But there are pitfalls, where do without them.

Reefs


Let's look at the pitfalls when working in the background. Since the roles of Peripheral and Central are available to you, you might think. that in the background you can determine which devices near are in the background, and which in the active. In theory, it should have been, but Apple imposed a restriction: phones that are in the background, be it Central or Peripheral, are not available for other phones that are also in the background. Also, phones that are in the background are not visible from non-iOS devices. Let's look at why this is happening.

When your device is active, it sends a normal broadcast packet, which can contain a device name and a list of services. which this device provides. And overflow data - all that did not fit.



When the device goes into the background, it does not transfer the name, and the list of supported services transfers to the overflow data. If the application is active, it reads this data when scanning from an iOS device, and ignores it when switching to the background. Therefore, when you go to the background, you will not be able to see applications that are also in the background. The remaining Apple operating systems always ignore overflow data, so if you search for devices that support your service, you will get an empty array. And if you connect to each device that is nearby, and request supported services, then the list may contain your service and you can work with it.



Next, we were already preparing to pass on to testing, corrected minor flaws, were engaged in optimization. And suddenly at some point we started getting this error in the console:

  CoreBluetooth [WARNING] Unknown error: 124  

The worst thing was that no delegate method was called, we could not even beat this error for the user. Just a message to the log - and silence, everything froze. No special changes were made, so we started rolling back to the commits. And they found that they once optimized the code and reworked the way they write data. The problem was that not all customers were updated, so this error occurred.

 . write! = .writeWithoutResponse  

We, happy to have corrected everything, ran more quickly to transfer to testing, and they almost immediately return to us: “Your fashionable photos do not work. They all come underloaded. ” We started to try, and the truth is, sometimes, on different devices, at different times come broken photos. Began to look for the cause.

And here again they saw the same mistake. Immediately thought that it was in different versions. But after the complete removal of the old version from all test devices, the error was still reproduced. We felt sad ...

  CoreBluetooth [WARNING] Unknown error: 722
 CoreBluetooth [WARNING] Unknown error: 249
 CoreBluetooth [WARNING] Unknown error: 312  

Began to look for a tool for debugging. The first thing we came across was Apple Bluetooth Explorer. A powerful program that can do a lot of things, but for debugging the Bluetooth LE protocol there is one small tab with searching for devices and obtaining characteristics. And we needed to analyze L2CAP.

Then found the LightBlue Explorer.It turned out to be quite a decent program, albeit with a design from iOS 7. It can do the same thing as Bluetooth Explorer, and can also subscribe to features. And it works more stable. Everything is good, but again without L2CAP.

And here we remembered the well-known WireShark sniffer.

It turned out he is familiar with Bluetooth LE: can read L2CAP, but only under Windows. Although it is not scary that we will not find a Windows or something. The biggest disadvantage is that the program works only with a specific device. That is, it was necessary to find somewhere a device in the official store. And you understand that in a large company they are unlikely to approve the purchase of an incomprehensible device at a flea market. We even started browsing overseas online stores.

But they found PacketLogger program in Additional Xcode Tools. It allows you to watch the traffic that goes on your OS X device. Why not rewrite our MoneyDrop under OS X? We already had a separate library. We simply replaced UIImage with NSImage, it all started on its own in 10 minutes.



Finally, we could read the packets exchanged between the devices. It immediately became clear that at the time of data transfer via L2CAP one of the characteristics was recorded. And due to the fact that the channel was fully occupied by the transfer of photos, iOS ignored the recording, and the sender, after ignoring, cut off the channel. After correcting problems with the transfer of the photo was not.



That's all, thanks for reading :)

Useful Links


WWDC/CoreBluetooth:


Bluetooth


YouTube

  • Arrow Electronics → Bluetooth Low Energy Series

Source text: Bluetooth LE is not so scary, or How to improve user experience effortlessly