Apple AirTag anti-stalking protection bypassed by researchers

Apple AirTag anti-stalking protection bypassed by researchers

When the Apple AirTag hit the market in 2021, it immediately attracted the attention of hackers and reverse engineers.

Could AirTags be jailbroken? Could AirTags be simulated? Could the AirTag ecosystem be used for purposes beyond Apple’s own imagination (or at least beyond its intentions)?

We soon found ourselves writing up the answer to the “jailbreak” question, given that a researcher with the intriguing handle of LimitedResults figured out a way to subvert the chip used in the AirTag (an nRF52832 microcontroller, if you want to look it up) into booting up with debugging enabled:

Using this trick, another researcher going by @ghidraninja was able not only to dump the AirTag firmware, but also to modify and reupload the firmware data, thus creating an unofficially modified but otherwise functional AirTag.

To prove the point, @ghidraninja altered just one text string inside an AirTag, modifying the URL so it pointed not at Apple’s lost device reporting portal, but at a YouTube video (you know what’s coming) of Rick Astley singing Never Gonna Give You Up.

Anyone finding @ghidraninja’s AirTag and trying to report it lost…

…would get rickrolled instead.

Covert message delivery

A few days after the rickroll business, we were writing up another AirTag hack that documented how to create Bluetooth messages that could hitch a ride on Apple’s AirTag network.

Researcher Fabian Bräunlein, from Berlin, Germany, figured out a way to use almost any low-power Bluetooth system-on-a-chip, such as the well-known and inexpensive ESP32, as a message generator to send free (but very low bandwith) messages via iPhones that just happen to be nearby.

ESP32 development system showing diminutive size.
The code on this board simply blinks the blue light. (Red denotes power.)

Every two seconds, regular AirTags broadcast an identifier via a low-energy Bluetooth; any passing iPhones in the vicinity that are AirTag-enabled and happen to pick up these broadcast messages co-operatively relay them back to Apple’s AirTag backend, where they’re saved for later lookup.

To protect your privacy, the pseudorandom sequence is keyed, or “seeded”, using a shared secret that is known only to the AirTag and the owner who originally paired their Apple device with it, and the identifier that’s broadcast isn’t the actual data generated in the sequence, but a hash of it.

This means that only the AirTag’s owner can check whether their AirTag called home to Apple, because only the owner knows what magic identification code would have been generated, and therefore only the owner can calculate the hash to look up in Apple’s database, which is essentially an anonymous, crowd-sourced record of AirTag broadcasts.

Additionally, the identifier used by any AirTag is updated every 15 minutes, following a pseudorandom sequence that only the AirTag and its owner can construct (or reconstruct later), so that AirTag sightings can’t be matched in Apple’s database, albeit anonymously, simply by looking for repeated broadcast hashes.

Bräunlein figured out how to use an ESP32 device to create correctly-anonymised broadcast messages that Apple’s network would relay and store.

Each of his “not-actually-an-AirTag” messages was encoded so that it included:

  • A unique value (repeated in each transmission in a batch) to denote a related series of data packets, which we’ll collectively refer to as D;
  • A sequence number (incremented by one every time) to denote a specific bit position in the current hidden message, which we’ll call X; and
  • A single bit (either zero or one) added at the end to make each transmitted identifier even or odd, denoting the value of bit X in packet series D, which we’ll call bitval(D,X).

Because he could precompute the hashes of all possible messages for any sequence, he could see which identifiers actually turned up in each sequence (he sent each message several times to increase the chance of it getting picked up and anonymously rebroadcast to Apple).

If he only ever spotted identifiers where bitval(D,X) came out as zero, he’d know that his special-purpose ESP32 device was signalling that the Xth bit of D was zero; if he saw only identifiers with bitval(D,X) == 1, he’d know that the Xth bit of D was one.

If neither sort of message showed up, that would mean the Xth bit of the hidden message had been lost, but thanks to the sequence numbers, the rest of the message could nevertheless be recovered. If any evens turned up, then there could never be any corresonding odds; if any odds arrrived, there could never be any correponding evens, so the presence of one value and the absence of the other reliably signalled the intended setting for each bit in the hidden message.

As you can imagine, the bandwith of this “network”, which he humorously dubbed Send My, was poor: about 20 bits per second in throughput, with a waiting time for collected messages to wend their way to Apple’s servers of up to an hour.

Nevertheless, it did represent an essentially undetectable covert channel for tiny devices with tiny batteries to piggy-back onto Apple’s Find My network in an entirely innocent-looking way – no “giveaway” connections to Wi-Fi or the mobile phone network were needed.

Once more unto the breach

Well, Bräunlein is back in the AirTag news with a similar sort of “bogus but apparently innocent AirTag message” trick, this time designed not to sneak arbitrary data back via Apple’s network, but instead to deliver covert location information while preventing Apple’s network from generating its expected privacy warnings.

This one is cheekily dubbed Find You, and its primary purpose is to demonstrate the limits of Apple’s own “anti-find-you” programming, known as Tracker Detect, that’s now built into the AirTag network.

Apple’s system aims to provide basic protection against other people’s AirTags being hidden on your person, in your luggage, or on your car, and then used to keep tabs on you.

That’s because the anonymous, privacy-perserving system that’s supposed to ensure that only you can track your own AirTags if they’re lost of stolen…

…can be turned against you when it’s someone else privately tracking their tags that were neither lost nor stolen, but instead deliberately placed where their location data would denote your whereabouts.

Two main protections exist:

  • AirTags that haven’t been in direct connection with their owners’ devices recently start emitting an irritating noise through their small internal speaker. This not only helps genuinely lost AirTags get noticed, but also draws attention to AirTags in your vicinity ( your handbag/purse or rucksack/backpack) that shouldn’t be there.
  • AirTags that remain in your vicinity for some time but don’t belong to you pop up a warning on your device. While you’re walking around you’d expect to come across a random buch of AirTags, but if a single tag that isn’t yours sticks with you when there aren’t lots of other tags coming and going, you’ll be warned.

Clearly, the first mitigation is far from perfect: you’re unlikely to hear an AirTag that’s attached to your car, for example; and, sadly, there’s a creepy online market for second-hand AirTags in which the speaker is broken. (By “broken”, we mean “deliberately and deviously disconnected or damaged to allow silent operation”.)

The second mitigation, of course, not only relies on you regularly checking for stalker alerts, but also relies Apple’s software reliably deciding that a suspicious device is “standing out from the crowd” to a degree that’s worth alerting you about in the first place.

Indeed, the entire crowd-sourced nature of the Find My network relies on participants listening out for, detecting, and anonymously reposting AirTags that pass by, so that the genuine owner really can try to track them down (without knowing who submitted the report, of course) via Apple’s Find My portal.

In other words, turning every nearby appearance of every unknown AirTag into a “suspicious event” instead of simply quietly and anonymously calling it home would not only drive you nuts with false alarms, but also stop the Find My system from working as intended.

Instead, as Apple puts it:

If any AirTag, AirPods, or other Find My network accessory separated from its owner is seen moving with you over time, you’ll be notified.

Weakness in numbers

You can probably guess where this is going.

Bräunlein already knows how to create non-Apple-generated Find My messages that Apple’s network nevetheless accepts, in order to relay data of his choice.

This time, he simply created a plentiful supply of non-Apple-generated Find My messages, and broadcast them simply to trick Apple’s “moving with you over time” detector into ignoring devices that were indeed, right there with you all the time, essentially shrouding them in “crowd noise” that disguised the systematic presence of potential stalker devices.

Over a five-day period, Bräunlein had a volunteer carry one of his ESP32 “bogus message” generators, seeding his device with identification sequences for 2000 different AirTags that broadcast every 30 seconds at random.

Because Bräunlein could identify any or all of these pseudo-Airtags in the system, given that he knew all their identification seeds, he could reliably track his volunteer – without having to buy 2000 different AirTags and try to hide them all where they would neither get spotted nor be heard.

But at no time during those five days did Apple’s Tracker Detect system warn the volunteer of suspicious repeated appearances of any of these psuedo-tags.

With each pseudo-tag broadcasting only every 30 seconds (not every 2 seconds as a regular one would); with 2000 pseudo-tags to choose from; and with tag identifiers chaning by design every 15 minutes, we’re guessing that there wasn’t enough repetition or any obvious pattern for Apple’s stalker detector to latch onto.

What to do?

Unfortunately, there’s not much you can do to detect this sort of trickery at the moment, though we don’t doubt that Apple will revise its threat-detection modelling and detection code in the light of Bräunlein’s report.

Bräunlein does mention a free app from the Technical University of Darmstadt, called AirGuard, that can give you a bit more insight into fake trackers of this sort, by revealing a full list of AirTags or pseudo-AirTags seen near you – even though they all look different, the noise generated by Bräunlein’s multi-tag reporting device does reveal unusual tracker patterns in your vicinity.

However, the AirGuard app is only available for Android, so if you’re using Apple AirTags with Apple phones and laptops, this won’t work for you.

Leave a Reply

Your email address will not be published. Required fields are marked *