Last Updated: 01/24/2025
Navigating the crazy, confusing world of Bluetooth Low Energy
Matthew Piercey
Part of a Series:
Matthew's MachinationsA series chronicling my hijinks, across a bunch of random side-projects
Ah, Bluetooth. It’s not the fanciest, and it’s definitely not the easiest protocol to work with, but it’s certainly ubiquitous. If it’s not Wi-Fi, it’s highly likely any given “smart” device probably uses some form of Bluetooth for wireless communication. And these days, it’s probably going to be BLE, or “Bluetooth Low Energy”.
Going into this project, I had dabbled with Bluetooth for some projects before. Like using a Bluetooth adapter to control Arduino projects from a phone, or transferring files over Bluetooth from one laptop to another (though Wi-Fi direct is a far better choice whenever it’s available). But I never had success working with BLE. Thankfully (I guess?) that’s no longer the case, as this project has opened my eyes to the convoluted process of reverse-engineering a BLE device.
As frustrating as parts of this project were, I think the end result speaks for itself. As seems to be a bit of a tradition at this point, this blog post will outline my process of repurposing an otherwise decent product, held back by crummy software support. So, without further ado, let me begin.
My goal: Getting this thing to light up!
So, how did I find myself here in the first place, you ask? I’ve learned it’s generally better not to ask, when a project lands in your lap, and to just take it on and see where it leads. As frustrating and ultimately rather frivolous as this project was (like many I take on) it was a learning oppurtunity like no other. It’s all just theory until you get out there and start experimenting, after all.
But every story has to start somewhere, so let’s do just that. Long story short, I found myself in the possession of a BLE-enabled 96×20 LED matrix. It’s basically a flexible PCB covered in a thin layer of resin. Pretty elegant design, really.
The one I got had the brand name “ATOTOZONE CI-VIALD19”, and it was the 19.4" x 4.3” model. For what it’s worth, I found similar products floating around on sites like AliExpress. For some reason, they are often marketed as signs for the back of a car window (I can’t speak to the legality of using it for that, but it seems dubious and unsafe for other drivers) or the front of a store/restaurant (a more reasonable use case). The marketing often depicts one or two blinking, menacing yellow eyes, or custom text. And the flexible nature of the matrix is often at the forefront.
The specifics don’t really matter, because I don’t know if my reverse-engineering work will be applicable to anybody else, but I figured I’d provide some context. If you have something that looks a bit like what I have, you might be able to use my code with it.
Anyway, I wanted to display some custom pixel art and GIF animations on it. Unfortunately, as I’ve encountered all-too-frequently, the compatible mobile app only allowed for choosing from a set of pre-baked animations. Some were OK, but I wanted to upload my own. Granted, the app also had a “custom text” function, but it was bare bones and not very customizable. And there was also a “draw” mode for making a custom pixel pattern, but it was very difficult to create anything but crude scribbles from the app’s interface.
I mean the "scribble" mode worked, I guess, but I wouldn't exactly call this "custom pixel art". Although you can't really blame me, since I had to draw this with my thumb.
Clearly the options offered by the app weren’t going to satisfy me. I wanted something truly custom. For better or worse, I became obsessed with turning this LED matrix into my own personal pixel art publisher.
I wasn’t sure where to start with something like this. I figured it would be relatively straightforward, just requiring a lot of trial and error. The first thing I tried was enabling the “HCI Snoop Log” on my Android phone.
There are some decent tutorials for that sort of thing online, but my basic process was as follows. I had enabled HCI Snoop Log in the Developer Options of Android settings. From there:
I used the app to connect to the LED matrix, and performed some basic functions, making a note of what I did
I connected my phone to my computer, and ran adb bugreport
That generated a .ZIP file, after a while, although the file I was looking for wasn’t where I was told it would be.
I had to download this script to extract the Bluetooth log: https://github.com/nokia/rcm-bluez/blob/master/android_8_1/bt_mobile_android8_1/system/bt/tools/scripts/btsnooz.py
I then ran python btsnooz.py bugreport-XXXXX.zip > BTSNOOP.log
(where bugreport-XXXXX.zip
was the unique name of the bug report ZIP
The BTSNOOP.log
file was openable with Wireshark
I used the btatt
filter in Wireshark, and that got me… something
Just figuring out this process was far more difficult than it should have been. And I’m used to dealing with deceptively difficult process like this. Either way, this did give me something resembling a Bluetooth log.
You know you're in deep when this is your starting point.
At this point, I was (understandably) overwhelmed by all that was going on. BLE is a very messy protocol, as I had expected. But in the screenshot above, I was able to isolate what looked like a bunch of messages back and forth. My phone seemed to be sending data, then the screen would reply that it had received it. Given the actions that generated this log, I was pretty sure I had found where the actual animation upload process was occurring. Now, if I could only figure out how to do it without the app.
As it turned out, for reasons I don’t fully understand, the Bluetooth snoop log from my phone only painted a part of the picture. It showed me what was generally going on - which addresses were being used for sending and receiving data. But after looking deeper into the individual packet data, it didn’t make sense to me. I couldn’t tell where all of the data of the animation was actually being sent. It just looked like a (mostly) empty packet. It’s probably a limitation of Android’s Bluetooth logging capabilities (probably by design, thanks Google…) or maybe it’s my fault somehow. Either way, I was stumped, because I just couldn’t see where the data was being sent.
That’s when I stumbled upon this awesome three-part video series by Stuart Patterson on YouTube. It’s super underrated, as is unfortunately the case with his YouTube channel in general, but it’s a really practical way to get started with this stuff. As usual, the journey into this foreign land becomes easier when you have a good guide.
So, armed with practical knowledge of where to start (thanks Stuart!) I set out. I couldn’t find the exact model of USB dongle from the video, but I bought the closest thing I could find. Turns out it was fine, it’s a different chip that’s still compatible with BLE sniffing.
Yes, that’s right. The plan of attack was now to eavesdrop on the communication between my phone and the LED matrix. Powered by an nRF52833 chip from Nordic Semiconductor, this was:
My weapon of choice
Once it arrived (no thanks to Canada Post, who were on strike at the time in late 2024, so it was delivered by some no-name carrier) it took a bit of work, but I was able to set it up with Wireshark, the software tool of choice for electronic snoopers reverse engineers.
Let me take a moment to shout out https://aur.archlinux.org/packages/nrf-sniffer-ble, a package on Arch Linux’s “AUR” (Arch User Repository) that made the setup almost trivial. Thank you to everyone who maintains that script. Nordic Semiconductor’s setup documentation isn’t terrible, but it’s also pretty confusing, so this was greatly appreciated.
But now that I was finally setup, it was time to take the sniffer for a spin.
The dongle showed up in Wireshark just fine
At first I didn’t really know what I was looking at. It just seemed to be Bluetooth LE devices broadcasting that they existed and were ready to be connected to. I worried for a moment that that was all I was going to get. I was promised being able to sniff individual packets sent between devices, and this didn’t look like it would be it.
Thankfully, by enabling the sniffer toolbar from View → nRF Sniffer for Bluetooth LE, I was able to select the device I wanted to follow. I guess the sniffer needs to know which device it should be snooping on, because otherwise it would get overwhelmed if a lot of BLE devices were talking to each other at the same time. That makes sense.
But that also meant I had to capture a connection between the LED matrix and my phone, before I could start seeing the raw communication between them. Thankfully, that was almost trivial after all of this setup. Then by using the btatt
filter in Wireshark, I was able to take a look.
What my Wireshark toolbar was looking like, following the device called YS5247172084 and filtering for BT ATT packets
After getting my bearings in Wireshark, which is a highly capable albeit intimidating program, I was able to capture a full animation’s worth of data.
And maybe because Android snooping is just limited, but definitely because the nRF dongle I was using was awesome, this time I actually got the full binary data of the animation. Neat!
Here's an excerpt of a CSV export of the animation's data. Note the Value column specifically, which keeps going a long way to the right
Alright! Now we’re getting somewhere? Did I have any clue what format this data was in? Nope! But was I sure the animation was in here somewhere? You bet!
I had purposely chosen a simpler animation from the app to start my analysis, because some longer/more complicated animations had several dozen packets whereas this one had just… one dozen.
After a lot of trial and error, and using bluetoothctl
on my computer and the nRF Connect app on my phone, I was able to piece together a code sample for sending the binary data back to the screen from my computer. It was particularly tricky finding the right “handle” to send the data (basically a data address, but it’s way more complicated than it should be) but it worked.
I was able to send a copy of what I had captured back to the screen, by connecting directly to the screen from my computer via Bluetooth. Thus, the app had been bypassed completely. Except, well, this wasn’t all that interesting. This was basically no better than using the app, since I was limited to the “known good” animations captured from the app itself. No, I had to go further. But if I was going to do that, I had to make sense of the format of the data.
Going into this project, I knew this part was going to be tricky. Honestly, though, I wasn’t 100% sure if I could get this far from the start. The nRF sniffer dongle came in clutch, especially because it gave me exactly the data I was looking for. Though even if my phone’s bluetooth log had worked for me, the sniffer was a much better option, since it let me quickly test things and “fail fast”. Instead of having to wait multiple minutes for the bluetooth log to be generated and parsed, I could see what was happening in more or less real time. I could press a button on the app and watch my Wireshark window fill up with the intercepted commands and responses.
Still, if I wanted to do this project justice, I had to do some genuine software-side reverse engineering work. Something I’ve never claimed to be good at, and something I knew I wasn’t going to get much help with.
So, let me lay out the main challenge. Basically, I had to try to make sense of packets that looked like this (translated to a hexidecimal string, because I’m not crazy enough to try to read binary):
'''
aa55ffffed000000c1020901010c01000d01000e0100140301090a11040001000a12070c000000c400001381c4 47494638396160001400e66b001ad1ffe641ff23ff76ffa531ff57aa63faff0c85a351d0d523656707672b9326a312a349962f6214aed58755150553671bd5613da0a3a3681c67400d200324d58927a3356b3f0f26f553a3ff57a2241402f59f2f6b1978dd3ef54b0f543f25055c146757f9ff240614541735671e3e011c241b5254d43beb0e90b017c0eb1e6467671111b531c93aacb0a5478dbd7922541732195254872a54033f1854330956dbe0e04b9538093fc035050d873b2f9296055422a32122d853
'''
There’s gotta be an animation in there somewhere, I told myself. Thankfully, through the magic of capturing a bunch of packets, I had a pretty good set of samples to draw from. And that’s when I started noticing a sort of pattern.
'''
aa55ffff
ed
000000
c1020901010c01000d01000e0100140301090a11040001000a1207
0c
000000
c4000013
81c4
47494638396160001400e66b001ad1ffe641ff23ff76ffa531ff57aa63faff0c85a351d0d523656707672b9326a312a349962f6214aed58755150553671bd5613da0a3a3681c67400d200324d58927a3356b3f0f26f553a3ff57a2241402f59f2f6b1978dd3ef54b0f543f25055c146757f9ff240614541735671e3e011c241b5254d43beb0e90b017c0eb1e6467671111b531c93aacb0a5478dbd7922541732195254872a54033f1854330956dbe0e04b9538093fc035050d873b2f9296055422a32122
d8
53
'''
It started to look a lot more manageable, because clear divisions started to emerge. For the sake of clarity, and because I’d rather not have to relive all of the slog, the example above is separated far more cleanly than I had initially separated it. Because while some parts were obviously constant or clear patterns, others didn’t make as much sense at a glance. So instead of walking through the specific process (because I very much did not have one) let me cover some of the highlights.
One of my guiding principles when doing this sort of work is to remember that a person designed this. Someone, somewhere (or potentially multiple someones) sat down and decided that this was the way that this LED matrix would be interfaced with. With that in mind, and hoping that the only “security” at play here was security through obscurity, I started trying to make sense of the pattern.
It was at this point that I enlisted the “help” of ChatGPT to try to brute-force (or I suppose blunt-force) the rest of the protocol. Say what you want about LLM tools, but this is the kind of thing they’re best at, in my opinion. Menial grunt work that involves basic data manipulation. If you keep your expectations low, and give it a very specific prompt, you can sometimes get what you want. Solving complex problems is a no-go, so you have to break it up into byte-sized (hehe, see what I did there) chunks for it to try to digest. The longer you stay in a given chat, the more “confused” it’s going to get, but if you think of it like the data manipulation lottery, and don’t take anything it spits out too seriously, it’s sometimes better than nothing. Still, there was a time when I would’ve been completely stumped and would have given up, so I’m glad this option is available. LLM technology is a crummy panacea, but it makes a pretty great last resort.
For starters, it showed me that the animation was actually a GIF! Yes, the app was sending a GIF directly to the screen! I probably would’ve figured that out eventually using a tool like Imhex, but ChatGPT was able to tell me which parts of each packet the GIF payload was hiding in. It even gave me a GIF file with the data it had recovered, and it was playable!
So that made my job considerably easier, since I didn’t have to try to reverse-engineer the actual format of the binary data itself. I just had to figure out what was going on around it. Through trial and error, and the very helpful Checksum Calculator from Scadacore (https://www.scadacore.com/tools/programming-calculators/online-checksum-calculator/) I found out that the second-to-last byte in the sequence was the “CheckSum8 Modulo 256” of the preceding bytes.
This was another relief, and a lucky break, because it meant no weird encryption or complicated checksum calculation was at play. Here was proof that a clear, straightforward mathematical formula had been applied to this data. Great! The last byte was still a mystery, though. In hindsight, I probably could’ve figured it out if I was better with hex arithmetic, but I eventually figured it out in a round-about way.
The main breakthrough was realizing that I could send a packet to the screen, wait a set amount of time, and determine whether it had accepted it or not based on whether it had sent back a “value received” notification. So, I took the last byte off a known good payload and whipped up a script to add 00 to the end and see if it worked. If the screen rejected 00 as a final byte, I would try 01, 02, 03, … all the way to FF, and record which one worked. I was also able to construct an arbitrary payload with all 0’s, all 1’s, all 2’s, etc. for a more conclusive test.
More trial, error, captures, tests, and LLM abuse later, a reliable formula for calculating final byte was found. It definitely helped to take a few weeks off of this project, though, because I was getting too frustrated with it and missing the forest for the trees. Finding the last byte was a huge breakthrough, and everything else pretty much fell into place after that.
Although, there are still two bytes in the protocol that I don’t fully understand. They definitely relate to the length of the binary payload, and they’re 81C4
when the payload is a full 196 bytes of GIF data, but I couldn’t figure it out. Not to get too in-depth, but the GIF data is separated into chunks, of up to 196 bytes each. Each chunk has an index, etc. and the “full chunks” have 81C4
as part of the packet header. But the last packet generally doesn’t have an exactly 196-byte payload, hense its “payload length bytes” aren’t 81C4
.
The naive-but-effective solution was to just pad the last chunk with 0’s so it’s guaranteed to be 196 bytes, and 81C4
will work. From my testing, it doesn’t seem to effect the way the animation looks in any meaningful way, so I’d call it a win. KISS, as they say.
With that out of the way, this is my breakdown of what I believe the protocol’s data format to be. There are still a couple loose ends, but it’s good enough to be useful in practice. This is reverse engineering, after all.
'''
aa55ffff - Constant
ed - Derived from packet Length
000000 - Packet Index (Increments by 0x100)
c1020901010c01000d01000e0100140301090a11040001000a1207 - Constant
0c - Total Number of Packets Being Sent
000000 - Packet Index (Again)
c4000013 - Constant?
81c4 - Payload Length; Unknown Formula but 81C4 for 196-byte payload
47494638396160001400e66b001ad1ffe641ff23ff76ffa531ff57aa63faff0c85a351d0d523656707672b9326a312a349962f6214aed58755150553671bd5613da0a3a3681c67400d200324d58927a3356b3f0f26f553a3ff57a2241402f59f2f6b1978dd3ef54b0f543f25055c146757f9ff240614541735671e3e011c241b5254d43beb0e90b017c0eb1e6467671111b531c93aacb0a5478dbd7922541732195254872a54033f1854330956dbe0e04b9538093fc035050d873b2f9296055422a32122 - GIF Payload
d8 - CheckSum8 Modulo 256 of Preceding Bytes
53 - High Byte of Total Sum of all Preceding Bytes
'''
So yeah. That’s just about it. Here’s the final GIF Upload Python script I came up with. You can give it the name of a GIF under 49,890 bytes, and it will upload it to the LED matrix. It’s pretty quick, especially for static images or simple animations, although it’s not instant. And it doesn’t have all the “features” of the app. But it does everything I wanted it to do, so I’m happy with it (at least for now).
You can find the full project here on GitHub:
https://github.com/mtpiercey/ble-led-matrix-controller
#!/usr/bin/env python3
'''
* GIF Upload Script for Flexible 96x20 LED Matrix
* Copyright (c) 2025 Matthew Piercey
*
* https://github.com/mtpiercey/ble-led-matrix-controller
*
* MIT LICENSE:
* Permission is hereby granted, free of charge, to any person obtaining a copy of
* this software and associated documentation files (the "Software"), to deal in
* the Software without restriction, including without limitation the rights to
* use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
* the Software, and to permit persons to whom the Software is furnished to do so,
* subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
* FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
* COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
* IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
* CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*
* Instructions:
* python gif_uploader.py somefile.gif
* - Where somefile.gif is the name of the GIF animation you wish to upload to the LED matrix
* - somefile.gif must be under 49,980 bytes (~48 KiB)
* - The screen will attempt to render it regardless, but it works best with animations that are 96x20 pixels
* - Ensure your computer is capable of connecting to BLE devices
* - Also ensure you have installed the bleak and tqdm modules as dependencies
'''
from bleak import BleakClient
from tqdm import tqdm
import argparse
import asyncio
import sys
# The screen will likely have this BLE address. Ensure it is not paired to any other device before running this script.
# Using an app like nRF Connect can help you find the addresses of all BLE devices in your vicinity.
DEVICE_ADDRESS = "ff:24:06:18:41:5f"
# For whatever reason, this maps to the right BLE handle for binary data sending, although Wireshark sees it as 0x000e
HANDLE = 0x000d
# The second-to-last byte is a CheckSum8 Modulo 256 of the preceding bytes (see https://www.scadacore.com/tools/programming-calculators/online-checksum-calculator/)
def checksum_mod256(hex_string):
return hex(sum(int(hex_string[i:i+2], 16) for i in range(0, len(hex_string), 2)) % 256).upper()[2:].zfill(2)
# The last byte requires a specific checksum calculation, taking into account all previous bytes
def calculate_last_byte(hex_string):
# Convert the preceding bytes to an array of integers and take their sum
data_bytes = [int(hex_string[i:i+2], 16) for i in range(0, len(hex_string) - 2, 2)]
total_sum = sum(data_bytes)
# The last byte is the high byte of the total sum
# TODO: Can this and checksum_mod256 be combined somehow?
last_byte = total_sum // 256
return f"{last_byte:02X}"
# This sequence ensures the screen is ready for a new animation to be uploaded
async def reset_screen():
# Sending this byte array tells the screen to delete whatever animation(s) it was currently storing, so they can be overwritten
# Technically the screen has the capability to store multiple animations and swap between them, but that went beyond the scope of this proof-of-concept
await client.write_gatt_char(HANDLE, bytearray.fromhex("aa55ffff0a000900c102080200ffdc04"), response=False)
await asyncio.sleep(0.5)
# Sending this gets the screen ready to receive a new animation
await client.write_gatt_char(HANDLE, bytearray.fromhex("aa55ffff0a000900c10208020000dd03"), response=False)
await asyncio.sleep(0.5)
def generate_header(payload, index, animation_length):
# The header always starts with this
header = "aa55ffff"
# This is the byte length, in hex, from the first page number below, until the second-to-last byte
# (including the first checksum byte but not the last)
# Hex byte length of packet plus 41 (40 bytes preceding, 1 byte trailing included in length calculation)
header += hex(int(len(payload)/ 2) + 41)[2:]
# Page/packet number
# 000000, 000100, 000200, 000300, 000400, 000500, 000600, 000700, 000800, 000900, 000a00, 000b00, etc.
header += f"{index:04x}00"
# Always constant
header += "c1020901010c01000d01000e0100140301090a11040001000a1207"
# The length of the GIF, in frames, in hex - 0c means 12
header += f"{animation_length:02X}"
# Page number again
# 000000, 000100, 000200, 000300, 000400, 000500, 000600, 000700, 000800, 000900, 000a00, 000b00, etc.
header += f"{index:04x}00"
# Seems to always be constant, not sure what it represents
header += "c4000013"
# TODO: Figure out how to calculate this byte sequence
# Has something to do with the length of the payload (81c4 for a full payload, but lower if the payload isn't a full 196 bytes)
# For now, file_to_hex_chunks is just padding the last payload with 0's, so these bytes will work
header += "81c4"
return header
# Given a binary payload, the payload index, and the length of the animation (in number of packets)
# Generate a packet (including a header and four-byte checksum trailer)
def generate_packet(payload, index, animation_length):
# Header
header = generate_header(payload, index, animation_length)
full_value = header + payload
# First two bytes of the checksum trailer
checksum = checksum_mod256(full_value)
full_value = full_value + checksum
# Last two bytes of the checksum trailer
last_byte = calculate_last_byte(full_value)
full_value = full_value + last_byte
return full_value
# Split a GIF file into chunks
def file_to_hex_chunks(filename, chunk_size=392):
try:
with open(filename, "rb") as file:
hex_string = file.read().hex()
except:
print("Unable to open GIF file")
sys.exit(1)
# Naive GIF file validation
if not (hex_string.startswith("GIF87a".encode("ascii").hex()) or hex_string.startswith("GIF89a".encode("ascii").hex())):
print(f"{filename} is not a valid GIF file.\n")
sys.exit(1)
hex_chunks = [
# TODO: The padding with 0's is currently necessary because the pre-packet byte sequence pattern isn't clear
# So at least we can add some 0's, fill up the last packet so it's 196 bytes, and use the default "81c4" value in generate_header
hex_string[i:i + chunk_size].ljust(chunk_size, '0')
for i in range(0, len(hex_string), chunk_size)
]
# TODO: Not sure if this is a hard limit, but it appears to be given how the length in number of packets seems to be a two-digit hex value
if len(hex_chunks) > 255:
print("Please select a smaller GIF file (under 49,980 bytes or ~48KiB)\n")
sys.exit(1)
return hex_chunks
# Naive BLE notification handling logic
# Basically just wait for any notification to come in from the device, to trigger the asyncio event
notification_event = asyncio.Event()
def notification_handler(sender, data):
notification_event.set()
async def main():
global client
# Get the name of the GIF file to process from the CLI arguments (should be after the name of the command)
parser = argparse.ArgumentParser(description="Script to upload a GIF to a flexible 96x20 LED matrix")
parser.add_argument("gif", type=str, help="The name of the GIF file you wish to upload")
args = parser.parse_args()
GIF_FILE_NAME = args.gif
hex_chunks = file_to_hex_chunks(GIF_FILE_NAME)
async with BleakClient(DEVICE_ADDRESS) as client:
if client.is_connected:
try:
# Start receiving indications
INDICATIONS_UUID="00002a05-0000-1000-8000-00805f9b34fb"
await client.start_notify(INDICATIONS_UUID, notification_handler)
except Exception as e:
print(f"Failed to enable indications: {e}")
sys.exit(1)
try:
# Start receiving notifications
NOTIFICATIONS_UUID="0000fff1-0000-1000-8000-00805f9b34fb"
await client.start_notify(NOTIFICATIONS_UUID, notification_handler)
except Exception as e:
print(f"Failed to enable notifications: {e}")
sys.exit(1)
await reset_screen()
print(f"Connected to {DEVICE_ADDRESS}")
print(f"Uploading {GIF_FILE_NAME} (~{len(hex_chunks) * 196} bytes)...\n")
progress_bar = tqdm(total=len(hex_chunks), desc="Progress", unit=" Packets")
packet_index = 0
for hex_chunk in hex_chunks:
packet = ""
notification_event.clear()
try:
# Generate the binary packet to upload
packet = generate_packet(hex_chunk, packet_index, len(hex_chunks))
# Upload the packet to the screen
await client.write_gatt_char(HANDLE, bytearray.fromhex(packet), response=False)
# Naively wait for any notification, but it's likely that the notification will be because the current packet was received
# TODO: It may be possible to upload more than one packet at a time, and check latest value received notitications so as to not overflow the screen's input buffer
# For now, this system works even if it may not be as fast as possible
await asyncio.wait_for(notification_event.wait(), timeout=0.75)
progress_bar.update(1)
except Exception as e:
print(e)
print("An upload error occurred!")
sys.exit(1)
packet_index += 1
progress_bar.close()
# Not really sure what this does (or why it's sent twice), but seems to indicate the the screen that the upload has finished
await client.write_gatt_char(HANDLE, bytearray.fromhex("aa55ffff0b000f00c10236030100001404"), response=False)
await client.write_gatt_char(HANDLE, bytearray.fromhex("aa55ffff0b000f00c10236030100001404"), response=False)
print("\nUpload successful!")
else:
print("Failed to connect to the device")
sys.exit(1)
asyncio.run(main())
Here are some examples of custom GIFs being displayed:
An example GIF from https://learn.adafruit.com/animated-gif-player-for-matrix-portal/example-gifs
As with any project like this, it’s never super clear where and when to draw the line. You almost have to go into these sorts of things with a promise to yourself not to go further than a certain point, because you have to call it somewhere. But that’s hard, because you don’t always know what’s possible from the start, let alone what’s easy. So you also have to stay flexible, and learn to enjoy the process (at least a bit).
Still, I don’t think I would’ve embarked on a project like this if I hadn’t seen it as a learning opportunity like no other. This was genuine hardware and software reverse-engineering, and I’ll admit I do feel a little bit proud of the end result. It lets this beautiful LED matrix show its true colours (heh) instead of being locked behind a limited app. Which was also one of those infuriating apps that require you to create an account with your email, even for offline use! Joke’s on them; I used a temporary email for it.
If I were to make a roadmap for what I’d like to see in this project going forward, some kind of text animation would be nice. Probably doable by creating a GIF from text, then uploading that. Other commands like changing the brightness would likely be pretty easy to reverse-engineer as well. And technically this screen seems to be able to hold multiple animations at once, so being able to swap between pre-loaded animations would be neat. I could maybe make a custom desktop app for that, or use an ESP32 as a controller. Other than that, there’s not much more I’d add to this. Even in its current form, it’s a lot of fun, and the hardest part is definitely behind me. And hey, maybe someday someone will find my code on GitHub, realize they have one of these lying around, and give it a try. That’d be neat, especially if it’d give another one of these screens a new lease on life.
At any rate, this was another project for the books, and the skills I picked up along the way will probably lead to another BLE project or two. I’ve had some ideas in mind for a long time, but just never had what it took to make them a reality. Hopefully you’ve enjoyed following along through my recounting of this journey. Maybe even learned a thing or two, although I don’t claim to be a good teacher. If nothing else, I hope I was able to give you a bit of encouragement to just get started and see what happens!