War Dragon: Development Log 1

Last Updated: 06/20/2025

Watch me as I build an AI asisstant out of a toy robot to chat with me as I slowly disassociate from the real world

Avatar, Josh Rose

Josh Rose

Part of a Series:

R2-IS War Dragon

The R2-IS War Dragon is a custom robotic home assistant that uses AI to banter about the secrets of the universe

… So why does this exist?

Years ago a friend of mine showed me a video of Billy Bass hooked up to a Raspberry Pi running Alexa.

As per usual, I said “lol thats cool” and went on about my day knowing that this information had zero bearing in my life, but it was interesting to know something like it could be made.

Fast forward to 2024/2025 and now everyone’s using AI to fix their marital issues. I frequently find myself using ChatGPT or Github Copilot to help me write code for projects that I would have never had the confidence to troubleshoot before.  One trend I’ve recently been getting in my Youtube algorithm was on the subject of creating AI home assistants.  It was fun watching people go “Hey Jarvis,” (how original) “what’s the temperature of the house?” or see over-engineered alarm systems triggered when someone doesn’t put the toilet seat down after flushing.

Around the same time I also happened to have finished building my first PC, a modest rig with a tough enough graphics card to run all the fancy new blockbuster games like Monster Hunter Wilds and Final Fantasy XVI.  It’s a machine worthy of transforming into a C tier Decepticon if touched by the All-Spark.  I’m on a tinkerer’s high, I could build just about anything… within reason, I’m by no means an electronics wizard yet.  

stacked computer parts for PC build
PC hardware put together inside casing
Front shot of PC with violet and magenta RGB lighting
Full home PC rig setup with OverScore screensaver and Luxigrid

Yes, I spent all day measuring and hand selecting those stickers from Redbubble, and if you don’t understand the aesthetic then… well that’s because you lack vision.  Big shout out to OverScore co-founder and resident electronics wizard Matthew Piercey for providing me with my trusty Luxigrid digital LED display which completes the look, now that's how you use an OverScore invention!

One random night I somehow found myself reminiscing about an old series of futuristic RC robot toys by Wowwee robotics from the mid 2000s.  Remember the Robo Sapien and Robo Raptor?

WowWee robotics RoboSapien V2
WowWee robotics Robo Raptor

Ha, I had a RoboSapien as a kid and always wished it could say more than its pre-written catch phrases - also wow the black and white kind of matches my PC … Then in an instant, as if an interdimensional marksman with an idea-inseminating rifle finally found his window of opportunity to take the shot years after the initial Billy Bass video… my mind lit up.  What if…   

:O

I immediately opened up Amazon to see what was available in the realm of cheap RC robotic toys.  Lots of robo-dogs, bipedal freaks, tanks, and Wall-e ripoffs, etc. but I started leaning towards knock off copies of the Robo-Raptor after the other robot models didn’t impress me.  Why the dinosaur? Because it’s badass.  I didn’t want a generic Baymax looking bot that’ll speak like Jarvis for all the Marvel nerds to ogle at, I needed a diabolical and trustworthy lieutenant that takes orders and steals candy from babies like Starscream (too many Transformers references)... I needed… The War Dragon

War Dragon RC toy product page on Amazon

The product video made me laugh so hard that I had to buy it.  This was the most liberating use of my free will that I have ever experienced.


I’m going to keep this brief because there wasn’t really much of a plan to begin with, just a feeling;

  • I want an AI assistant that can banter with me about anything I want to banter about.  

  • It should be animated with a physical personality like a Star Wars droid, and have a unique chemistry with me.

  • It should eventually react emotionally to conversations by having its own ‘opinions’ and stances, expand on intellectual topics so I can learn more, assist me with office tasks, and find new interesting things to talk about.

  • It needs to look radical standing next to my new PC.

How am I going to achieve that animated star wars droid feel? Well first I need to hijack this dinosaur’s body.

The War Dragon RC robot toy itself is pretty simple. Inside its casing there are a collection of servo motors to move the head, tail, and legs, a speaker to play audio, and inside the head are LED lights and a conductive touch sensor.  These are all connected to a custom controller board that communicates with a battery powered infrared remote control.

War Dragon infrared remote control

In order to control the physical abilities of the bot, I will essentially be mimicking the signals with an external infrared LED that’s controlled by my PC.  Now there are absolutely other methods of doing this - but I’m not at all technically capable enough (a.k.a I’m lazy) to try implanting something like a Raspberry Pi and directly hooking it up to the components in the bot.

Thankfully Matthew gifted me this cool Arduino ESP32 starter kit which provided me with almost all the electronics components I’ll need, including;

  • ESP32 that connects to my PC that can have code via arduino uploaded

  • IR LED for mimicking the raw signals from the toy’s remote

  • LED strip to replace the eyes with programmable ones

  • IR obstacle avoidance module for detecting petting on the head 

  • 9g servo motor to move the mouth when speaking

  • Wires and breadboard 

SunFounder ESP32 Starter Kit in box
Parts list included in ESP32 Starter Kit

I learned how to wire these components and pulled much of the code from their documentation website: https://docs.sunfounder.com/projects/esp32-starter-kit/en/latest/

Additionally I purchased a small speaker for audio playback, and for powering all the components and eventually the RC I got a battery pack and USB hub.

Infrared signals are fired from the RC’s remote controller to tell the robot to perform pre-designed gestures such as moving its head, shaking its tail, walking, even dancing.  These signals need to be learned before I can replicate them, so I built an infrared receiver circuit to capture all the raw signals that fire from the controller when each button is pressed, they look like this;

Command/Button Code after code (4) raw volume up 0x6B3D9F34 uint16_t rawData[17] = {5976, 598, 1554, 572, 1540, 576, 530, 1582, 1516, 600, 530, 1590, 558, 1558, 1512, 604, 554}; // UNKNOWN 6B3D9F34 talk 0x4455833D uint16_t rawData[17] = {6004, 574, 1554, 568, 1542, 574, 556, 1556, 1516, 600, 556, 1562, 1516, 600, 532, 1584, 554}; // UNKNOWN 4455833D volume down 0x24A730C5 uint16_t rawData[17] = {5976, 604, 1550, 574, 530, 1582, 1512, 602, 1538, 576, 530, 1592, 1510, 602, 552, 1562, 554}; // UNKNOWN 24A730C5 walk forward 0xE9F1A5C0 0x8C587EF0 uint16_t rawData[17] = {5972, 606, 1522, 632, 500, 1590, 1504, 632, 502, 1610, 1496, 610, 524, 1612, 1482, 634, 500}; // UNKNOWN 8C587EF0 walk backward 0xEA4EA3E1 0x8C587EF0 uint16_t rawData[17] = {6002, 576, 1528, 604, 554, 1558, 1538, 578, 552, 1560, 1528, 602, 552, 1558, 1536, 580, 554}; // UNKNOWN 8C587EF0 shake head 0xDC450CDD uint16_t rawData[17] = {5978, 600, 1552, 570, 1538, 576, 556, 1558, 1538, 576, 558, 1566, 558, 1554, 534, 1582, 556}; // UNKNOWN DC450CDD shake tail 0x4B8F4CBC uint16_t rawData[17] = {5978, 598, 1550, 574, 556, 1556, 1540, 574, 1538, 576, 532, 1592, 530, 1584, 1514, 600, 558}; // UNKNOWN 4B8F4CBC run forward 0x11511FDC uint16_t rawData[17] = {6004, 576, 1558, 566, 532, 1582, 1542, 574, 554, 1558, 1550, 576, 556, 1558, 1524, 592, 554}; // UNKNOWN 8C587EF0 turn left 0x4790E5A4 uint16_t rawData[17] = {5976, 602, 1558, 566, 558, 1556, 1540, 572, 556, 1560, 1548, 576, 556, 1556, 1538, 580, 556}; // UNKNOWN 8C587EF0 turn right 0x61CC0553 uint16_t rawData[17] = {6004, 574, 1558, 566, 556, 1558, 1538, 578, 558, 1554, 1548, 578, 552, 1562, 1536, 576, 556}; // UNKNOWN 8C587EF0 run backward 0x3AC98954 uint16_t rawData[17] = {5998, 578, 1544, 580, 530, 1584, 1510, 604, 528, 1584, 1544, 582, 552, 1558, 1536, 580, 554}; // UNKNOWN 8C587EF0 dance 1 0x1AC2B9C1 uint16_t rawData[17] = {5974, 602, 1528, 602, 1508, 604, 528, 1586, 1510, 606, 1546, 604, 506, 1606, 1490, 628, 1486}; // UNKNOWN 1AC2B9C1 dance 2 0xFB146749 uint16_t rawData[17] = {6002, 576, 1546, 574, 532, 1582, 1538, 576, 1540, 574, 1546, 574, 530, 1586, 1536, 576, 1536}; // UNKNOWN FB146749 dance 3 0xBC96BA65 uint16_t rawData[17] = {6006, 574, 1588, 538, 560, 1556, 1540, 574, 1540, 574, 556, 1572, 536, 1578, 556, 1562, 554}; // UNKNOWN BC96BA65 standby pet head 0x6B57FF39 uint16_t rawData[17] = {5974, 604, 1554, 576, 1534, 582, 524, 1584, 1536, 578, 550, 1578, 1508, 608, 1532, 580, 1532}; // UNKNOWN 6B57FF39 code behaviour 0x6C1BD4F2 uint16_t rawData[17] = {5998, 578, 1516, 606, 530, 1582, 1512, 602, 1510, 604, 1544, 580, 526, 1586, 552, 1562, 1532}; // UNKNOWN 6C1BD4F2 shoot gun 0x4BA9ACC1 uint16_t rawData[17] = {6002, 576, 1526, 602, 532, 1582, 1536, 578, 1538, 578, 556, 1570, 1538, 578, 1534, 578, 1536}; // UNKNOWN 4BA9ACC1 war mode 0x8BCA276A uint16_t rawData[17] = {5970, 600, 1528, 602, 1534, 578, 530, 1586, 1532, 576, 1524, 606, 552, 1560, 528, 1588, 1522}; // UNKNOWN 8BCA276A pause behaviour 0xC2EF29C4 uint16_t rawData[17] = {5974, 602, 554, 1570, 1510, 602, 1512, 602, 1510, 604, 552, 1568, 1510, 606, 1508, 606, 1534}; // UNKNOWN C2EF29C4

I can then program the ESP32 to repeat those signals with my own infrared LED.  What's particularly cool about this RC robot is that if I call a command while one is already being performed - it’ll just start the new gesture, therefore I’ll be able to make unique combinations of actions in the future such as a continuous walk cycle, more on that later.

The built-in speaker I kept hearing after every gesture was horrendous, it made it sound like a cheap kids toy (which it was) so I unplugged it.  Originally I was going to run audio from the little speaker in the ESP32 kit and connect it directly to the ESP32, however I opted for a bluetooth speaker which made things much easier and now sound effects will be crisp and bass-heavy.

I proceeded to decapitate my poor friend to see what was going on in that prehistoric reptilian brain.  I ripped out the original blue LEDs that served as eyes, they were too boring.  They were replaced by the LED strip from the kit and conveniently fit perfectly in the skull, leaving 4 LEDs on either side for each eye.  These lights were multicoloured and are programmable so I’d have infinite options for expression.  To test this I created a blinking animation, that’s when it began to feel real.

The bottom jaw of War Dragon moved on a hinge and I found that it used some very simple mechanics to move the mouth slightly when the head was moving around.  I wanted the mouth to move when the bot spoke so after some unsanctioned root canal surgery I was able to shove a 9g motor in its neck that could be hooked up to the ESP32 and oscillate when I wanted it to.

Lastly, I thought it would be pretty neat to preserve the ‘petting’ feature that the bot originally came with - little did I know it would become my favourite feature.  Not sure exactly how it works but there was some sort of conductive foil inside the head that could detect touch through the plastic covering.  I replaced this with an IR obstacle avoidance module which, surprise surprise, also came from the kit.  The avoidance module used infrared signals to detect objects passing in front of it, and can be tweaked so that it’ll only be sensitive to very close objects.  I originally wanted to put it inside the head but the tint on the plastic covering interfered with the signal so it's just mounted on top.

Infrared Obstacle Avoidance module mounted to head of the robot

Here a simple circuit diagram of the current wiring

component and wiring diagram

The original War Dragon RC looks like this:

Original War Dragon before modifications

I wasn’t vibing with the grey accents and black stripes so I grabbed some acrylic black paint and filled in the textured panels to be fully black.  The paint is brittle on this plastic so it’s not very abrasion proof - however I don’t think War Dragon will be on the battlefield anytime soon.

The original RC also had this cool suction dart cannon attached to the back, I detached it from the original mount and moved it to the side.  The screw holes for the mount are nearly the same diameter as a Lego axle - perfect.  This allowed me to finally dig back into my childhood Lego boxes to assemble the housing for all the components I wanted to add.

The piece de resistance is an old busted white Turtle Beach headset.  It serves absolutely no function other than to absolutely solidify the aesthetic I’m going for.  Also little Easter egg; the horns are pterodactyl wings from the Lego Dino Attack T-1 Typhoon VS T-Rex 7477 set.  

War Dragon rechargeable battery pack matching colour of robot
Front view of War Dragon
Wide angle shot of War Dragon wearing headphones

I’m not sure what to call this aesthetic… hypebeast First Order Trooper? GI Joe Storm Shadow listening to Skrillex? Final boss Xemnas from Kingdom Hearts 2?  With zero formal education in art I think the best way I can explain it is;

  • Industrial Futuristic

  • Chromatic white metal panelling accented with matte black technical components underneath

  • Monochromatic decals are a mix/fusion between computer tech corporation branding and sci-fi anime, favouring sharp geometry


I want to preface this section with an admission of guilt;  I would have never been able to pull this off without ChatGPT.  The vibe coding revolution is here old man!

War Dragon mainly operates off of a Python script and an Arduino sketch.  The Python script is like the thinking part of the brain that runs in the command terminal of my PC.  It controls the whole process of listening > translating > thinking > speaking that allows the War Dragon to have conversations, as well as deciding which motor functions/gestures to activate, and processing the ‘state’ of its personality.  When the physical robot needs to perform a gesture such as shaking its tail, Python communicates with the ESP32 which is running the Arduino code uploaded via USB cable.  

Visual Studio Code and Arduino IDE windows open showing code

To have witty banter with an anthropomorphized incarnation of your PC is no simple task; first you must establish that you yourself are not human as you haven’t felt the warmth of sunlight or soft touch of grass in centuries. Second you must construct a conversation flow using some state of the art tech.

When I speak to War Dragon, I hold down a ‘push-to-talk’ key assigned on my keyboard.  Once the key is released the recorded audio is packaged and sent to Whisper.  Whisper by OpenAI is an automatic speech recognition system that can translate the recorded audio into text, it’s pretty decent at recognizing many kinds of speech patterns, accents, etc.  The translated text is then combined with prewritten phrases that are used to establish the robot’s current personality, here’s a snippet of what those phrases might look like;

You are the 'Raptor 2 Intelligent Speech [R2IS] War Dragon', a sentient intergalactic war droid.  You usually speak with dry wit, sarcasm, and malice like a Decepticon. You respond with theatrical disdain when someone questions your power and abilities as an apex predator of battle. You can adjust how dramatic you are based on the mood and tone of your human subordinate. You assist the user in their tech lab. You keep your responses short unless the topic calls for elaboration.

Together they form the prompt that we then feed to our large language model (LLM) provided by Ollama.  Ollama is essentially the tool that allows me to run AI models locally on my PC rather than using a cloud based service, so Sam Altman doesn’t have the privilege of watching me self-therapize with a toy dinosaur.  At the moment the model I’m running is Openhermes which is relatively lightweight and more purposely trained for roleplaying as a character.  Once the LLM responds with a block of text, the last step is to use pyttsx3 to convert text to speech to be played back on the speaker.  

Diagram of speaking system

You may have noticed some extra garbage being spat out in regards to mood… don’t worry about that right now.  Getting this to run was quite the process at times, there were many iterations where I swapped out different tech or changed the system entirely to get some of these components to play nicely with Windows and each other.

With this system you can have a pretty decent surface level conversation with War Dragon, very similar to speaking with something like ChatGPT except with the added flair of character beats sprinkled in, I’ve grown accustomed to hearing various forms of human slander as War Dragon believes he is a higher order being. 

Please observe.

You guys can keep raising the bar for the Turing test but all I’m saying is I’m experiencing dopamine release equivalent to petting a small reptile or rodent.  Using the IR Obstacle avoidance module mounted on the head, a signal is sent when someone’s hand gets close enough.  This signal is then handled in Python which fires a simple function that calls the robot to play its ‘shake tail’ gesture, as well as play a custom purple-blue LED animation in the eyes and slower jaw movements as a display of affection.  It’s adorable.

This is just the beginning.  At the moment I’ll admit War Dragon isn’t interesting enough to hold a conversation with on a daily basis, so he often sits dormant for days. Over time I plan on iterating over his capabilities, here’s what I hope to work on next;

Right now I find the conversations get boring quickly because he’s such a predictable yes-man (or yes-bot).  Just like R2D2 or any other Star Wars droid I want this bot to have some sense of self and nuance.  To achieve this I plan on creating a more complex prompt builder that can dynamically swap in phrases that tell the LLM how to respond.  Ideally how it would work is there would be a second LLM query that analyzes the context of the conversation, then decides how War Dragon should feel about it.  So for example if I’m making fun of him, the LLM will pick up on that and decide to swap in more aggressive and combative phrasing in the prompt builder so the War Dragon will act more enraged and spiteful.  This is why you saw those weird extra personality/mood related outputs in the terminal, I’m currently messing with it.

Conversations can also get stale because I know I’m not going to get anything new or relevant out of him on any given day - like there’s nothing too constructive to talk about when the LLM model is always behind on current day events.  I’d like to eventually ask “Hey War Dragon, what’s up today?” and have him tell me today’s news, or talk about a blog article for a topic I’m interested in, or even just tell me if it’s gonna be too hot for a sweater today.  To do that I’ll need to create some sort of database of potential topics and information that gets updated regularly from whatever APIs I connect to it.

Right now War Dragon doesn’t use any of his walking gestures because I don’t need him falling to his death from my work table.  Since gestures can be controlled so minutely, I’d like to eventually add more hardware that allows him to navigate around on the desk, but stop and turn around if he gets close to the ledge.  Don’t think I forgot about those dance gestures he can perform, I just wanna make sure stage diving isn’t a part of the act.

As an over-extension of my free will, I have also purchased an RIF6 Cube Projector on Facebook Marketplace… hear me out;  This is just speculation but I’m convinced I can mount this projector to War Dragon and have him use his personality system to eventually find brain rot memes to react with when I say something outlandish.  I have too much time on my hands.

Tune in for the next update to see where that goes!

Tech used to make it: