This project began with fanciful dreams of an RX controller+receiver set (see: reactive programming) that parses an audio stream for data, transforms the dataset, runs tests against the transformed data to determine callback parameters, and then emits a message to the WS281x API.
This sort of RX design would allow humans (or machines) to drop in rules about musical data patterns, like using a pre-defined RGB palette in a certain key or modifying a hue transition scaler when the data steam is above a threshold in the time domain (e.g. BPM > 90).
First, I’ll need to build a working hardware environment and a controller+receiver pattern.
Base Case (high level summary)
- A strip of LEDs mapped to Raspberry Pi GPIO
- A controller application that emits web socket messages
- A high level wrapper around the WS281x spec + Raspberry Pi PWM
- A receiver application that subscribes to web socket messages, whose callbacks are using WS281x API
A very basic example might allow the client/controller to set RGB values via CLI, which will push an event to a WSGI wrapper around the ws281x API.
- WS2812 LED Strip, like NeoPixel RGB White 30 via Adafruit.
- 3V to 5V level shifter 74HCT125 via Adafruit.
- Female/Male jumper wires
- Male/Male jumper wires
- Breadboard, like half board via Adafruit.
- Power supply. I’m using 5V 4A (4000mA) via Adafruit.
- Power Jack via Adafruit.
- Raspberry Pi 2
- Raspberry Leaf reference sheet. This is going to save me so many silly mistakes over the course of this project!
Libraries & APIs
- Socket.io will handle pub/sub patterns between the controller and receiver.
- rpi_ws281x - both a SWIG-generated interface and high level API