Wooden Macropad

What is it?

It’s an open source electronics kit, the Adafruit Macropad, embedded in a solid block of quarter-sawn tigerwood.

What can it do?

It’s a programable HID keyboard with OLED display and rotary encoder running CircuitPython. Which is a hardware specific, light-weight port of Python for microcontrollers.
The keys have RGB LEDs and can be programmed to send single or multiple keystrokes to the computer.

It shows up as a mountable drive, you can live-edit the code.py file so
when you save, the new code is automatically loaded. No compiling.

Similar to other computer mice, keyboards, or other peripherals it’s powered by USB so it does not run standalone.

The Woodworking began as a solid block of South American Tigerwood. Nice pattern and hardness but I can’t say I enjoy the smell of this wood. It’s got a gluey stank which is not particularly enjoyable.

I used a bandsaw to take a 1/4 inch veneer off the top. This will be the cover.

The bottom base was hollowed out using a plunge router. A device you must respect. It can clear out a lot of material quickly but at 10k to 30k RPMs it can easily get away from you in a hurry. Safety always first.

The two two pieces were then carved and fitted to accept the Macropad. It’s a bit of a shame to seal up the beautiful silkscreen art of this particular PCB.

The fitted Macropad had one other addition. I used a cut down credit card sized plastic magnifying glass using lenticular? magnification. The offset from the OLED gives the display a slight floating feeling.

Macrophotograpy of the Macropad

The project was finished with a couple coats of MInwax Tung Oil finish. Not a “true” tung oil but it makes the grain pop while not filling the wood pores.

The seams are a little more visible than I would like. From a distance however it’s not that noticeable. I’ve learned that minimal handling is required after separating using the “bandsaw box” technique to prevent “dings”.

As for programming I’m mainly using the application hotkeys demo found on the Adafruit Learning Guide but the sky is the limit because it can be programmed to do anything a keyboard or mouse can do.

Motivation to write this up comes from the recent Hackaday.io contest for Odd inputs and peculiar peripherals.

“Big Flood” lightbox

Blue led animation in steel laser cut Coast Salish artwork

Using the Adafruit Adabox #017 I added leds and an e-ink display to a lasercut lightbox made of steel by Coast Salish artist Xwalacktun as a gift this year.

A miniature version of He-yay meymuy (Big Flood), it’s 30cm tall with an 11cm diameter. The original piece is an impressive 487.8cm tall by 167.6cm made of aluminum. Located at the entrance to the Audain Art Museum, it’s is a powerful piece inside a beautiful building.

I used a metre of RGB neopixels wrapped around a cardboard tube, diffused with the bubblewrap, and plugged into the Adafruit Magtag to animate rain. In this mode the lightbox will eventually fill to a full blue colour.

There are three other modes with different colours and animations using the Adafruit_led_animation library. Each mode animates differently and updates the e-ink display with a section of art from the lightbox.

Adafruit MagTag eink display
Top View: With e-ink the last image will remain, even after removing power.

I even upcycled the spool from the neopixel strip to mount the MagTag as it fit snuggly. There are plenty of other features I’ve yet to take advantage using built in Wi-fi, light sensor, etc.

Art by Coast Salish artist Xwalacktun
296 x 128 px Indexed Colour .BMP used to display on e-ink

The base is made with glued pieces of western red cedar to mimic the architecture of the museum and carved to receive the artwork.

Grow Conference 2014

grow
I just attended the Grow Conference here in Whistler and I wanted to share a couple thoughts.

Billed as “an experiential playground exploring the future of innovation, growth and entrepreneurship” the conference tagline is LIVING IN A CONNECTED WORLD. These themes are near-and-dear to me so I was thrilled when they announced they were coming to Whistler this year. Offering a Lean pass, I was able to secure a fair priced ticket without the need to pay for transportation etc.

The conference, which took place at the Fairmont Chateau Whistler, was well organized and aside from a couple audio and scheduling issues, everything ran smooth. Along side the conference was a two day hack-a-thon. The challenge; To create a “connected” resort town! I wish it preceded the conference so that I could participate in both.

The mix between technical and non-technical attendees kept the discussions mostly high-level. The people I tended to connect with were lower-level developer types. With my poor entrepreneurship skills, I was unable to secure millions of dollars in venture capital. I wasn’t actually trying to sell anything. Except maybe Whistler. I would offer local advice, help every lost attendee, and generally just say “how awesome is this place eh?”. I probably wouldn’t have turn down 100k for my idea to start a hackerspace here in Whistler. In fact, I am trying to start a hackerspace right here in Whistler. Sadly, this town isn’t filled with geeks. Only few local tech companies like Guestfolio and Ridebooker here in Whistler and they represent a small percentage of the employers. No reason this can’t increase.

A few highlights and common themes;

Wearables and Internet of Things – These are definitely the hot topics getting all the attention. The Internet of Things (IoT), which everyone mutually agrees is a poor term, seems what you call any connected device that’s not a computer. Wearables, obviously are worn. Think Google Glass, Recon Instruments, fitbit, Nike+, etc.

Privacy and Security – These topics always come up immediately afterwards. As soon as you think “Cool, I can open my door from the internet” you realize that theoretically anyone else can too. That sleep monitor, check-in data, and all the things tracking you for convenience can be mined or interpreted and used against you. My opinion is those that take this serious will win out.
I was very impressed with SmartThings founder Jeff Hagins opinion on the subject and glad they are staying separate from Samsung after just recently being acquired. I can’t say the same for lax attitude of Life360, whose founder repeatedly made broad statements like “I don’t think your average user cares”. Their product GPS tracks family members btw.

Data – Data, Data, everywhere data. It’s not enough to just collect it, you need to use it. Inform.
Scale, infrastructure, APIs and the other things that were once hard have since been figured out. Some companies whole business is collecting customer data. Once you have the data… (see Privacy and Security)

Best talk and what resonated with me most was Scott Jenson’s How to make everything discoverable with the Physical Web.
In it, he smartly discussed how the physical web will need a mechanism unlike the current app model. The idea that every future smart-thing would require it’s own smart-app obviously has flaws.
Just Google “app fatigue” and you’ll see for the past year or so the world isn’t very appy anymore. Yay web! And that’s the idea, broadcast URI’s, no passive tracking. I’m looking forward to experimenting with this. You can too. Find out more with examples on GitHub.

Overall it was a great conference. I hope to be back again next year.

PABLO

Pablo is a physical chatbot. An open source social robot.

Pablo got his name from (PyAIML Arduino Bluetooth Low-Energy Object) or something like that.

1. A Python program running on a host computer accepts input from a web form.
2. Input is interpreted using Artificial Intelligence Markup Language (AIML)
3. Response is sent via Bluetooth and spoken by Pablo.

Pablo is open source software and hardware. Code can be found on GitHub

He can be found on Twitter here https://twitter.com/pablo_robot

Using the basic PyAIML example plus a simple webpy form we are able to talk to PABLO. I still need to get the text-to-speech Javascript API working. I was using the speech-to-text input feature in Chrome (x-webkit-speech) but it has since been deprecated.
AIML responses are constructed from a set of reduced answers to planned questions. eg. “What’s your name?” “Who are you?” “What are you called?” = PABLO.

NOTE: The USB cable in the eye is a temporary 5v power source only.
Not quite Natural Language Processing but with random responses and recollection it can make for a convincing conversations.

On the physical side of things, PABLO is made up of:
Pablo in pieces

(Clockwise from top right)
Arduino Duemilanove microcontroller (Or any compatible)
Adafruit BlueFruit EZ-Link Bluetooth Shield
Emic2 text-to-speech Module
1000 mAh Lipo battery
2 Adafruit NeoPixel rings (I’m using one 16 and one 12 pixel)
Adafruit 3v Trinket
Adafruit 4-channel I2C-safe Bi-directional Logic Level Converter
2 Hobby servo motors with Pan-tilt brackets
Cardboard head with wire-spool LED diffusing eyes
8 Ohm speaker

An Arduino with the Adafruit EZ-Link Bluetooth Shield receives the response from the host computer. The response is interpreted then commands issued to eyes, servos, and speech module. I used the proto area of the shield to connect headers to which I can temporarily plug in the text-to-speech module, two servo motors, and the level converter connection to eyes.

The eyes are controlled by a small microcontroller from Adafruit called the Trinket and is powered by the lipo battery. They are self supported and can easily be repurposed for other projects. I used a 16 pixel ring and a 12 pixel ring which made some of the eye functions a little specific to this build. The logic level converter is used to receive commands over I2C from the 5v Arduino microcontroller using the Tinywire library.

Pablo

Everything is currently crammed into a cardboard box with a speaker pointed down into the mouth. A talking function randomly moves the jaw servo while Pablo is talking, opening and closing the mouth. This combined with the advanced settings of the Emic2 voice module make for endless hilarity. A second servo twists the head briefly, as one might picture a confused dog, when an answer is not known. The datasheet from Parallax(PDF) shows you how to change the basic settings and take advantage of the more powerful DECTalk processor.

Lots of things to build on, still tons to do, least of which is his “personality”.

I plan to document more of the details and code as I go because today I’m hoping Pablo can help me win a trip to space! If not, he’s about the size of a CubeSat and I’ll send him into space.

UPDATE: Pablo was honoured to make an appearance on Adafruit’s July 23rd Show and Tell!

GPS Glove with RGB LEDs

gps_glove

Thrilled to have been mentioned on Adafruit’s Ask an Engineer live webcast last week, I decided to write up more about my glove project. Here is a quick 6 second video.

Using Adafruit’s open-source Arduino compatible board, the Flora, a GPS module, and four RGB LED “pixels”, I adapted a North Face Hyvent glove to passively respond to my location on earth and to relay data.

flora_gps

Mostly for fun but I can see where this could be useful to someone in situations. I’m discovering some limitations along the way as well.

The project is basically a fork of Adafruit’s own excellent Flora GPS Jacket tutorial. You can find their code on Github.

I’ve added coordinates for all the lifts on Whistler Blackcomb in an array that get checked against my current location. If within a specified range (10m) the LEDs blink a calming red pulse (or warning).

GPS Coordinates of Whistler Blackcomb Lifts

//Creekside
#define GEO_LAT1 50.093744
#define GEO_LON1 -122.988872

//RED
#define GEO_LAT2 50.085781
#define GEO_LON2 -122.963975

//Peak
#define GEO_LAT3 50.066972
#define GEO_LON3 -122.951978

//Harmony
#define GEO_LAT4 50.067606
#define GEO_LON4 -122.931094

//Symphony
#define GEO_LAT5 50.058631
#define GEO_LON5 -122.918017

//Green
#define GEO_LAT6 50.084258
#define GEO_LON6 -122.941625

//Fitz
#define GEO_LAT7 50.112939
#define GEO_LON7 -122.953297

//W Gondola
#define GEO_LAT8 50.112906
#define GEO_LON8 -122.954214

//B Gondola
#define GEO_LAT9 50.113458
#define GEO_LON9 -122.953381

//Wizard
#define GEO_LAT10 50.115503
#define GEO_LON10 -122.947719

//Solar
#define GEO_LAT11 50.106319
#define GEO_LON11 -122.920436

//Catskinner
#define GEO_LAT12 50.099919
#define GEO_LON12 -122.915525

//7th
#define GEO_LAT13 50.078456
#define GEO_LON13 -122.895761

//Jersey
#define GEO_LAT14 50.106478
#define GEO_LON14 -122.900831

//Glacier
#define GEO_LAT15 50.106447
#define GEO_LON15 -122.899889

//Crystal
#define GEO_LAT16 50.109542
#define GEO_LON16 -122.90565

//Excelerator
#define GEO_LAT17 50.111767
#define GEO_LON17 -122.922981

Altitude, time, location and speed are displayed by a unique animation indicating the data type; the numbers are then blinked  in base 10 sequence across the fingers. Digits on digits.

I’ve got it set up to also indicate when I’m above a mile high (1609m) and to do other functions at precise times or locations. Get moving over twenty knots (37kph) and the lights flash alternating blue and red for a police chase effect. I purposely let the glove do it’s thing and be passive rather than introduce controls to make it interactive.

glove_snaps_pixels

Building the glove was made easy by the design of the glove, or so I thought. The mesh inside pocket on top of the glove easily houses everything and the mesh makes weaving conductive thread easier. Working with conductive thread was *by far* the hardest part. Thread with a mind of it’s own. Nail polish as a knot sealer is the key here. Snaps sewn inside allow the Flora and GPS boards to be removed and used in other projects.

It’s holding up after a few days wear-and-tear skiing and snowshoeing except around the LEDs where the fabric is lifting. I’m still experimenting and fixing little things. Next steps are to add logging. Once stable I’ll share it on Github.