For most of us, subtitles are what we use to understand what's going on in Kung Fu movies and Aaron Sorkin shows -- but a new invention from developer Will Powell is bringing them to new venues.
Powell's device, which he says was inspired by Google Glass, links a Raspberry Pi microcomputer to a set of augmented reality glasses with a microphone and built-in display to provide (almost) real-time subtitles for conversations being held in foreign languages.
MORE ADVANCED HEADWARE: 9 high-tech glasses you might be seeing soon
Powell gained recognition back in April, when he developed his own, home-brewed version of Google Glass, using webcams, Adobe AIR, and other tools to replicate the advertised functionality of Google's product.
MORE GOOGLE GLASS ALTERNATIVES: Can't wait for Google's Project Glass? Try these video glasses on for size
A smartphone or tablet is also required, to provide a steady audio stream to the Microsoft translation API used to create the captions. Captured audio passes through a Bluetooth microphone, to the mobile device, and is translated via a Microsoft translation API.
"Passing through this API service is the biggest delay in the subtitles," Powell wrote in a blog post.
Once the translation is made, the text is fed to the Raspberry Pi running the display in the glasses, and projected for the user to read.
And no, we're not sure why they're playing chess, either.
While there's obviously a significant translation delay in the system, it's an impressive demonstration nevertheless. However, the need for all that associated hardware -- a nearby TV appears to be translating the text input from the site into closed-caption data for the glasses -- likely makes Powell's invention no more than a proof-of-concept device at the moment.
Email Jon Gold at firstname.lastname@example.org and follow him on Twitter at @NWWJonGold.
Read more about data center in Network World's Data Center section.