Mycroft.ai, which is working to create a home AI platform based on Raspberry Pi, Arduino and an extensive in-house software stack, has opened an important part of that stack to developers everywhere as of Wednesday.
The project’s Adapt intent parser library, now available under LGPL 3.0, is designed to be a powerful tool for converting human speech into machine data – smoothly translating a verbal command into a usable set of instructions for a program.
Adapt has something of a home AI pedigree – its lead designer is Sean Fitzgerald, who previously worked on both Siri and Amazon Echo.
The first device to make use of Adapt will likely be Mycroft’s own reference units, which are designed to be home entertainment and IoT hubs for early adopters. Simple voice commands, processed by Adapt, will put media services like YouTube, Netflix, Pandora and many others at a user’s fingertips, along with smart home technology like SmartThings or Philips Hue.
The project launched on Kickstarter in August 2015, and reached its original funding goal of $99,000 during the original period. Mycroft’s still accepting additional contributions and pre-orders on IndieGoGo, and delivery of the devices is predicted for April of this year. The team has pledged to make all code that it develops for the device available through open-source licensing by the time the hardware is ready to ship.
Open-source AI has made headlines lately – Google released its TensorFlow machine learning framework in November 2015, and Microsoft followed suit soon afterward, opening up development on its own Distributed Machine Learning Toolkit within days.