The new Apple Photos now labels photos based on facial recognition and content such as landscapes and objects so that users can search and sort photos. Apple copied what Google announced last year. This is good news for Apple users. No one wants to deny iPhone users a better experience organizing their photos. But any of them with a lick of sense about scientific research should also be offended.
Apple stood on the shoulders of giants to produce Photos. Photo labeling isn’t new. It was one of the first areas of machine learning research to be tackled by machine learning researchers after character recognition. Almost all the software or concepts used to do this could be open source.
No one wants to deny Apple’s use of open source software. But Apple should give back. It should pay its taxes to support the community where it thrives. But it doesn’t, except perhaps in proprietary open source projects like Swift. Who says so? Co-founder of MIT Media Lab and One Laptop Per Child Nicholas Negroponte does. Mashable’s Jenni Ryall wrote that Negraponte told the crowd at the World Business Forum in Sydney:
“[Apple] has not written one research paper or attended any external research meetings, such as working groups, government-funded workshops, or held their own onsite research meetings with external scientists in the way Google, Microsoft and Facebook often do.”
Apple does send out its ants from One Infinity Drive to research conferences to bring back innovation-sustaining ideas. But they never speak.
Apple’s most egregious offense is against the collegial, hyper-open artificial intelligence (AI) and machine learning research community. That community is so open that the research paper written by Ronan Collobert, cited by Facebook as the inspiration for its DeepText project, included Jay Weston of Google as a co-author. So collegial that Google’s AI research chief Geoffrey Hinton complemented Facebook’s AI research chief, Yann Lecunn, when Fast Company published a story calling Facebook the Athens of AI.
Apple Photos could be built by a computer science graduate student using the open source Torch machine learning library. The choice of libraries isn’t limited to Torch, though. Facebook, Google, IBM, Microsoft and others have all made the fruits of their research available as open source software projects. They’re motivated to contribute to and grow the machine learning R&D community with the goal of inspiring more research and innovation.
Apple may not be using these libraries; they could have built their own. Rewriting software under clean-room conditions to create a non-infringing identically functioning copy dates back to Phoenix Technologies copying IBM PC Bios to enable the building of PC clones in the late 1980s.
Apple would have been careful to make its machine learning non-infringing, though, because if found responsible in court of violating the open source licenses of these projects, it might be ordered to publish its code.
Apple’s secrecy hinders innovation
Apple’s clandestine approach to R&D impairs its ability to innovate. It is impossible to imagine Apple creating a project like Google’s Tango that lets computers see 3D space like humans. After two less-than-perfect prototypes, Tango has been improved and reduced in size through open collaboration, and Lenovo designed it into one of their latest Phablets. R&D openness attracts developers and researchers that contribute their software and their experience. Open innovation also exposes the best job candidates based on their contributions.
Apple boosts its R&D with free research, saving millions of dollars. But it should pay for it by contributing its research back to the community, not just for the good of the community but for the good of Apple.
Recruiting is just one of many examples that could be cited for why Apple should open up. Consider a top newly minted AI PhD candidate considering a machine learning job at Apple, Google or Facebook. He or she could debate the offers from Google and Facebook equally. But Apple would be a distant third choice because it would mean the candidate would have to limit his or her relationship with the worldwide AI research community.