deeplearning.jpg

Image: nicescene, Getty Images/iStockphoto

Years ago Cloudera co-founder Mike Olson declared, “No dominant platform-level software infrastructure has emerged in the last ten years in closed-source, proprietary form.” He was right because, as Adobe’s Aaron Hardy has noted, “It’s difficult to truly reach platform status without building and leveraging open source technology.”

That said, it’s not so much a matter of source code and more a matter of open access and real value for the developers who will adopt it. For that matter, it’s not even just about developers.

Opening up data science

Take, for example, Google’s recent decision to release Kubeflow Pipelines and AI Hub. Google was already spreading the gospel of AI by open sourcing TensorFlow to help engineers get deep into ML code and then releasing AutoML, enabling less AI-savvy enterprises to build custom ML models.

SEE: Artificial intelligence: Trends, obstacles, and potential wins (Tech Pro Research)

Somewhere in the middle is Kubeflow Pipelines and AI Hub, which enables data scientists to build and share ML resources. As described by Google’s Hussein Mehanna:

Kubeflow Pipelines are a new component of Kubeflow, a popular open source project started by Google, that packages ML code just like building an app so that it’s reusable to other users across an organization. Kubeflow Pipelines provides a workbench to compose, deploy and manage reusable end-to-end machine learning workflows, making it a no lock-in hybrid solution from prototyping to production. It also enables rapid and reliable experimentation, so users can try many ML techniques to identify what works best for their application.

The point of these two additions to Google’s AI arsenal isn’t open source, though the underlying Kubeflow (which Google released) is, of course, open source. Rather, the openness that matters most here is the ease of sharing models and other ML resources on AI Hub, in addition to experimenting and building them in the first place in the equivalent of a containerized ML component that data scientists can pull together in different configurations.

To work, it can’t simply be open in some way—it also has to promise utility for the data scientists or developers targeted for adoption. Here Google thrives. But not everyone is doing as well.

Open but why?

Take, for example, Samsung’s decision to open up its voice assistant technology, Bixby. If you’ve not heard of Bixby, you’re not alone. In the land of Apple’s Siri and Amazon’s Alexa, Bixby is an also-ran. Opening it up won’t necessarily change this, though it’s arguably a necessary step on the way to Samsung’s vision of making Bixby the heart of an “AI-enabled ecosystem.”

SEE: The impact of machine learning on IT and your career (free PDF) (TechRepublic)

Amazon, too, opened up the tooling necessary to build Alexa Skills, tens of thousands of which have been built. Few to none have figured out how to make those voice-activated apps useful or usable to the mainstream user. Samsung has given developers tools without helping them to answer the question, “Why would they bother to build anything in the first place?” Amazon got away with launching a platform without developers needing to answer that question, because it was still a novelty. But Amazon now must answer it, and Samsung doubly so, as the late entrant with far less reach.

Again, as Hardy has highlighted, things like open source are critical to platform success, because open source facilitates development and implementation, providing visibility into how the underlying code works. To truly build an ecosystem, however, any aspiring platform must answer why developers (or in the case of Kubeflow Pipelines, data scientists) should care. Google has done that with aplomb with its recent announcements. Samsung? Not so much.

Also see



READ SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here