AndPlus acquired by expert technology adviser and managed service provider, Ensono. Read the full announcement

Apple and IBM Introduce Machine Learning Partnership

Apr 9, 2018 9:05:00 AM

shutterstock_1040985679Ask the average person on the street what IBM does, and you might get a blank stare, or perhaps “Didn’t they do that 'Jeopardy!’ thing a few years ago?” Once a household name whose mainframes, PCs, and typewriters could be found in nearly every large company around the world, IBM has mostly fallen off the pop culture radar in the last few years.

Well, here’s a couple of things IBM has been up to lately: machine learning and cloud computing. And they recently announced a partnership with Apple to bring those technologies to iOS apps.

The Apple-IBM AI Partnership

Apple and IBM are two names not often mentioned in the same breath. IBM has long focused on the enterprise market, Apple on the consumer market. But this partnership makes sense for both companies:

  • Apple recognizes that the consumer smartphone market is pretty much saturated, especially the segment that can afford to plunk down $800 to $1,000 on the latest and greatest iPhone. They need to make inroads in the enterprise market, and have been adding enterprise friendly features to its iOS platform.
  • IBM needs a way to sell its machine learning and cloud computing services, and leveraging an existing, mature mobile platform is a good way to do it.

The partnership’s offering, officially called Watson Services for Core ML, enables iOS app developers to combine Apple’s Core ML framework with machine learning models on IBM’s Watson Services platform and cloud services for model training. The stated goal is to bring more artificial intelligence to enterprise mobile apps.

How It Works

Suppose, to take a completely hypothetical example, excavation companies want an app to help job site supervisors quickly identify whether objects being dug up might be important anthropological or paleontological artifacts, so they can make on-the-spot decisions to stop work if something of probable scientific or historical value is found. An app developer, using Watson Services for Core ML, could develop a machine learning-enabled app using this process:

  1. Build a machine learning model using Watson and train it with a large number of images of artifacts, such as fossils, human and animal bones, and ancient pottery, tools, and other objects. The training images would be tagged not only with the name of the object but its age and the geographical areas where it might be found.
  2. Once the model is built and trained on Watson’s high-powered cloud servers, it can be incorporated into an iOS app using the Core ML framework. As part of the partnership, IBM is providing what it’s calling the Cloud Developer Console to simplify this process for developers.
  3. In the field, the end user snaps a photo of a found object, and the app tries to identify it. By incorporating geolocation data, it can filter out irrelevant potential matches (a Grecian urn, for example, is unlikely to be found at a building site in New Jersey).
  4. As the app is used by various end users around the world—not just excavation supervisors, but anthropologists and paleontologists—the app can upload photos of found objects to add to the database, thereby enabling developers to tweak the model. The updated model can then be incorporated into the next update of the app. It’s a continuous cycle in which the app gets better at identifying objects over time.

Among the advantages for developers is that they don’t need to be AI experts—they bring the business problem and the proposed solution, and the Watson service does the AI heavy lifting. Watson already has pre-trained image recognition models for many familiar categories of objects, so it’s possible that for many apps, the developer doesn’t even need to provide a training database. (In the example above, the customer probably would need to provide such a database, because it’s such a specialized area of knowledge.)

It turns out that IBM and Apple have been in cahoots for some years, developing hundreds of enterprise applications for the iOS platform. This new partnership takes that collaboration to the next level by incorporating machine learning and enabling third-party app developers to build the apps. As machine learning technology matures, more enterprise customers will see the potential benefits of the technology. With this new partnership, IBM and Apple are poised to take full advantage in filling this demand. This partnership is the only one of its kind so far, but it’s a safe bet that other initiatives are in the works. If IBM and Apple are doing it, can Google be far behind?

Abdul Dremali

Written by Abdul Dremali

Abdul Dremali is a key content author at AndPlus and a driving force in AndPlus marketing. He was also instrumental in creating the AndPlus Innovation Lab which paved the way for the company’s leadership in Artificial Intelligence, Machine Learning, and Augmented Reality application development.

Get in touch