Apple teamed to bring IBM’s Watson machine learning capabilities to the device vendor’s Core ML application framework.

On its developer website, Apple explained developers will now be able to build apps which can access Watson’s computing muscle directly from an iPhone or iPad, even when the devices are offline. The company noted this will enable apps to quickly analyse images; classify visual content such as scenes, faces, colours, food and other objects; and train models using machine learning.

In an example of a corporate application, the technology could be used to help a technician identify a broken piece of equipment and run a search for parts necessary for repair.

While no connection is needed for this on-device processing, IBM’s general manager for the Apple partnership, Mahmoud Naghshineh, told TechCrunch when connectivity is available, data will be sent to Watson in the cloud to continuously improve the algorithms processing content on the edge.

Apple said developers will be able to start with pre-trained Watson models or can choose to customise and train their own artificial intelligence models to identify images based on their specific needs.

The addition of machine learning capabilities marks an extension of Apple and IBM’s existing partnership, which first offered developer access to Watson capabilities such as natural language processing in 2016.