Apple

Report: Apple plans to make its large language models available to developers – SiliconANGLE


Apple Inc. plans to let third-party developers build applications using its large language models, Bloomberg reported today.

The company is expected to announce the initiative at its upcoming WWDC conference.

Last year, Apple rolled out a suite of artificial intelligence features called Apple Intelligence to iPhones and Macs. The capabilities are powered by a series of internally-developed models known as Apple Intelligence Foundation Models. According to Bloomberg, the iPhone maker plans to make some of those models available to developers via a “software development kit and related frameworks.”

Currently, Apple Intelligence lends itself to a fairly limited number of tasks. The technology can rewrite text on the user’s display, proofread it and generate images based on natural language prompts. There’s also a tool for summarizing notifications.

It’s unclear why developers might use Apple’s LLMs instead of the numerous alternatives on the market. Microsoft Corp. offers a family of open-source language models, the Phi series, that can run on hardware-constrained devices such as iPhones. Some of the algorithms in the model lineup are optimized for reasoning, which Apple Intelligence currently doesn’t support.

One way the iPhone maker could differentiate its models is by making them easier to use. Apple might, for example, offer prebuilt guardrails that filter inaccurate AI output.

The company could also integrate the LLMs with its Core ML framework. It’s a tool that enables developers to incorporate third-party neural networks into iOS and Mac applications. Many software teams are already familiar with Core ML, which means that integrating it with Apple Intelligence’s LLMs could make them easier for developers to adopt.

Some of the models that power Apple Intelligence run on users’ devices while others are hosted in the cloud. According to Bloomberg, the iPhone maker will initially only make the on-device algorithms available to developers. It’s unclear whether the cloud-hosted LLMs, which are more capable, will follow suit.

Apple detailed one of Apple Intelligence’s on-device language models at last year’s WWDC. According to the company, the model includes about three billion parameters. In an internal Apple test, it outperformed an open-source language model with more than twice as many parameters across most tasks.

According to the company, Apple Intelligence’s LLMs can learn new skills using so-called adapter layers. Those are artificial neurons that can be trained separately from an LLM’s core components. Adapter layers make it possible to optimize an AI model for new tasks for a fraction of the cost of retraining it from scratch or building a new neural network.

This year’s WWDC event could also see Apple debut other AI tools for developers. Earlier this month, Bloomberg reported that the iPhone maker has teamed up with Anthropic PBC to build an LLM-powered programming assistant. The tool is set to ship with Xcode, an Apple-developed code editor for building iOS and Mac applications. 

Image: Unsplash

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.