The intricate connection between Apple and OpenAI.
However, Altman, who has become the face of generative AI for the last 18 months following the launch of ChatGPT, was not included in Apple's official presentation either physically or through a livestream. Moreover, he was absent from the private press event where Apple CEO Tim Cook and other executives discussed privacy and security, as well as their partnership with the company.
"I was not surprised that Sam Altman did not appear on stage," said Ben Wood, an analyst at market research firm CCS Insight in an interview with CNN. "Apple had to carefully manage the messaging. OpenAI is simply a tool used to address broader AI-related inquiries that are not central to Apple's user experience. Including him in the livestream would have merely added confusion."
Earlier this week, Apple showcased several AI-powered features coming to iPhones, iPads, and Macs this fall - most of which are driven by the company's own homegrown technology known as Apple Intelligence.
The firm will make OpenAI's viral ChatGPT tool available, but only in a limited capacity. ChatGPT will typically only be used when Siri needs more help answering a query.
Apple's decision to invite Altman and then not have him appear in front of the public signifies the cautious stance they are adopting with their partnership. OpenAI, along with other AI companies, has faced concerns from researchers, industry experts, and government officials regarding misinformation, biases, copyright, privacy, and security, among others. The deal also comes at a time when the industry is moving swiftly, and regulators, companies, and consumers are still trying to figure out how to navigate the technology responsibly.
Apple is hoping that a major push into AI could help boost sales of the iPhone, as it has not seen a significant upgrade in recent years, prompting users to hold onto their devices longer. Additionally, an uncertain economic climate is also affecting consumers, particularly in China.
The swift move also comes at a time when Apple is facing regulatory scrutiny in Washington. The company was recently surpassed by chipmaker Nvidia in terms of market cap, but Apple's stock price (AAPL) soared by as much as 10% in the 60 hours following the event, increasing the company's market cap by more than $300 billion and outpacing Nvidia and placing Apple back in contention with Microsoft for the largest market value.
The timing is significant: Apple doesn't always adopt and integrate emerging technologies quickly - it typically researches, develops, and then works to perfect them before incorporating them into new products - but the rapid embrace of generative AI may be forcing the company to accelerate its timeline.
"Apple needed to present an AI narrative, and Apple Intelligence should help ease investor concerns and demonstrate that the company is keeping pace with its competitors," said Wood. "The OpenAI partnership is a significant development that enhances Apple's AI capabilities, and features like a significantly improved Siri are expected to be well-received by users.
Still, the partnership could expose Apple to potential vulnerabilities, as the company has no control over the workings of OpenAI's models or how it utilizes user inputs. The association with a company and technology that have yet to gain the trust of the public could also pose challenges down the line.
A strategic partnership
Despite having been working on its own AI program for several years, a partnership with OpenAI presents a way for Apple to fill in the gaps in its AI offerings.
With a user question outside Siri's domain, ChatGPT can step in. In the demo following the keynote, Apple demonstrated how someone could upload a picture of vegetables at a farmer's market and ask what they could make for dinner. Siri might suggest asking ChatGPT since the question is better suited for it.
Apple's use of ChatGPT as a supplementary feature could lessen the associated risks. Furthermore, it's quite possible that Apple could collaborate with other AI providers in the future, such as Google's Gemini or niche providers that specialize in certain areas like healthcare.
"I believe Apple will take a practical approach to the OpenAI partnership," said Wood. "If the collaboration with OpenAI starts to impact the overall user experience or raise questions about security and data integrity, Apple may look to implement additional safeguards or explore different ways to deliver AI-powered services."
An emphasis on privacy and security
Apple has been adamant about prioritizing user privacy and security when it comes to their proprietary AI technology, stating that the majority of AI functions will be performed on the device itself, and user inputs will be kept away from offsite servers.
“As we set out to build these extraordinary new capabilities, we want to ensure that the outcome reflects the principles that underpin our products," Cook explained during the presentation. "It must be strong enough to help you with the things that matter most in your life. It must be intuitive and easy to use. It must be deeply integrated into your product experiences. And, of course, it must be built with privacy, from the ground up.”
Apple has stated that they will not disclose any personal user details to OpenAI, which means queries made through ChatGPT won't be linked to an Apple user's account. Additionally, the repeated consent requests that require users to opt into using ChatGPT with Siri are significant as well. Each time Siri wants to redirect a question to ChatGPT, it'll seek authorization first.
Wood considers these consent prompts and safety measures to be indicative of Apple's uneasiness.
According to Reece Hayden, a senior analyst at ABI Research, Apple's method is wise because it offers users a selection on how to interact with their data.
Hayden explained, "By providing a mixed approach that seamlessly combines ChatGPT and native capabilities, users will be less alarmed about the partnership. Apple can also remain focused on promoting their AI capabilities and mitigate some of the peril associated with partnering with OpenAI, which continues to undergo turmoil."
Industry worries
Companies like OpenAI have acknowledged the critical threats that AI presents, such as manipulation and a potential loss of control that could even lead to human extinction. However, several experts, researchers, and AI employees believe these organizations should be doing more to enlighten the public concerning potential risks and safety measures. Last week, a group of OpenAI insiders urged artificial intelligence companies to be more open about the concerns surrounding the technologies they're developing.
Therefore, it was expected that people would express their concerns about Apple's collaboration with OpenAI.
For instance, Elon Musk expressed his disapproval, stating he would prohibit the use of Apple gadgets at his businesses - Tesla, SpaceX, and X, among others - if the corporation proceeded with its AI plan. Musk considered the integration of OpenAI at the operating system level as a "dangerous security violation."
Although the concern over how employees utilize AI models is currently a prevalent subject around the world across various industries, Gartner analyst Annette Zimmermann believes Musk's reaction is slightly misplaced. She noted, "Any employee with a smartphone should follow company protocols and refrain from inputting private information into ChatGPT's public domain. This isn't just a concern for iPhones... or Tesla."
Andrew Cornwall, a senior analyst at Forrester, believes it's improbable that Apple users will become devoted to ChatGPT, as many individuals won't utilize the service unless Apple cannot supply an appropriate response.
Cornwall said, "When users do inquire ChatGPT, Apple will track the prompts and gather metrics to improve its own models. In the future, Apple may change providers or utilize multiple third-parties to enhance its capabilities. Once Apple has perfected its own AI model, the company will shut down the collaboration."
Read also:
The strategic partnership between Apple and OpenAI could potentially utilize OpenAI's tech in a limited capacity, such as when Siri needs additional assistance with answering queries. Apple's business model might involve collaborating with various AI providers in the future to fill gaps in their AI offerings.
Despite Apple's emphasis on privacy and security, the partnership with OpenAI could expose them to potential vulnerabilities. They may need to implement additional safeguards or explore different methods of delivering AI-powered services if the collaboration impacts the user experience or raises concerns about security and data integrity.