Apple Invests Millions in Siri Improvements

Apple has increased its funding for artificial intelligence development, focusing on producing conversational chatbot features for Siri – purportedly spending millions of dollars per day on research and development.

It was revealed in May that Apple was hiring more engineers to work on generative AI projects. While the corporation did not offer any official predictions, CEO Tim Cook stated that generative AI is “very interesting.”

However, the tale of Apple’s venture into generative AI begins far earlier than May. Apple’s chief of AI, John Giannandrea, assembled a team four years ago to work on large-language models (LLMs), the foundation of generative AI chatbots like ChatGPT.

Ruoming Pang, who formerly worked at Google for 15 years, leads Apple’s conversational AI division, the Foundational Models team. The team has a large budget and trains advanced LLMs with millions of dollars every day. Despite having only 16 members, their progress rivals that of OpenAI, who spent more than $100 million to train a similar LLM.

According to The Information, Apple has at least two other teams working on language and image models. One group is working on Visual Intelligence, which generates images, videos, and 3D sceneries, while another is working on multimodal AI, which can handle text, photos, and videos.

Apple is currently planning to incorporate LLMs into Siri, its speech assistant. This would allow users to use natural language to automate difficult processes, comparable to Google’s efforts to improve their voice assistant. Apple claims Ajax GPT, its sophisticated language model, is superior to OpenAI’s GPT 3.5. Additionally, you can also read about- Apple GPT: A New Challenger to ChatGPT and Google Bard.

Finally, implementing LLMs into Apple products is fraught with difficulties. In contrast to some competitors, Apple wants to run software on-device for improved privacy and efficiency. However, due to their size and complexity, Apple’s LLMs, like Ajax GPT, are rather huge, making them impossible to load into the iPhone.

There have been examples for downsizing huge models, such as Google’s PaLM2, which is available in several versions, including one ideal for tablets and offline use. While Apple’s plans are unknown, the corporation may opt for smaller LLMs for privacy reasons.

Internal documents and unidentified sources disclosed information about Apple’s internal restriction on ChatGPT-like technology and plans for its own LLM in May.

RECENT POSTS

December 5 Zodiac: Sign, Date and Characteristics of Sagittarius

Are you born on December 5th and curious about...

How to Become a Ruby Developer: Mastering The Ruby Programming Language

Are you dreaming of crafting sleek web applications or...

A First-Time Entrepreneur’s Guide to Business Structures

A 2022 survey with 500 respondents revealed that 16...

Messi Leaves Door Open for One Last World Cup in 2026

A year removed from an iconic 2022 FIFA World...

What Did Stephanie Melgoza Do? Explore Latest Updates on Stephanie Melgoza

Stephanie Melgoza, a former Bradley University student, made headlines...

ASTROLOGY

LIFESTYLE

BUSINESS

TECHNOLOGY

HEALTH

FEATURED STORIES