OpenAI
ChatGPT
ChatGPT can now read responses to the user on the iOS and Android versions.
Lawsuits
Elon Musk sued OpenAI, claiming that OpenAI has deviated from its mission of promoting Artificial General Intelligence. OpenAI has revealed Musk’s messages to OpenAI in response.
Anthropic
Claude 3
Anthropic has released a new family of models named Claude 3. Released in Haiku, Sonnet and Opus variants, these models claim speed and cost improvements and the claim is that they are on par with GPT-4 and they have strong vision capabilities. They have a 200K context window size (which is bigger than most models but less than the 1M token context window offered by Gemini 1.5 Pro). The lower-level Sonnet is available through claude.ai whereas Opus is available through a Claude Pro subscription. Note also that claude.ai is currently available only in certain regions.
Amazon investment
Amazon plans to invest up to $4 billion in Antrhopic.
Mistral
Mistral released a new model called Mistral Large, with a 32K context window and multilingual support.
Microsoft
Mistral Partnership
Microsoft partnered with Mistral, diversifying their AI partnerships and getting a minor share in the company.
Inflection.AI
Microsoft got the founders of the AI startup Inflection.ai, Mustafa Süleyman and Karen Simonyan. They will be running the Consumer AI division.
xAI
Elon Musk has made xAI’s LLM named Grok 1.0 Open Source. xAI has also announced a more advanced Grok model 1.5, however it is not clear whether this new version will also be open-source.
Alani
Alani has released the 2.0 version of their ChatBot environment with an optimized user experience.
Leonardo.ai
Leonardo can now produce transparent images.
Apple
LLM
Apple has published a paper describing a multi-model LLM.
Partnerships
Apple seems to be in talks with Google and OpenAI for the potential use of their models.
Databricks
Databricks has released a new open-source LLM named DBRX, reportedly performing better than Claude 3 or GPT-4 at certain tasks, with fewer computing requirements. It has a Mixture-of-Experts (MoE) architecture, 128B parameters and a 128K context window. It has proven to be best at mathematics and programming.