Brave Enhances Leo AI with Mixtral Integration in Desktop Web Browser

TL;DR:

  • Brave has integrated the Mixtral 8x7B large language model (LLM) into its Leo AI assistant within the desktop web browser.
  • Mixtral 8x7B, an open-source LLM by Mistral AI, now serves as the default LLM in Brave Leo.
  • Brave aims to offer novel use cases and groundbreaking user interactions during browsing sessions.
  • Leo AI can summarize web pages, provide answers, generate content, translate languages, transcribe audio/video, and engage in conversations.
  • A free desktop version of Leo is available, with a Premium version offering enhanced features.
  • Mixtral 8x7B in Brave Leo can handle larger contexts, communicate in multiple languages, and generate software code.
  • It outperforms competitors in terms of reduced hallucinations and biases, based on the BBQ benchmark.
  • Users have the option to choose other LLMs or revert to the previous default, all within Brave’s private and secure ecosystem.
  • Leo AI will soon be available on mobile versions of Brave, aiming for feature parity with desktop versions.

Main AI News:

In a strategic move, Brave has unveiled its latest enhancement to the Leo AI assistant within its desktop web browser – the incorporation of the Mixtral 8x7B large language model (LLM). Launched just a month ago by Mistral AI, Mixtral 8x7B has quickly become the default LLM for Brave Leo. Since its inception, Brave Leo has garnered substantial traction among both free tier users and paid subscribers, with expectations of even broader adoption as Mixtral’s capabilities are integrated.

Brian Bondy, the co-founder and CTO of Brave, expressed his enthusiasm for this development, stating, “Our aim is to create novel and convenient use cases in the context of users’ browsing sessions, and to help our users interact with the Web in groundbreaking ways.” This integration marks another milestone in Brave’s journey to provide innovative solutions to enhance the browsing experience.

The launch of Leo in August 2023 marked a significant step forward, and with its native AI assistant introduced in Brave 1.6 in November, users have gained access to a multifaceted tool. Leo’s functionalities extend beyond mere web page summarization, encompassing content generation, language translation, audio and video transcription, and even engaging in conversations. While a free version is available for desktop users, a Premium variant, priced at $15 per month, offers extended usage limits, superior LLM access, and higher-quality conversations.

As Mixtral 8x7B takes its place as the default LLM in Brave Leo, all users, whether free or Premium, can harness its capabilities. Compared to the previous default LLM, Mixtral boasts the ability to handle larger contexts, communicate proficiently in English, French, German, Italian, and Spanish, and generate software code. Its capabilities have recently powered the launch of the CodeLLM developer query service, further expanding its utility.

Remarkably, Mixtral 8x7B surpasses competitors such as ChatGPT 3.5, Claude Instant, Llama 2, and numerous other LLLMs. It offers reduced instances of hallucinations and biases, as evidenced by the BBQ benchmark. Yet, Brave remains committed to offering choices to its users. Leo’s open architecture allows users to opt-in to other LLMs like Claude Instant from Anthropic or Meta’s Llama 2 13B, or revert to the previous default, Llama 2 LLM.

Crucially, in line with Brave’s core principles, Leo maintains privacy, anonymity, and security. It neither records conversations for model training nor necessitates user accounts or sign-ins. To access Leo’s prowess, users need only type their queries into the Brave address bar and click on “Ask Leo” at the bottom of the search suggestions pop-down.

In an exciting development, Brave intends to extend Leo’s reach to mobile users, with plans to introduce Leo to Brave’s Android, iPhone, and iPad versions. As detailed in the Leo roadmap, the objective is to achieve feature parity with the desktop versions of the browser, ensuring that users can seamlessly transition across their devices while enjoying the benefits of Leo’s AI capabilities. Stay tuned for further updates on Brave’s ongoing innovations.

Conclusion:

Brave’s integration of Mixtral 8x7B into its Leo AI represents a significant leap in enhancing the browsing experience. This move strengthens Brave’s position in the competitive browser market by offering users advanced AI capabilities, privacy, and a commitment to user choice in selecting language models. It sets the stage for further market growth as Brave expands its AI offerings to mobile platforms, catering to a wider user base.

Source