NEC unveils a new AI strategy focused on expanding its lightweight large language model 

TL;DR:

  • NEC unveils a strategic plan centered around enhancing its lightweight large language model (LLM) for generative AI.
  • The company aims to customize AI solutions for specific business needs across industries like healthcare, finance, and manufacturing.
  • NEC is developing specialist models to stimulate business transformation and encourage widespread generative AI adoption.
  • They’ve doubled the high-quality training data, resulting in LLM outperforming leading models in Japanese dialogue skills.
  • NEC is working on an adaptable architecture for innovative AI models.
  • The company is creating a 100 billion parameter model, a significant leap from the conventional 13 billion parameters.
  • NEC targets approximately 50 billion yen in generative AI-related revenue over the next three years.
  • Challenges remain in instructing AI accurately and addressing security concerns.
  • NEC utilizes the NEC Inzai Data Centre for low-latency, secure LLM environments.
  • The company seeks to provide tailored generative AI solutions for various industries, emphasizing scalability and value enhancement.

Main AI News:

NEC Corporation has unveiled its ambitious vision to reshape various industries through the deployment of generative artificial intelligence (AI). Central to this strategy is the evolution and broadening of its lightweight large language model (LLM), setting the stage for a transformative impact across sectors. NEC is poised to offer tailored generative AI solutions, meticulously aligned with each client’s distinct business requirements, leveraging its profound industry knowledge and business acumen.

The ripple effect of these groundbreaking services is expected to usher in a new era of operational transformation, spanning key domains such as healthcare, finance, local governance, and manufacturing. NEC’s focused efforts will be channeled into crafting specialized AI models that not only catalyze business evolution but also facilitate the widespread adoption of generative AI across entire industries. This mission will be achieved through meticulously managed application programming interfaces (API) services.

NEC’s dedication to excellence has manifested in doubling the volume of high-quality training data for its LLM, resulting in a performance that surpasses a host of leading LLMs on both domestic and international fronts. In a comparative evaluation of Japanese dialogue capabilities (Rakuda), NEC’s LLM has emerged as a frontrunner. Notably, it boasts the capacity to process a staggering 300,000 Japanese characters—150 times the capacity of third-party LLMs. This capability positions it as the ideal choice for tasks demanding extensive document processing, encompassing internal and external business manuals.

Simultaneously, NEC is engineering a “new architecture” designed to create innovative AI models by flexibly combining various models based on input data and task complexities. The ultimate objective is to craft a scalable foundational model, equipped to expand its parameters and broaden its functionalities. This adaptable model will seamlessly integrate with diverse AI models, including specialized legal and medical AI, as well as those offered by collaborating businesses and partners. Its compact size and low power consumption render it suitable for seamless integration into edge devices. Furthermore, NEC’s globally recognized expertise in image recognition, audio processing, and sensing technologies empowers LLMs to process a myriad of real-world events accurately and autonomously.

In tandem with these innovations, NEC has embarked on a monumental undertaking—a large-scale model boasting a staggering 100 billion parameters, a significant leap from the conventional 13 billion parameters. These initiatives collectively underpin NEC’s overarching ambition to generate approximately 50 billion yen in revenue from its generative AI-related endeavors over the next three years.

Nevertheless, despite the rapid advancement and adoption of generative AI, certain challenges loom large. These include the need for precise engineering to effectively instruct AI, security concerns related to data integrity and vulnerability, and the orchestration of business data throughout the implementation and operational phases. NEC, however, has been at the forefront of tackling these challenges since July 2023, utilizing the NEC Inzai Data Centre, which provides a low-latency, secure environment for LLMs and facilitates the creation of customer-specific “individual company models” and “business-specific models” through NEC’s proprietary LLMs.

Looking ahead, NEC’s vision revolves around delivering unparalleled solutions across a spectrum of industries, driven by an LLM comprising a scalable foundational model and an optimized generative AI environment tailored to each client’s unique business landscape. In lockstep with this vision, NEC remains steadfast in its commitment to enhancing value, encompassing service and functionality expansion, and providing safe and secure generative AI services and solutions to address the myriad challenges faced by its clientele.

Conclusion:

NEC’s strategic push into generative AI signifies a pivotal moment in the market. By combining advanced LLM capabilities, scalable models, and a commitment to addressing industry challenges, NEC is poised to lead the charge in transforming various sectors through AI-driven solutions. This development is likely to stimulate increased competition and innovation within the AI market, ultimately benefiting businesses and industries seeking to harness the power of generative AI.

Source