TL;DR:
- NEC Corporation is set to launch an upgraded Large Language Model (LLM) in 2024, focusing on tailored generative AI solutions for businesses.
- The enhanced LLM aims to revolutionize various industries, including healthcare, finance, local governments, and manufacturing.
- NEC’s LLM outperforms competitors, handling up to 300,000 Japanese characters, making it ideal for processing large document volumes.
- A flexible “new architecture” allows NEC to create scalable AI models, seamlessly integrating with other specialized AI models.
- NEC is developing a large-scale model with 100 billion parameters to drive generative AI-related business growth.
- The company acknowledges challenges in AI adoption, such as the need for refined engineering, security concerns, and data coordination.
- NEC leverages its Inzai Data Center to offer low-latency, secure LLM environments and has a robust portfolio of tailored AI models.
- The generative AI business strategy unfolds in three phases: individual company models, integration into business packages, and collaboration with partners.
- NEC strengthens its technology and sales infrastructure, establishing the Generative AI Centre and prioritizing safety and reliability.
- Collaboration with Robust Intelligence ensures secure LLM provision, adhering to global standards.
Main AI News:
In a strategic move poised to reshape the landscape of generative artificial intelligence (AI) applications, NEC Corporation is set to unveil its revamped Large Language Model (LLM) in the spring of 2024. This initiative represents a pivotal step towards providing tailored AI solutions, built on NEC’s industry and business expertise, to cater to the unique needs of individual clients. The enhanced LLM promises to open new horizons for businesses across various sectors, including healthcare, finance, local governments, and manufacturing.
NEC’s commitment extends beyond mere innovation. The company is primed to drive a comprehensive transformation, focusing on delivering specialized models that have the potential to revolutionize entire industries. These models will be seamlessly integrated through managed application programming interfaces (APIs), fostering a collective evolution of generative AI across diverse sectors.
The cornerstone of this transformation lies in NEC’s LLM, which has undergone substantial improvements. Boasting an impressive doubling of high-quality training data, it has outperformed its peers both in Japan and internationally, as demonstrated in a rigorous evaluation of Japanese dialogue skills. Remarkably, NEC’s LLM can now process up to 300,000 Japanese characters, a staggering 150 times more than third-party LLMs. This expanded capacity positions it as an invaluable tool for handling large volumes of documents, encompassing internal and external business manuals, with unprecedented ease and efficiency.
Yet, NEC’s ambitions do not stop there. The company is actively crafting a “new architecture” designed to create bespoke AI models by dynamically combining various models in response to specific input data and tasks. This innovative approach aims to establish a scalable foundation model capable of accommodating varying parameters and extending functionality. Flexibility is a key feature, enabling seamless integration with specialized AI models, including those tailored for legal and medical applications, while maintaining efficiency and adaptability. Moreover, NEC’s commitment to efficiency extends to energy consumption, as its LLMs can be installed on edge devices, ensuring a sustainable footprint.
In a parallel endeavor, NEC is advancing its capabilities with the development of a large-scale model boasting an impressive 100 billion parameters, a quantum leap from the conventional 13 billion parameters. These efforts are geared towards generating approximately 50 billion yen in sales from generative AI-related business within the next three years, reaffirming NEC’s commitment to pioneering advancements in the AI domain.
As the adoption of generative AI surges, the importance of business reforms cannot be overstated. Organizations, both public and private, are actively exploring transformative opportunities with various LLMs. Nevertheless, significant challenges loom on the horizon, including the need for refined engineering to effectively guide AI, security considerations to mitigate information exposure and vulnerabilities, and the harmonization of business data during implementation and operation.
Since the launch of the NEC Generative AI Service in July 2023, NEC has harnessed the power of the NEC Inzai Data Center, offering a low-latency and secure LLM environment. NEC’s expertise in constructing and providing “individual company models” and “business-specific models” has positioned it at the forefront of the industry. This wealth of knowledge serves as the foundation for NEC’s mission to deliver tailored solutions to clients across diverse industries, combining scalable foundation models with optimal generative AI environments that align with each customer’s unique business requirements.
Generative AI Business Strategy in Three Phases
- Phase 1: Tailored Models for Individual Companies NEC’s Managed API service, leveraging advanced LLMs, will build industry- and business-specific models tailored to individual companies.
- Phase 2: Integration into Business Packages and Solutions Industry- and business-specific models will be seamlessly integrated into comprehensive business packages and solutions.
- Phase 3: Collaboration and Development with Partners NEC will collaborate with strategic partners to develop business packages and solutions, fostering a collaborative ecosystem for generative AI innovation.
To fortify its position in the market, NEC is bolstering its technological and sales infrastructure. The NEC Generative AI Hub, along with the Digital Trust Promotion Management Department, will play instrumental roles in driving generative AI solutions to customers across diverse sectors. A dedicated “Generative AI Centre” will bring together over 100 leading researchers in generative AI from global research centers to expedite the commercialization of cutting-edge research.
Furthermore, NEC’s commitment to safety and reliability remains unwavering. The company is actively engaged in the LLM Risk Assessment Project in collaboration with Robust Intelligence, ensuring the provision of secure and dependable LLMs. Clients can expect industry- and business-specific models to undergo rigorous risk assessment procedures adhering to global standards.
Conclusion:
NEC’s ambitious generative AI strategy signifies a significant leap in the AI market. By offering tailored solutions for diverse industries and addressing the challenges of AI adoption, NEC is poised to drive innovation and reshape the business landscape. With a clear focus on safety, reliability, and scalability, NEC is well-positioned to capture substantial market share in the evolving world of generative AI.