NEC Expands Its AI Service Portfolio with Cutting-Edge Generative Models

  • NEC unveils “NEC cotomi Pro” and “NEC cotomi Light,” two cutting-edge generative AI models.
  • These models prioritize both speed and performance, outpacing global competitors like “GPT-4” and “Claude 2” by up to 87% in response time.
  • In real-world applications, NEC’s solutions demonstrate superior performance, achieving response times up to 93% faster than established models.
  • Both “NEC cotomi Pro” and “NEC cotomi Light” excel in tasks ranging from document summarization to logical reasoning, validated by industry benchmarks.
  • Architectural innovations and a robust Japanese dictionary enable unprecedented speed gains and reduced latency in AI inference processes.

Main AI News:

NEC Corporation announces the launch of two groundbreaking additions to its “NEC cotomi” generative AI services: “NEC cotomi Pro” and “NEC cotomi Light.” These high-speed generative AI Large Language Models (LLMs) boast enhanced training data and architectures, catering to the evolving needs of businesses worldwide.

In recent years, the proliferation of generative AI has sparked interest across diverse industries, with organizations exploring its potential for business transformation. As unique application scenarios surface, the demand intensifies for models and formats that align with specific requirements such as response time, data integration, and security.

The newly unveiled “NEC cotomi Pro” and “NEC cotomi Light” redefine performance benchmarks, delivering unparalleled speed without compromising on quality. Unlike traditional models that sacrifice speed for increased size, NEC’s innovative approach achieves remarkable efficiency through advanced training methods and architectural enhancements.

“NEC cotomi Pro” sets a new standard for speed and accuracy, rivaling leading global models like “GPT-4” and “Claude 2.” Leveraging just two graphics processing units (GPU), it outpaces GPT-4 by an impressive 87% in response time. Meanwhile, “NEC cotomi Light” offers lightning-fast processing akin to “GPT-3.5-Turbo,” capable of handling numerous requests swiftly with minimal hardware requirements.

In real-world applications like in-house document retrieval systems utilizing RAG methodology, NEC’s solutions demonstrate superiority, surpassing the performance of established models like GPT-3.5 and GPT-4, even without fine-tuning. With response times up to 93% faster, the efficiency gains are evident across various tasks, including document summarization and logical reasoning.

Both “NEC cotomi Pro” and “NEC cotomi Light” exhibit unparalleled processing power on a global scale, excelling in tasks ranging from question answering to document summarization. Performance benchmarks such as “ELYZA Tasks 100” and “Japanese MT-Bench” validate their top-tier capabilities, outperforming competitors while maintaining exceptional speed.

In addition to superior inference performance, the reduced latency between request and response underscores the practical utility of NEC’s solutions. By leveraging architectural innovations and a robust Japanese dictionary for tokenization, NEC achieves unprecedented speed gains, enabling rapid access and streamlined training processes.

Moving forward, NEC remains committed to empowering businesses with AI solutions that combine high processing power and efficiency. By collaborating closely with partners and leveraging the expanded NEC cotomi lineup, NEC aims to address complex challenges effectively, delivering safe, secure, and reliable AI services that drive meaningful outcomes for customers.

Conclusion:

NEC’s advancements in generative AI signify a significant shift in the market landscape. By prioritizing speed without compromising on performance, NEC has positioned itself as a formidable contender, offering solutions that address the evolving needs of businesses across diverse sectors. These innovations not only enhance operational efficiency but also pave the way for transformative AI applications, driving tangible outcomes for organizations worldwide.

Source