Zenity became the first company to offer comprehensive security for custom-built Enterprise AI Copilots and embedded Generative AI Copilots

TL;DR:

  • Zenity is the first company to offer comprehensive security for custom-built Enterprise AI Copilots and embedded Generative AI Copilots.
  • Microsoft Copilot Studio enables professional and citizen developers to create AI Copilots, raising concerns about data security.
  • Data leakage is identified as the primary threat, as AI Copilots can inadvertently expose sensitive corporate data.
  • Zenity’s security support includes identifying custom Copilots, detecting data leaks, assessing risks, and automated remediation.
  • Ben Kliger, Zenity’s CEO, emphasizes responsible development with proper security guardrails.

Main AI News:

In a groundbreaking move, Zenity has taken the lead in securing custom-built Enterprise AI Copilots and embedded Generative AI Copilots, making it the first company to offer such comprehensive protection.

With the recent introduction of Microsoft Copilot Studio, both professional and citizen developers now have the ability to construct their own AI Copilots using a no-code approach. Furthermore, developers can create plugins directly for Microsoft’s Generative AI Copilot. This opens up a world of possibilities for individuals with varying technical backgrounds to create powerful enterprise AI Copilots, granting them access to a wide array of corporate data and systems. While this development promises increased productivity and efficiency, it also raises significant concerns for security leaders. Several prominent companies, including JP Morgan Chase, Samsung, and the US Space Force, have recently restricted the use of Generative AI among their employees due to cybersecurity risks.

Michael Bargury, CTO and Co-Founder of Zenity, emphasizes the primary threat posed by Enterprise AI Copilots: data leakage. As employees, third-party vendors, and other business users increasingly build their own Copilots, the potential for inadvertent data leaks escalates. Copilots can inadvertently connect business data to the open web, resulting in a serious breach. Without stringent development safeguards, user impersonation, secrets exposure, and over-sharing of data become all too common. Ongoing visibility and robust security controls have become indispensable in this era.

Zenity’s newly implemented security measures encompass a range of capabilities:

  • Identifying all custom Copilots created within Microsoft Copilot Studio and tracking subsequent applications that incorporate AI Copilots.
  • Detecting Copilots that are leaking corporate data, such as those exposing sensitive Sharepoint sites to the Internet.
  • Automatically assessing risks and generating corresponding SBOM (Software Bill of Materials) files for the components integrated into each Copilot and its associated application.
  • Preventing data leakage by deploying automated remediation playbooks for risky and non-compliant Copilots.

Ben Kliger, CEO and Co-Founder of Zenity, expresses his pride in the team’s innovative efforts. The company has continually pushed the boundaries of research and remained attuned to the needs of its customers and partners. The convergence of no-code development and Generative AI is an exciting prospect for everyone involved, but within enterprises, IT and security leaders are keen to ensure that these developments are executed responsibly, with the proper visibility and security safeguards in place. Zenity delivers an agentless, enterprise-ready solution that meets these critical requirements.

Conclusion:

Zenity’s pioneering efforts to secure Enterprise AI Copilots address a growing need in the market. As AI development becomes more accessible, the potential for data breaches also rises. Zenity’s comprehensive security measures are poised to provide enterprises with the safeguards they require, ensuring responsible and secure AI integration into their operations. This move positions Zenity as a leader in enhancing data protection within the evolving landscape of AI technology.

Source