Hello, this is Hiro Kobashi and Kosaku Kimura from the Artificial Intelligence Laboratory.
From November 19 to 22, at Microsoft Ignite 2024 held in Chicago, we presented a collaborative demonstration of MSâs AI orchestration tool, Semantic Kernel, and the Fujitsu Kozuchi AI Agent, much like what we did at the previous Microsoft Build 2024. Weâd like to introduce it to you here.
MS Ignite 2024
MS Ignite is a customer-focused event hosted by Microsoft. Since the end of the COVID-19 pandemic, it has been held annually around November. This yearâs event saw approximately 14,000 attendees onsiteâroughly triple the number from last yearâand more than 200,000 online participants, making it a huge event. Over 80 product updates were announced among more than 800 sessions, and although there were even days when snow accumulated in a chilly Chicago, every session room was brimming with enthusiasm.
The theme Agentic World at MS Ignite 2024 left a strong impression. Many sessions were abuzz with product updates and use cases aimed at realizing AI agents and multi-agent systems. In the keynote by Satya Nadella, the CEO, it was declared that Copilot is the UI for AI. Copilot stands closest to the user experience. Moreover, behind the scenes, MS showcased their product stack at this event timing, including Copilot Studioâan environment for developing autonomous agentsâand Azure AI Foundryâan execution environment. Their ability to unveil these at once highlights Microsoftâs strength in this area.
Summarizing our impressions about agents:
- Everything is being called an âagentâ: Personally, Iâm of the view that if a specialized generative AI is sufficiently usable on its own, calling it an agent is not a problem. However, since many specialized generative AIs are not fully usable as-is, they likely need to be integrated with tools and planning as described by the concept of Agentic AI.
- Multi-agent systems are becoming a reality: For example, Toyotaâs use case describing their practice of building multiple agents in Azureâeach possessing specialized knowledge of particular powertrain technologiesâwas fascinating. Similarly, Cognizantâs use case showcased multiple agents connected via a Directed Acyclic Graph (DAG) interacting in real scenarios, which I found highly intriguing. On the other hand, complex, multi-directional interactions among agents seem still relatively rare.
- Autonomy here often means âacting on pre-defined triggers,â not just âacting of oneâs own accordâ: Many cases gave the impression that what they call autonomy is akin to a stored procedure triggered by preset conditions, rather than self-motivated action.
Breakout Session: Productive AI with Semantic Kernel
As with MS Build, this time we also held a breakout session with the Semantic Kernel team.
This time, instead of focusing on Fujitsu Composite AI as before, we showcased its evolution into Fujitsu Kozuchi AI Agent.
Since our session was on the last day of the event, we didnât have a completely full house, but we were happy to reach a tech-savvy, highly focused audience.
Hereâs a brief introduction of our presentation:
We believe that in the near future, the way humans interact with AI will change. Specifically, rather than the current âask and answerâ (passive) model that includes generative AI, weâre moving toward a model where AI âthinks and acts on its own,â blending naturally into human-to-human interactions and providing new insights and flashes of inspiration (proactive). As a result, AI enters conversations, acts cooperatively, and suggests new knowledge and ideasâthus enabling creative collaboration between people and AI. We call this shift âBeyond Chat.â
Based on this concept, we developed the Fujitsu Kozuchi AI Agent. Hereâs a YouTube video showing an example of how it functions as a meeting agent.
The selling point of the Fujitsu Kozuchi AI Agent is that it has already realized the concept of âthinking and acting on its own.â During meetings, the agent listens to the conversation continuously, identifies tasks that seem to need solving, and automatically executes them to present the results.
This is the system flow diagram. In simple terms, it consists of three parts:
ââ ââ¡â listens to the conversation in the meeting,
ââ¢ââ¦â is where the Agentic AI creates tasks and executes them to obtain results (the part previously handled by Fujitsu Composite AI),
ââ§ââ¨â returns the results.
The key point is that ââ ââ¡â and ââ¢ââ¨â are fully asynchronous. This means itâs not a simple âask and answerâ system like before. Of course, if you mention the agent directly in the chat, thereâs also a path to get answers in the traditional Q&A manner.
Check out the Microsoft Tech Blogâs case study for more details if youâre interested!
Customer Case Study: Fujitsu Kozuchi AI Agent Powered by Semantic Kernel @ MS Dev Blog
Also, during Ignite, we had the chance to meet with Matthew, who was previously the PM for Semantic Kernel. Matthew is now the PM for Azure AI Agent (we share a similar interest!). He praised our Agentic AI approach: âListening to conversations, using tools, and providing responses is amazing!â He was impressed again by how we developed this in just a few monthsâjust as we did at MS Build.
At the end, we took a commemorative photo with all the presenters :)
Fujitsu Will Continue to Focus on AI Agents
As a side note, in Matthewâs presentation about Azure AI Agent Service, Fujitsu was featured prominently as a customer. This is not only because we continue to collaborate with the Semantic Kernel team, but also because another Fujitsu teamâseparate from oursâparticipated in the Azure AI Agent Service Co-Building Program.
Fujitsu will continue to invest heavily in AI agents, so please stay tuned!