• UX for AI
  • Posts
  • Reporting: One of the Most Important Copilot Use Cases

Reporting: One of the Most Important Copilot Use Cases

We dig deeply into Reporting, an important emerging LLM design pattern using Zoom AI Companion and Microsoft Security Copilot as examples notable both for their usefulness and their fearless embrace of AI-First features.

One of the Most Powerful LLM Use Cases is Reporting, yet it is often overlooked in favor of its more sexy cousins, like the ability of LLMs to effortlessly compose haikus about space pirates and crocodiles. In this article, we continue our earlier topic of reviewing the emerging trends and best practices in the copilot design (see “UX Best Practices for Copilot Design” on UXforAI.com  https://www.uxforai.com/p/ux-best-practices-copilot-design.) Today we dig deeply into Reporting, an important emerging LLM design pattern using Zoom AI Companion and Microsoft Security Copilot as examples notable both for their usefulness and their fearless embrace of AI-First features.

Zoom AI Companion

One of the shining examples of usefulness is the Zoom AI Companion (ZAC for short). The Companion provides several envy-inducing features not found in competing products.

Meeting Summary

ZAC provides an easy way to summarize the meeting using the transcript. It offers the users minimal controls for fine-tuning, but it is so useful precisely because it takes no time at all to set up. Users can enjoy the immediate benefit of documenting the meeting summary and next steps without doing pretty much anything at all – the AI takes care of everything out of the box:

Answer Questions about the Meeting

Not only does ZAC automatically summarize the meeting, it is able to answer specific questions about it, such as action items, due dates, and so forth:

Set it and Forget it

In the unlikely event the users feel the need to configure the Companion, here’s the screen. It’s very basic:

As the source article says, the design of the Companion utilizes the “Set it and forget it” approach, with the option to kick in automatically and for every meeting. And why not? This “effortless enhancement” approach is a shiny example of how AI should be utilized in a product.

UI Modality Switch 

But what’s really cool and very creative, in our opinion, is that the Zoom AI Companion takes the idea of copilot reporting even further. ZAC can help list ideas from the meeting on a digital whiteboard, organize content into categories, and even come up with relevant brainstorming topics and ideas to help jump-start the brainstorming session:

The application starts with words and summaries and transforms the canvas into a brainstorming space whenever it’s needed. This smooth, rapid, on-demand change of modality (while also transferring the context of the meeting) is especially impressive. Our advice is to look for more opportunities to change modalities with your own copilot design.

Microsoft Security Copilot

The Microsoft Security Copilot (MSC for short) provides similar functionality but adds a few very interesting controls. It offers two reports: 1) an executive summary and 2) a pinboard. Let’s examine the UX design of these features more closely.

1) Executive Summary: a General Report

The first and primary feature of the MSC is that it provides effortless automated executive summary documentation of the security incident with a prompt. This is a key feature because summaries such as this are mandated (required) by the leadership, government regulations, and industry best practices. This is because a security incident is kind of a big deal: think of this like a police report on steroids. This is why a general executive summary report is such an incredibly useful feature for which LLMs like ChatGPT are ideally suited. 

Microsoft is not pulling any punches here: it is going all in. In fact, one of the nice features is that the executive report is in plainly understandable and non-jargon English, suitable for consumption by the leadership, auditors, regulators, and the security team itself. Importantly, the MSC is not JUST creating a report for leadership but generating the complete required internal documentation for the incident. This is the added benefit of having the fine-tuned ChatGPT model instance we discussed in our previous article, UX Best Practices for Copilot Design https://www.uxforai.com/p/ux-best-practices-copilot-design.

2) Pinboard: a Specialized Report Focused Only on Selected Key Details

In addition to providing a general summary incident report for a security incident, the MSC also offers another excellent feature: the ability to create a custom pinboard from the manually selected pertinent data that contains the lines of investigation that actually yielded fruitful insights.

As with any investigation, oftentimes, the security teams will look at the most common suspects and find them “innocent of the crime,” that is, they find nothing suspicious or data unrelated to the security incident on hand. These side tangents of the investigation are often multi-step lengthy affairs, and there are often several of them before the security team figures out the root cause. Adding every possible aspect of the investigation to the report makes it a very long, boring, and rambling narrative and increases the chances that the LLM will hallucinate and call out the wrong root cause or get confused with unrelated information. Additionally, the consumption of such a “complete” report would be a challenge. Instead, the MSC does something clever: it allows the user to “pin” only the most relevant data points to a temporary “pinboard” space and then use the copilot LLM to construct the report only from those limited data points:

The pinboard is then available as a separate report so that anyone joining the investigation at a later time can jump straight to the critical information:

This pinboard provides a complete report using human-selected information:

Info for Report: Ignore Automatically vs. Pick Manually?

Interestingly, where the Zoom AI Companion (ZAC) and the Microsoft Security Copilot (MSC) diverge significantly in their designs is how the information for the report is chosen. The Companion automatically filters out non-relevant information; for example, if the meeting started with one of the participant’s lengthy accounts of their recent trip to Baja and all the amazing bluefin tuna they caught on the boat there, ZAC will likely amit this account (however fascinating) from the meeting summary. (We can assume that the AI will still be able to answer questions about the size of tuna caught by the gentleman in question, however.)  

In contrast, Microsoft Security Copilot pinboard feature relies heavily on the human security consultant to help it pick the right data to showcase in the report. This is likely for two reasons: 

  1. It’s harder to tell the difference between a highly relevant yet futile avenue of security investigation and a story about the bluefin tuna told in the meeting, and most AIs, no matter how well-trained, will have a hard time assigning different meanings to similar-sounding information. 

  2. The security report in the pinboard is a legal and required document, governed in large part by the industry regulations and legally admissible, should the need arise. Thus it needs to be both as complete and accurate and as understandable as possible, and include only pertinent data and nothing more. Thus the security report pinboard generated by the MSC is much more important than a mere meeting summary produced by ZAC.

Recall that false responses (false positives and negatives) depending on the use cases often have a different “penalty” or “weight” associated with them, and those must be weighed carefully against the benefits the AI-driven process provides. We discussed the topic at some length in the “AI Accuracy is Bullsh*t. Here's what UX must do about it,” Part 1: https://www.uxforai.com/p/ai-accuracy-is-bullshit-75b3  and Part 2:  https://www.uxforai.com/p/ai-accuracy-is-bullshit-part2 

Divergent design approaches utilized by MCS vs. ZAC are a perfect example of the principle of understanding trade-offs. Whereas ZAC filters out unnecessary data automatically, MCS is choosing to employ additional human work in an effort to avoid possible hallucinations and thus decrease the possibility of creating a False Positive or a False Negative in the security incident report. As in all cases involving AI products, your design decisions must be based on a solid and thorough understanding of the costs and benefits of various trade-offs, including the costs and benefits of true/false positives and true/false negatives.

This decision-making is key to your success as an AI product designer, as will your ability to understand and advocate for the security and privacy of your customer’s data.

Security and Privacy

Are either of these AI tools trained on your data? Zoom and Microsoft both assure us that this is not the case. And that is the key for high-end paid AI copilot services like these. 

Imagine if your key product strategy decisions were to be somehow leaked to a competitor. Or the details of your systems’ security vulnerabilities somehow end up being public knowledge? No bueno. 

One thing the Zoom AI copilot could improve would be to answer users' questions about how private their conversations are -- right in the copilot instead of in the documentation, which is a wall of text no one wants to read. It’s great that the copilot calls your attention to it and lets you know it’s working right when it launches, but some of my colleagues were uncomfortable with it because they distrust AI in general, and they wanted to know whether these transcripts are viewable by their IT admins or company leadership even when the meeting isn’t recording or when the copilot features are disabled. 

When developing your own copilot application, keep in mind that while it may be useful to learn from your customers' data, it is often not at all ideal as a long-term business plan. A clear stance on Security and Privacy might be the most demanded (and most often overlooked) feature in the copilot designs.

So that’s it for now. Let us know how your own designs for the next copilot have turned out.

Greg & Daria

P.S. Enjoyed this article? Please remember to take a moment to visit our generous sponsor:

Join the conversation

or to participate.