Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com The Ultimate Guide to Mastering Microsoft 365 Wed, 15 Oct 2025 22:44:41 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2025/06/cropped-cropped-O365Cover-Twelfth-Edition-final.jpg?fit=32%2C32&ssl=1 Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Outlook Gets AI Drafting of Meeting Agendas https://office365itpros.com/2025/10/17/agenda-auto-draft/?utm_source=rss&utm_medium=rss&utm_campaign=agenda-auto-draft https://office365itpros.com/2025/10/17/agenda-auto-draft/#respond Fri, 17 Oct 2025 07:00:00 +0000 https://office365itpros.com/?p=71173

Agenda Auto-Draft Available for OWA and the New Outlook

Microsoft is doing its level best to convince Microsoft 365 tenants to invest in Copilot. Given the massive capital investment in datacenters to power AI experiences, it’s unsurprising that engineering groups are busy infusing Copilot features into as many applications as possible. Features like Copilot memory add value and help dissuade tenants from investigating other options, such as the ChatGPT Connector for SharePoint Online and OneDrive for Business.

Of course, a SharePoint connector is limited when compared to the breadth of integration of Copilot across the Microsoft 365 apps. Because Copilot works well for some and not for others, work continues apace to find new ways to integrate AI in daily tasks. This brings me to message center notification MC1171854 (13 Oct 2025), which describes “Intelligent agenda suggestions for calendar events.” The feature is available now, but only to users with Microsoft 365 Copilot licenses.

Agenda Auto-Draft Uses AI to Generate Some Bullet Points

At first glance, I didn’t see much to get excited about. The description says that AI is used “to automatically generate a proposed agenda when users create or edit a calendar event, making it easier to align meeting goals, participants, and discussion topics.” I’ve never had any problems coming up with a few salient points for a draft meeting agenda, and agendas have a nasty habit of changing as soon as meetings start. However, I can see the value of being able to create some bullet points to frame an agenda.

What happens is that Microsoft has updated the calendar scheduling form to add an auto draft an agenda option to the set of prompts available when the Draft with Copilot button is used. When the auto draft option is used, Copilot uses the meeting subject to generate an agenda composed of some introductory text and some bullet points. Copilot has always been good at generating bullet points in document and message summaries!

In Figure 1, the meeting subject is Review Chapter Updates for Office 365 for IT Pros. Copilot’s suggested agenda items seem reasonable, and it looks as if Copilot discovered that Office 365 for IT Pros is an eBook from information found internally or on the web (Bing search).

Intelligent agenda suggestions for a calendar event.

Agenda auto-draft.
Figure 1: Intelligent agenda suggestions for a calendar event

If the meeting organizer doesn’t like the draft agenda, they can simply instruct Copilot to retry or adjust the text by making the agenda longer or shorter. The changes proposed in further versions are not dramatic, likely due to using the meeting subject as the core input to the AI processing.

Eventually, the suggested text is accepted or rejected. If accepted, it can be further edited before the meeting notice is sent.

Now Available Worldwide

Auto-draft of meeting agendas is now a default feature that is enabled in OWA and the new Outlook. According to Microsoft, the feature was enabled worldwide from October 9, 2025.

There’s no administrative control to enable or disable auto-draft for meeting agendas. Given the dramatic difference between the scheduling interface of Outlook classic, it’s unlikely that auto-draft of agendas will find its way into that client.

New Feature that Won’t Move the Needle

Agenda auto-draft won’t move the needle at all when the time comes for Microsoft 365 tenants to decide whether to embrace Microsoft 365 Copilot. It’s a feature that will please some people (those who scheduled meetings and discover how to use agenda auto-draft). For most, I suspect that this is one of the Copilot features that will pass them by because they never need to create an agenda. But that’s always true for new software features.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/10/17/agenda-auto-draft/feed/ 0 71173
Microsoft 365 Copilot Usage Report API General Availability https://office365itpros.com/2025/10/10/copilot-usage-report-api-ga/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-report-api-ga https://office365itpros.com/2025/10/10/copilot-usage-report-api-ga/#comments Fri, 10 Oct 2025 07:00:00 +0000 https://office365itpros.com/?p=71098

It’s Nice to be GA, but What Can You Do with the Copilot Usage Report API?

MC877369 first appeared in August 2024 to announce the availability of Microsoft 365 Copilot usage data through the Graph usage reports API (Microsoft 365 roadmap item 396562). The most recent update (6 Oct 2025) sets out a new timeline for general availability of the APIs, which is now expected to roll out in late October 2025 for worldwide completion in late November 2025. Microsoft doesn’t say why the latest delay occurred or why it’s taken so long to move the API from preview to GA.

Still at the Beta Endpoint

Although the Copilot usage report API is heading for general availability, it’s still only accessible through the beta endpoint. There’s nothing wrong with that, providing the API works. Normally, Microsoft Graph APIs accessible through the beta endpoint are under active development to solve performance or reliability problems, or to complete the features necessary to move to production (V1.0) status.

Using the Copilot Usage Report API

I first looked at the API in September 2024 and concluded that most value can be derived from the Copilot user activity detail API. Knowing what apps people use Copilot in is valuable information if you want to do things like:

  • Knowing what departments Copilot is being used in and those that need a little help to get going. By including user data from Entra ID with Copilot usage data, we can slice and dice the usage data to generate additional insights (Figure 1).

Combining Entra ID user account data with Copilot usage report data.
Figure 1: Combining Entra ID user account data with Copilot usage report data
  • Look for user accounts with expensive ($360/year) Microsoft 365 Copilot licenses and automatically remove underused licenses so that the licenses can be reallocated to people who might use them more. The folks who lose the Microsoft 365 Copilot licenses might be happy with the no-charge Microsoft Copilot chat capability. Or they might be the folks in the company who are using ChatGPT and other AI tools instead of Copilot.
  • A variation on the theme is to integrate Microsoft 365 audit data with Copilot usage report data to drill down into what people are doing with Copilot. The intention once again is to weed out underused Microsoft 365 Copilot licenses so that others might be assigned those licenses.
  • I have a script to create a composite picture of user activity across multiple workloads. It would be easy to add the Copilot usage data to the mix.

Example PowerShell scripts are available to demonstrate the principles explored in each scenario. The point is that usage data is interesting in its own right, but it becomes more powerful when combined with other easily-accessible Microsoft 365 data sources about user activity.

Remember to allow full display of usernames and other information for the report data. If you don’t, the usage data will be obfuscated (concealed) and won’t be able to match up with data from other Microsoft 365 sources.

Other Usage Report APIs

Microsoft 365 supports a bunch of other usage reports APIs for different workloads. Not all workloads featured in the Microsoft 365 admin center are available through a Graph API (like Forms, Project, Visio, and Viva Learning). The same is true for some sub-reports (like Copilot agents). However, there’s enough data available to be able to build a good picture of how people use Microsoft 365 across the board.

The issue with reporting SharePoint URLs (first reported in September 2023) persists. Some security issue is apparently cramping Microsoft’s ability to include site URLs in the site activity report (powered by the getSharePointSiteUsageDetail API), which means that the usage data returned for a site looks like this:

Report Refresh Date      : 2025-10-07
Site Id                  : 66bbf297-2f09-43ec-ab94-9333deacf769
Site URL                 :
Owner Display Name       : Project Haycock Owners
Is Deleted               : False
Last Activity Date       : 2025-05-23
File Count               : 375
Active File Count        : 131
Page View Count          : 0
Visited Page Count       : 0
Storage Used (Byte)      : 110786012
Storage Allocated (Byte) : 27487790694400
Root Web Template        : Group
Owner Principal Name     : projecthaycock@office365itpros.com
Report Period            : 180

The Site Id can be used to find the website URL:

(Get-MgSite -SiteId '66bbf297-2f09-43ec-ab94-9333deacf769').WebUrl
https://office365itpros.sharepoint.com/sites/projecthaycock

It’s a mystery why Microsoft won’t or can’t fix this irritating issue. Just one of those cloud mysteries…


]]>
https://office365itpros.com/2025/10/10/copilot-usage-report-api-ga/feed/ 1 71098
Using the Enterprise Website Microsoft 365 Copilot Connector https://office365itpros.com/2025/09/29/microsoft-365-copilot-connector/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-connector https://office365itpros.com/2025/09/29/microsoft-365-copilot-connector/#respond Mon, 29 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70887

Integrate External Information into Microsoft 365 Copilot Search Results

In July 2025, Microsoft released Microsoft 365 Copilot Search. On its own, Copilot Search is a great way to search SharePoint Online, OneDrive for Business, Teams, and Exchange for information available to the Graph. The more interesting aspect is the ability for tenants with Microsoft 365 Copilot licenses to integrate external sources of information into search. AS Microsoft said at the time:

Copilot Search leverages Microsoft Graph and Microsoft 365 Copilot connectors to index content across Microsoft 365 and third-party apps. It interprets user context, natural language, behavioral signals, and organizational relationships to deliver highly personalized, context-aware answers to complex queries.”

A Gallery of Microsoft 365 Copilot Connectors

The Copilot section of the Microsoft 365 admin center includes a gallery of off-the-shelf Microsoft 365 Copilot Connectors (previously called Graph connectors). These connectors can be plugged into Copilot search by configuring them to external sources of the relevant type, like the Jira Cloud or ServiceNow (Figure 1). These are called prebuilt connectors (see the online documentation).

Microsoft 365 Copilot Connector gallery.
Figure 1: Microsoft 365 Copilot Connector gallery

If a prebuilt connector isn’t available for your preferred source, you can develop a custom connector. For example, the People platform that Microsoft is building envisages information about people being ingested into the Graph through Microsoft 365 Copilot connectors.

Sometimes it’s hard to get your mind around whether a new feature will be valuable in your environment. If someone in your organization asked, “how can we exploit Microsoft 365 Connectors?”, my guess is that quite a few tenant administrators would struggle to come up with a cogent answer. I’d be in the same position, so decided to take a look at configuring a connector to see what happens.

Configuring the Enterprise Websites Connector

I chose to test the enterprise websites connector, which is designed to ingest material from company or public websites available on the internet. Configuring the enterprise website connector is simple. Essentially, all you need is the https URL for a website. When active, the connector crawls the target website to fetch and index content to include the material into Copilot searches. Given the superb information found in Office365itpros.com and Practical365.com, I decided to configure connectors for both sites. Figure 2 shows the configuration I used to fetch information from the Practical365.com website.

Configuring the enterprise websites connector.
Figure 2: Configuring the enterprise websites connector

It takes some time for a connector to perform the initial crawl and bring information into Microsoft search. Eventually, the connector status will turn to Ready (Figure 3), and at that point you’ll know that the information retrieved from the website is available for searching.

Microsoft 365 Copilot connector status in the Microsoft 365 admin center.
Figure 3: Microsoft 365 Copilot connector status in the Microsoft 365 admin center

Searching External Data with Microsoft 365 Copilot Search

Microsoft 365 Copilot Search responds to search requests with information from all the sources available to it. I searched for information about the Microsoft 365 licensing report script that I knew occurs in articles published on Office365itpros.com (here’s one) and was rewarded with an instant hit (Figure 4) with the source that Copilot Search retrieved the information from clearly indicated in the results. Even comments for posts are indexed and available.

Microsoft 365 Copilot Search features results from Office365ITPros.com.
Figure 4: Microsoft 365 Copilot Search features results from Office365ITPros.com

Clicking the link brings the user directly to the source page on the website. It’s a very seamless experience.

Even better, if asked to summarize the results, Microsoft 365 Copilot integrates the information from the external websites along with the other Graph-based information available to it. In the summary shown in Figure 5, Copilot cites a source from the Practical365.com site and tags it as “ThirdParty,” meaning that the information comes from a non-standard Graph source.

Copilot summary includes information from an external source.
Figure 5: Copilot summary includes information from an external source

Easy, Quick, Seamless

Being able to integrate external website content into Microsoft 365 Copilot Search is one of the best features I’ve seen Microsoft launch in the blizzard of AI-related functionality introduced since the Copilot launch in March 2023. Configuring the connector is quick and easy; crawling happens automatically (and a schedule for crawling can be set up); and the results are presented seamlessly alongside items from other Graph sources. It’s a great example of the power of bringing external data into the Graph and can certainly help answer the question of what to do with Microsoft 365 Graph connectors.

Best of all, this is an easy way to integrate information from trusted external web sites into Graph searches. Who wouldn’t want to integrate content from Office365ITPros.com and Practical365.com?


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/09/29/microsoft-365-copilot-connector/feed/ 0 70887
Microsoft Use of Anthropic AI Models Creates Concerns for Tenants https://office365itpros.com/2025/09/26/anthrophic-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=anthrophic-copilot https://office365itpros.com/2025/09/26/anthrophic-copilot/#comments Fri, 26 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70905

European Union Microsoft 365 Tenants Might Have Highest Level of Concern

On September 24, Microsoft announced an expansion of the AI models used by Microsoft 365 Copilot to include models created and maintained by Anthrophic. Initially, Microsoft says that the Anthrophic Claude Opus 4.1 model can be used with the Copilot Researcher agent, and the Claude Sonnet 4 and Claude Opus 4.1 models can be used to create agents in Copilot Studio. The Anthrophic models are not available in Copilot Chat or other Copilot integrations with Microsoft 365 apps, which continue to use the AI models.

The interesting thing about this change is that it gives Microsoft 365 tenants located in the European Union and elsewhere some issues to consider. Today, Microsoft safeguards the data used in their versions of the OpenAI models to make sure that data does not leave the tenant, compromise the EU data boundary for the Microsoft cloud, or is meets the same standards of privacy and security set down for Copilot.

Although Anthrophic seeks to reassure people about processing information in various locations, no guarantee exist about how using the Anthrophic models with Copilot Researcher or to create Copilot agents maintains the same level of enterprise data protection and integrity that Microsoft has emphasized to date for its AI systems and highlights for the Copilot Researcher agent (Figure 1).

Copilot researcher chat (without Anthrophic) has enterprise data protection.
Figure 1: Copilot researcher chat (without Anthrophic) has enterprise data protection

Microsoft Documents the Compliance and Processing Risk

Microsoft’s documentation covering how to connect to the Anthrophic models emphasizes that:

When your organization chooses to use an Anthropic model, your organization is choosing to share your data with Anthropic to power the features. This data is processed outside all Microsoft‑managed environments and audit controls, therefore Microsoft’s customer agreements, including the Product Terms and Data Processing Addendum do not apply. In addition, Microsoft’s data‑residency commitments, audit and compliance requirements, service level agreements, and Customer Copyright Commitment do not apply to your use of Anthropic services. Instead, use of Anthropic’s services is governed by Anthropic’s Commercial Terms of Service and Anthropic’s Data Processing Addendum.

In other words, all bets are off once a tenant goes off the beaten path to explore the glorious uplands of the Anthrophic models. The flat warning given by Microsoft that data will be processed outside the Microsoft environment without any audit controls is stark. It makes other administrative challenges like stopping individual users from uploading files to ChatGPT for processing seem very simple (because uploads are easily blocked).

The loss of all the Microsoft Purview compliance functionality built up around Copilot like the interaction audit records captured by the Microsoft 365 substrate is a big problem. It’s certainly enough to stop all but a few tenants that have a clearly defined and well understood need to use the Anthrophic models from venturing down the path to choose Anthrophic in the Copilot section of the Microsoft 365 admin center (Figure 2).

The Microsoft 365 admin center option to select Anthrophic models.
Figure 2: The Microsoft 365 admin center option to select Anthrophic models

Where Next?

Choice is always good and Microsoft’s move to expand the pool of available Large Language Models is a worthwhile initiative. Although the targets for the Anthrophic models are currently limited, the guardrails around the use of those models are also limited. Considering the effort expended by Microsoft to develop security and compliance controls for Copilot since the March 2023 launch, it seems that there’s a huge amount of work to be done before third-party models become a serious option for many Microsoft 365 tenants, even in limited circumstances.

The work to bring model choice to Copilot is obviously evolving, and no doubt Microsoft will deliver some of the controls desired by customers over time. Quite how many of the Purview controls can be applied to third-party AI processing remains an open question.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!

]]>
https://office365itpros.com/2025/09/26/anthrophic-copilot/feed/ 2 70905
Copilot Chat Arrives in Microsoft 365 Apps https://office365itpros.com/2025/09/19/copilot-chat-microsoft-365-apps/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-chat-microsoft-365-apps https://office365itpros.com/2025/09/19/copilot-chat-microsoft-365-apps/#respond Fri, 19 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70779

A Logical Progression for Copilot in the Microsoft 365 Apps

The news in message center notification MC1096218 (last updated 17 September 2025) about the rollout of Copilot Chat confirms the worst fears of some that Microsoft is on a one-way path to stuffing Copilot into as many places as it can. Well, that feeling is backed by some truth, but in this case, I think the change is a natural progression of Copilot’s existing presence in apps like Word, where it’s been producing document summaries since last year.

Once Copilot appeared in the Office apps, there was only one way forward, and that wasn’t to see Copilot disappear from Office. Now Copilot Chat is available in Word, Excel, and PowerPoint, just like it has been available in Outlook (new and classic) for a while. Microsoft says that the rollout is expected to complete in the coming weeks, which basically means that it will turn up when the stars align in terms of desktop client Office build and server infrastructure.

Copilot Chat for All

Copilot Chat is available for any user of the Microsoft 365 apps, with or without a Microsoft 365 Copilot license. The difference is that those with Microsoft 365 Copilot licenses can access tenant resources like documents stored in SharePoint Online and OneDrive for Business while those without are restricted to web queries (via Bing search).

Working in Copilot and an Office App

The idea behind the side-by-side implementation is that users can work on a file in the main pane while being able to interact with Copilot in a side pane (Figure 1). It’s a useful feature that makes it easy to take questions from the main file, research them in Copilot, and take the results back into the file.

Editing a Word document with a side-by-side Copilot chat.

Copilot Chat for Microsoft 365 Apps.
Figure 1: Editing a Word document with a side-by-side Copilot chat

Apart from anything else, integrating Copilot so tightly into the Office apps makes it less likely that users will seek AI assistance elsewhere and potentially end up uploading documents from SharePoint and OneDrive to services like ChatGPT. It also encourages people to consider upgrading from the free Microsoft Copilot to the full-feature and more expensive Microsoft 365 Copilot.

Word Action Button for Microsoft 365 Copilot Chat

After Outlook, Word is easily the Office app where I spend most time. The announcement in message center notification MC1143298 (last updated 17 September 2025) that an Open in Word action button will soon be available to move text from Copilot to Word is therefore very interesting.

It’s possible to move content from Copilot to Word now using Copilot pages as an interim step. Copilot pages are built from Loop, so the intention is that the content is worked on in Loop after coming from Copilot rather than being exported to a new app. At this point, Word is a more sophisticated word processing tool than Loop is. Given the use cases for the two apps, this is the natural state of affairs. I seldom need to collaborate with others to write articles or book text. Being able to move content from Copilot to Word is an action I shall check out once it becomes available later this month.

Teams Move to the Unified Microsoft 365 Apps Domain

Before closing for the weekend, a little bird tells me that Teams might soon move from its teams.microsoft.com domain to teams.cloud.microsoft as part of the initiative launched by Microsoft to create a unified domain for Microsoft 365 apps.

In March 2024, Microsoft posted a note for developers to tell them that Teams apps needed to be able to use teams.cloud.microsoft. By this point, I’m sure that most ISVs will have updated their apps, but if your tenant has some custom home-grown Teams apps, it’s worthwhile checking with the developers that the apps are ready to accommodate the domain switch. Who wants to be surprised when the switch happens?


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!

]]>
https://office365itpros.com/2025/09/19/copilot-chat-microsoft-365-apps/feed/ 0 70779
Copilot Administrative Skills Don’t Do Much for SharePoint Management https://office365itpros.com/2025/09/16/sharepoint-skills-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-skills-copilot https://office365itpros.com/2025/09/16/sharepoint-skills-copilot/#comments Tue, 16 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70725

SharePoint Skills in Copilot Won’t Impress SharePoint Administrators

Message Center notification MC1147976 (4 September 2025, Microsoft 365 roadmap item 501427) apparently heralds a new era of AI-enhanced administrative assistance for Microsoft 365 workloads. The post describes two skills to assist administrators to perform tasks in the SharePoint Admin Center:

  • Step-by-step task guidance: Copilot provides clear instructions to help administrators complete common tasks.
  • Multi-variable site search: Copilot enables administrators to search for sites using multiple conditions, such as inactivity, external sharing, and size, and suggests recommended actions.

The change will roll out in general availability worldwide from October 6, 2025. The capability showed up in my targeted release tenant, so I thought that I’d ask Copilot to help me to manage SharePoint Online, especially because of the promise that Copilot will help “both new and experienced admins complete tasks faster.” Alas, the skills exhibited by Copilot didn’t live up to expectations.

SharePoint Skills and the Promise of AI

Largely because of Teams, SharePoint Online administrators have many more sites to manage than in the past. It therefore makes perfect sense to apply artificial intelligence to help administrators detect potential problems that might be lurking or to find sites that need attention.

I started by asking Copilot to find which sites have most files. That seems like a pretty simple question for AI to answer, but it’s not and Copilot couldn’t answer, saying that it was unable to search for that criterion (Figure 1).

Copilot can’t respond to a SharePoint administrator question.

SharePoint skills.
Figure 1: Copilot can’t respond to a SharePoint administrator question

Hmmm… Such a response seems at odds with Microsoft’s promise that Copilot will strengthen governance at scale by allowing administrators to “ask complex questions and receive actionable results, making it easier to detect risks and enforce lifecycle policies across large environments.” Knowing which sites store most files seems like a fundamental piece of information from a data lifecycle perspective.

SharePoint Skills Need Data

The root of the problem is likely to be the data available for Copilot to reason over. All the Microsoft 365 admin centers present sets of data relevant to a workload through their UX. The Exchange admin center deals with mailboxes and other mail-enabled objects; the Entra admin center deals with directory objects; the Teams admin center deals with Teams policies and other team-related information, and so on. The information in these data sets is whatever’s accessible through and presented by the admin centers.

In the case of my question, the SharePoint Online admin center doesn’t have the data to respond because there’s nowhere in its UX that surfaces the file count for sites. In fact, although the SharePoint admin center reports the total number of files in the tenant, finding the file count for a site takes some effort unless you use the slightly outdated information that’s available through the site usage Graph API.

On the other hand, when I asked Copilot to “Find sites without a sensitivity label that have more than 1GB of storage,” the AI could respond because the storage used by each site is available in the SharePoint admin center (Figure 2).

Copilot finds an answer
Figure 2: Copilot finds an answer

Delivering the Promise

Tenant administrators have a lot to do, so any tool that can help is welcome. This is a first-run implementation, so it’s bound to have flaws. Copilot can offer limited help that novice administrators might welcome while not offering much to anyone with some experience. Microsoft is likely to iterate its Copilot assistance for SharePoint administrators to improve, deepen, and enhance what Copilot can offer, but I fear it will take several attempts before the promise of AI is delivered.

What SharePoint Skills Would Help Administrators?

This raises the question of what kind of assistance Microsoft 365 administrators might want AI tools incorporated into the admin centers to deliver? To me, the answer lies in bringing information together from available sources to answer questions faster than a human being can.

For example, SharePoint advanced management includes a change history report. It would be nice if an administrator could ask Copilot to review all changes made to SharePoint over the last month to report changes like sensitivity label updates for any site that generate label mismatches for documents. The information is available in audit logs and SharePoint document libraries, but it takes effort to bring everything together in a concise and understandable format. AI should be capable of answering questions like this instead of simple queries against site properties, which is all that Copilot can do today, and that is hardly a great example of AI in action.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/09/16/sharepoint-skills-copilot/feed/ 4 70725
Running the SharePoint Site Content and Policy Comparison Report https://office365itpros.com/2025/09/11/sharepoint-site-content-comparison/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-site-content-comparison https://office365itpros.com/2025/09/11/sharepoint-site-content-comparison/#comments Thu, 11 Sep 2025 06:00:00 +0000 https://office365itpros.com/?p=70687

Use the SharePoint Site Content Report to Highlight Issues – If You Can Find Any

As quickly as they can and in as many ways as possible, Microsoft is attempting to address the problem of digital rot in SharePoint Online. Having old, obsolete, inaccurate, and just plain wrong information in SharePoint Online and OneDrive for Business doesn’t matter so much until the eagle eye of AI tools are deployed. At that point, all manner of misleading responses can appear because the AI is grounded on misleading or incorrect information.

To some extent, customers cannot be blamed for the digital debris that they accrue in SharePoint Online. For years, Microsoft has encouraged cloud storage (the latest tactic is a policy setting to block users from saving to non-cloud locations), so it’s natural that some rot accumulates along with valuable material. As I’ve said before, many of the problems customers have encountered with Microsoft 365 Copilot deployments are the legacy of previous Microsoft collaboration strategies. Not everyone goes through the process of auditing the files stored in SharePoint Online to remove dross (here’s how to create a report of files in a document library, and here’s how to do the same for a OneDrive for Business account).

The Site Content and Policy Comparison Report

All of which brings me to the “SharePoint AI-powered site content and policy comparison report” announced in MC1143287 (27 August 2025). The report is available to tenants with Microsoft 365 Copilot or SharePoint Advanced Management (SAM) licenses. It’s one of the SAM features made available to Copilot customers without the need for SAM licenses. The site content and policy comparison report is rolling out in general availability now with completion worldwide due in late December 2025.

According to Microsoft, the idea is that a tenant selects a reference site that they regard as a good example of a site with “up-to-date and relevant policies” to use as a benchmark to compare up to 10,000 other sites against (Figure 1). In addition to policies that apply to the site (like data lifecycle policies), the reference site should contain more than 10 files of the same kind of material that’s found in the target sites. This is because the comparison uses the 10 most recently used files from each site.

Selecting a reference site for the content and policy comparison report.
Figure 1: Selecting a reference site for the content and policy comparison report

The report uses AI to examine the target sites. The target sites can be chosen by uploading a CSV containing their URLs or selected using several site properties, ranging from the simplest (examine everything as shown in Figure 2) to identifying sites based on the container management sensitivity label assigned to sites, the site type, creation date (for example, sites created within the last 30 days), sites with no owners, and sites where external sharing is disabled.

Selecting target sites for the policy comparison report.
Figure 2: Selecting target sites for the policy comparison report

Nothing in My Reports

Behind the scenes, AI compares the target sites against the reference site to highlight inconsistent application of policies based on the similarity between the reference site and target sites (here’s the documentation). After pondering on any anomalies that it finds (a process that Microsoft warns could take up to 48 hours), the AI generates a report for administrators to consider and potentially act upon.

And that’s where my story ends because despite multiple attempts to select a good reference site to compare against the other sites in my tenant, the AI always came up with an empty report. I even purposely populated a site with content that I knew is similar to other sites and edited ten of the files added to the site to make sure that fresh material was available for the comparison. The site had the same sensitivity label and settings as the reference site, but the report still ignored it.

Maybe my SharePoint Deployment Has No Problems

I could take a positive view and conclude that the AI discovered no irregularities. For instance, all my team sites have container management labels and have assigned retention policies. It could also be the case that the selected reference sites are very dissimilar to the other sites in the organization, so much so that none of the other sites came close enough to be of interest.

However, I suspect that the AI comparison just doesn’t work well for tenants where many similar sites exist but only a few of those sites are actively used. I also wonder why Microsoft insists on comparing the last ten most recently used files because if the intention is to help organizations prepare for Copilot, then perhaps sites that hold many files without having sufficient recently modified files should be highlighted? After all, in the eyes of AI tools, a file is a file, and the information contained in a file that hasn’t been modified in years could end up being cited in a Copilot response. Using old material often leads to poor responses.

Don’t assume that just because it didn’t work for me, the site content and policy comparison report is rubbish. It might work well in your tenant and highlight many areas that you should investigate to improve the tenant readiness for AI.


Learn about managing SharePoint Online and the rest of Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.

]]>
https://office365itpros.com/2025/09/11/sharepoint-site-content-comparison/feed/ 2 70687
Microsoft Bolts on Copilot License Check onto ExRCA https://office365itpros.com/2025/09/08/copilot-license-check-exrca/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-license-check-exrca https://office365itpros.com/2025/09/08/copilot-license-check-exrca/#comments Mon, 08 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70545

Hard to Find Logic for Copilot License Check in Exchange Connectivity Tool

Unless you’re a keen reader of the Microsoft blogs posted to the Microsoft Technical community, you might have missed the August 25 article about a new diagnostic tool for a “Copilot License Details Check.” According to the text, it’s “a powerful tool designed to streamline and validate your license readiness by confirming whether Copilot for Microsoft 365 licenses are properly assigned to users.” In reality, it’s some Graph API requests cobbled together to report details of a Microsoft 365 Copilot license assignment to a user account that’s been bolted onto the side of the Exchange Remote Connectivity Analyzer (ExRCA).

As I explain in another article, ExRCA started as a troubleshooting tool to help Exchange on-premises administrators debug connectivity problems with protocols like Autodiscover and ActiveSync (check out the YouTube video from that time). Later, Microsoft upgraded ExRCA to support Exchange Online and Teams. At this point, it’s fair to say that ExRCA is an invaluable tool for Microsoft 365 tenant administrators.

However, having a valuable support tool is no reason to bolt on a license checker. I’m sure Microsoft will point to the inclusion of the message header analyzer tool in ExRCA as evidence that ExRCA has become a toolbox, but that’s missing the point that the message header tool is at least associated with a messaging protocol (SMTP) whereas the Copilot license check is a barefaced attempt to help people use more Copilot features.

Running a Copilot License Check

Running a Copilot license check is very simple. Input the user principal name or primary SMTP address of a user account, sign in as an administrator with permissions to access user account details, and the code runs to verify that the account has a Microsoft 365 Copilot license with all its service plans intact (Figure 1).

Running a Copilot license check for a user account in ExRCA.
Figure 1: Running a Copilot license check for a user account

A Simple Check

Stripping everything away, the license check is very simple and the results that it generates are equally simple (no expensive CPU cycles for AI analysis are burned here). Figure 2 shows that the user being checked is good to go. I’m sure that this is deemed to be a successful test.

Results of a Copilot license check.
Figure 2: Results of a Copilot license check
A close-up of a document

AI-generated content may be incorrect.

But some issues exist. First, the test doesn’t distinguish between direct-assigned licenses and group-assigned licenses, which is valuable information for an administrator to know if they want to address a problem highlighted by the test. Second, the test only considers a “full” Microsoft 365 Copilot license to be valid. Trial licenses are not considered valid. Third, disabling some Copilot features is a perfectly reasonable thing to do. Not everyone needs to create new agents through Copilot Studio, for example.

PowerShell Code for the Copilot License Check

To show what the ExRCA Copilot check does, I recreated the check using the Microsoft Graph PowerShell SDK. The code is simple and took about an hour to write (including testing):

  • Sign into the Graph with Connect-MgGraph using an administrator account.
  • Prompt for a user account and validate that the account exists.
  • Check that the account has a valid Microsoft 365 Copilot license (including trial licenses). License and service plan information is available online.
  • Run the Get-MgUserLicenseDetail cmdlet to retrieve service plan information for the Copilot SKU.
  • Check each of the service plans defined in the license to report if it is enabled or disabled.

Figure 3 shows some sample output.

Checking Microsoft 365 Copilot license details with PowerShell.
Figure 3: Checking Microsoft 365 Copilot license details with PowerShell

You can download the script from the Office 365 for IT Pros GitHub repository.

No Reason to Use ExRCA to Check License Details

I don’t know how many Microsoft 365 tenant administrators will seek out ExRCA to answer questions like “I wonder if the Copilot license assigned to John Doe is good to go?” It seems like an unnatural reaction to consider ExRCA in that light when it’s straightforward to build your own tools to measure user readiness for Copilot or analyze and report licenses assigned to user accounts (including disabled service plans).

The idea might be a good one, but I fear it’s implemented in the wrong place and is too limited to deliver much value.


Need some assistance to write and manage PowerShell scripts for Microsoft 365, including Azure Automation runbooks? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.

]]>
https://office365itpros.com/2025/09/08/copilot-license-check-exrca/feed/ 2 70545
Microsoft Explains the Differences Between Copilot Memories https://office365itpros.com/2025/09/03/copilot-memory-types/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-memory-types https://office365itpros.com/2025/09/03/copilot-memory-types/#comments Wed, 03 Sep 2025 07:00:00 +0000 https://office365itpros.com/?p=70579

Copilot Memory and Copilot Communication Memory or Live Memory

On August 6, I reported about Copilot Memory, a way for users to personalize how Microsoft 365 Copilot responds to prompts. The article mentions message center notification MC1127234 (1 August 2025). In the text I noted some conflicts between the reported dates in the Technical community post about Copilot Memory and MC1127234.

As it turns out, some confusion existed about what Microsoft described in the message center notification because it doesn’t refer to the Copilot Memory feature. Instead, MC1127234 discusses Copilot “communication memory,” or information derived from a user’s personal communication used by Microsoft 365 Copilot to deliver better results to user prompts.

On August 29, 2025, Microsoft reissued MC1127234 (Microsoft 365 roadmap item 499153) to clarify what they meant, saying that the restatement was due to customer feedback, and because the change requires new processing of user data.

Learning to be More Context-Aware from User Communications

The updated text describes an “extended capability” for how Microsoft 365 Copilot gleans information about what’s important to a user from their chats, emails, and meetings. In other words, Copilot learns about what people do from the information contained in the Microsoft Graph and uses that information (referred to as “memories”) to deliver more personalized and “context-aware” responses to user prompts. “Live memories” (the same as communication memory) for individual users are formed by using AI to analyze and summarize the information held in items accessible to individual user, like transcripts from meetings they attend.

Easy as it is to become confused with so many references to different types of memory, the serious point is that communication memory is “a unified view across communication channels.” In other words, where Copilot integration in Microsoft 365 apps used to focus exclusively on information from a specific app, now Copilot draws in information from multiple sources to create a more rounded picture. For instance, the Facilitator agent in Teams operates on messages posted to a chat. With communication memory, the agent can refer to other sources available through the Graph (like previous emails between the chat participants) to deliver better advice.

People-Related Questions First

Microsoft says that Copilot communication memories will initially be used to improve the quality of Copilot responses to people-related questions. I guess this means that if you include someone’s name in a Copilot prompt, it will automatically query the Graph to find out if any information associated with that name is available to help frame its response. The new mode of processing started on September 1, 2025. However, complete worldwide deployment is not expected to be complete until late October 2025.

Making Microsoft 365 Copilot A Bit Smarter

Anything that helps AI to do a better job is welcome. In this case, it seems like Microsoft is expanding the horizon of what Copilot looks for when it assembles information to answer a question from the limit of a single app to a more-encompassing view of what’s available to a user in the Graph. That’s a good idea.

Copilot citations for sources.

Copilot memory
Figure 1: Copilot citations for sources

Of course, the normal caveats over anything to do with AI applies. Copilot communication memory depends on what Copilot can find, and if the information in the Graph is inaccurate, obsolete, or just plain wrong, then the advice informed by memory could be affected. Microsoft notes that citations (source links) are provided for each source mentioned in a response (Figure 1). Quite how often people check out every citation for a Copilot response is quite another matter. Sometimes, any answer will do.


Learn more about how the Microsoft 365 applications really work on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.

]]>
https://office365itpros.com/2025/09/03/copilot-memory-types/feed/ 1 70579
Summarize Email Thread Feature Coming to Outlook https://office365itpros.com/2025/08/26/summarize-email-thread/?utm_source=rss&utm_medium=rss&utm_campaign=summarize-email-thread https://office365itpros.com/2025/08/26/summarize-email-thread/#comments Tue, 26 Aug 2025 07:00:00 +0000 https://office365itpros.com/?p=70479

Releasing Features like Summarize Email Thread without Microsoft 365 Copilot Licenses is Just Business

Those who are surprised by Microsoft making Copilot features in Office to users without a Microsoft 365 Copilot license don’t understand that it’s simply a matter of business. If Microsoft doesn’t make basic AI features available within Office, ISVs will fill the vacuum by selling add-ons to integrate ChatGPT or other AI with Outlook. If customers buy ChatGPT integrations, it removes opportunity for Microsoft to sell Microsoft 365 Copilot licenses.

Message center notification MC1124564 (updated 12 August 2025, Microsoft 365 Roadmap item 498320) is a good example. This post announces that the option to summarize email threads will be available in Outlook even for users without a Microsoft 365 Copilot license provided Copilot chat is pinned to the navigation bar. The feature is available in Outlook Classic (subscription version), the new Outlook for Windows, and OWA if they have enabled Copilot chat by pinning the app to the navigation bar. This option is controlled by a setting in the Copilot section of the Microsoft 365 admin center (Figure 1).

The setting to pin Copilot Chat.

Summarize Email Thread.
Figure 1: The setting to pin Copilot Chat

Targeted release users should see the feature between late August 2025 and mid-September 2025, with general availability following in between mid-September and mid-November 2025.

Summarizing Email Threads

Generative AI always creates the best results when it has a well-defined set of data to process. Just like users who have Microsoft 365 Copilot licenses, Outlook users without a Copilot license will see a Summarize button in the reading pane. Choosing the option calls Copilot to process the email thread to create a summary by extracting the most important points from the thread. Even in a single-item thread, summarization can be valuable by confirming critical issues raised in a message.

Summarizing an email thread doesn’t include other Copilot features like summarizing attachments for a message.

The Business Question

If Microsoft didn’t offer thread summarization in Outlook, customers can find the same functionality available in ISV offerings such as AI MailMaestro (ChatGPT for Outlook), available in the Microsoft app store, which includes the ability to summarize “any email for immediate thread analysis and key points” at a price point where “Copilot is 2.5x more expensive than MailMaestro.”

This is not the only example of an Outlook add-in for ChatGPT (here’s another picked at random). OpenAI has their own connector for Outlook email (and others for Outlook calendar, Teams, and SharePoint Online). Using add-ins and connectors creates security, app management, and compliance questions for Microsoft 365 tenants, but some organizations are happy with the trade-off to gain AI features at reduced cost.

No doubt Microsoft will emphasize to customers that their version of the OpenAI software is specially tailored to the demands of Microsoft 365 in a way that a general-purpose LLM cannot be. However, price is a powerful influence and ChatGPT is a very popular solution.

From a Microsoft perspective, if customers embrace OpenAI-based third-party solutions and deploy add-ins or connectors to extend the Office apps, Microsoft loses some degree of account control and their potential to sell Microsoft 365 Copilot licenses is reduced. Neither outcome is an attractive prospect, especially in large enterprise accounts.

In the context of wanting to protect the Office franchise, it’s understandable why Microsoft should make a limited subset of AI-driven features available to users of the Microsoft 365 enterprise apps (subscription version of Office). Apart from making third-party offerings less attractive, getting Copilot’s proverbial foot in the door is likely to encourage investigation of other Copilot functionality like email prioritization that might lead to future purchases.

Raising the Ante

I’ve nothing against Microsoft adding features to Outlook where it makes sense. Summarizing email threads is an example of where everyone can gain from AI, so it seems sensible to add it to Outlook. The fact that adding the feature helps Microsoft to compete with ISVs might seem regrettable, but it’s just business.

In some scenarios, adding features like this might be deemed anti-competitive, but there is plenty of room for ISVs to compete with Microsoft to exploit AI, and including basic features like summarization rapidly becomes the ante to participate in the market.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/08/26/summarize-email-thread/feed/ 2 70479
Microsoft Fixes Copilot Audit Records https://office365itpros.com/2025/08/22/copilot-audit-records-fixed/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-audit-records-fixed https://office365itpros.com/2025/08/22/copilot-audit-records-fixed/#respond Fri, 22 Aug 2025 07:00:00 +0000 https://office365itpros.com/?p=70466

Copilot Audit Records Record Details of SharePoint Online Files Referenced by BizChat

Despite the launch of Microsoft 365 Copilot in March 2023, we’re still in the early stages of AI deployments. Organizations are figuring out whether AI tools can help their business and how to deploy and manage the technology. Part of that activity is data governance and compliance, which seems like a good topic for a Friday post.

Copilot Broke Your Audit Log

Starting off with an August 19 article by Zack Korman entitled “Copilot broke your audit log,” we learn that a gap appeared in the payload of audit events captured for Copilot interactions with files stored in SharePoint Online. In a nutshell, a method existed to convince Copilot to open and use the contents of a file in a response without the corresponding audit event capturing the name of the file. Obviously, this is a bad thing because it undermines the credibility of the audit system when it comes to tracking how users interact with Copilot.

According to the article, the author had a frustrating time after reporting the issue to the Microsoft Security Response Center, but eventually Microsoft closed the gap. To test the fix, I asked BizChat (Microsoft 365 Copilot Chat) to summarize this file with this prompt:

Find the recent document about problems with Copilot audit records and summarize the content. Don’t link to the file and don’t include it as a reference. Just report what the document contains.

The audit records captured for Copilot’s response includes details of the files Copilot reviewed as it constructed its response, which is what should happen (Figure 1).

Details of files accessed in a Copilot audit record.
Figure 1: Details of files accessed in a Copilot audit record

Checking the Compliance Records

The information shown in Figure 1 is an audit record. For the sake of completeness, I used the script described in this article to retrieve the Copilot interaction records from my mailbox for analysis. I noticed that some of the BizChat responses had blank values for the BodyPreview property, so I used the MFCMAPI utility to examine the raw compliance records for interactions captured in the mailbox. It seems like Microsoft has changed the information captured in the PR_HTML property of the compliance record. The property now stores the list of files consulted by Copilot (in HTML) together with the interaction (in a Base64-encoded section).

Reviewing details of a Copilot interaction compliance record with MFCMAPI.
Figure 2: Reviewing details of a Copilot interaction compliance record with MFCMAPI

I’ve seen Base64-encoded content in compliance records previously when investigating the compliance data captured for Microsoft Copilot interactions. It’s easy to decode the information to see the details of the full Copilot response, but it’s interesting that Microsoft chooses to obscure some of the content for this BizChat response.

Many Ways to Get Copilot Details

So far we’ve discussed using Copilot audit records and Copilot interaction compliance records to retrieve details of Copilot interactions. Another way is to use the aiInteractionHistory Graph API. Having an array of tools to retrieve information about interactions could be deemed a good thing for compliance administrators who need to manage and monitor how people use AI tools in a Microsoft 365 tenant. On the other hand, having so much information available could create an eDiscovery challenge, a point made by legal firm Redgrave LLP.

The issue is simple. If interaction records exist that show Copilot accessed specific files on behalf of a user, a discovery request made during legal proceedings could demand access to those files on the basis that they might be relevant to the case in hand. Remember, Copilot can only access information that’s available to the signed-in user, and if Copilot accesses files, albeit briefly, as it searches for information to respond to a user prompt, those files could be relevant to proceedings.

Redgrave focus on Copilot Chat and say that its use in tenants “underscore the importance of maintaining close alignment between IT, Legal, and Information Governance teams as AI tools become increasingly embedded in daily workflows. Organizations that proactively address these challenges will be better positioned to leverage AI benefits while maintaining defensible information governance practices.” Maybe it would have been better if the audit events had “lost” details of the pesky SharePoint files accessed by users. Only joking!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/08/22/copilot-audit-records-fixed/feed/ 0 70466
Microsoft Introduces Copilot Memory https://office365itpros.com/2025/08/06/microsoft-introduces-copilot-memory/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-introduces-copilot-memory https://office365itpros.com/2025/08/06/microsoft-introduces-copilot-memory/#comments Wed, 06 Aug 2025 07:00:00 +0000 https://office365itpros.com/?p=70246

Personalize Copilot to Make it Better for You

A July 14 Microsoft Technical Community post introduces “Copilot Memory,” a new feature that Microsoft claims makes Copilot smarter and “more in tune with how you work” by remembering key facts about how a user likes to work. In other words, the feature allows people to personalize Copilot so that they don’t need to repeat instructions about how Copilot should respond to their prompts.

Copilot memory is also covered in message center notification MC1127234 (1 August 2025), which says that general availability should be in September 2025. That conflicts with the original post, which says that “Memory in Copilot will be generally available in July 2025” and notes that the feature is enabled by default.

Adding to Copilot Memory

Fortune favors the brave, so I fired up Copilot Chat (BizChat) and used the suggested method to capture a memory. Copilot responded with “memory updated” (Figure 1).

Adding a Copilot memory
Figure 1: Adding a Copilot memory

Selecting the settings option in Copilot’s […] menu duly revealed a personalization section featuring Copilot memory. I added a couple more observations to build out some instructions for Copilot (Figure 2). Interestingly, Copilot combined some of my instructions to make them more concise.

A set of instructions for Copilot memory.
Figure 2: A set of instructions for Copilot memory

This screen is also where a user can disable Copilot memory or delete individual or all memories.

The work profile tab showed that Copilot had taken note of my job title and location, both sourced from my Entra ID account properties.

Using Copilot Memory

With some memories in place, I created a very simple prompt for Copilot to respond to by asking it to create a list of Entra ID users with PowerShell. In Figure 3, you can see that Copilot included the Microsoft Graph PowerShell SDK in its response because it’s covered by one of the memories that I captured.

Using a Copilot memory in a response.
Figure 3: Using a memory in a response

Graph Option to Manage Copilot Memory for a Tenant

In their July 14 post, Microsoft promised user and tenant options to manage Copilot memory. Individual user control is mentioned above. Tenant-level control is enabled using the Enhanced Personalization Graph resource type. Interacting with the resource requires the PeopleSettings.ReadWrite.All permission and the signed-in user must hold at least the People administrator role. The default setting is revealed as follows after running the Connect-MgGraph cmdlet to connect to the Graph:

$Uri = "https://graph.microsoft.com/beta/copilot/settings/people/enhancedpersonalization"
Invoke-MgGraphRequest -Method Get -Uri $Uri

Name                           Value
----                           -----
isEnabledInOrganization        True
disabledForGroup

These settings tell us that Copilot memory is enabled for the organization, and no restrictions apply. To update the settings to disable Copilot memory for everyone, you patch the resource with a payload containing the new settings:

$Settings = @"
{
  "isEnabledInOrganization": false,
}
"@
Invoke-MgGraphRequest -Method Patch -Uri $Uri -Body $Settings

Name                           Value
----                           -----
isEnabledInOrganization        False
disabledForGroup

After a short period, Copilot memory is disabled for user accounts (Figure 4).

The effects of disabling Copilot memory.
Figure 4: The effects of disabling Copilot memory

But let’s assume that you want some people to be able to use Copilot memory and not others. In this case, create an Entra ID (security) group and pass its details in the payload. This example reenables Copilot memory for the tenant and adds a group of restricted users:

$GroupId = (Get-MgGroup -Filter "displayName eq 'Users disabled for Copilot Memory'").Id
$Settings = @"
{
  "isEnabledInOrganization": true,
  "disabledForGroup": "$GroupId"}
"@

Invoke-MgGraphRequest -Method Patch -Uri $Uri -Body $Settings

More Personalization Makes Copilot Better

Microsoft obviously hopes that personalization will make Copilot more attractive to users. Another example of this strategy in action is the Prioritize My Inbox option to give Copilot instructions about how to filter email. I’m a big fan of removing the need to repeat myself, which is essentially what these options do, so I think they add value. Whether others agree and choose to use Copilot instead of ChatGPT or another competitor remains to be seen.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work! Not even Copilot can help us figure out some of the intricacies of Microsoft 365.

]]>
https://office365itpros.com/2025/08/06/microsoft-introduces-copilot-memory/feed/ 7 70246
Copilot Studio Agent Vulnerability to Prompt Injection https://office365itpros.com/2025/07/15/copilot-studio-vulnerability/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-studio-vulnerability https://office365itpros.com/2025/07/15/copilot-studio-vulnerability/#respond Tue, 15 Jul 2025 07:00:00 +0000 https://office365itpros.com/?p=70027

Copilot Studio Agent Sends Salesforce Customer Data to Attacker

The July 7 report (“A Copilot Studio Story 2: When AIjacking Leads to Full Data Exfiltration“) from Zenity Labs is sobering reading for anyone considering how to introduce Copilot agents within a Microsoft 365 tenant. In a nutshell, Zenity created a replica of a “flagship example” of an agent created using Copilot Studio built by McKinsey & Co and proved that a an email containing a prompt injection sent to the agent could result in the generation of an emailed response containing customer data sent back to an attacker.

I have an instinctive suspicious of reports issued by security researchers because there are too many examples of overhyped text designed purely to enhance the credentials of the company. In this instance, the Microsoft Security Response Center took the issue seriously, so we should too.

The Problem is Still There

We’ve been down this path before with Copilot because researchers reported how they had compromised BizChat at sessions at the Black Hat USA conference in 2024. Copilot agents didn’t exist at the time, but the same method of sending a message for Copilot to process that convinced Copilot to do bad things was used.

Zenity reported the exploit described in the article to Microsoft, who fixed the problem in late April 2025, most likely through the deployment of a “prompt shielding mechanism.” The net result is that attackers cannot use the same avenue to exfiltrate large quantities of data. However, the kicker is that the fix works for the attack as described, but as Zenity says “Unfortunately because of the natural language nature of prompt injections blocking them using classifiers or any kind of blacklisting isn’t enough.” In other words, attackers can find new ways to use natural language prompts to convince agents to do silly things.

The Stupidity of Agents

The problem is, despite all the hype around artificial intelligence, Copilot agents are essentially stupid. They cannot distinguish between good and bad users, nor can they decide that an action demanded of them is wrong or inappropriate. As we head into an era where agents can talk to agents, the need for increased oversight about what agents do and how they do it is all too apparent.

Managing agent objects in Entra ID is a good way to incorporate agents within the infrastructure, but that doesn’t do anything to reveal what agents do in response to different user prompts, including prompts deliberately intended to do harm. You could pour over the details of Copilot interactions captured by the aiInteractionHistory API or the compliance records captured in user mailboxes by the Microsoft 365 substrate searching for evidence of attacker intervention, but what would you look for? Searching for the one API record where 500 Salesforce customer records are sent to an address in Russia might be the equivalent of seeking the proverbial needle in a haystack.

Although ISVs can work on the problem of agent governance, it’s obvious that ISVs can only work with agents using the APIs and data made available by Microsoft. Dealing with prompt injections is something that will remain a Microsoft competence.

As AI tools become more embedded into our work, the more attackers will be interested in seeking gaps. The battle between Microsoft and the bad guys to protect Copilot (apps and agents) is likely to be a ping-pong contest of exploit followed by remediation

The Goodness of Copilot Studio Agents

Don’t get me wrong. I like the ease of use that Copilot Studio brings to the agent creation process. Even an old duffer like me can create and publish an agent from Copilot Studio (Figure 1). We’re simply at the point in the evolution of AI tooling where security, governance, and management struggle to keep up with the pace of innovation and overhyped expectations.

Testing an agent in Copilot Studio.
Figure 1: Testing an agent in Copilot Studio

It would be an overreaction to block users from being able to develop agents with Copilot Studio. Some controls are necessary and restricting those who develop agents to a limited group with oversight before publication seems like a reasonable step. I’m sure that more comprehensive development methodologies and structures will emerge over time and will be discussed on the web and at conferences. I’m looking forward to hearing what the experts say at the TEC event in Minneapolis at the end of September. Come along and join the debate!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/07/15/copilot-studio-vulnerability/feed/ 0 70027
Microsoft 365 Copilot Search Rolling Out https://office365itpros.com/2025/07/14/copilot-search-mark2/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-search-mark2 https://office365itpros.com/2025/07/14/copilot-search-mark2/#comments Mon, 14 Jul 2025 07:00:00 +0000 https://office365itpros.com/?p=69986

Copilot Search Delivers Even More Intelligence?

Prior to Microsoft’s Copilot launch in March 2023, search was simple. Google dominated and had educated people into performing keyword-based searches to find information. Today, the situation isn’t quite so straightforward. AI-generated executive summaries are the norm for Google and other search engines. Keyword-based searching is very different now.

Copilot came with promise of radically better search. I think Microsoft 365 Copilot has lived up to this promise, but only for Graph-based searches for documents, messages, and email, Web-based searches depend on Bing, and that dependency makes web results less than spectacularly wonderful. Some thought that semantic search would make a big difference, but given that Copilot functions without semantic search, its influence is marginal at best.

Copilot Search Mark 2

Microsoft has a reputation for getting things right on the second or third attempt, which brings us to the launch of Microsoft 365 Copilot search, “a new AI-powered, enterprise-grade search experience” for tenants with Microsoft 365 Copilot licenses. The technology is described in message center notification MC1108844 (3 July 2025, Microsoft 365 roadmap item 490778). The new search is available in targeted release tenants now and is scheduled for general availability starting in mid-July 2025.

According to Microsoft, “Copilot Search leverages Microsoft Graph and Microsoft 365 Copilot connectors to index content across Microsoft 365 and third-party apps. It interprets user context, natural language, behavioral signals, and organizational relationships to deliver highly personalized, context-aware answers to complex queries.” That’s quite a mouthful. After using the new search for several days, it reminds me of the Microsoft Search in Bing feature (retired in March 2025) with a hint of Delve, all wrapped up with BizChat. Microsoft says that the integration with BizChat enables users to move seamlessly from search to task execution.

Copilot Search in Action

After reading the documentation, I headed to the Microsoft 365 Copilot app and selected the Search option, making sure that the option to use the new search was selected. I’ve written extensively about Entra ID license management, so proceeded to see what Copilot Search could find. Figure 1 shows the results. Instead of a simple list of results with the ability to filter by type (files, sites, people, messages, and so on), Copilot Search presents what Microsoft considers to be a more intelligent view, including an extract from the Copilot chat response to the question posed in the search. The full chat response is available by clicking the Continue reading button. In essence, you then participate in a full-blown conversation with Copilot.

Copilot Search in action.
Figure 1: Copilot Search in action

The Modified drop-down offers filters for the past 24 hours, past week, past month, and past year. Under Results, you can refine the results based on items found in SharePoint Online, the web (other sites), Outlook mail, and Teams messages (not shown in Figure 1). Loop workspaces and pages are also scanned by search and are included in SharePoint results rather than having their own type.

The web results are generated from Bing, so the normal caveat applies to the accuracy and usefulness of Bing. Microsoft documentation is heavily favored by Bing, so it’s of no surprise that the top two results listed come from that source. Better results are generated if you include a target website address to search. For example, “What Practical365.com articles cover the topic of Entra ID license management

Like BizChat, the DLP policy for Copilot stops Office documents and PDF files assigned specific sensitivity labels turning up in search results.

Pity About Bing, but Copilot Search Excels with Graph Searches

Like any search, the new Copilot search takes some getting used to before it becomes an effective and efficient tool in user hands. The dependency on Bing weakens the ability of Copilot search to beat Google or other search engines, but the ability to find items in Graph-grounded searches is unmatched. That, and the smooth integration with BizChat and the ability to save output in Copilot pages will likely please people enough to drive usage.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/07/14/copilot-search-mark2/feed/ 3 69986
Copilot Audio Overviews for OneDrive Documents https://office365itpros.com/2025/07/07/audio-overview-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=audio-overview-copilot https://office365itpros.com/2025/07/07/audio-overview-copilot/#comments Mon, 07 Jul 2025 07:00:00 +0000 https://office365itpros.com/?p=69915

Create Audio Overviews for Word and PDF Files and Teams Transcripts

Message center notifications MC1061100 (updated 2 July 2025) and MC1060872 (updated 3 July 2025) both focus on audio overviews generated from documents (Word and PDFs) and Teams meetings (transcripts) stored in OneDrive for Business and Copilot Notebooks. This is yet another example of Microsoft applying AI to Microsoft 365 information. The question is whether having an audio review of a file is of real value or a demonstration of technology that might be used once and then forgotten.

This feature requires a Microsoft 365 Copilot license.

Generating an Audio Overview

The implementation is simple. The Copilot menu for a supported file type in the OneDrive for Business browser interface includes the Create an audio overview option (Figure 1).

The Create an audio overview option in the Copilot menu for a supported document type.
Figure 1: The Create an audio overview option in the Copilot menu for a supported document type

Selecting the option causes Copilot to process the file. Logically, it seems like Copilot summarizes the file into a format similar to a Teams transcript and uploads the output to the Azure Audio Stack for transformation into an audio stream (users can save the summary as an .MP3 file in the Recordings folder of their OneDrive for Business account). For now, only English language audio overviews are available, and only files in English can be processed. Copilot politely refused to process documents that contained non-English text, even when the majority of the text was in English. On the other hand, Copilot had no problem processing files containing computer code, such as the PowerShell examples.

Given that Copilot can generate document summaries in different languages and the support for many languages in the Azure Audio Stack, it seems likely that support for other languages will come soon. I also expect to see UX provided to allow users to select other settings, such as the voices used for output (see below).

MC1060872 says that the OneDrive mobile app can generate audio overviews. I haven’t seen the mobile option appear yet.

Audio Overview Styles

The default style summarizes the key points in a document. If you prefer, you can switch the overview to a podcast style using the option in the […] menu. Essentially, the summary is a report of a document read by a single person. The podcast style usually generates a shorter audio stream that’s delivered by two “hosts” (a male voice and a female voice, both with neutral American accents). Figure 2 shows an overview being played with the transcript visible together with the option to switch style.

Playing an audio overview of the Office 365 for IT Pros eBook.
Figure 2: Playing an audio overview of the Office 365 for IT Pros eBook

The audio overview option advises that generation could take a few minutes. I discovered that this is accurate and that overviews for even very large files were available in a couple of minutes. For example, I asked Copilot to generate an audio overview of the Word document for the latest Office 365 for IT Pros eBook. This is a large and complex file (28 MB, 1,250 pages, 22 chapters, and many figures and tables), so I thought it would be a good test. The audio overview was available in less than two minutes. You can download and listen to the summary and podcast versions using the links below to get an idea about the quality and type of output generated for an audio overview.

The DLP Block for Microsoft 365 Copilot

Interestingly, the DLP policy for Microsoft 365 Copilot blocks Copilot from generating audio overviews. I shouldn’t be surprised at this because the idea behind the policy is to stop Copilot from processing confidential files assigned specific sensitivity labels. As noted above, Copilot generates an audio overview using a transcript summary produced from a file. To create the summary, Copilot must be able to extract the file content but is blocked by the DLP policy.

When asked to create an audio overview from a protected file that comes within the scope of the DLP policy, Copilot chews on the problem for a few minutes before concluding that it can’t do anything and errors out (Figure 3). OneDrive must be refreshed before further files can be processed.

Something goes wrong with an audio overview.
Figure 3: Something goes wrong with an audio overview

Although it’s good that the DLP policy for Microsoft 365 Copilot does its job, the poor user experience in the OneDrive for Business browser interface is evidence that the folks who created the audio overview option never considered that a policy might block Copilot access to a file. It would be much better if the UX displayed an immediate error message to say that Copilot cannot process a file instead of making the user wait for a few minutes before Copilot times out.

Are Audio Overviews Valuable?

I might not be the right target market for audio overviews. I suspect that this feature is directed towards people who can’t use regular Copilot document summaries. In this context, I think audio overviews will be very useful. Another scenario where the feature might shine is the ability to save audio overviews of files to OneDrive for listening to during commutes or other journeys. Like all the AI-driven features, the value comes down to the individual. I’m not sure I will ever use a Copilot-generated audio overview again, but I know how to create one if I need it.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/07/07/audio-overview-copilot/feed/ 2 69915
Outlook’s New Summarize Option for Email Attachments https://office365itpros.com/2025/06/23/summarize-attachment-outlook/?utm_source=rss&utm_medium=rss&utm_campaign=summarize-attachment-outlook https://office365itpros.com/2025/06/23/summarize-attachment-outlook/#comments Mon, 23 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69699

Summarize Attachment Feature is an Example of New Features Needed to Maintain Customer Interest

Introducing a new technology is hard. The great expectations created at the initial launch soon meets the hard reality of deployment and things don’t get better until the technology has had time to bake. This is as true for Microsoft 365 Copilot as for any other major technology. I see people questioning whether the $30/user/month really delivers any benefits, with real concern over whether people use any of the purported time saved through Copilot interventions doing anything more valuable than drinking more coffee.

News that the U.S. Better Business Bureau forced Microsoft to change some of the claims it makes about how Microsoft 365 Copilot affects user productivity doesn’t help the case for AI-based assistance. And lukewarm or mildly enthusiastic (but independent) reports about Copilot usage in organizations, like the recent UK Government report based on a 3-month trial for 20,000 employees don’t bolster the case much either.

All Microsoft can do is continue to push out updates and new AI-based features to keep customer interest while Copilot matures to become more useful in day-to-day activities. The result is a flood of new Copilot-related features, not all of which seem valuable except in specific cases. I don’t know whether AI-informed People Skills will become popular (some HR professionals that I know like People Skills a lot). Those in the Power Platform world (now with 56 million monthly active users according to data made available at Microsoft’s FY25 Q3 results) see lots of changes to make Copilot agents more productive. I do like the ability to upload documents to agents for the agents to reason over.

Summarizing Attachments

All of which brings me to the update described in message center notification MC1073094 (13 May 2025, Microsoft 365 Roadmap item 475249). It’s an example of a recent Copilot enhancement to help users process “classic” email attachments faster. Even though cloudy attachments are preferable in many respects, many people still send files instead of links.

Copilot has been able to summarize cloudy attachments for email for quite a while. Now, when a message with one or more classic file attachments arrives, users with a Microsoft 365 license see a new summarize option for Office and PDF attachments. The feature is available in the New Outlook for Windows, OWA, Outlook mobile, and Outlook for Mac. . Microsoft is rolling out the update now with estimated completion by late August 2025. According to MC1112451, Outlook classic will receive the update in mid-July 2025

Figure 1 shows the general idea. A Word file is attached to a message. Clicking the summarize option from the drop-down menu beside the attachment causes Copilot to create and display the summary for the file inside the Summary by Copilot panel (or card). If a message has multiple file attachments, the summarize option must be invoked separately.

The summarize option for a file attachment for a message opened in OWA.
Figure 1: The summarize option for a file attachment for a message opened in OWA

Copilot cannot process encrypted attachments (using sensitivity labels or another encryption mechanism).

No Archived Messages

My archive mailbox is full of attachments from long-forgotten projects, including files related to some legal cases that I was involved with. I was curious to see what sense Copilot might extract from some of the PDFs and Word documents from those cases. Oddly, Outlook won’t summarize any of the attachments for messages stored in an archive mailbox. To generate a summary for these files, you must open download and open Office files in a desktop or web app and use the Copilot options available in the app.

Thinking about why this might be so, I guess the logic is that attachments for archived messages probably aren’t of very high interest, and if someone goes to the trouble of finding an archived message, they have a purpose for doing so and won’t mind opening attachments to view content. On the other hand, I could be overthinking things and Microsoft simply designed the feature to work only with messages from the primary mailbox.

The Value of Small Changes

Over my many years of work, I cannot say how many emails I have received with file attachments. Being able to see a quick summary of an attachment is a good example of how AI can be effective. The feature works well because the AI has just one file to process, so it’s unlikely that hallucinations or other issues will occur. You might disagree with points made in the summary, but having the summary is a timesaver and a great starting point for understanding whether a file contains anything important.

Another example of a small but interesting change is the ability to create a meeting from an Outlook email thread (MC1090693, 9 June 2025, Microsoft 365 roadmap item 494154). The idea is that Copilot scans an email thread to determine the topic for a meeting and its participants and creates a meeting invitation ready to go. This kind of thing doesn’t need AI because existing Graph APIs can do the work, but Copilot integrates the work into a new Schedule with Copilot option (only for email threads with sufficient data to base a meeting upon). According the roadmap item, this feature is for the mobile clients, but I bet it will be available in the new Outlook and OWA too.

In the overall scheme of Copilot, delivering Outlook features to make small tasks easier is not important. However, changes that reduce friction for users are important and collectively a bunch of changes like this might just be enough to convince an organization that they really can’t live without Copilot.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/06/23/summarize-attachment-outlook/feed/ 1 69699
Using a Copilot Agent in SharePoint to Interact with Office 365 for IT Pros https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-studio-agent-knowledge https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/#comments Mon, 16 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69542

Use Office 365 for IT Pros PDF Files as Knowledge Sources for Copilot

The announcement in message center notification MC1078671 (20 May 2025) that Copilot Studio can deploy agents to SharePoint Online sites (in Copilot Studio terms, SharePoint Online is a channel) gave me an idea. SharePoint has supported agents since October 2024, but those agents are limited to reasoning over the information contained in a site. Copilot Studio can create more flexible and powerful agents that can consume different forms of knowledge, including external web sites and files. Uploaded files are stored in the Dataverse, or the mysterious SharePoint Embedded containers that appeared in tenants recently.

My idea is to use the Office 365 for IT Pros eBook as a source for a Copilot agent. Our subscribers can download updated book files every month in PDF and EPUB format. Copilot can consume text files, including PDFs, as knowledge sources (message center notification MC1058260, last updated 9 June 2025, Microsoft 365 roadmap item 489214). If you have Microsoft 365 Copilot licenses, it seems logical to create an agent that uses the PDFs for the Office 365 for IT Pros and Automating Microsoft 365 with PowerShell eBooks as knowledge sources.

You could even expand the set of knowledge sources to https://office365itpros.com and https://practical365.com to include articles written by our author team. Once the agent is configured, it can be published to a SharePoint Online site for users to interrogate. Sounds good? Let’s explore what you need to do to make the idea come alive.

Adding Files to a Copilot Agent

During an investigation of the various ways to create Copilot agents, I created an agent in Copilot Studio called the Microsoft 365 Knowledge Agent. The agent already reasoned over office365itpros.com and practical365.com. I uploaded the PDF files for the two books to the agent so that the agent now reasons over the two websites and two PDF files (Figure 1). You might notice that I have disabled the options for the AI to use its LLMs and to search public websites when composing answers. That’s because I want the agent to limit its responses to the set of defined knowledge sources.

Adding files as knowledge sources for the Copilot agent.
Figure 1: Adding files as knowledge sources for the agent

The upload dialog says that files cannot be “labeled Confidential or Highly Confidential or contain passwords.” This might reflect old information as Microsoft has support for files protected by sensitivity labels in preview. The implementation seems very like support for sensitivity labels in Bizchat in that a user cannot access a file protected by a label if the label doesn’t grant them access to the content. I also assume that Copilot Studio will eventually support the DLP policy for Microsoft 365 to stop confidential files with specific labels being used as knowledge sources.

It can take some time for Copilot Studio to process uploaded files to prepare their content for reasoning, depending on their size. Office 365 for IT Pros is a 1,280-page 27 MB eBook, so it took several hours before Copilot Studio declared the file to be ready. You can upload a maximum of 500 files as knowledge sources for an agent.

Updating the Copilot Agent Instructions

Next, I adjusted the instructions for the agent. Here’s what I used:

  • Respond to requests using information from specific curated websites and the files uploaded as knowledge sources.
  • Ensure the information is accurate and relevant to the topic.
  • Provide well-structured and engaging content.
  • Avoid using information from unverified sources.
  • Maintain a professional and informative tone.
  • Be responsive and prompt in handling requests.
  • Focus on topics related to Microsoft 365 and Entra ID technology.
  • Write in a professional, clear, and concise manner.
  • Output PowerShell code formatted for easy copying and use by readers.
  • Ensure the PowerShell code is accurate and functional.
  • Do not guess when answering and create new PowerShell cmdlets that don’t exist. Always check that a cmdlet exists before using it in an answer.

Coming up with good instructions for an agent is an art form. I’m sure that these can be improved, but they work.

Publish the Copilot Agent to SharePoint Online

The next task is to publish the agent. To publish the agent to a SharePoint Online site, I selected SharePoint as the target channel (Figure 2) and then selected the site that I wanted to host the agent. I suspect that Copilot Studio caches site information because it wasn’t possible for search to find a new site for several hours after the site’s creation. Publishing to a site creates an .agent file in the default document library in the site.

Selecting SharePoint as the publication channel for the Copilot agent.

Copilot Studio.
Figure 2: Selecting SharePoint as the publication channel for the Copilot agent

An agent can only be deployed to a single site. If you make a mistake and deploy the agent to the wrong site, you’ll need to undeploy and remove the site from the agent configuration and then deploy the agent to the correct site.

Out of the box, the only person who can use the agent at this point is the publisher. To make the agent available to all site members, a site administrator needs to mark the agent as approved. The agent then shows up in the list of agents accessed through the Copilot button in the meu bar. Any user with a Microsoft 365 Copilot agent can use the agent as part of their license. Access for other users must be paid for on a pay-as-you-go basis.

Using the Copilot Agent in SharePoint

Interacting with the agent to ask questions from the knowledge contained in Office 365 for IT Pros is just like any other Copilot interaction. Compose a prompt and submit it to the agent, which contemplates the request and responds based on the knowledge available to it (Figure 3).

Using the agent in a SharePoint site.
Figure 3: Using the agent in a SharePoint site

SharePoint Online is not the only publication channel available to an agent. I also connect the agent to Microsoft 365 and Teams. Figure 4 shows how to chat with the agent in Teams.

Copilot agent working in Teams chat
Figure 4: Copilot agent interacting in Teams chat

The Only Downside is Our Monthly Updates

We know that Office 365 for IT Pros is a big eBook. Sometimes it’s hard to find the precise information that you’re looking for using the standard search facilities. Being able to have an agent reason over the books (and optionally, associated web sites) is an excellent way to have AI do the heavy lifting of finding and extracting knowledge in a very accessible way. The only downside is that you need to update the agent with the new files and republish to the target channels after we release our monthly updates. But that’s not a difficult task – and I am sure that a way will be found to automate the step in the future.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/feed/ 4 69542
Microsoft Launches the Copilot Interaction Export API https://office365itpros.com/2025/05/30/aiinteractionhistory-api/?utm_source=rss&utm_medium=rss&utm_campaign=aiinteractionhistory-api https://office365itpros.com/2025/05/30/aiinteractionhistory-api/#comments Fri, 30 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69376

aiInteractionHistory Graph API Available in June 2025

Microsoft 365 message center notification MC1080682 (22 May 2025, Microsoft 365 Roadmap item 491631) announces that the new Microsoft 365 Copilot Interaction Export API (aka, the aiInteractionHistory API) will roll out in June 2025. This is the same API that I covered in a Practical365.com article last December and the documentation still says that the API is available through the Graph beta endpoint. Perhaps the intention is to move the API to the V1.0 (production) endpoint when it’s officially released.

I don’t see much change in how the API works or the retrieved data since I last looked at it. A welcome change is that it is now possible to fetch a maximum of 100 records per request rather then ten. Fetching ten interaction records at a time made the API very slow. Although faster than before, the API is still slow, especially for an API designed to allow third-party apps and ISVs “to export Copilot user interaction data for processing in their security and compliance (S+C) applications.”

Other audit APIs support fetching up to a thousand records at a time. Maybe a V1.0 version of the API will support a higher value. Details of how the API works and an example script can be found in the original article.

Licenses and Permissions

The AiEnterpriseInteraction.Read.All Graph permission needed to access interaction data is not available as a delegated permission, meaning that the only way to access the data is through an app (including app-only interactive Microsoft Graph PowerShell SDK sessions). Weirdly, accounts used to run apps using the API to fetch interaction records must have a Microsoft 365 Copilot license.

What the aiInteractionHistory API Captures

According to Microsoft, the API “captures the user prompts and Copilot responses in Copilot private interactions chat and provides insights into the resources Copilot has accessed to generate the response.” This statement does not mean that the data lays bare the details of Copilot interactions. Some of the information needs to be mined and interpreted to make sense. For instance, here are the details of an interaction record:

Name                           Value
----                           -----
locale                         en-us
body                           {[content, [AutoGenerated]undefined<attachment id="fd3a9044-309c-4ec9-a568-676f1d521f24"></attachment><attachment id="01TAGX3U2ESA5P3HBQZFHKK2DHN…
from                           {[@odata.type, #microsoft.graph.chatMessageFromIdentitySet], [user, System.Collections.Hashtable], [application, ], [device, ]}
appClass                       IPM.SkypeTeams.Message.Copilot.Word
attachments                    {02 Managing Identities.docx, unknown-file-name}
contexts                       {02 Managing Identities.docx, unknown-file-name}
createdDateTime                25/04/2025 09:27:05
conversationType               appchat
interactionType                userPrompt
mentions                       {}
links                          {}
sessionId                      19:t67NyrXsxDyC8qGGCtSQZYjC3TV1lYvq3IkjzpXquUc1@thread.v2
id                             1745573225046
requestId                      GTbr3lBouCMpcP7L1qVv8Q.20.1.1.1.4
etag                           1745573225046

The appClass property tells us what Copilot app the interaction is for. In this case, it’s Copilot for Word. The attachments property tells us if any reference files are used. One is mentioned here, and given that the body property mentions AutoGenerated, we can conclude that this interaction occurred when Copilot for Word generated an automatic summary for a document.

The interactionType tells us that this record is for a user prompt. Responses from Copilot have aiResponse in the interactionType property. User prompts that aren’t for automatic summaries have the text of the prompt in the body property. For example:

Name                           Value
----                           -----
content                        What functionality isn't available with a Microsoft 365 retention policy
contentType                    text

aiInteractionHistory API requests require the identifier for a user account and the response is the records for that user. Details of the user are in the from property, but you’ll have to navigate to from.user.id to see the identifier for the user. A DisplayName property is available in the from structure but doesn’t hold the display name of the user.

Assuming that a third-party application wanted to retrieve the ai interaction history records and process the records for its own purposes, it’s obvious from this brief discussion that the application has some work to do to interpret the raw data to make it useful for compliance investigations or other purposes. The script published with the December article referenced above shows how to approach the task, which is like the parsing of audit records to extract useful content. Figure 1 shows the kind of data that can be extracted from the aiInteractionHistory API records.

Data extracted using the aiInteractionHistory API
Figure 1: Data extracted using the aiInteractionHistory API

The Many Sources of Information About Copilot Interactions

It’s hard to know how useful the aiInteractionHistory API will turn out to be. Other sources of information can be mined to discover how people use Copilot, including usage data, audit records, and the compliance records held in user mailboxes. I guess it all depends on what you’re looking for.


]]>
https://office365itpros.com/2025/05/30/aiinteractionhistory-api/feed/ 1 69376
The Case of the Mysterious SharePoint Embedded Containers https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-embedded-containers-km https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/#comments Wed, 28 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69322

Oddly Named SharePoint Embedded Containers Show Up for Copilot Studio

Microsoft 365 tenant administrators can be swamped with message center notifications, reports about service health issues, and automated email generated by Entra ID and other workloads. Other more important things usually get in the way and often no great harm is done. Right now, there are 830 notifications in the message center for my tenant, and probably only 20% of the notifications are what I consider important. For instance, knowing that a new channel update is available for the Office apps isn’t usually a critical event.

In any case, some gems do appear, and it’s important that tenant administrators keep an eye on what’s happening. Let’s discuss an example involving SharePoint Embedded and Copilot Studio to illustrate the point.

The Set of SharePoint Embedded Containers with GUID Names

At first glance, message center notification MC1058260 (last updated 12 May 2025, Microsoft 365 roadmap item 489214), titled “Microsoft 365 Copilot: Admin controls and user file uploads for agent knowledge sources” didn’t seem too worrying. Given Microsoft’s current preoccupation with AI, it’s unsurprising that flood of notifications describing various Copilot enhancements appear weekly. As I don’t use Copilot Studio much, it was easy to assume that a development won’t impact my tenant.

When investigating how Loop workspaces connected to Teams standard channels, I noticed a bunch of strange containers for the Declarative Agent app had appeared in SharePoint Embedded (Figure 1). Some process had created these containers in three batches on April 27 (3:25am), 8 May (1:53am), and 15 May (2:21pm). All the containers appeared to be empty. The only clue was the application name, indicating that the containers are related to some form of agents.

Some of the mysterious SharePoint Embedded Containers created for Copilot agents.
Figure 1: Some of the mysterious SharePoint Embedded Containers created for Copilot agents

Agents process information from knowledge sources like SharePoint Online sites. MC1058260 explains that users will soon be able to upload up to 20 documents for agents to use as knowledge sources, and when this happens, the uploaded files are stored in “tenant-owned Microsoft SharePoint Embedded (SPE) containers.” MC1058260 goes on to note that “As part of this rollout, we will pre-provision a limited set of SPE containers in your tenant.” The mystery is solved because these containers are the pre-provisioned containers mentioned by MC1058260. I assume that Microsoft creates the containers to make it faster for users to upload documents (because they don’t have to wait for an agent to create a container).

Adding Files as Knowledge Sources for Agents

My tenant ended up with 80 pre-provisioned containers (so far – I have no idea if more provisioning cycles will happen in the future). As far as I can tell, the provisioning operation didn’t generate any audit records. At least, audit log searches for the creation times for the containers turn up nothing of interest.

My tenant doesn’t have 80 agents in use (the number is more like 8), so I assume that the pre-provisioned containers are a pool that agents can use. To test the theory, I edited an agent that I created with Copilot Studio a couple of months ago and added the source Word document for the Automating Microsoft 365 with PowerShell eBook as a knowledge source (Figure 2).

Adding a file as a knowledge source for a Copilot agent.
Figure 2: Adding a file as a knowledge source for a Copilot agent

What I expected to happen is an allocation of one of the pre-provisioned containers to the agent and an update to the container name to change it from the GUID used by the pre-provisioning routine to the name of the agent. Updates don’t happen quickly in the SharePoint admin center and site and containers data is usually at least two days behind real time, so I was prepared to wait. However, no change showed up over the next few days.

The Mysterious SharePoint Embedded Containers Disappear

And then, Microsoft hid the pre-provisioned containers. I had chatted to some Microsoft contacts and complained about the mysterious containers, so I guess they acted. In any case, there’s now no trace of the containers and I can’t find out if the updated agent took over a container. And as I don’t know the application identifier for the Declarative Agent app, I can’t use the Get-SPOContainer cmdlet to retrieve any details like the storage consumption (or name) to check if anything had changed in the set of containers.

It’s probably best that Microsoft hides these containers when they are newly created and empty. However, once a container is used by an agent, I think it should show up in the set of active containers displayed in the SharePoint admin center, if only because the storage consumed by the container is charged against the tenant SharePoint Online storage quota. It’s the kind of detail that Microsoft needs to deliver for tenant-wide agent management.

The mystery is solved, and I learned how to add a file as a knowledge source for an agent. Keep an eye on the notifications posted to the message center. You might even learn something too!


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/feed/ 1 69322
Why Copilot Access to “Restricted” Passwords Isn’t as Big an Issue as Uploading Files to ChatGPT https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-pen-test2 https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/#comments Tue, 20 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69290

Unless You Consider Excel Passwords to be Real Passwords

I see that some web sites have picked up the penetration test story about using Microsoft 365 Copilot to extract sensitive information from SharePoint. The May 14 Forbes.com story is an example. The headline of “New Warning — Microsoft Copilot AI Can Access Restricted Passwords” is highly misleading.

Microsoft 365 Copilot and penetration tests.

Unfortunately, tech journalists and others can rush to comment without thinking an issue through, and that’s what I fear has happened in many of the remarks I see in places like LinkedIn discussions. People assume that a much greater problem exists when if they would only think things through, they’d see the holes in the case being presented.

Understanding the Assumptions made by the Penetration Test

As I pointed out in a May 12 article, the penetration test was interesting (and did demonstrate just how weak Excel passwords are). However, the story depends on three major assumptions:

  • Compromise: The attacker has control of an Entra ID account with a Microsoft 365 Copilot license. In other words, the target tenant is compromised. In terms of closing off holes for attackers to exploit, preventing access is the biggest problem in the scenario. All user accounts should be protected with strong multifactor authentication like the Microsoft authenticator app, passkeys, or FIDO-2 keys. SMS is not sufficient, and basic authentication (just passwords) is just madness.
  • Poor tenant management: Once inside a tenant and using a compromised account, Microsoft 365 Copilot will do what the attacker asks it to do, including finding sensitive information like a file containing passwords. However, Copilot cannot find information that is unavailable to the signed-in user. If the tenant’s SharePoint Online deployment is badly managed without well-planned and well-managed access controls, then Copilot will happily find anything that the user’s access allows it to uncover. This is not a problem for Copilot: it is a failure of tenant management that builds on the first failure to protect user accounts appropriately.
  • Failure to deploy available tools: Even in the best-managed SharePoint Online deployment, users can make mistakes when configuring access, Users can also follow poor practice, such as storing important files in OneDrive for Business rather than SharePoint Online. But tenants with Microsoft 365 Copilot licenses can mitigate against user error with tools available to them such as Restricted Content Discovery (RCD) and the DLP policy for Microsoft 365 Copilot. The latter requires the tenant to deploy sensitivity labels too, but that’s part of the effort required to protect confidential and sensitive information.

I’m sure any attacker would love to find an easily-compromised tenant where they can gain control over accounts that have access to both badly managed SharePoint Online sites that hold sensitive information and Microsoft 365 Copilot to help the attackers find that information. Badly-managed and easily-compromised Microsoft 365 tenants do exist, but it is my earnest hope that companies who invest in Microsoft 365 Copilot have the common sense to manage their tenants properly.

Uploading SharePoint and OneDrive Files to ChatGPT

Personally speaking, I’m much more concerned about users uploaded sensitive or confidential information to OpenAI for ChatGPT to process. The latest advice from OpenAI is how the process works for their Deep Research product. Users might like this feature because they can have their documents processed by AI. However, tenant administrators and anyone concerned with security or compliance might have a different perspective.

I covered the topic of uploading SharePoint and OneDrive files to ChatGPT on March 26 and explained that the process depends on an enterprise Entra ID app (with app id e0476654-c1d5-430b-ab80-70cbd947616a) to gain access to user files. Deep Research is different and its connector for SharePoint and OneDrive is in preview, but the basic principle is the same: a Graph-based app uploads files for ChatGPT to process. If that app is blocked (see my article to find out how) or denied access to the Graph permission needed to access files, the upload process doesn’t work.

Set Your Priorities

I suggest that it’s more important to block uploading of files from a tenant to a third-party AI service where you don’t know how the files are managed or retained. It certainly seems like a more pressing need than worrying about the potential of an attacker using Microsoft 365 Copilot to run riot over SharePoint, even if a penetration test company says that this can happen (purely as a public service, and not at all to publicize their company).

At least, that’s assuming user accounts are protected with strong multifactor authentication…


]]>
https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/feed/ 1 69290
Microsoft 365 Copilot Gets Viva Insights Service Plans https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-license-sp https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/#comments Mon, 19 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69239

Two Workplace Analytics Service Plans to Enable Viva Insights

Microsoft message center notification MC1009917 (last updated 25 April 2025, Microsoft 365 roadmap item 471002) announced the inclusion of Viva Insights in the Microsoft 365 Copilot license. The mechanism used is the addition of two “Workplace Analytics” service plans to join the existing eight service plans (table 1) that make up the Copilot license. More information is available in the documentation for the Copilot features made available by these service plans.

Service PlanService Plan SKUService Plan Part Number
Microsoft Copilot with Graph-grounded chat (Biz Chat)3f30311c-6b1e-48a4-ab79-725b469da960M365_COPILOT_BUSINESS_CHAT
Microsoft 365 Copilot in Productivity Appa62f8878-de10-42f3-b68f-6149a25ceb97M365_COPILOT_APPS
Microsoft 365 Copilot in Microsoft Teamsb95945de-b3bd-46db-8437-f2beb6ea2347M365_COPILOT_TEAMS
Power Platform Connectors in Microsoft 365 Copilot89f1c4c8-0878-40f7-804d-869c9128ab5dM365_COPILOT_CONNECTORS
Graph Connectors in Microsoft 365 Copilot82d30987-df9b-4486-b146-198b21d164c7GRAPH_CONNECTORS_COPILOT
Copilot Studio in Copilot for Microsoft 365fe6c28b3-d468-44ea-bbd0-a10a5167435cCOPILOT_STUDIO_IN_COPILOT_FOR_M365
Intelligent Search (Semantic search and dataverse search)931e4a88-a67f-48b5-814f-16a5f1e6028d)M365_COPILOT_INTELLIGENT_SEARCH
Microsoft 365 Copilot for SharePoint0aedf20c-091d-420b-aadf-30c042609612M365_COPILOT_SHAREPOINT
Workplace Analytics (backend)ff7b261f-d98b-415b-827c-42a3fdf015afWORKPLACE_ANALYTICS_INSIGHTS_BACKEND
Workplace Analytics (user)b622badb-1b45-48d5-920f-4b27a2c0996cWORKPLACE_ANALYTICS_INSIGHTS_USER

Table 1: Microsoft 365 Copilot Service Plans

The last update from Microsoft said that updates to add the Viva Insights service plans completed in mid-April 2025.

Viva Insights and Microsoft 365 Copilot

According to Microsoft, access to Workplace Analytics allows “IT admins and analysts can tailor advanced prebuilt Copilot reports with their business data or create custom reports with organizational attributes, expanded Microsoft 365 Copilot usage metrics, and more granular controls.” The data is exposed in Viva Insights (web), the Viva Insights Teams app (Figure 1), and the Viva Insights mobile apps.

Copilot Dashboard in the Viva Insights Teams app.
Figure 1: Copilot Dashboard in the Viva Insights Teams app

Everyone running a Copilot deployment is intimately aware of the need to track and understand how people use AI in different apps. The API behind the Copilot usage report in the Microsoft 365 admin center delivers sparse information. It’s possible to enhance the usage report data with audit data and use the result to track down people who don’t make use of expensive licenses, but that requires custom code. Hence the insights reported in the Copilot Dashboard in Viva Insights.

A note in the announcement says that access to the Copilot Dashboard now requires a minimum of 50 Viva Insights (Copilot) licenses. As obvious from Figure 1, my tenant has fewer than 50 licenses, but can still use Viva Insights because it’s not a new tenant.

What Service Plans Do

As you’re probably aware, a license (product, or SKU) is something that Microsoft sells to customers. A service plan enables or disables specific functionality within a license. For example, the Copilot license includes the Copilot Studio in Copilot for Microsoft 365 service plan, which in turn allows users to create agents in Copilot Studio. If you don’t want people to be able to access Copilot Studio, you can disable the service plan.

Disabling a service plan can be done by updating a user’s licenses through the Microsoft 365 admin center. Options are available to do this through User Accounts or License Details (Figure 2).

Amending service plans for a user’s Microsoft 365 Copilot license.
Figure 2: Amending service plans for a user’s Microsoft 365 Copilot license

If you use group-based licensing, you can amend the options for the Copilot license to remove service plans. However, this affects every user in the group, so you might end up with one group to assign “full” Copilot licenses and another to assign “restricted” licenses.

Be Careful When Disabling Copilot Service Plans

One potential issue with some Copilot service plans is that you’re never quite sure what removing a service plan will do. Removing the Microsoft 365 Copilot in Productivity Apps service plan seems straightforward because it disables the Copilot options in the Office desktop apps (all platforms). But disabling the Intelligent Search service plan will mess up any app that uses Copilot to search.

Blocking Copilot Studio is problematic. Removing the service plan only removes the ability of a user to sign in to use Copilot Studio. They can still sign in for a 60-day trial, just like anyone else with an email address who doesn’t have a Copilot Studio license.

Disabling Copilot Service Plans with PowerShell

Disabling service plans through a GUI can rapidly become tiresome. I wrote a PowerShell script to (downloadable from GitHub) to demonstrate how to use the Set-MgUserLicense cmdlet from the Microsoft Graph PowerShell SDK to disable a Copilot service plan. Another variation on removing service plans is explained here.

The script checks for group-based license assignment for Copilot licenses and if found, creates an array of excluded accounts that it won’t process. It then scans for accounts with a Microsoft 365 Copilot license and if the account isn’t excluded, runs Set-MgUserLicense to disable the Copilot Studio service plan. It’s just an example of using PowerShell to automate a license management operation and is easily amended to process any of the Copilot service plans. Enjoy!!


Stay updated with developments across the Microsoft 365 ecosystem by subscribing to the Office 365 for IT Pros eBook. We do the research to make sure that our readers understand the technology. The Office 365 book package includes the Automating Microsoft 365 with PowerShell eBook.

]]>
https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/feed/ 5 69239
Penetration Test Asks Questions About Copilot Access to SharePoint Online https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-pentest https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/#comments Mon, 12 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69177

Can Attackers Use Copilot for Microsoft 365 to Help Find Information?

An article by a UK-based security penetration test company titled “Exploiting Copilot AI for SharePoint” drew my attention to see what weaknesses testing had found. I was disappointed. Although the article makes some good points, it doesn’t tell reveal anything new about the potential issues that can arise due to poor protection of information stored in SharePoint Online sites. Let’s discuss the points raised in the article.

A Compromised Account

Copilot for Microsoft 365 always works as a signed in user. Before an attacker can use Copilot for Microsoft 365, they must be able to sign into a licensed user’s account. In other words, that account is compromised. That’s bad for a tenant because any compromise can lead to data loss or other damage, and it’s probably indicative of other problems that attackers can exploit without going near Copilot.

Organizations should protect themselves with strong multifactor authentication (MFA). That message seems to be slowly getting through, and you’d imagine that any tenant willing to invest in Copilot is also willing to protect themselves by insisting that all accounts are protected by MFA.

Seeking Sensitive Information

The authors make a good point that people often store sensitive information in SharePoint Online. Attackers like to search for information about passwords, private keys, and sensitive documents. Copilot undoubtedly makes it much easier for attackers to search, but I don’t think that the default site agents create any vulnerability because these agents are constrained to searching within the sites they belong to.

Custom agents might be more problematic, but that depends on the information accessed by the agents. It also depends on the penetrated user being able to run the custom agents. The big thing to remember here is that Copilot can only access data available to the account being used. Custom agents in the hands of an attacker can’t automagically get to some hidden data. Anyway, organizations should monitor the creation of agents and have some method to approve the use of those agents.

Accessing Password Data

The penetration team reported that they had found an interesting file (an encrypted spreadsheet) that appeared to contain passwords that SharePoint blocked access to because “all methods of opening the file in the browser had been restricted.” This sounds like SharePoint’s block download policy was in operation for the site. However, Copilot was able to fetch and display the passwords stored in the file.

It’s likely that the spreadsheet was “encrypted” using the default Excel protection applied when a user adds a password to a spreadsheet. However, the encryption is no match for Microsoft Search, which can index the information in the file, and that’s what Copilot for Microsoft 365 Chat was able to display (Figure 1).

Copilot for Microsoft 365 reveals some passwords stored in a password-protected Excel worksheet.
Figure 1: Copilot for Microsoft 365 reveals some passwords stored in a password-protected Excel worksheet

Excel’s encryption is very poor protection in the era of AI. Sensitivity labels should be used to secure access to sensitive information, specifically labels that do not allow Copilot to extract and display information from files found by searching against Microsoft Search. Even better, use the DLP policy for Microsoft 365 Copilot to completely hide sensitive files against Copilot so that not even the file metadata is indexed.

Alternatively, use Restricted Content Discovery (RCD) to hide complete sites so that casual browsing by attackers (or anyone else looking for “interesting” information). Apart from RCD, Microsoft makes other SharePoint Advanced Management (SAM) features available to Microsoft 365 Copilot tenants. There’s no excuse for failing to use the access control and reporting features to secure sensitive sites.

Copilot for Microsoft 365 is a Superb Seeker

Copilot for Microsoft 365 is superb at finding information stored in SharePoint Online and OneDrive for Business. With good prompting, an attacker with access to a compromised account can retrieve data faster than ever before, and unlike previous methods of trawling through SharePoint files, Copilot access doesn’t leave breadcrumbs like entries in the last files accessed list.

Copilot access can be constrained by making sure that suitable permissions are in place for documents, deploying the DLP policy for Microsoft 365 Copilot, and limiting access to confidential sites through Restricted Content Discovery. The DLP policy and RCD are recent Copilot control mechanisms that I don’t think the authors of the penetration test report considered (even though they refer to blocking agents with RCD). But available mechanisms are worthless unless implemented, and the real value of reports like this is to prompt administrators to use available tools, including MFA to reduce the likelihood of a compromised account.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/feed/ 1 69177
How to Enhance Copilot Usage Data https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-data-accounts https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/#comments Fri, 09 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69127

Combine Copilot Usage Data with User Account Details to Gain Better Insight for Deployments

Discussing the usage data that’s available for Microsoft 365 Copilot (in the Microsoft 365 admin center and via a Graph API), a colleague remarked that it would be much easier to leverage the usage data if it contained the department and job title for each user. The usage data available for any workload is sparse and needs to be enhanced to be more useful.

Knowing what data sources exist within Microsoft 365 and how to combine sources with PowerShell or whatever other method you choose is becoming a valuable skill for tenant administrators. I’ve been down this path before to discuss combining usage data with audit data to figure out user accounts who aren’t using expensive Copilot licenses. Another example is combining Entra ID account information with MFA registration methods to generate a comprehensive view of user authentication settings.

Scripting a Solution

In this instance, the solution is very straightforward. Use a Graph API call (complete with pagination) to download the latest Copilot usage data, Find the set of user accounts with a Microsoft 365 Copilot license and loop through the set to match the user account with usage data. Report what’s found (Figure 1).

Copilot usage datacombined with user account details.
Figure 1: Copilot usage data combined with user account details

Obfuscated Data and Graph Reports

The thing that most people trip over is matching usage data with user accounts. This is impossible if your tenant obfuscates (anonymizes) usage data. This facility has been available since late 2020 and if the obfuscation setting is on in the Microsoft 365 admin center, all usage data, including the data used by the admin center and Graph API requests is “de-identified” by replacing information like user principal names and display names with a system-generated string.

It’s therefore important to check the settings and reverse it if necessary for the duration of the script to make sure that you can download “real” user information. If you don’t, there’s no way of matching a value like FE7CC8C15246EDCCA289C9A4022762F7 with a user principal name like Lotte.Vetler@office365itpros.com.

Fortunately, I had a lot of code to repurpose, so the script wasn’t difficult to write. You can download the complete script from the Office 365 for IT Pros GitHub repository.

Finding Areas for Focus

Getting back to the original question, I assume the idea of including job titles and departments with Copilot usage data is to figure out where to deploy assistance to help people understand how to use Copilot in different apps. You could do something like this to find the departments with Copilot users who have no activity in the report period (90 days).

    Group-Object -Property Department | ForEach-Object {
        [PSCustomObject]@{
            Department = $_.Name
            UserCount  = $_.Group.Count
        }
    }

$GroupedReport | Sort-Object -Property Department | Format-Table -AutoSize

Department               UserCount
----------               ---------
Analysis and Strategy            3
Business Development             1
Core Operations                 57
Editorial                        1
Group HQ                         1
Information Technology           3
Marketing                       22
Planning & Action                1
Project Management               1
Research and Development         1

With this kind of output, the team driving Copilot adoption and use for the organization would be wise to spend some time with the Core Operations and Marketing departments to ask why so many of their users don’t appear to be using Copilot.

As noted above, understanding how to use PowerShell to mix and match data sources to answer questions is a valuable skill. There’s lots of data available in a Microsoft 365 tenant. That data is there to be used!


Need some assistance to write and manage PowerShell scripts for Microsoft 365? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.

]]>
https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/feed/ 1 69127
How Microsoft 365 Copilot Tenants Benefit from SharePoint Advanced Management https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-advanced-management-2 https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/#respond Tue, 06 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69011

Ignite Announcement About SAM for Copilot Customers Misinterpreted by Many

At the Ignite 2024 conference, Microsoft announced that “Microsoft 365 Copilot will now include built-in content governance controls and insights provided by SharePoint Advanced Management.” At the time, and still broadly believed, the assumption was that Microsoft would provide customers with Microsoft 365 Copilot licenses with SharePoint Advanced Management (SAM) licenses. Maybe even a single SAM license would be sufficient to license SAM technology alongside Copilot. That’s not the case.

If you’ve been waiting for a SAM license to appear in your tenant, you’ll be disappointed and won’t see SAM listed in the set of tenant subscriptions. Don’t be swayed by the banner in the SharePoint Online admin center to announce that your SharePoint Advanced Management subscription is enabled (Figure 1). It’s not. Access to SAM features is granted through a check enabled in code for the presence of Copilot. The necessary update is now broadly available to customers.

SharePoint Advanced Management options in the SharePoint admin center.

SAM
Figure 1: SharePoint Advanced Management options in the SharePoint admin center

SAM Features for Microsoft 365 Copilot Customers

The facts are laid out in the SAM documentation. Customers with eligible Copilot licenses can use some, but not all, SAM functionality without a SAM license. Here’s the list:

  • Site Lifecycle Policy
    • Inactive SharePoint sites policy
    • Site Ownership Policy
  • Data Access Governance (DAG) Insights
    • “Everyone Except External Users” (EEEU) insights
    • Sharing Links and Sensitivity Labels
    • PowerShell: Permission state report for SharePoint and OneDrive Sites, and Files
    • Sharing links report
  • Site Access Review
  • Restricted Content Discovery (RCD – enabled via PowerShell)
  • Restricted Access Control (RAC) for SharePoint and OneDrive for Business.
  • Recent Admin Actions and Change History
  • Block Download Policy
    • SharePoint and OneDrive sites
    • Teams recordings

There’s some good stuff here, particularly Restricted Content Discovery (RCD), the Site Lifecycle Policy to manage inactive sites, and the Block download policy. Every tenant with Microsoft 365 Copilot should consider enabling RCD to block Copilot access to sites containing sensitive Office and PDF files and sites containing old and obsolete material (the digital rot or debris that clutters up so many tenants).

The problem with Copilot reusing sensitive material in its responses is obvious. The issue with Copilot reusing old, obsolete, and potentially misleading content in its responses is equally problematic, especially if human checks don’t catch errors in responses. Copilot doesn’t know when a Word document written ten years ago is outdated and inaccurate. All Copilot sees is words that can be processed and reused.

When SAM is Needed

All of which brings me to a point where a SAM license is required. In my case, I wanted to test the extend SharePoint protections with a default sensitivity label feature. The idea here is to make sure that unlabeled files receive protection when downloaded by applying a sensitivity label with equivalent rights to those enjoyed by site users. Defining a default sensitivity label for a document library already requires an Office 365 E5 license or equivalent. Why this slight extension wanders into the need to have SAM is another example of bizarre Microsoft licensing.

The documentation notes that Copilot can’t currently open files with sensitivity labels applied in this manner. This means that Copilot cannot extract the protected content to use in its responses because it doesn’t have the right to do so. However, Copilot can search the metadata of labeled files and show that metadata to those who perform searches. Restricted Content Discovery is the right way to block Copilot access to files.

Anyway, without a SAM license, I can’t test. Do I want to pay Microsoft for a license for the privilege of testing their software? I don’t think so.

Copilot in Word for iOS

In closing, I attempted to use a new feature in Word for iOS (and Android) to dictate some notes for this article for Copilot to reason over and produce a draft. The feature is covered in MC1060866 (23 April 2025) and deployment has begun, which is why I guess I could use it. The dictation part worked, even if some of my words were misunderstood (Figure 2). But any attempt to have Copilot do some magic failed utterly. I guess that AI can’t help me…

Dictating text in Word for iOS for Copilot to process.
Figure 2: Dictating text in Word for iOS for Copilot to process


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/feed/ 0 69011
Microsoft Extends DLP Policy for Copilot to Office Apps https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/?utm_source=rss&utm_medium=rss&utm_campaign=dlp-policy-for-copilot2 https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/#respond Mon, 05 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69146

Same DLP Policy for Copilot Used to Block BizChat

On May 1, Microsoft announced that the public preview of the DLP policy for Microsoft 365 Copilot is effective for the Office apps (MC1059677, 21 April 2025, Microsoft 365 roadmap item 423483). The new functionality is an extension of the DLP policy introduced in March 2025. At that time, the policy only covered Microsoft 365 Copilot Chat (BizChat). Its extension to cover the Office apps (desktop and web) is logical, even if the implementation is different. We’ll get to what those differences are shortly.

How the DLP Policy for Copilot Works

As a quick refresher, the DLP policy for Copilot works by checking if a file is assigned a specific sensitivity label. If true, the Copilot functionality built into the app is limited and the content of the file cannot be used in Copilot responses, such as creating a document summary.

Apps are responsible for checking if a DLP policy is active within the tenant and what sensitivity labels are associated with the policy, so the announcement marks the inclusion of the necessary code in the Office apps to check for the DLP policy. I tested with Microsoft 365 Enterprise Apps version 2504 (build 18730.20122).

Like any other DLP policy, the policy can have multiple rules. In this case, rules for the DLP policy for Copilot block access for a sensitivity label, so if you want to block access for multiple sensitivity labels, the DLP policy has a separate rule for each label. If you created the DLP policy for Copilot to use with BizChat, you don’t need to do anything to extend the policy to cover the Office apps.

Using the DLP Policy for Copilot in Word

As an example, I created a Word document and tested that all the Copilot functionality worked as expected. I saved the document and reopened it to force Copilot to generate the automatic summary.

I then applied one of the sensitivity labels covered by a rule in the DLP policy for Copilot and tried out some of the Copilot features. As you can see from Figure 1, the automatic summary was not removed (but the summary cannot be updated), and asking Copilot to explicitly summarize the document fails because “your organization’s policy doesn’t allow it.” However, it looks like Copilot can query the content of the document to answer questions in chat.

Copilot in Word with DLP block.

DLP policy for Copilot.

In their announcement, Microsoft says that “Copilot actions like summarizing or auto-generating content directly in the canvas are blocked.” They also say that chatting with Copilot is also blocked, but as you can see in Figure 1, Copilot answered a predefined question (“What is the purpose of DLP for M365 Copilot”) quite happily. On the other hand, if you go to the Message Copilot section and input the same question, Copilot refuses to answer. The block on chat worked in the web app but not always in the desktop version of Word (but this is preview software, so some bugs are expected).

Finally, Copilot cannot reference a file protected by one of the sensitivity labels covered by the DLP policy (an action that forces Copilot to extract the content of the referenced document).

Maybe Just Turn Copilot Off

I’ve used Copilot for nearly two years, and I was initially confused by the effect the DLP policy for Copilot has on the Office apps. To me, it would be simpler and more understandable to disable Copilot completely for documents within the scope of the DLP policy. I would remove the Copilot button from the menu bar and make sure that no UI elements that expose any Copilot feature, like the automatic summary appear. Right now, the UI is a confusing mishmash of things that work and stuff that doesn’t that needs to be cleaned up.


Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.

]]>
https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/feed/ 0 69146
Copilot’s Solution to Fix Grammar and Spellings https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/?utm_source=rss&utm_medium=rss&utm_campaign=fix-grammar-and-spelling-errors https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/#respond Fri, 25 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=69025

Fix Grammar and Spelling Problems with a Single Click

I am indebted to Paul Robichaux for pointing out a notable example of where Copilot for Word could help authors rewrite text to be clearer and more concise. The offending article is in the Microsoft Graph documentation for a topic called “enhanced personalization control overview.” It’s part of a set of controls for Microsoft 365 features such as name pronunciation and user pronouns in an area that’s been under development for several years. The text is turgid, unclear, and difficult to understand (Figure 1).

This text is perfectly clear, concise, and highly readable.

Fix grammar and spelling errors.
Figure 1: This text is perfectly clear, concise, and highly readable

The Copilot Rewrite

I asked Copilot for Word to make the original text more consise and understandable. Copilot produced this text:

Microsoft 365 Copilot customizes itself for each user, assisting with daily tasks based on their work knowledge. Both tenant administrators and users can manage this personalization using advanced control features.

Discover how Microsoft 365 Copilot enhances personalization, ensures privacy, and provides control through improved settings.

The corrected text is still not good. Human intervention is necessary to explain how Copilot “can become personalized to each user” (original text) or how Copilot “customizes itself for each user” (revision), and so on. It is a stark warning of the danger of accepting AI-generated text without question. Not only can customer communications end up with absolute rubbish, but the danger also exists that AI-generated incorrect and misleading text ends up being stored in a file and reused ad nauseum by Copilot when it generates responses to future user prompts.

You might wonder why the Microsoft writers did not ask Copilot to refine their text. Well, I do not know if they did or not, but it could be that because the text is about a new feature that does not exist, Copilot could not find anything better to say in the Graph or in its LLMs. Remember, generative text depends on what has gone before. Copilot can rehash material it knows about, but it cannot write material about a new topic from scratch.

The Copilot Promise to Fix Grammar and Spelling Errors

Which brings me neatly to message center notification MC1060868 (23 April 2025, Microsoft 365 roadmap item 483954), which promises a new Copilot “fix grammar and spellings” feature that will address all grammar and spelling problems found in text with a single click. General availability of the feature is due in late April 2025 with deployment scheduled to complete worldwide by mid-June 2025.

Microsoft doesn’t say what languages are supported, but I assume that the feature will appear in all the languages supported by Copilot. MC1060868 contains no detail about which Copilot apps will benefit. Copilot for Word is an obvious target, and I assume that Copilot for Outlook will also receive help to tidy up email communications. As to the other apps, I guess we will see after the feature arrives.

It is a logical progression to have a single-pass process to find and remedy common errors in documents. Word has options to check for spelling and grammar errors as user type text into documents. The difference here is that Word suggests and nudges people when it detects potential errors whereas Copilot will go ahead and rewrite text to remove errors. It is then up to the user to decide whether to keep or discard the Copilot rewrite. Overall, Copilot’s one-click solution is a more proactive approach to helping people generate better text.

But is it Possible to Fix Grammar and Spelling with One Click?

That is, if everything works. The history of software designed to help people write better text is littered with dead ends. Does anyone pay much attention to the recommendations of Microsoft Editor? Why do people continue to subscribe to services like Grammarly when Microsoft offers spell and grammar checking in its products. Perhaps we are heading to a new golden age of beautiful text created by humans and enhanced by AI. Maybe, and I am sure the prospect will be welcomed by those who write the Graph documentation. But I am not holding my breath.


Make sure that you’re not surprised about changes that appear inside Microsoft 365 applications by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers stay informed.

]]>
https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/feed/ 0 69025
How SharePoint Online Restricted Content Discovery Works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/?utm_source=rss&utm_medium=rss&utm_campaign=restricted-content-discovery-works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/#comments Wed, 02 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=68682

Restricted Content Discovery Hides SharePoint Content from Copilot and Agents

The problem of poor permission management has surfaced from time to time in the history of SharePoint. The Office Delve app caused the last big upheaval within Microsoft 365 when it demonstrated an uncanny ability to surface sensitive documents to user view. Of course, Delve was never the problem. The issue is due to careless permission assignment, usually at site level.

When Microsoft launched Copilot in March 2023, it soon became apparent that Copilot is even better than Delve at finding and reusing documents, including files that an organization would prefer to remain restricted. Microsoft’s short-term answer was Restricted SharePoint Search, a horrible but expedient solution that works on the basis of an allow list for enterprise search which restricts users to only being able to search approved sites. Copilot always works as the signed in user, so the limits applied to users apply to Copilot to stop the AI using material stored in unapproved sites in its responses.

Restricted Content Discovery (RCD) is the latest solution to control unfettered access to confidential information stored in SharePoint Online sites. RCD is part of the SharePoint Advanced Management (SAM) suite. Microsoft is making SAM available to tenants with Microsoft 365 Copilot licenses via a code update that’s slowly deploying.

How Restricted Content Discovery Works

Restricted Content Delivery works by adding a flag to files stored in designated SharePoint Online sites. When an administrator marks a site for RCD through the SharePoint admin center or PowerShell. Figure 1 shows the “restrict content from Microsoft 365 Copilot” option in the admin center. When a site is selected for RCD, SharePoint sets a site-level property that causes index updates for every file in the site. Although RCD is applied at a site basis, SharePoint indexing happens at the file level, so a fan-out process must find and reindex every file in a site before RCD becomes effective for that site.

The time required to update the index for a site is highly dependent on the number of items in the site. Microsoft says that “for sites with more than 500,000 items, the Restricted Content Discovery update could take more than a week to fully process and reflect in search and Copilot.”

Setting the Restricted Content Discovery flag for a SharePoint Online site.
Figure 1: Setting the Restricted Content Discovery flag for a SharePoint Online site

The indexing update does not remove items from the tenant index. If it did, items would be unavailable for eDiscovery searches, auto-label policies for retention and sensitivity labels, and other solutions. Instead, the flag set on files instructs Copilot to ignore those files when it consults the Graph to find matching content to help ground user prompts. The same approach is used by the Data Loss Prevention (DLP) policy to block Copilot access to files assigned specific sensitivity labels.

The block applies to anywhere Copilot for Microsoft 365 can use SharePoint Online files, including Copilot agents. It doesn’t affect how site-level search works, nor does it interfere with other Purview solutions like eDiscovery, content searches, or DLP. However, content from sites enabled for RCD don’t appear in enterprise level searches.

RCD Management with PowerShell

PowerShell can be used to manage RCD for sites. Make sure that you use a recent version of the SharePoint Online management module (I used Microsoft.Online.SharePoint.PowerShell version 16.0.25715.12000). For example, to enable RCD for a site, run the Set-SPOSite cmdlet to set the RestrictContentOrgWideSearch property to $true.

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $true

To remove RCD from a site, set the value for RestrictContentOrgWideSearch to $false:

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $false

Much the same reindexing process must occur before files in sites where RCD is disabled after being enabled before files become available to Copilot.

To generate a list of sites with RCD enabled, run the Start-SPORestrictedContentDiscoverabilityReport command to create a job on a queue for processing. The Get-SPORestrictedContentDiscoverabilityReport cmdlet reports the status for the job, which eventually reports “Completed.”

Start-SPORestrictedContentDiscoverabilityReport

Generating the report will take some time. Are you sure you want to proceed?
Continue with this operation?
[Y] Yes  [N] No  [?] Help (default is "Y"): y

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 16:09:52
Status               : NotStarted

Get-SPORestrictedContentDiscoverabilityReport

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 17:03:52
Status               : Completed

To download the RCD insights report, run the Get-SPORestrictedContentDiscoverabilityReport cmdlet and pass the GUID (id) for the report. This value is shown in the Get-SPORestrictedContentDiscoverabilityReport output:

Get-SPORestrictedContentDiscoverabilityReport –Action Download –ReportId 02aa91ea-5e12-43de-91a1-a58275d3b201
Report RestrictedContentDiscoverabilityReport_1743437651407.csv downloaded successfully

Microsoft documentation says that “the downloaded report is located on the path where the command was run.” This is incorrect. The file ends up in whatever folder the PowerShell session starts up in. In my case, I ran the job when positioned in c:\temp and the file ended up in c:\windows\system32. The easy fix here is to use a PowerShell profile to define the folder where PowerShell starts up.

The contents of the “insights” report aren’t too exciting (Figure 2) and could be easily generated by looping through sites with PowerShell to find those with the flag set.

Restricted Content Discovery is enabled for these sites
Figure 2: Restricted Content Discovery is enabled for these sites

Restricted Content Discovery for All

It’s a reasonable guess that any Microsoft 365 tenant that’s interested in Copilot has some sensitive information stored in SharePoint Online sites. If you’re in this situation, you should consider RCD as the front-line method to prevent that information leaking out through Copilot. I’d also deploy the DLP policy to restrict Copilot access as a backup. Between the two lines of defence, it’s unlikely that inadvertent disclosure of confidential data will happen, and that’s a good thing.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/04/02/restricted-content-discovery-works/feed/ 5 68682
Copilot in Outlook Gets a Revamp https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-outlook-ui https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/#respond Fri, 21 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68524

Tweaks to Copilot for Outlook Make the Functionality More Accessible

On Tuesday, I reported that I thought the new Facilitator agent in Teams chat is a good example of AI performing a task well. It’s evidence of how the initial rush of deploying AI everywhere to anything that could have a Copilot label applied is moderating into better implementations.

Message center notification MC892651 (last updated 18 March 2025, Microsoft 365 roadmap item 397092) could be regarded as being in the same category. In this case, the UI for Copilot interactions in the Outlook has what Microsoft terms as “major design improvements” for the new Outlook on Windows and Mac desktops, OWA, and Outlook mobile clients. Outlook classic remains unaltered.

Perhaps because it involves major improvements or a wide range of clients, the deployment of the update has been delayed. Microsoft originally intended to have full deployment done by late February 2025. That date is now late April 2025. When this happens, it normally means that Microsoft had to halt the deployment to fix some problems.

No New Functionality in Revamped UI

According to Microsoft, the revamped UI doesn’t include any new functionality. I never saw the ‘rewrite like a poem’ option before (which might have improved some of my email enormously), so the fact that the new layout and navigation makes this option accessible (Figure 1) is proof that the overhaul works.

The revamped Copilot for Outlook UI in the new Outlook for Windows.
Figure 1: The revamped Copilot for Outlook UI in the new Outlook for Windows

Of course, things work differently on mobile devices, but the changes seem to make things better there too (Figure 2).

Copilot for Outlook mobile.
Figure 2: Copilot for Outlook mobile

By comparison, the Copilot options in Outlook classic are a tad austere (Figure 3), just like the options were like in the other clients before the change. The changes made in the other clients proves once again that good design is important when it comes to making technology accessible to users.

Copilot options in Outlook classic.
Figure 3: Copilot options in Outlook classic

UI Great, Text Awful

I like the UI changes and think they improve how Copilot for Outlook works. However, the changes do nothing to improve the quality of the written text generated by Copilot, which remains bland and overly effusive to my taste. I guess that’s my personal approach to email shining through because I favor brief to-the-point messages over lengthy missives.

The late Mark Hurd (CEO of HP at the time) once advised me to always put the most important information in a message into the first paragraph so that recipients could quickly review items in their inbox without needing to read long messages on mobile devices (Blackberries and iPAQs then). Technology has moved on, but the advice is still true, especially as so many different forms of mobile devices are now in use. Maybe Copilot for Outlook needs a rewrite in one brief paragraph option.

More Change to Come

Although it sometimes seems much longer, we’re still only two years into the Copilot era. We’ll see more changes like this as Microsoft refines and enhances how Copilot is integrated into apps. Now that they’ve given Outlook a nice new UI, perhaps they’ll do the same for Excel and PowerPoint to make it easier to use Copilot in those apps. Or maybe that’s just me moaning because I’m not as proficient as I should be with those apps.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/feed/ 0 68524
Use Data Loss Prevention to Stop Microsoft 365 Copilot Chat from Processing Documents in Its Responses https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=dlp-policy-for-microsoft-365-copilot https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/#comments Thu, 20 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68504

DLP Policy for Microsoft 365 Copilot to Restrict Access to Sensitive Documents

Ever since the introduction of Microsoft 365 Copilot in March 2023, organizations have struggled to stop the AI consuming confidential or sensitive documents in its responses. Some of the early tools, like Restricted SharePoint Search, were blunt instruments hurried out as responses to customer requests. Microsoft’s current best answer is SharePoint Restricted Content Discovery (RCD), a feature normally licensed through SharePoint Advanced Management (SAM). All tenants with Microsoft 365 Copilot licenses are due to receive access to RCD and the deployment process is ongoing.

Microsoft says that the key use case for RCD is to “prevent accidental discovery of [files stored in] high-risk sites.” RCD works by limiting the ability of end users to search selected sites. By excluding sites from search, RCD prevents Copilot Chat (and agents based on Copilot Chat) from using the files stored in those sites in its responses. It’s still possible for Copilot to use information from a sensitive document if the user has the file opened in an app like Word. At this point, the sensitive content is open in memory and available for Copilot to process.

Blocking files from user access doesn’t stop system functions like eDiscovery working.

Update April 21: MC1059677 announces the extension of DLP protection to Copilot in Office apps (Word, PowerPoint, Outlook, and Excel).

Update June 30: The DLP policy for Microsoft 365 Copilot is now generally available.

Blocking Access to Individual Files

RCD is a good way to cast a protective net across multiple sites. But what about protecting individual files that might be in sites that aren’t covered by RCD? Until now, the answer has been to use sensitivity labels to stop Copilot Chat using sensitive files to generate its responses. Although sensitivity labels can stop Copilot using the content of protected files, it cannot prevent Copilot finding reference protected files through a metadata search.

Creating a DLP Policy for Microsoft 365 Copilot

A solution to that problem might be coming in the form of a new type of Data Loss Prevention (DLP) policy. The feature is described in message center notification MC937930 (last updated 6 February 2025, Microsoft 365 Roadmap ID 423483). DLP policies are usually used to block external sharing of confidential information, like Teams meeting recordings. Blocking files for internal consumption is a new step.

Essentially, tenants can create a DLP policy to check for specific sensitivity labels and block Copilot Chat (and agent) access to files with those labels. The functionality is now in preview and is scheduled for general availability in June 2025 (complete worldwide by the end of July 2025). Some gaps are always expected in preview code, and the gaps right now include alerts, incident reports, policy simulation, and audit records. In other words, it’s very hard to know when a DLP policy match happens to block access. But testing indicates that the DLP policy works.

The DLP policy for Microsoft 365 Copilot is a special form of policy in that the policy only covers Copilot and no other type of data (Figure 1).

Creating a DLP policy for Microsoft 365 Copilot.
Figure 1: Creating a DLP policy for Microsoft 365 Copilot

The rules used in a DLP policy for Microsoft 365 Copilot are simple. The policy checks if a file has a specific sensitivity label, and if the sensitivity label is found, DLP executes the action to “prevent Copilot from processing content” (Figure 2). A rule can check for the presence or one or more sensitivity labels. In some respects, it might be easier to create a separate rule for each label.

Creating a DLP rule for Microsoft 365 Copilot.
Figure 2: Creating a DLP rule for Microsoft 365 Copilot

Testing the DLP Policy for Microsoft 365 Copilot

To test the new DLP policy, I created several documents referring to regulations governing cryptocurrency in Iceland (a topic selected at random because I knew that my tenant was unlikely to store any files relating to the topic). I used Copilot for Word to generate the text for each file and added a reference to a mythical regulation to the text of each document to give Copilot an easy target to find. The first check asked Copilot Chat to find documents relating to cryptocurrency in Iceland with special relevance to the regulation. The sensitivity labels assigned to the documents were not covered by a DLP policy for Microsoft 365 Copilot, and Copilot found all the documents (Figure 3).

Copilot finds confidential documents without sensitivity labels monitored by a DLP policy.
Figure 3: Copilot finds confidential documents without sensitivity labels monitored by a DLP policy

After applying sensitivity labels covered by the DLP policy for Microsoft 365 Copilot to two of the three documents, the search was rerun and Copilot found only one document (Figure 4).

The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels.
Figure 4: The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels

I don’t pretend this to be a full test. However, it’s the only way to check preview software that doesn’t generate audit records or other traces to show when DLP policy matches occur to force DLP to execute the defined actions.

New DLP Policy Shows Promise

I’ll look forward to retesting the DLP Policy for Microsoft 365 Copilot after the software reaches GA and the full array of auditing and reporting options are available. Auto-label policies can only apply sensitivity labels to Office files and PDFs, and I suspect that this limitation won’t be lifted. That’s a pity because it stops the DLP policy being able to control access to items like the .MP4 files used for Teams Meeting Recordings (transcripts).

The nice thing is that users see no trace of a sensitive document show up in Microsoft 365 Copilot Chat. Unlike basic sensitivity label protection, which allows Copilot Chat to show metadata found in its searches, the DLP policy is silent. And that’s just the way you’d want it to be when dealing with sensitive data.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/feed/ 3 68504
Why Microsoft 365 Copilot Works for Some and Not for Others https://office365itpros.com/2025/02/20/make-copilot-useful/?utm_source=rss&utm_medium=rss&utm_campaign=make-copilot-useful https://office365itpros.com/2025/02/20/make-copilot-useful/#comments Thu, 20 Feb 2025 07:00:00 +0000 https://office365itpros.com/?p=68101

I Can’t Wait for Agentic Experiences to Make Copilot Useful

We’re all on a journey to understand how to use artificial intelligence effectively to improve systems, lives, and human existence. If you pay for the necessary licenses, Copilot is everywhere within the Microsoft 365 ecosystem, both as helpers deployed in desktop apps like Word, Teams, and PowerPoint, and the possibility of custom agents for tenants to develop and deploy, albeit without the necessary tools to manage potentially thousands of agents created by citizen developers.

According to Microsoft CEO Satya Nadella, Microsoft wants to make it as simple for people to create agents than it is to create an Excel worksheet, which might mean the creation of the “highly customized agentic experiences” referred to in Microsoft 365 center notification MC985480 (January 22). I don’t quite know that phrase means, and the clarifying text that said it “means you can design unique prompts, connect to any LLM, and integrate these custom agents with Microsoft 365 Copilot” wasn’t much help either. When I asked Copilot, it struggled with the concept too (Figure 1). In any case, I’m sure that we’ll all be happy in our highly customized agentic world when it arrives.

Copilot attempts to define highly customized agentic experiences.
Figure 1: Copilot attempts to define highly customized agentic experiences

Why Today’s AI Falls Short of its Hype

All of which brings me to a thoughtful article in the Tomorrow’s Blueprint blog entitled “Why Others Think AI Is a Miracle But You Think It’s Useless.” The author is Microsoft product manager Abram Jackson, now deeply involved in the development of Microsoft 365 Copilot. The core of the article is an assertion that:

Today’s AI falls short of its hype for many due to three big reasons:

  • It often doesn’t have the data it needs to work with
  • Defining tasks precisely is very difficult
  • There’s little AI can do other than give you text or images.”

Abram knows much more about AI than I do. I reckon that he has captured the problems faced by many organizations as they consider how to extract value from a potentially massive investment in Copilot licenses.

Without access to data, Copilot can do nothing. The magic of Microsoft 365 Copilot, if some exists, is the Microsoft Graph, or access to the documents, emails, and Teams messages stored within Microsoft 365. Yet the legacy of some older Microsoft decisions around collaboration strategy forced organizations to restrict SharePoint Search to stop Copilot revealing information to anyone who asked. As it turns out, it is hard to stop Copilot using data because even document metadata can reveal secrets.

I like the way Abram discusses the issue of defining tasks. Math works because the answer is either right or wrong. Copilot works very well when given well-defined tasks to do, like summarizing a meeting transcript or extracting tasks for people to consider. The same goes for scanning an email thread or summarizing a Word document. Generating text is less satisfactory unless the user is very precise in their prompt and grounds Copilot with some suitable input, like documents to work from. The promise of early demos where Copilot generated project reports and other material in the blink of an eye is never attained where loose prompting gives the AI free rein to indulge itself.

How People Need to Use AI

The summary is that to extract value from AI (and Microsoft 365 Copilot in particular), users must:

Understand if a task is valuable and not prone to hallucinations. Asking Copilot for Word to scan a document and decide if it is well-structured and how make improvements is valuable for many people who aren’t natural writers. Asking Copilot for Word to generate the initial document introduces the possibility of hallucinations.

Work to define the task precisely: Asking Copilot to do something very precisely with clear boundaries and guidelines will generate much better results than dashing off a quick prompt. Grounding a prompt with some relevant information, like several pertinent documents will always help Copilot to generate better information.

Translate the result generated by the AI into the form you need it to be. For chat, the introduction of Copilot pages has proven useful because it allows users to easily capture the output generated by Copilot for reuse. But will the slides generated by Copilot for PowerPoint be the type you need? Or can Copilot for Excel really perform the computations you want? Of course, they can, but only with practice and perseverance on the part of the human.

As Abram says, this approach “isn’t natural and it is time-consuming.” It comes about because Copilot is essentially an eager assistant that wants to work but will do stupid things unless you tell it precisely what to do and how to do it. Expanding on the example shown in Figure 1, adding context and direction to the prompt gives Copilot the chance to deliver a much better answer. Prompts can now be up to 128,000 characters, so there’s lots of room for comprehensive instructions.

Make Copilot useful by giving the AI better and more detailed instructions. It's more likely to come up with a good answer.
Figure 2: Make Copilot useful by giving the AI better and more detailed instructions

The Bing Conundrum

One last point about data being available for Copilot to work with. I’m not sure about Abram’s statement that “hallucination is largely a solved problem for Microsoft Copilot.” I see odd stuff generated all the time. Abram justifies his claim by saying that “Copilot is trained to only respond with information it has been able to find through search.”

Copilot depends on Bing and Bing isn’t very good at searching. Take this website. Despite the ease in which Google has indexed and searched all my articles for years, Bing stubbornly refused to touch the site. I only discovered this fact when creating some declarative agents that used office365itpros.com as a source. Since then, the best efforts of WordPress support and my own attempts to navigate the online Bing webmaster advice have only just persuaded Bing to start indexing some pages. Some of the blocks are quite silly. One problem that caused Bing to refuse to index pages was the lack of an alt tag for a graphic in a sidebar.

If Copilot had better search facilities, it could generate better answers because it has better data to work with.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/02/20/make-copilot-useful/feed/ 1 68101
Microsoft Launches Copilot for All Initiative https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-chat-jan25 https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/#comments Thu, 16 Jan 2025 07:00:00 +0000 https://office365itpros.com/?p=67692

New Agent Capabilities for the Free Microsoft 365 Copilot Chat App

Infused with the zealotry of true believers, Microsoft announced Copilot for All on January 15, 2025 to reveal the details of the complicated Copilot renaming they previewed in December. And the new logo, of course.

In a nutshell, Microsoft is creating an “on-ramp” to give Microsoft 365 tenants that haven’t invested in expensive Microsoft 365 Copilot licenses the chance to use agent technology “grounded in Microsoft Graph data.” The idea here is to encourage commercial customers to run a mix of Copilot with some having the full-blown licensed version while others experience with the free-to-use version. Figure 1 shows the relative capabilities of the two Copilot options.

Functionality available in the two Microsoft 365 Copilot products.
Figure 1: Functionality available in the two Microsoft 365 Copilot products (source: Microsoft)

.Lots of Functionality in Microsoft 365 Copilot Chat

The free-to-use Microsoft 365 Copilot Chat app includes a lot of functionality in terms of its ability process user prompts against information available on web sites (providing those sites are indexed by Bing). Recently, Microsoft added features like Copilot pages and the image generator (Figure 2). Microsoft says that limitations exist on the number of images that can be generated daily. I guess I don’t create many images as I haven’t experienced any problems.

Generating an image in Microsoft 365 Copilot Chat.
Figure 2: Generating an image in Microsoft 365 Copilot Chat

The Chat client has enterprise data protection, so data is secure, protected, and actions are audited and captured in compliance records.

Pay-as-you-go Agents

The big news is that customers will be able to create and run custom agents grounded against “work data” on a pay-as-you-go (PAYG) metered basis. PAYG means the tenant must sign up for an Azure subscription with a valid credit card before the agent will run. Agent activity is charged against the subscription using “messages” as the metering unit (an action performed by an agent can consume up to 25 messages). Grounding against work data means that the agents can interrogate information available in the Microsoft Graph. Technically speaking, Graph data includes Exchange, Teams, SharePoint, and OneDrive plus anything imported into the Graph through a third-party connector. However, the capabilities of today’s agents are limited to SharePoint and OneDrive sites plus Graph connectors. In any case, there is some magic here to exploit because if an organization can import its data into the Graph, agents can reason over that data to create responses to user prompts, providing PAYG is set up for the tenant.

The custom agents are developed with Copilot Studio. I have spent some time working with Copilot Studio to build simple agents over the last few weeks. It’s not a terribly difficult task, but organizations do need to take the time to chart out how they plan to develop, deploy, and manage agents rather than rushing headlong into the brand-new world. Like any software, agents work best when some structure is in place.

The Big Differences between Microsoft 365 Copilot Chat and Microsoft 365 Copilot

Paying for agents to use Graph data does not deliver the full suite of capabilities enjoyed by those who invest in Microsoft 365 Copilot licenses. Figure 1 shows that Microsoft 365 Copilot includes a bunch of personal assistants where Copilot is built into Microsoft 365 apps like Teams, Word, Outlook, PowerPoint, and Excel. Sometimes, as in the case of the automatic document summary generated by Copilot in Word, the help is unwanted, but the personal assistants are very good at helping with other tasks, like summarizing long email threads or recapping Teams meetings.

Microsoft 365 Copilot also includes SharePoint Advanced Management (SAM). However, although Microsoft announced at Ignite 2024 that tenants with Microsoft 365 Copilot licenses would get SAM in early 2025, there’s no trace of these licenses turning up in any tenant that I have access to. License management can be complex and I’m sure that SAM will turn up soon.

Finally, PAYG access to Graph data does not include the semantic index. The index is generated automatically from Graph data in tenants with Microsoft 365 Copilot licenses to create a vector-based index of the relationships of items in the Graph. It’s an untrue urban legend that Microsoft 365 Copilot needs the semantic index to function. The semantic index enhances search results, but it’s not required for the chat app or agents to work.

In Simple Terms, Two Copilot Products

It’s easy to become confused by the naming of different elements within the Microsoft 365 Copilot ecosystem. It boils down to Microsoft offering free (with PAYG capabilities) and expensive Copilot products to Microsoft 365 customers. Microsoft obviously hopes that the free version will act as the on-ramp to full-fledged Copilot. It’s a reasonable tactic. Time will tell if it’s successful.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering the Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/feed/ 4 67692
The Confusing Renaming of Microsoft 365 Copilot https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-rename https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/#comments Fri, 20 Dec 2024 07:00:00 +0000 https://office365itpros.com/?p=67476

Microsoft 365 Copilot Rename Means Exactly What?

By now, I’m sure that people understand that Microsoft has two chat apps available for Microsoft 365 users:

  • Microsoft Copilot, which is limited to making queries against the Microsoft LLMs. The app is available without a license to anyone who signs into their Entra ID account before they can use Microsoft Copilot, which is why it’s sometimes referred to as Microsoft Copilot (for users with Entra accounts). This app started as Bing Chat Enterprise before the Copilot branding team applied their magic. To be fair, the addition of enterprise data protection to Microsoft Copilot in September 2024 improved the app greatly.
  • Microsoft 365 Copilot, which can include Graph content (data stored in SharePoint Online, OneDrive for Business, Teams, and Exchange Online) in its queries against the Microsoft LLMs (the Graph content “grounds” the queries). This app is also called BizChat, and I use that name for the remainder of this article. User accounts must hold a $360/year Microsoft 365 Copilot license before they can use BizChat.

The naming used for these apps and the Microsoft 365 Copilot suite (a Copilot for every app, like Copilot in Word, Copilot in Teams, Copilot in Outlook, etc.) has evolved since the original launch in March 2023. In that time, probably far too many brain cells have been sacrificed to keep up with Microsoft’s marketing machinations as they seek to drive Copilot deep into the consciousness of Microsoft employees and customers alike.

The January 2025 Change

Message center notification MC958903 (16 December 2024) marks yet another turn in the naming game. In mid-January 2025, Microsoft will introduce changes “to simplify the user experience.”

  • Microsoft Copilot becomes Microsoft 365 Copilot Chat. The app will be able to use Copilot agents for the first time. Agents that access web content are free, but using agents that access work data (Graph data) must be paid for on a pay-as-you-go (metered consumption) basis.
  • The current Microsoft 365 app, which includes a Copilot icon to access Copilot in its navigation bar, becomes Microsoft 365 Copilot, complete with a new M365Copilot.com URL to “make it easier to discover.” Depending on their licensing status, the Copilot icon brings people to either Microsoft 365 Copilot Chat or BizChat. The app will receive a UI makeover to “support future AI-first experiences” like exposing Copilot Pages. The changes are detailed in MC958905 and include a new icon that I thoroughly dislike (see Figure 1).
 The January 2025 chnges for Microsoft 365 Copilot (source: Microsoft)

Microsoft 365 Copilot rename
Figure 1: The January 2025 changes for the Microsoft 365 Copilot rename (source: Microsoft)

All of this was discussed at the Ignite 2024 conference in Chicago last month. I paid little attention at the time because I ignored most of the marketing fluff from the conference, preferring to wait to see the details emerge. If you’re interested, the keynote is still online, complete with a very brief mention of a rename (Figure 2).

Microsoft EVP Rajesh Jha describes the wonders of Microsoft 365 Copilot
Figure 2: Microsoft EVP Rajesh Jha describes the wonders of Microsoft 365 Copilot

The Confusion Between Product and App

I dislike renaming Microsoft Copilot to be Microsoft 365 Copilot Chat because it complicates what should be a simple differentiation between users who have Microsoft 365 Copilot licenses and those who do not. Once you apply the Microsoft 365 brand to an app, a certain implication exists that the app has something to do with Microsoft 365 and enjoys some access to Microsoft 365 content (which it doesn’t have).

I guess the chat app that can’t access Microsoft 365 content has some relationship with Microsoft 365 because it’s available through the Microsoft 365 Copilot app, but the connection is tenuous at best and difficult for people who don’t track the minutiae of changes within the service. It took me several readings of MC958903 before the details sunk in. I suspect that I am not alone.

I’m sure that Microsoft will point to its fabled telemetry to justify the decision. They always do. However, I think this is more of the “let’s brand everything with the <insert latest product du jour name here> name here” tactic seen in the past with Windows, Office, and .Net. The problem is that telemetry seldom highlights the potential for future confusion of the sort that’s likely when this change emerges.

Tiring Pace of Branding Changes

Everyone understands that Microsoft is making a big bet to be the leader in AI. Microsoft is spending a ton of money to build their leadership, including a reported $19 billion spend reported in their Q4 FY24 results. But the constant mantra of Copilot everywhere is starting to wear. It will be a relief when the tsunami subsides and we can all get back to productive work, with or without Copilot’s assistance.


Make sure that you’re not surprised about changes that appear inside Microsoft 365 applications by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers stay informed.

]]>
https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/feed/ 8 67476
How Microsoft Copilot Generates Compliance Records https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-copilot-interactions https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/#respond Thu, 07 Nov 2024 07:00:00 +0000 https://office365itpros.com/?p=66948

Microsoft 365 Substrate Captures Interaction Details for Microsoft Copilot

After writing about how to use the Microsoft Graph PowerShell SDK to analyze the interactions between users and Microsoft 365 Copilot in various apps, I was asked if the code reports interaction records for Microsoft Copilot. This is the free version of Copilot that appears in the Microsoft 365 app when a signed-in Entra ID user account doesn’t have a Microsoft 365 Copilot license.

The big difference between the free and paid-for version is that Microsoft 365 Copilot can use Graph queries to find email, Teams messages, and documents to ground its queries while Microsoft Copilot is limited to Microsoft’s LLMs and Bing web searches. In addition, Microsoft 365 Copilot comes with extra features, such as custom Copilot agents for SharePoint Online.

Both versions support enterprise data protection (EDP). Microsoft added support for EDP to Microsoft Copilot in August 2024 and the announcement specifically says that information about prompts and responses is retained for eDiscovery purposes.

Asking Microsoft Copilot

My first step to gather information was to ask Microsoft Copilot if it generates interaction compliance records. Figure 1 shows the negative response.

Microsoft Copilot responds to a query about interaction compliance records
Figure 1: Microsoft Copilot responds to a query about interaction compliance records

Looking Behind the Scenes

As Microsoft Copilot couldn’t answer the question, it was time to look behind the scenes. I figured that the Microsoft 365 substrate would store anything it captured for Microsoft Copilot interactions in the same hidden TeamsMessagesData folder in the user’s mailbox.

Some are curious why Microsoft selected TeamsMessagesData as the storage location for these records. It doesn’t really matter what folder is used if it’s hidden and indexed for eDiscovery, but I think Microsoft chose TeamsMessagesData because the Copilot chats are very much like regular Teams one-on-one chats. The substrate captures Teams compliance records for one-on-one chats in the same folder.

MFCMAPI is the best tool to investigate mailbox contents. After using Microsoft Copilot several times, I opened the TeamsMessagesData folder with MFCMAPI and discovered that the substrate had captured compliance records for the Copilot interactions. Figure 2 shows the record captured for the prompt shown in Figure 1.

A prompt captured for a Microsoft Copilot interaction
Figure 2: A prompt captured for a Microsoft Copilot interaction

Once I located the compliance records, it was easy to update the PowerShell script to extract and report the Microsoft Copilot interactions. The updated code is available from GitHub.

No Data Shown for Responses

I noticed that compliance records captured for Microsoft Copilot responses do not include the response in the Body and BodyPreview properties. The same is true for responses generated for Microsoft 365 Chat (BizChat) responses. Looking back through records for Microsoft 365 Chat interactions, it appears that the only output is any documents located by Copilot to form its response. In Figure 3, we see a reference to a document in a Microsoft 365 Chat response followed by some base 64 encoded text.

A Microsoft 365 Copilot response including some referenced documents
Figure 3: A Microsoft 365 Chat response including some referenced documents

Inputting the encoded text into an online decoder reveals the text (Figure 4). It looks like whatever routine Microsoft uses to generate the compliance record doesn’t decode the text before it’s written into the mail item used to store the record in TeamsMessagesData.

A base64 decoder reveals the full text for a Microsoft 365 Chat response
Figure 4: A base64 decoder reveals the full text for a Microsoft 365 Chat response

The encoded state of the information also explains why the Activity Explorer in the AI Hub in the Purview portal can’t display Copilot’s response to a prompt (Figure 5).

The AI Hub's Activity Explorer fails to display a Microsoft 365 Chat response
Figure 5: The AI Hub’s Activity Explorer fails to display a Microsoft 365 Chat response

Summarizing Microsoft Copilot and Compliance Records

The answer to the question is that compliance records are generated for Microsoft Copilot interactions. However, the information logged in the compliance records isn’t as easy to access as it should be. The flaw shared by Microsoft Copilot and Microsoft 365 chat suggests that some buggy code is shared by the two apps. It should be easy for Microsoft to decode responses from base64 before including clear text in compliance records.

The issue is reported, but quite when a fix will appear is anyone’s guess. Hopefully, because the problem means that compliance records aren’t as useful as they should be, the fix should appear soon.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/feed/ 0 66948
Create a Custom Copilot Agent for SharePoint Online https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-agents-sharepoint https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/#comments Thu, 31 Oct 2024 01:00:00 +0000 https://office365itpros.com/?p=66871

Copilot Agents Rolling Out to Targeted Release Tenants

On October 23, 2024, Microsoft published message center notification MC916296 (Microsoft 365 roadmap item 416297) to announce the rollout of Copilot agents in SharePoint Online to targeted release tenants. Worldwide deployment to targeted release tenants is due to finish in early November 2024 with general availability following to all tenants (with Microsoft 365 Copilot licenses) completing in late December 2024.

Microsoft included Copilot agents in SharePoint Online as part of their Wave 2 announcement on September 16, 2024. At the time, I thought that Copilot agents were the most interesting part of Wave 2. Copilot pages, another major part of the announcement, are a nice way to capture the output from Copilot queries, but having an agent automatically created for SharePoint sites to query just the content from that site seemed like more useful functionality. I was therefore very happy to see Copilot agents show up in my tenant.

Default Copilot Agent for Sites

When users have a Microsoft 365 Copilot license, they see a Copilot option in the navigation bar when a document library is open. Selecting Copilot opens the default agent for the site, which responds to user prompts by reasoning over Office documents and PDF files stored in the site. Limiting Copilot to a predefined set of files from a site stops Copilot using a wider search to find information in any file it can access across the tenant or through a web search if permitted by the tenant. It’s a way of getting a precise response from information held in a site.

Creating a Custom Copilot Agent

Site members with create and edit permissions (for a site owned by a Microsoft 365 group, any group member) can create a Copilot agent to create an even more precise search. For instance, I store the source Word documents for every article that I write (including this one) in a document library in a SharePoint Online site. Using the Create a Copilot agent option, I created a custom Copilot agent to reason over the articles. The entire operation took less than a minute, which is kind of startling.

The Sources tab of the wizard selects the folders or file for Copilot to process (Figure 1). You can select the entire site or any of the folders or individual files from the site, including from any document library if the site includes more than the default document library. The name of the agent can be between 4 and 42 characters.

Defining the source content for a custom Copilot agent
Figure 1: Defining the source content for a custom Copilot agent

The Behavior tab allows you to tailor the sample prompts shown to users and how Copilot will respond. In Figure 2, I’ve changed the tone for the responses from professional to formal and modified one of the starter prompts.

Modifying the behavior of a custom Copilot agent
Figure 2: Modifying the behavior of a custom Copilot agent

After saving the agent, Copilot creates a file in the document library for the agent and adds the agent to the recently used list of agents (Figure 3). If you make a mistake with an agent, simply delete the file. The file is also used to share agents. For instance, you can create a sharing link for the agent and include it in email or a Teams chat. If the people who see the link have access to the documents processed by the agent, they can use the sharing link to access the agent.

Creating a custom Copilot agent creates a file in the document library
Figure 3: Creating a custom Copilot agent creates a file in the document library

The list of recently used agents includes agents from other sites. You don’t need to navigate to a specific site to use its agents because they can be invoked from elsewhere in SharePoint.

Using the agent is like any other Copilot interaction. You compose a prompt (question) and submit it to Copilot for processing. Copilot restricts its search to the set of files defined for the agent. Figure 4 shows a sample interaction where I asked Copilot to search for anything that I have written about Copilot Pages, and it duly found the Word document source for the published article.

Interacting witn a custom Copilot agent
Figure 4: Interacting witn a custom Copilot agent

Custom agents work very well for sites storing small to medium documents. Copilot doesn’t do so well with large documents. For example, I created a custom agent based on the folder holding the Word source documents for the chapters in the Office 365 for IT Pros eBook. Many of these files are over 50 pages long, and the agent couldn’t use the chapter files in its responses.

Missing Features

Microsoft says that the current release does not include the ability to interact with a Copilot agent in a Teams chat, nor does it include the extension to Copilot Studio to customize agents. Another missing feature is the ability for site owners to approve agents or to define a default agent for a site. Microsoft says that these features will be available later in 2024. However, they haven’t said if administrators will be able to control Copilot agents across the tenant, such as having PowerShell cmdlets to enable or disable the feature for selected sites.

The Advantage of Precise Searches

Since its debut, Microsoft 365 Copilot has been plagued by oversharing issues caused when Copilot responses include unexpected information. The source for the information is available to the signed-in user, which is why Copilot can access the content, and is usually a result of flawed site permissions or overly-generous sharing. Flawed information generated in documents can creep into other documents and end up polluting the Graph.

Copilot agents offer more precise responses. I anticipate these agents being very useful in sites that hold specific information like product documentation when you really don’t want results to be polluted by some random document found in a site that no one remembers.


Learn about using SharePoint Online and the rest of Office 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.

]]>
https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/feed/ 4 66871
Microsoft Says SMEs Can Benefit from Microsoft 365 Copilot https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-sme https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/#respond Fri, 25 Oct 2024 07:00:00 +0000 https://office365itpros.com/?p=66799

Take the Results Presented with a Pinch of Salt

What are we to make of Microsoft’s release of a new study into the effect of Microsoft 365 Copilot for Small to Medium businesses? The blog post on the topic appeared on October 17 and highlights some results reported by Forrester Consulting, who Microsoft commissioned “to study the potential return on investment (ROI) of Microsoft 365 Copilot for SMBs.”

Microsoft 365 Copilot for SMEs

As the post says, the “results of the study are eye-opening” with big claims for projected ROI of up to 353% and $1.2 million of projected benefits. Projected is the important word here because it means that the ROI and benefits are potential and not achieved, even if they make good headlines. The report of 6% increase in net revenue and 20% reduction in operating costs seem more attainable.

Doubts About Any Technology Report

Doubts surface every time that I read a report about the gains that companies can make if they would only deploy some new technology. I ask myself if the authors of the report understand the technology they’re writing about as deeply as they should. I ask if the companies covered in the report are hand-picked to make the technology look as good as it can be. I ask what direction Microsoft gave Forrester Consulting when they commissioned the report and how independent Forrester can be in what they write about. And I ask if the results gathered from the over 200 companies surveyed for the report are massaged in any way. All nagging doubts honed from years of experience as a consultant.

I’ve no doubt that Microsoft 365 Copilot can do a good job for some SMEs, especially for companies who are backed up by a partner who knows the Copilot technology and understand where the potholes are. For instance, the assertion that 50% of time can be saved by legal firms in contract reviews is believable because many contracts cover the same ground, and a Copilot agent built for the purpose can reason over a corpus of contracts when reviewing text for problems.

It’s also true that Copilot’s ability to summarize text in email, Teams chats, and documents is of great help to people returning to work after a vacation. Catching up by wading through a full inbox or hundreds of Teams chats is never fun, and Copilot absolutely can help by summarizing information and presenting what happened while people were away in a very digestible format.

No Mention of Microsoft 365 Copilot Flaws

But I worry that the report ignores the flaws we know to exist in Microsoft 365 Copilot. Some SMEs are great at organizing their information; others are not, and they succumb to the same kind of group/teams sprawl and accumulation of digital debris that happens in enterprise tenants. SMEs might not have the same training capabilities as exist in larger organizations, which can lead to bad habits like oversharing through sloppy site permissions.

As you might imagine, none of this is covered by the Forrester report. No mention is present about why an SME might need to deploy Restricted SharePoint Search (or the newer but not yet available Restricted Content Discoverability capability), or deploy sensitivity labels to protect their most confidential documents from being reused by Copilot. There’s no comment about the way that errors can creep into user documents from Copilot responses and end up by corroding the reliability of stored documents. These are real issues surrounding the introduction of generative AI to Microsoft 365.

Just a Marketing Tool to Sell Microsoft 365 Copilot Licenses to SMEs

Then I remember that the Forrester report is no more than a marketing tool designed to encourage SMEs with Microsoft 365 Business Basic, Microsoft 365 Standard, or Microsoft 365 Business Premium pay for $360/user/year Microsoft 365 Copilot subscriptions. The companies covered in the report had up top 300 employees. At list price, Copilot licenses cost $108,000 annually for 300 employees. That’s a big investment for any SME.

But someone’s got to pay for the billions of dollars Microsoft is currently investing in AI, and a large percentage of the 400-million plus Office 365 installed base comes from the SME sector. If you work for an SME and are interested in Microsoft 365 Copilot, take the time to read the report, but do so with a large pinch of salt close at hand. Investing a large chunk of change in expensive software licenses without knowing exactly how you’ll achieve an ROI has never been a good business tactic.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/feed/ 0 66799
Copilot’s Automatic Summary for Word Documents https://office365itpros.com/2024/09/05/automatic-document-summary-word/?utm_source=rss&utm_medium=rss&utm_campaign=automatic-document-summary-word https://office365itpros.com/2024/09/05/automatic-document-summary-word/#comments Thu, 05 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66234

Automatic Document Summary in a Bulleted List

Updated 4-Dec-2024

Last week, I referenced the update for Word where Copilot for Microsoft 365 generates an automatic summary for documents. This is covered in message center notification MC871010 (Microsoft 365 roadmap item 399921). Automatic summaries are included in Copilot for Microsoft 365 and Microsoft Copilot Pro (the version that doesn’t ground prompts using Graph data).

As soon as I published the article where I referred to the feature, it turned up in the latest channel update for Word. Figure 1 shows the automatic summary generated for a document (in this case, the source of an article).

 Copilot generates an automatic document summary
Figure 1: Copilot generates an automatic document summary

The summary is the same output as the bulleted list Copilot will generate if you open the Copilot pane and ask Copilot to summarize this doc. Clicking the Ask a question button opens the Copilot pane with the summary prepopulated ready for the user to delve deeper into the summary.

The summary is only available after a document is saved and closed. The next time someone opens the document, the summary pane appears at the top of the document and Copilot generates the summary. The pane remains at the top of the document and doesn’t appear on every page. If Copilot thinks it necessary (for instance, if more text is added to a document), it displays a Check for new summary button to prompt the user to ask Copilot to regenerate the summary.

Apart from removing the Copilot license from an account (in which case the summaries don’t appear), there doesn’t seem to be a way to disable the feature. You can collapse the summary, but it’s still there and can be expanded at any time.

Summarizing Large Word Documents

When Microsoft launched Copilot support for Word, several restrictions existed. For instance, Word couldn’t ground user prompts against internet content. More importantly, summarization could only handle relatively small documents. The guidance was that Word could handle documents with up to 15,000 words but would struggle thereafter.

This sounds a lot, and it’s probably enough to handle a large percentage of the documents generated within office environments. However, summaries really come into their own when they extract information from large documents commonly found in contracts and plans. The restriction, resulting from the size of the prompt that could be sent to the LLM, proved to be a big issue.

Microsoft responded in in August 2024 with an announcement that Word could now summarize documents of up to 80,000 words. In their text, Microsoft says that the new limit is four times greater than the previous limit. The new limit is rolling out for desktop, mobile, and browser versions of Word. For Windows, the increased limit is available in Version 2310 (Build 16919.20000) or later.

Processing Even Larger Word Documents

Eighty thousand words sounds a lot. At an average of 650 words per page, that’s 123 pages filled with text. I wanted to see how Copilot summaries coped with larger documents.

According to this source, the maximum size of a text-only Word document is 32 MB. With other elements included, the theoretical size extends to 512 MB. I don’t have documents quite that big, but I do have the source document for the Office 365 for IT Pros eBook. At 1,242 pages and 679,800 characters, including many figures, tables, cross-references, and so on, the file size is 29.4 MB.

Copilot attempted to generate a summary for Office 365 for IT Pros but failed. This wasn’t surprising because the file is so much larger than the maximum supported.

The current size of the Automating Microsoft 365 with PowerShell eBook file is 1.72 MB and spans 113,600 words in 255 pages. That’s much closer to the documented limit, and Copilot was able to generate a summary (Figure 2).

Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook.
Figure 2: Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook

Although the bulleted list contains information extracted from the file, it doesn’t reflect the true content of the document because Copilot was unable to send the entire file to the LLM for processing. The bulleted list comes from the first two of four chapters and completely ignores the chapters dealing with the Graph API and Microsoft Graph PowerShell SDK.

Summaries For Standard Documents

In early December 2024, Microsoft published documentation for Copilot in Word’s automatic document summary feature. Regretfully, the documentation didn’t include instructions about how to disable the feature on a per-user or tenant-wide basis. It looks like we’ll just have to cope with automatic summaries. At least the summaries work for regular Word documents of less than 80,000 words.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/09/05/automatic-document-summary-word/feed/ 7 66234
Using Company-wide Sharing Links with Copilot for Microsoft 365 https://office365itpros.com/2024/07/02/company-wide-link-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=company-wide-link-copilot https://office365itpros.com/2024/07/02/company-wide-link-copilot/#comments Tue, 02 Jul 2024 08:00:00 +0000 https://office365itpros.com/?p=65424

Why Some People Can’t Use Shared Files with Copilot for Microsoft 365

After reading the article about the new sensitivity label advanced setting to block access for Microsoft content services to confidential Office documents, a reader asked why some users can use some documents shared using company-wide links with Copilot for Microsoft 365 while others cannot. The situation seemed a little strange because it happened for documents shared with everyone in the organization. The problem couldn’t be due to a sensitivity label because the capability only just rolled out and is limited to the Office applications.

The answer is in Microsoft’s documentation for secure file sharing, which says: “Creating a People in your organization link will not make the associated file or folder appear in search results, be accessible via Copilot, or grant access to everyone within the organization. Simply creating this link does not provide organizational-wide access to the content. For individuals to access the file or folder, they must possess the link and it needs to be activated through redemption.

In other words, sharing a file with everyone in your organization is only the first step in the process of making information available to Copilot for Microsoft 365. A company sharing link that arrives in your inbox or is shared through a Teams chat is dormant until you redeem it by using the link. At that time, SharePoint Online checks that your account belongs to the organization to conform your access to the file. If confirmed, the file joins the set of “shared with you” information, which makes it available to Copilot for Microsoft 365.

Testing Company-wide Sharing Links with Copilot

A simple test proves the point. Create a file that contains some information that’s unlikely to exist elsewhere within the company. In my case, I created a Word document about a fictional digital SLR camera called the Bunsen BX7. Now share the file with a company-wide link (Figure 1).

A company-wide sharing link.
Figure 1: A company-wide sharing link

After signing into another account, open Copilot for Microsoft 365 chat and attempt to find some information about the topic in the file. Copilot should return nothing because a Bing search of the internet and a Microsoft search of company resources available to the account turn up no mention of the topic. But if you now go and use the link to open the file, Copilot can find the information and use it in its responses.

Figure 2 shows a Copilot for Microsoft 365 chat session. The first prompt about the Bunsen BX7 turns up nothing and Copilot responds with some generic text about digital cameras. The second prompt is after redemption of the company-wide sharing link. Copilot is able to find the document and use the information in its response. You can see that the shared document is listed as a source for the response.

Copilot for Microsoft 365 chat uses a company-wide link.
Figure 2: Copilot for Microsoft 365 chat uses a company-wide link

The Desirability of Company-wide Links

The mystery of why some people can use shared documents with Copilot for Microsoft 365 is solved, but thoughts now turn to whether organizations should restrict the use of company-wide links for sensitive documents. The value of these links is that they allow anyone in the organization to access content. The downside is that it’s too easy to create and use company-wide links, which then creates the temptation for people to use these links to share confidential files wider than the organization wants the information to be known.

To guide users away from company-wide links to create sharing links for specific people instead, you can modify the SharePoint tenant configuration to make direct links the default option. Even better you can update individual site settings to disable company-wide links (anyone links are also disabled). For example, the first command sets direct links as the tenant default; the second disables company-wide links for a specific site.

Set-SPOTenant -DefaultSharingLinkType Direct

$Site = "https://office365itpros.sharepoint.com/sites/BlogsAndProjects"
Set-SPOSite -Identity $Site -DisableCompanyWideSharingLinks Disabled

If your organization uses sensitivity labels, you could also consider applying a label that restricts access to a small group of users. That way, even if someone sends a document outside the organization as an email attachment, external recipients won’t be able to open it.

The Challenge of Managing Information in an AI World

The advent of AI assistants creates new information governance challenges for Microsoft 365 tenants. Slowly but surely mechanisms are being developed to help organizations cope and manage the potential for information leakage and misuse. Some Microsoft solutions are no more than sticking plasters to allow customers to progress their Copilot deployments, but overall, the situation seems to be improving. Let’s hope that the trend continues and the current AI hype lives up to its promise.


]]>
https://office365itpros.com/2024/07/02/company-wide-link-copilot/feed/ 1 65424
Better Copilot Audit Records and Copilot Chat Appears in Classic Outlook https://office365itpros.com/2024/05/31/copilot-audit-records-resources/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-audit-records-resources https://office365itpros.com/2024/05/31/copilot-audit-records-resources/#comments Fri, 31 May 2024 07:00:00 +0000 https://office365itpros.com/?p=64983

Copilot Audit Records Now Include Resources Used in Responses

In April 2024, I wrote about the appearance of audit events to capture details when Microsoft 365 applications call Copilot to process a user request (prompt). These events have an operation type of CopilotInteraction.

Since then, Microsoft announced progress in capturing records when people use Copilot in the Stream player to query video transcripts (MC720180, last updated 22 May 2024). It’s like MC720180 (also updated on 22 May 2024), which describes using Copilot to interact with meetings. In both cases, the important point is that the audit events generated for Copilot interactions capture details of resources accessed by Copilot when responding to user prompts (previously the AccessedResources property in the AuditData payload was empty).

Linked to the Change in Transcript Storage Location

Because Copilot depends on meeting transcripts to answer queries, meeting interactions are only possible when meetings are recorded with a transcript. As discussed last week, Teams is standardizing on OneDrive for Business storage for the MP4 files generated for meeting recordings and transcripts. Like many situations in Microsoft 365, developments reported in one message center notification are linked to what’s described in another, seemingly unconnected, update.

The change should be effective in most places now as Microsoft aims to complete worldwide deployment in early June 2024.

Updated Script to Handle Copilot Audit Records

To test the effectiveness of the change, I updated the script I wrote for the previous article (downloadable from GitHub) to support audit records generated by the Stream player and to pay more attention to the data recorded in the associated resources property. Figure 1 shows the output of the script as viewed through the Out-GridView cmdlet.

Copilot audit records capture the resources Copilot accesses
Figure 1: Copilot audit records capture the resources Copilot accesses

Please check out the updated script and let me know if it’s helpful or could be improved.

Copilot in Outlook Classic

Speaking of Copilot, for a long time Microsoft communicated the message that Copilot experiences would only be available in the new Outlook client (aka Monarch). This was no more than a thinly-disguised ploy to drive adoption for Monarch, which still isn’t close to ready for consumption by corporate users.

In any case, message center notification MC794816 (21 May 2025, Microsoft 365 roadmap item 388753) reports the availability of the Copilot for Microsoft 365 chat experience for Outlook classic (Win32). This feature joins “Summarize,” the Copilot option that extracts the major points from an email thread (my second favorite Copilot feature after meeting summarization), and the option to have Copilot draft or revise message drafts. Microsoft will roll out Copilot for Microsoft 365 chat to Outlook classic in the current channel in June 2024.

Before anyone gets too excited, let me say that Copilot for Microsoft 365 chat in Outlook is the same application as accessed as a web application and in Teams. The only difference is that Copilot has an icon in the Outlook application bar and runs in the Outlook window (Figure 2). In other words, if you’re used to Copilot chat elsewhere, you’ll find no difficulty using it in Outlook, providing you have the necessary Copilot for Microsoft 365 license.

Outlook classic gets Copilot for Microsoft 365 chat
Figure 2: Outlook classic gets Copilot for Microsoft 365 chat

As you can see from Figure 2, chats generated in other instances of the client are available in Outlook.

Change, Change, and More Change

Change is ongoing within Microsoft 365. Some changes are dependent on other changes, such as Copilot audit records capturing associated resources for the Stream player. Others are the delivery of incremental functionality within an application. The trick is to keep an eye on what’s happening and to recognize what kind of change each message center notification represents. That’s sometimes hard to do based on the way Microsoft describes a change. Oh well, into every life a little rain must fall…


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/05/31/copilot-audit-records-resources/feed/ 1 64983
Disabling Bits of Copilot for Microsoft 365 https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-service-plans https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/#comments Tue, 30 Apr 2024 07:00:00 +0000 https://office365itpros.com/?p=64575

Exerting Control Over Individual Copilot for Microsoft 365 Components

No doubt inspired by the article explaining how to remove individual features (service plans) from Microsoft 365 licenses, a reader asked if it is possible to control where Copilot for Microsoft 365 functionality surfaces in different apps. There’s no GUI in the Microsoft 365 admin center to disable bits of Copilot for a tenant. You can disable apps belonging to the Copilot license for a user account (Figure 1), but the question is what apps are bundled with Copilot for Microsoft 365 and what happens if administrators disable the apps for users.

Copilot for Microsoft 365 apps for a user account.
Figure 1: Copilot for Microsoft 365 apps for a user account

The Copilot for Microsoft 365 Service Plans

Looking into the details of the Copilot for Microsoft 365 license with the Microsoft Graph PowerShell SDK, we discover that the product (SKU) identifier is 639dec6b-bb19-468b-871c-c5c441c4b0cb and that the license covers eight service plans. As you’ll recall, a service plan governs functionality within a license that can be enabled or disabled. The Microsoft 365 admin center refers to service plans as apps when displaying the license information for a user.

Here’s how to find the license detail with PowerShell:

Connect-MgGraph -Scopes Directory.Read.All -NoWelcome
$CopilotSKU = Get-MgSubscribedSku | Where-Object SkuPartNumber -match "Microsoft_365_Copilot"
$CopilotSku.ServicePlans | Format-Table ServicePlanName, ServicePlanId

ServicePlanName                    ServicePlanId
---------------                    -------------
COPILOT_STUDIO_IN_COPILOT_FOR_M365 fe6c28b3-d468-44ea-bbd0-a10a5167435c
M365_COPILOT_SHAREPOINT            0aedf20c-091d-420b-aadf-30c042609612
GRAPH_CONNECTORS_COPILOT           82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_CONNECTORS            89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_APPS                  a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_TEAMS                 b95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_BUSINESS_CHAT         3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_INTELLIGENT_SEARCH    931e4a88-a67f-48b5-814f-16a5f1e6028d

Table 1 summarizes the service plans included in the Copilot for Microsoft 365 license.

Service Plan NameUser Friendly Feature NameService Plan Id
GRAPH_CONNECTORS_COPILOTGraph Connectors in Microsoft 365 Copilot82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_INTELLIGENT_SEARCHIntelligent Search (Semantic Index)931e4a88-a67f-48b5-814f-16a5f1e6028d
M365_COPILOT_BUSINESS_CHATMicrosoft Copilot with Graph-grounded chat3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_TEAMSMicrosoft 365 Copilot in Microsoft Teamsb95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_APPSMicrosoft 365 Copilot in Productivity Apps (Office)a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_CONNECTORSPower Platform Connectors in Microsoft 365 Copilot89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_SHAREPOINTMicrosoft 365 Copilot in SharePoint0aedf20c-091d-420b-aadf-30c042609612
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c
Table 1: Copilot for Microsoft 365 Service Plans
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c

What the Copilot for Microsoft 365 Service Plans Do

The Copilot service plans split into those governing user-facing features and background or administrative functionality.

User functionality:

  • Microsoft Copilot with Graph-grounded chat
  • Microsoft 365 Copilot in Microsoft Teams (app, summarization of chats and meeting discussions, ability to rewrite/adjust messages before posting to chats or channel conversations)
  • Microsoft 365 Copilot in Productivity Apps (Word, Excel, PowerPoint, Outlook (Win32 and Monarch), Loop, OneNote)

Teams and the productivity apps support Copilot in the desktop, browser, and mobile platforms.

Background and administrative functionality:

Copilot Studio.
Figure 2: Copilot Studio

Turning Off Bits of Copilot

Getting back to the original question, control is available over the chat app, Copilot in Teams, and the generalized bucket of productivity apps. For example, you cannot turn off Copilot for Word and Excel and have it available in PowerPoint and Outlook. The productivity apps are either enabled or disabled for Copilot. Granular control isn’t available.

Copilot for Office depends on the Microsoft 365 enterprise apps (subscription version of Office). Using another version, like Office 2024 (preview available now) isn’t possible because these apps don’t include the necessary UI and code to communicate with Copilot.

The answer to the question is that you can turn bits of Copilot for Microsoft 365 off. For instance, not everyone needs access to Copilot Studio. I’m not sure that I would disable any of the other service plans for background and administrative activity because you don’t know if the action might affect how the user-facing apps work. Disabling a user app certainly works and the license change will be effective within fifteen minutes for browser-based apps (Figure 3) and a few hours for desktop apps, depending on when the app refreshes its license information.

Microsoft Copilot chat discovers that it doesn't have a license.
Figure 3: Microsoft Copilot chat discovers that it doesn’t have a license

But if an organization is paying $360/year for Copilot for Microsoft 365 licenses, surely the imperative is to extract maximum value for the investment instead of restricting what people can use? But if you do decide to disable service plans from the Copilot for Microsoft 365 license, the script will happily do the job for you.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/feed/ 1 64575
Microsoft Grounds Copilot Apps with Graph and Web Content https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-grounding https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/#comments Mon, 25 Mar 2024 08:00:00 +0000 https://office365itpros.com/?p=64268

Office Apps Get Better Grounding in Copilot for Microsoft 365

Message center notification MC734281 (12 March 2024) might have passed by without too much attention unless you’re particularly interested in Copilot for Microsoft 365. The notification informs tenants that Word, Excel, PowerPoint, and OneNote will ground user prompts by reference to enterprise data and the web. As Microsoft notes, this is like what happens when users interact with Copilot for Microsoft 365 chat.

Grounding against enterprise data means that when Copilot responds to user prompts, it will seek additional context by attempting to find relevant information in Microsoft 365 repositories using Graph requests. Web grounding means that Copilot will use Bing search to find relevant information from sites within and outside the enterprise. The fact that major apps will start to use grounded requests from April 2024 might come as a surprise. After all, Microsoft has long cited Copilot’s ability to use the “abundance of data” stored in Microsoft 365 as a major advantage of Copilot for Microsoft 365 over other AI tools that don’t have access to Microsoft 365 repositories.

The roll out starts with Word (Windows and Online) and progresses to PowerPoint, Excel, and OneNote. Microsoft expects to complete the deployment by September 2024.

The Importance of Grounding

Microsoft explains that grounding is “the process of using large language models (LLMs) with information that is use-case specific, relevant, and not available as part of the LLM’s trained knowledge.” In other words, if you ask Copilot for Microsoft 365 to do something and grounding doesn’t happen, it relies on the user prompt to query the LLM.

Until now, users have been able to ground prompts in apps like Word by including up to three reference documents in the prompt. Let me illustrate the importance of grounding by showing an example of two briefing notes generated by Copilot in Word about the Midnight Blizzard attack against Microsoft in January 2024. Copilot generated the first briefing note without any reference documents. Because it couldn’t search the Graph or web for relevant information, the grounding of the prompt was poor, and Copilot could only use whatever information is in the LLM.

As shown in Figure 1, the generated text included several inaccurate statements (hallucinations), including the remarkable assertion that the attack led to a drop of $400 billion in Microsoft’s market value together with a declaration had deprived millions of Microsoft cloud users from accessing services.

Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents).
Figure 1: Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents)

If some relevant reference documents are included in the prompt, Copilot’s generated text becomes more accurate and balanced (Figure 2).

Briefing note about Midnight Blizzard generated by Copilot for Word with reference material.
Figure 2: Briefing note about Midnight Blizzard generated by Copilot for Word with reference material

The important point here is that after Microsoft updates Copilot to allow the Office apps to ground prompts using Graph and web material, the chances of Copilot generating absolute rubbish lessen considerably. That is, if Copilot can find relevant information through its searches. Adding reference documents to prompts in Copilot for Word will generate even better results because the reference documents should give Copilot a more precise context to work with.

Microsoft says that Graph grounding is enabled for all user prompts and that Copilot requests will use “the file context” (whatever file is open at the time) plus web searches as well. Copilot for Microsoft 365 chat uses Graph and web lookups today.

The Quality of AI-Generated Text

In some respects, I was shocked that it has taken so long for Microsoft to ground Copilot requests in these important apps. Copilot for Microsoft 365 is evolving rapidly, but the ability to generate high-quality text at general availability seems like an essential rather than a nice to have feature. I’ve always been suspicious about the quality of the text generated by Word and this revelation certainly explains a lot.

Take Your Time

The advice of Directions on Microsoft analyst Wes Miller that organizations should pace themselves and understand exactly what they are buying before they invest in expensive Copilot licenses is accurate. Things are changing, and the hyperbole around Copilot is like a dust storm that obscures detail. Why rush in where angels fear to tread?

Before making your mind up about Copilot, take the time to read the article posted by MVP Joe Stocker where he reports a drop-off of Copilot activity after the novelty effect of asking the AI to perform tasks wears off. Although the sample size was small, this emphasizes the need to support users on their Copilot journey, especially as important new functionality like Graph and web grounding appears.

And if you attend the Microsoft 365 Conference in Orlando at the end of April, make sure that you come to my session about not letting Copilot for Microsoft 365 become a vanity project. You might even enjoy what I have to say!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem, including in Copilot. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/feed/ 1 64268
Can Copilot for Microsoft 365 Save Users 14 Hours a Month? https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-14hrs https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/#respond Tue, 12 Mar 2024 01:00:00 +0000 https://office365itpros.com/?p=64051

It All Depends on the Person and How They Use Office

Personal perspectives of using technology are often valuable guides to how useful products will be in production. Given the current hype around Copilot for Microsoft 365, I was interested to read a LinkedIn post by Microsoft employee Luka Perne. Based on his use of Copilot over several months logged on a per-task basis, Perne believes he saves 14 hours per month. That’s quite an impressive number that more than justifies the $30/month Copilot license.

It’s always important to put personal observations in context and ask yourself if a product would work as well for you, especially when reading a report written by someone who works for the vendor. I’m sure that some gain enormously from Copilot for Microsoft 365, just as I’m equally convinced that success with Copilot depends on many individual factors.

Not a Marketing Document

What I liked about this report is that it is not trying to sell Copilot. If you look at Microsoft’s marketing material, Copilot works wonderfully because what you see are carefully selected scenes that show Copilot working with data selected to demonstrate its strengths. This coverage is more practical and informative.

For instance, Perne makes the point that people go through a learning curve as they interact with Copilot. Some progress faster and discover how to extract value quickly. Others struggle with prompts or are unsure how Copilot can help. That’s why it’s important to educate and support users during a Copilot deployment project.

Where Success is Found for Copilot for Microsoft 365

Microsoft employees working in engineering and services roles tend to be more comfortable with new technology than the average Microsoft 365 user. Copilot support for users (informal and formal) is likely better and more comprehensive than elsewhere, and users are motivated to explore the capabilities of the technology, including mastering the technique of constructing effective prompts. Overall, I suspect that a technology like Copilot is adopted more easily inside Microsoft than in customer environments.

Perne says that he’s been working with Copilot for four months. Some will gain the same increase in productivity he reports, but I suspect it will take others many months before they do the same.

As Perne notes, he values specific Copilot features. This matches my own experience where the summaries generated by Copilot for Teams meetings, Outlook email threads, and documents (Figure 1) are easily the most valuable in terms of time savings. Anyone who has ever worked with Microsoft (especially the corporate teams) can attest to the number of meetings that people attend and the ability to generate a quality summary based on the meeting transcript is much appreciated, especially when multiple meetings occur at the same time.

Working with Copilot for Microsoft 365 in a Word document.
Figure 1: Working with Copilot for Microsoft 365 in a Word document

Copilot’s ability to create and rewrite text can help people unsure of their writing skills. In my case, I think I do as well in terms of rewriting text by reviewing the suggestions made by Editor or Grammarly. Copilot is good at generating the outline of a document. However, the accuracy of the material Copilot uses to flesh out the outline depends on being able to find relevant information in SharePoint Online or OneDrive for Business. Without something to use, Copilot often strays into made-up text that reads well without being accurate.

Perne generated the graphics in his article with Excel. but notes the limitations Copilot currently has in Excel, like only working for tables with less than 10K rows. I’m sure this is an area that Microsoft will improve in the future. For now, I agree with the observation that I’ve picked up enough Excel over the years to survive without Copilot for the kind of worksheets I deal with.

The assertion that Copilot always delivered improved results for a non-native English speaker when it came to generating or rewriting text was insightful, and I think fair. Many large organizations have a corporate language that most communication is in. For Microsoft, that language is English, and I can see how useful Copilot is when asked to rewrite or correct text. The output will be bland, but it will be precise and readable, and that’s important in email and documents.

Can You Track Your Copilot Results?

The net is that many factors influence the ability of Copilot for Microsoft 365 to save time for people. If you’re technically literate, skilled in using Word, PowerPoint, Outlook, and Excel, and attend a lot of meetings, and store the material you work with in SharePoint Online and OneDrive for Business, the probability is that you will achieve good results. Whether you save 14 hours per month is another matter. Tracking savings using the same methodology as Perne is certainly one way to assess the outcome, if you’re as good as he was at noting results.


Keep up to date with developments like Copilot for Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers understand the most important changes happening across Office 365.

]]>
https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/feed/ 0 64051
Microsoft Kills Viva Topics to Focus on Copilot https://office365itpros.com/2024/02/23/viva-topics-retirement/?utm_source=rss&utm_medium=rss&utm_campaign=viva-topics-retirement https://office365itpros.com/2024/02/23/viva-topics-retirement/#comments Fri, 23 Feb 2024 00:01:00 +0000 https://office365itpros.com/?p=63851

Viva Topics Retirement Propelled by More Lucrative Copilot Opportunity

In a surprise announcement posted in Microsoft 365 message center notification MC718486, Microsoft said that they will retire Viva Topics on February 22, 2025 and will stop new feature development as of February 22, 2024. Originating as part of Project Cortex, Microsoft launched Viva Topics as one of the four modules in its new Viva employee experience platform in February 2021. Support documentation covering the retirement is available online as is a FAQ.

The idea behind Viva Topics is that organizations could leverage their investment in SharePoint Online by creating curated knowledge network about topics important to the business. Knowledge editors would maintain the topics and link them to sources. Users could consume the information in the knowledge network by inserting topics into the natural flow of communications created in Outlook messages, Teams chats and channel conversations (Figure 1), or SharePoint documents. The latest development was to expose topics in the Microsoft 365 user profile card.

Viva Topics in a Teams channel conversation.

Viva Topics retirement
Figure 1: Viva Topics in a Teams channel conversation

There’s some great technology in Viva Topics. Alas, great technology doesn’t always survive in the acid test of the market. Some Microsoft 365 tenants use Topics, but I don’t see any evidence of a major groundswell of projects. The level of discussion about Topics is low in online forums and it’s not a subject for sessions submitted to major Microsoft 365 conferences. Although hardly a test that could be stood over, it is undeniable that potential speakers submit sessions for technology that interests them or that they work on. I cannot recall seeing a submission for a Viva Topics session in the last year.

Knowledge Management is Hard

Knowledge management is hard. Anyone who set up and managed a knowledge network for Viva Topics will appreciate that the AI-powered harvesting of topics from content stored in SharePoint Online can generate hundreds or thousands of topics to curate, refine, and publish, all of which takes time. The work of the knowledge managers might not be appreciated by end users, or even recognized if end users don’t receive education about how to use Topics.

Even though they announced lightweight management for Topics through Viva Engage in July 2023 and Copilot in Viva Topics in April 2023, the benefit of hindsight shows that Microsoft’s heart had been snatched by Copilot and the clarion call to development groups to create Copilot-branded experiences.

Copilot Wins the Game and Forces the Viva Topics Retirement

Apart from being swept along by the Copilot wave, I think hard business logic is a major driving factor behind Microsoft’s decision to retire Viva Topics. Copilot for Microsoft 365 brings in $30/user/month plus the opportunity to upsell customers to more expensive Office 365 or Microsoft 365 licenses. Microsoft’s pricing for Viva Topics varied over the years. According to Copilot, a Viva Topics license brings in $4/user/month (Figure 2).

Copilot figures out the cost of Viva Topics licenses.
Figure 2: Copilot figures out the cost of Viva Topics licenses

Even when included in the Viva Communications and Community license, Topics cannot contribute anywhere close to the revenue that Copilot will likely deliver over the next five years. In addition, Viva Topics is usually a much harder project to sell, and its implementation lacks the excitement and glamor currently associated with Copilot. I mean, topic refinement compared to AI-generated email and documents?

Looking at the situation through the business lens, it makes absolute sense for Microsoft to retire Viva Topics and realign the engineering resources from that program to work on other AI-related projects, such as the “new AI-powered knowledge management experiences” promised in the announcement.

Third Time Lucky

Microsoft’s record in knowledge management is not stellar. The next-generation knowledge portals promised at Ignite 2015 vanished as soon as the attendees left Chicago and its infamous baloney conference lunches behind. Now Viva Topics is being retired. Microsoft has put all its knowledge management eggs in the Copilot basket. Let’s hope that the next round of knowledge applications powered by Copilot demonstrate once again that Microsoft has the habit of getting things right third time around.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes to understand why the Viva Topics retirement happened. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering the Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/02/23/viva-topics-retirement/feed/ 2 63851
Stopping Copilot Access to SharePoint Online Sites and Document Libraries https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=exclude-sharepoint-site-from-copilot https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/#comments Wed, 21 Feb 2024 01:00:00 +0000 https://office365itpros.com/?p=63738

Exclude SharePoint Site from Copilot by Blocking Search Indexing

One of the fundamental concepts underpinning Copilot for Microsoft 365 is the use of Graph queries to find information stored in Microsoft 365 to help ground user prompts. Grounding is the process of providing additional context to make it easier for Copilot to return high-quality responses to user prompts. For instance, if someone asks Copilot to write a briefing note about Office 365, Copilot first queries Microsoft 365 repositories like SharePoint Online to discover what information the user already has about the topic. Optionally, if allowed by the tenant, Copilot can query the web to find additional information.

After gathering information, Copilot refines the prompt and sends it to the Large Language Model (LLM) for processing. Eventually, possibly after further refinement, Copilot returns the response to the user.

Copilot Access to Content Stored in Microsoft 365 Repositories

One of the things you quickly learn about Copilot for Microsoft 365 is that the quality and reliability of generated text is highly dependent on the availability of information. For instance, Copilot is very good at summarizing Teams meetings because it has the meeting transcript to process. However, if you ask Copilot to draft text about a topic where it cannot find anything in Microsoft 365 to ground the prompt, Copilot will certainly generate a response, but the text might not be as useful as you expect. The output will certainly follow the requested format (a report, for instance), but the content is likely to surprise because it is likely to come from a web search that might or might not retrieve useful information.

Users can guide Copilot for Word by providing up to three reference documents. In effect, the user instructs Copilot that it should use the reference documents to ground the prompt. This works well, unless the documents you want to use are large (I am told that Microsoft is increasing the maximum supported size for reference documents).

All of this means that anyone contemplating a deployment of Copilot for Microsoft 365 should store information within Microsoft 365 to create what Microsoft calls an “abundance of data” for Copilot to consume. SharePoint Online and OneDrive for Business are prime repositories, but it’s possible that some SharePoint Online sites contain confidential or other information that the organization doesn’t want Copilot to consume.

Remember, Copilot can only use information that the signed-in account using Copilot can access. An account that has access to a site holding confidential information could find that Copilot retrieves and uses that information in its responses. The user is responsible for checking the text generated by Copilot, but accidents do happen, especially when time is short to get a document out.

Preventing Copilot Access to Sensitive Information

Two methods help to avoid accidental disclosure of confidential information. First, you can protect files with sensitivity labels. If Copilot consumes protected documents, it applies the same sensitivity label to the output.

However, not every organization uses sensitivity labels. In this situation, an organization can decide to exclude selected SharePoint Sites from indexing (Figure 1) by both Microsoft Search and the semantic index. If content is not indexed, it can’t be found by queries and therefore cannot be consumed by Copilot.

Configuring a SharePoint site to exclude it from search results.

Exclude sharepoint site from copilot
Figure 1: Exclude SharePoint Site from Copilot Access by Stopping it Appearing in Search Results

But what happens if you have a SharePoint site with several document libraries and want to make the content available from some libraries and not others? The answer is the same except that the exclusion from search results is applied through the advanced settings of document library settings (Figure 2).

Settings for a document library.
Figure 2: Settings for a document library

The downside of excluding sites or libraries from search results is that people can’t use SharePoint search to find documents.

Testing Excluded Sites and Document Libraries

How do you know site and document library exclusions work? The easiest way is to create a document with an unusual phrase in the excluded site or library and then attempt to use it with Copilot for Word. I created a document about ‘Project Derrigimlagh’ and included the phrase ‘wicked worms’ several times in the content. I then created a new Word document and added the document from the excluded library as a reference (Figure 3).

Selecting a reference file for Copilot for Word
Figure 3: Selecting a reference file for Copilot for Word

You might ask why the document can be added as a reference. The dialog shows recent documents, and the document is in this category, so it shows up. However, when Copilot attempts to consume the document, it cannot access the content. The result is that the prompt cannot be grounded and Copilot flags this as a failure to generate high-quality content (Figure 4). This is a general-purpose error that Copilot issues anytime it believes that it cannot respond to a prompt.

Copilot for Word can't generate high-quality content
Figure 4: Copilot for Word can’t generate high-quality content

Interestingly, when I removed the reference document and reran the prompt, Copilot generated text explaining the potential use of wicked worms as a biofuel source. This is emphatically not the content stored in the excluded document library. The information about Derrigimlagh came from the internet, and making wicked worms into a biofuel source is probably due to published material about using worms in a biorefinery. In any case, it’s a good example of how AI-based text generation needs to be treated with caution.

Use Sensitivity Labels If Possible

If an organization has implemented sensitivity labels, I think this is a better method to protect confidential material, if only because of the persistence of labels to generated documents. You can also define a default sensitivity label for a document library to make sure that everything stored in the library is protected and use auto-label policies to find and protect confidential material stored across all sites.

In a nutshell, sensitivity labels are more flexible and powerful, but it’s nice to have the backup of being able to exclude complete sites and individual document libraries. Just another thing to consider in a Copilot deployment!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/feed/ 1 63738