Generative AI, Copilot, and the Future of Security Vigilance
Taking a Look at Microsoft’s Venture Into Generative AI and How That Impacts ChatGPT
- Microsoft just released Copilot, a product line that applies generative AI to long-existing IT admin solutions, hoping to improve the quality of those tools.
- Copilot is designed to complete with and defeat free tools like ChatGPT by reaching into existing work data to help better generate the content users need.
- Copilot has its pros and cons (as outlined below), but either way, it is exciting to see the progress made in generative AI as well as it’s inevitable application to the tools average users use every day.
Generative AI is the tech theme for 2023. It’s being utilized and pulled into development in every organization; Microsoft is no exception. Microsoft just released its new Copilot product line, which will apply generative AI to solutions IT admins have used for decades, with the hope of improving the quality and quantity, or both, of work.
What is Copilot?
Copilot is AI built into your Windows, Bing, Office suite and more. It comes at a cost, but the difference between the free ChatGPT version and Copilot is the ability to reach into existing work data to help generate content the end user needs. Microsoft defines Copilot as a combination of “large language models (LLMs) with your data in the Microsoft Graph,” which includes your calendar, email, chats, documents, meetings and more.
Note: If you want to dive a bit deeper into LLMs, check out the article on Microsoft Tech Community.
Graph APIs pull source material out of your emails and documents, and based on prompts you provide, it will respond to your requests. Having immediate access to that data certainly makes Copilot more interesting to consider than the public options. But there are some negatives, too.
Copilot price point = Selective usage
At The Experts Conference (TEC) recently, Microsoft MVP Tony Redmond discussed Copilot pros and cons. He pointed out that one major con is the expense. Falling somewhere around $30 per user, per month (with the right plan in place to start), does it really make sense to give Copilot to all users? At that price, it isn’t just a universal deployment move. Rather, there needs to be a conversation about who would truly increase productivity to the degree that it would warrant the expense.
We recently gave a few tips on how to use ChatGPT (a free solution) to accomplish admin-oriented work. End users can make use of it, but it’s public-facing without access to users’ personal M365 data repository, which may or may not matter. While users still have some access to AI tools that can do some or all the same things Copilot will do for them, that extra Graph API connection aspect may make the AI connection of Copilot a bit better.
Organizations have some decisions to make. Unfortunately, the Copilot price point makes it somewhat prohibitive to deploy universally. However, we must also consider the supposed power drain for each AI-based user may be something that Microsoft is happy to keep at a minimum in the short term as they get in the AI game.
Microsoft Security Copilot
What we’ve been seeing from Microsoft is that Copilot is not a single product, but more of an umbrella for all solutions that have integrated AI. A good example of this is GitHub Copilot to help coders work faster and with greater accuracy (i.e., fewer bugs) with generative AI code. Microsoft has a related new security piece called Security Copilot, which apparently allows users to ask questions in natural language and receive actionable responses for faster incident response. This allows users to tap into other Microsoft solutions (Sentinel, Defender, Intune) to generate guidance that is specific to their organization. It is the same as the added value of Copilot for M365, which uses the M365 Graph to generate content specific to the user (as opposed to generic ChatGPT responses). The Security Copilot has access to an organization’s security settings (good and bad) and can see the success/failure of those settings daily to provide focused and actionable insights and direction.
Do You Trust Your Copilot?
Whether called HAL, Skynet, V.I.K.I. Otto, ChatGPT or Copilot, we’re still talking about generative AI that is far from perfect. For example, ChatGPT has been known to produce a variety of errors called “hallucinations”. A quick search for ChatGPT hallucinations will show you how wild and varied these can be. Examples range from response inaccuracies such as “the capital of France is London” to completely fabricated books, laws, and more.
It’s exciting, however, to see the progress being made in generative AI. It’s also obvious Microsoft is putting a lot of effort into development. At the recent Ignite Conference, there were 500 sessions, and 250 revolved around generative AI and Copilot. If that’s any indication on what 2024 (and beyond) will be about, I think we’re in for an interesting ride.
Subscribe to Cyber Resilience Insights for more articles like these
Get all the latest news and cybersecurity industry analysis delivered right to your inbox
Sign up successful
Thank you for signing up to receive updates from our blog
We will be in touch!