Context is king - the race for integration
This week we are seeing a tonne of developments in the front end experience of the leading AI models, much of it focused around context.
Let me summarise the latest updates and explain why this is important to you and your teams.
We are the weakest link
As the models get better with every release, for most employee’s requirements, the limiting factor is now the human.
I always think of fighter jets, where it is the pilot’s body that is the limiting factor. Planes could turn faster, but the pilot would black out even with the G-suits they are provided with.
Right now, for the vast majority of your employees, the AI model is the jet, and it is your employees knowing what to ask, and importantly how to ask, that is the limiting factor.
For AI to do its best work it needs context
If I am to ask about developing a product launch strategy, then the AI needs to know about our company, the new product, the pricing model, the competitive landscape, and the target customer if it is to provide a relevant and valuable answer.
You likely have all of this information somewhere else - Google Docs, a shared folder, or project management tool.
If you can share all of this context with the AI when you ask about your product launch, then the model will be able to use that in its response.
But humans are lazy. To download the relevant files, to know what files will be useful to the chat as it progresses, to keep the files fresh as they are updated - well that is a challenge.
Even if you do it once, you probably won’t the second time around.
The AI vendors recognise this and are working hard to bring your context into your chats without you having to think.
Model Context Protocol (MCP)
If you haven’t heard of MCP, it was a protocol designed and defined by Anthropic at the end of last year, and now gathering pace being picked up by OpenAI and other AI vendors as the standard for how models interact with other tools to gather context.
Other protocols you may be familiar with - Hypertext Transfer Protocol - HTTP, for the internet, of Simple Message Transfer Protocol - SMTP for how email is routed.
Using MCP, vendors are creating MCP servers that can be used to connect your on-premise and cloud based applications to your AI models to provide the context they so desperately need.
Google Workspace integration
One of the most advanced and widely adopted of these is the Google Workspace integrations being rolled out by the AI vendors.
ChatGPT
ChatGPT allows you to integrate with Google Workspace and then use Google’s native document picker to add them into your chat.
Note that you cannot add Google Docs into ChatGPT Projects yet which is the top use case for me.
Grok
Grok this week announced their own integrations with Google Workspace. In the same way as with ChatGPT you can add the integration and use the native Google picker to select a document and add either directly to a chat, or to a Grok Workspace (their equivalent of a project)
Gemini
With Google you would expect they would have deep integrations into their wider stack - and they are getting there.
You can directly add Google Docs, Sheets and Slides to an individual chat, or to a Gem (their version of projects)
Claude
And finally to Claude, the original developers of the MCP protocol. Instead of using the standard Google pickers they have built a deeper integration that surfaces your Google experience within the Claude UI.
For a while you have been able to add Docs natively to a chat or a project,
And this week Anthropic launched a big update.
Now you can integrate with your entire Google Drive (as opposed to picking individual docs), your Gmail, and your Google Calendar.
In this example I ask Claude to summarise the last three week’s of the Kowalah weekly newsletter.
Now consider how using these integrations:
a manager might want to plan for a 121 with their direct report,
an HR leader wants to prioritise employee experience FAQs
a product leader wants to summarise customer feedback
No Microsoft?
There is an interesting side note here - all of the vendors are busy building Google Workspace integrations, not Microsoft 365 integrations.
This speaks to the collaborative nature of Google Docs, and we’ll see a second order effect here where companies that use Google will get more value from the AI models than Microsoft customers who will still need to remember to manually download and upload copies of their files to their chats.
Context is king
Whichever model you use, I’d encourage you to set up the Google integrations and test them out.
The best way to learn how the context from your calendar, documents or emails is valuable is to chat away as if the model was your mentor or coach and see how the context improves your interactions.
Get Started
Whenever you are ready there are three ways I can be helpful:
Model 101 Playlist: 20 minute guides to using ChatGPT, Claude, NotebookLM for work
AI Inspiration Briefing: Show your people the path forward in this 90 minute live session
Kowalah: Buying platform to help you pick the right AI tools for your business