Microsoft has published a new interview with the head of Windows, Pavan Davuluri, where he shared some interesting details about how Windows is going to change in the future. He wants Windows to more closely integrate with AI technology, unsuprisingly.
Davuluri said that computing will become “more ambient, more pervasive, continue to span form factors, and certainly become more multi-modal in the arc of time”. This means that the operating system is going to fundamentally change and will begin to prioritize new ways of interacting with your computer, like with your voice and by having the OS understand what’s on your screen at all times. Davuluri revealed that Microsoft is making a huge investment in making the OS “agentic” and multi-modal. He also teased that Windows won’t look like it does today in the future.
The next version of Windows will likely put voice at the center of how you interact with your computer. Davuluri says you’ll be able to “speak to your computer while you’re writing, inking, or interacting with another person. You should be able to have a computer semantically understand your intent to interact with it,” which sounds like an AI that happens to also be a computer.
This isn’t the first time Microsoft has hinted at this kind of change, another Microsoft executive teased aWindows 2030 Visionvideo that also uses voice as a first-class input method. It seems clear that Microsoft wants us to be able to talk to our computers in addition to using a keyboard and mouse. The OS will understand your intent based on what is on your screen and will take that into account, making your workflow a lot easier if you choose to use your voice. This will let you use natural language to tell the computer what to do, and it will be able to perform tasks for you.
This sounds a lot like how Gemini watches your screen when you share it. However, a lot of people are going to be worried about privacy. It will take a lot of personal data to make these features truly useful, but Davuluri states that Microsoft is building these models with privacy in mind. He says they are being thoughtful when it comes to privacy and security requirements and making sure that “data stays local customers are in control of the choices associated with the decisions themselves.”
Davuluri said that AI models on the device bring “a bunch of new capabilities and agencies… to the platform and the device itself”. He gave a few examples of this, like an improved version of Windows Search. Traditional search is a lexical indexer that searches for keywords. The vision of a new AI would be augmented by a semantic indexer that can understand the content being searched. This gives you a much better chance of finding what you’re looking for.
It feels like Microsoft has been building up to an AI-focused future, since it keeps adding Copilot to apps. So, while this isn’t much of an unexpected turn, it is still surprising.