Microsoft's core platform isn't software, it's trust

For the first time in a half-decade, I watched a Microsoft Build keynote this morning. Time gives fresh perspective, looking at where the company was compared to where it is today. Listening to CEO Satya Nadella and other Softies, I repeatedly found myself reminded of Isaac Asimov's three laws or Robotics and how they might realistically be applied in the 21st Century. The rules, whether wise or not, set to ensure that humans could safely interact with complex, thinking machines. In Asimov's science fiction stories, the laws were core components of the automaton's brain—baked in, so to speak, and thus inviolable. They were there by design; foundationally.

Behind all product design, there are principles. During the Steve Jobs era, simplicity was among Apple's main design ethics. As today's developer conference keynote reminds, Microsoft embraces something broader—design ethics that harken back to the company's founding objectives and others that share similar purpose as the robotic laws. On the latter point, Nadella repeatedly spoke about "trust" and "collective responsibility". These are fundamental principles of design, particularly as Artificial Intelligence usage expands and more corporate developers depend on cloud computing platforms like Azure.

"To us, really thinking about the trust in everything that we build, in the technology we build, is so core—and as engineers, we need to truly incorporate this in the core design process, in the tooling around how we build things", Nadella said today.

Microsoft's core platform isn't software, such as Azure or Office 365, it is trust. It is collective responsibility. And going back to one of the founding precepts—put "a computer on every desk and in every home"—the more expansive extension is "empowerment". Typically I regard such keywords as marketing speak. But here, the company communicates core ethical principles embedded into its products, like Asimov envisioned the three laws being part of the robotic brain.

Trust is the cornerstone. Whether developers creating new products from Microsoft's software platforms or corporations and other institutions relying more on AI and the cloud, the T word means everything. The Apollo 11 moon landing simulation demo that should have opened the keynote failed to launch. But many of the other demos, and their supporting announcements, showed a present and future computing landscape where humans can trust that technology will make their lives better—and developers are platform partners achieving the objective. I was particularly impressed, for example, by the Azure Speech Service, which accurately transcribed a meeting conversation while correctly identifying each of the people speaking. Trust is a foundational design principle, particularly for the users who will depend on the product.

Collective responsibility could be interpreted several ways. Microsoft released ElectionGuard, an open-source dev kit for enabling secure, verifiable voting results that can be trusted. The SDK is part of the company's Defending Democracy Program, which includes Office 365 for Campaigns. All three design ethics apply: Collective responsibility, empowerment, and trust.

Same can be said of Microsoft's approach to Artificial Intelligence and Intelligent Agents—advocating development interoperability and availability anytime, anywhere. Related, Nadella explained how the Open Data Initiative "allows you to break free of any one silo. An AI-first company is one where you can take data from one system and make the outcomes of the other system better. It's not about just optimizing that one system and its data, it's about being able to relate the insights, the reasoning from one to improve the outcomes in the other. And that's what the architecture of ODI enables".

For AI-driven digital assistants, Nadella sees openness as crucial to the category's success—and Microsoft plans to champion that future: "We need a multi-agent world. The idea that you're always going to start with one wake word and one assistant is just not like how we start on the web, for example. Just imagine, what does an open assistant future look like? Similar to an open web, that's what we want to really ensure happens when it comes to the personal assistant".

In an amazing demo, a woman used an intelligent agent on her iPhone, where natural language interaction and anticipatory responses were (frighteningly) human-like and completely conversational. Empowerment and trust.

Build 2019 will continue for two more days, and the three principles that I highlight are not the only ones incorporated into Microsoft's design ethic. Tomorrow, Google's developer conference commences, representing another company pushing AI, too. There are fundamental differences to Google's approach, versus Microsoft's, but that is topic for another day.

----------------------------------------------------

From my colleagues, regarding Build 2019 Day One:

"Microsoft announces Internet Explorer mode for Edge to aid enterprise compatibility", Ian Barker

"Windows Terminal is a new Linux-inspired command line app for Windows 10", Wayne Williams

"Open source champion Microsoft announces Windows Subsystem for Linux 2 (WSL 2) at Build 2019", Brian Fagioli

59 Responses to Microsoft's core platform isn't software, it's trust

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.