Microsoft Corporation (NASDAQ:MSFT) Jefferies Software Conference May 29, 2024 1:00 PM ET
微软公司(纳斯达克股票代码:MSFT)杰弗里斯软件大会 2024 年 5 月 29 日下午 1:00 ET
Company Participants 公司参与者
Jessica Hawk - Corporate Vice President, DATA, AI, & Digital Apps
杰西卡·霍克 - 企业副总裁,数据,人工智能和数字应用
Conference Call Participants
电话会议参与者
Joe Gallo - Jefferies
乔·加洛 - 杰弗里斯
Joe Gallo 乔·加洛
All right. We're going to get started. I'm Joe Gallo. And today, we're delighted to host Jessica Hawk, Corporate Vice President of Data, AI and Digital Apps. She's been at Microsoft for three years, I believe. She previously had co-founded Capax Global, a national system integrator and one of the few super partners with Microsoft. But her role in all things data and AI makes her probably the most important or interesting person at this conference as well as Microsoft. So thank you very much. And also thank you to Kendra and James for joining us in Investor Relations as well.
好的。我们要开始了。我是乔·加洛。今天,我们很高兴地邀请到杰西卡·霍克,微软公司数据、人工智能和数字应用企业副总裁。我相信她在微软工作已经三年了。她之前曾共同创立了 Capax Global,这是一个全国性系统集成商,也是微软的少数超级合作伙伴之一。但她在数据和人工智能方面的角色使她可能是本次会议以及微软最重要或最有趣的人物。非常感谢。同时也感谢肯德拉和詹姆斯加入我们的投资者关系团队。
Maybe to kick things off, you've been at Microsoft three years, which I think is interesting because most people at Microsoft have been there decades. And so you have a fresh perspective and you also co-founded Capax. Can you just give us a sense of where most of your time is being spent and what your biggest focus areas are now?
也许为了开始,你在微软已经三年了,我觉得这很有趣,因为大多数在微软的人在那里已经有几十年了。所以你有一个新鲜的视角,你也是 Capax 的联合创始人。你能告诉我们你大部分时间都花在哪里,现在最大的关注重点是什么吗?
Jessica Hawk 杰西卡·霍克
Sure. Good morning, everybody. I spent most of my -- so product marketing is done differently at every organization around the world for sure. At Microsoft, it is very much a product strategy role. So we spend a ton of time assessing market conditions, working with our engineering teams to find product market fit, doing a lot with sales enablement. So less on the -- some of those I think when people hear marketing, they start thinking about sort of the traditional campaign development and asset creation, imagery.
当然。大家早上好。我大部分时间都在做产品营销--世界各地的组织肯定都有不同的做法。在微软,这主要是一个产品战略角色。因此,我们花费大量时间评估市场状况,与工程团队合作找到产品市场契合点,大力推动销售。因此,不太关注--有些人听到营销时,会开始想到传统的广告活动开发和资产创作、形象等。
We have a sister org that does that. My role is to run the product strategy for most of Azure. And Azure is organized in three distinct areas: infra, data and AI and digital applications and innovations. And so I have the latter two and I partner with others in the company to focus on infra, which is where you would think about things like Windows VMs spinning in Azure.
我们有一个姐妹组织在做这个。我的角色是负责大部分 Azure 的产品战略。Azure 分为三个不同的领域:基础设施、数据和人工智能、数字应用和创新。所以我负责后两者,与公司其他人合作专注于基础设施,这是你会考虑到在 Azure 中运行 Windows 虚拟机等事项的地方。
Joe Gallo 乔·加洛
Awesome. Maybe before we dig into data and AI specifically, can you just remind investors how we should think about Microsoft and their AI strategy as a whole?
很棒。也许在我们深入讨论数据和人工智能之前,您可以提醒投资者我们应该如何看待微软及其整体人工智能战略吗?
Jessica Hawk 杰西卡·霍克
Well, the timing for this event was spectacular because if you're watching us on LinkedIn at all, you know we had our major conference last week in Seattle called Microsoft Build. And I think Satya did a really nice job of laying out the three ways to think about Microsoft's plans with regards to our AI strategy. So we talked about the Copilot plus PCs, Copilot and the Copilot stack. And so I focus on the Copilot stack. So it's everything from the AI infrastructure that is making all of this possible to the data and AI tools that our customers are taking advantage of.
这次活动的时间安排非常完美,因为如果您在 LinkedIn 上关注我们,您就会知道上周我们在西雅图举办了名为 Microsoft Build 的重要会议。我认为萨提亚在阐述微软 AI 战略时做得非常好。我们谈到了 Copilot 加 PC、Copilot 和 Copilot 堆栈。我关注的是 Copilot 堆栈。因此,这涵盖了 AI 基础设施,使所有这些成为可能,以及我们的客户正在利用的数据和 AI 工具。
So think less Microsoft Copilot, the Copilot experience you see in Microsoft Word, for example. Think more what customers are doing with the AI infrastructure and their data and their apps layer and the developer tools that we make available to the market, things like GitHub, Visual Studio, VS code. That's the Copilot stack.
所以不要只考虑微软 Copilot,比如你在 Microsoft Word 中看到的 Copilot 体验。更多地考虑客户如何利用 AI 基础设施、他们的数据、应用程序层以及我们提供给市场的开发工具,比如 GitHub、Visual Studio、VS code。这就是 Copilot 堆栈。
So it's truly where a customer is taking advantage of the same platform that is serving the Microsoft Copilot, they can go build their own Copilot experiences. And so that's really where we spent most of our time at the event. And when you think about the strategy, it's very much -- we are in a fantastic position because we are able to orchestrate across all three of those circles, if you will, and create these connected experiences that can run from a PC experience or a mobile experience to what's happening in the apps that we deliver around the world in our productivity suite to what our customers can then go either extend from the productivity suite or build new for their own customer employee experiences.
因此,客户真正利用的是为微软 Copilot 提供服务的同一平台,他们可以构建自己的 Copilot 体验。这就是我们在活动中花费大部分时间的地方。当您考虑战略时,这非常重要——我们处于一个绝佳的位置,因为我们能够在这三个圈子中进行协调,并创造这些连接体验,这些体验可以从 PC 体验或移动体验延伸到我们在全球交付的应用程序中,以及我们的生产力套件中发生的事情,客户可以从生产力套件中扩展,或者为他们自己的客户员工体验构建新的体验。
Joe Gallo 乔·加洛
That's a good segue, your Build was last week. So for us, investors, what were the most exciting announcements and what should we be focused on?
这是一个很好的过渡,您的 Build 是上周。那么对于我们投资者来说,最令人兴奋的公告是什么,我们应该关注什么?
Jessica Hawk 杰西卡·霍克
We had a lot, over 50, and I'm telling you that was a -- it was a short list of all of the things that we shipped across the business. But let's go with 50. So my role is to really work with Satya and his team on his keynote and then Scott Guthrie and his team on his keynote. And so I would -- Satya's keynote is a great way to go look at what we would say are sort of the top items, of which I didn't count. I would estimate maybe about 50% or more of them were related to the Copilot stack. And then on day two, Scott took the world through the entire Copilot stack.
我们有很多,超过 50 个,我告诉你,那是一个——那是我们在业务中运送的所有事物的一个简短清单。但我们就说 50 个吧。所以我的角色是真正与萨提亚及其团队合作,为他的主题演讲做准备,然后与斯科特·古斯里及其团队合作,为他的主题演讲做准备。所以我会——萨提亚的主题演讲是一个很好的方式,可以看看我们会说是一些顶级项目,其中我没有计算。我估计其中可能有 50%或更多与 Copilot 堆栈相关。然后在第二天,斯科特带领大家了解了整个 Copilot 堆栈。
And so there are a couple of things that I would say sort of broke through. First of all, the concept of Copilot stack, I would say, is one. We've been describing this concept since last year's Build. And at the time, going back to last year, what we're really trying to do is help software developers, because that is our Build conference audience, it's software developers primarily. Just think through, okay, you already know how to build applications, you already know how to build software.
所以有几件事情我想说是突破的。首先,我会说 Copilot stack 的概念是其中之一。我们从去年的 Build 开始描述这个概念。那时候,我们真正想做的是帮助软件开发人员,因为他们是我们 Build 大会的受众,主要是软件开发人员。只是想让他们思考,好吧,你已经知道如何构建应用程序,已经知道如何构建软件。
What is different or new about building applications that have gen AI built into them? So if you're at all technical, you know that software developers work off of a solution architecture diagram and the Copilot stack was meant to at least help them understand what are the boxes that I need to now go think about when I'm building an app that's going to have gen AI in it. And then what we did was on day two, Scott took the audience and our online audience through okay, you understand the conceptual framework, you understand the three circles of the overall AI strategy. Now let's talk about the products that we're making available within each layer of the Copilot stack.
构建集成了 gen AI 的应用程序有何不同或新颖之处?因此,如果您具备一定的技术知识,您就会知道软件开发人员是根据解决方案架构图工作的,而 Copilot 堆栈至少旨在帮助他们了解在构建将具有 gen AI 的应用程序时需要考虑哪些方面。然后,我们在第二天做的是,Scott 带领观众和在线观众了解了概念框架,了解了整体 AI 战略的三个圈子。现在让我们谈谈 Copilot 堆栈每个层面提供的产品。
So customers can move as quickly as they want. Everybody is on an AI transformation mission right now. And so just understanding crystally and clearly, what is Microsoft offering me, what are the tools and products and services I can use to go build these apps, that's what we did on the Copilot stack.
因此,客户可以按照他们的意愿快速移动。现在每个人都在进行人工智能转型任务。因此,只要清晰明了地了解微软为我提供了什么,我可以使用哪些工具、产品和服务来构建这些应用程序,这就是我们在 Copilot 堆栈上所做的事情。
And so some of the top things that pop through beyond just the concept of what the stack is and how companies can think about using it, the Azure AI Studio went GA. And think of that as the single destination for developers and customers and partners to manage the gen AI specific part of the app layer. It's not meant to replace GitHub Copilot or Visual Studio or any of the other developer spaces, those are all still very real and critical.
因此,除了堆栈的概念以及公司如何考虑使用它之外,一些重要的事情突破了,Azure AI Studio 已经上线。将其视为开发人员、客户和合作伙伴管理应用程序层中的通用 AI 特定部分的单一目的地。它并不意味着取代 GitHub Copilot 或 Visual Studio 或任何其他开发者空间,这些仍然非常真实和关键。
The Azure AI Studio allows our customers to take a foundational model and tune it for their own internal use and apply safety controls to it. And so it's a very specific to gen AI set of work that developers have to do, and we're excited to bring it to GA to give not only developers a reliable place to go do that work, but also find a plane to inspect the application of content safety, our Azure AI content safety service, which is where the responsible AI tooling shows up. Not only do developers need that, but their decision makers and the leaders within their organizations need that. And we understand every customer is thinking about how to do this safely and responsibly. And it gives them that single place for them to go and inspect those choices that are being made to make sure that the apps are being built responsibly. So that's one.
Azure AI Studio 允许我们的客户拿取一个基础模型并对其进行调整以适应其内部使用,并对其应用安全控制。因此,这是一项非常特定于通用 AI 工作的工作,开发人员必须完成,我们很高兴将其带到 GA,不仅为开发人员提供一个可靠的工作场所,还可以找到一个检查内容安全应用的地方,我们的 Azure AI 内容安全服务,这就是负责任的 AI 工具的展示地方。开发人员不仅需要这个,而且他们的决策者和组织内的领导者也需要。我们理解每个客户都在考虑如何安全和负责任地进行这项工作。这为他们提供了一个单一的地方,让他们检查正在做出的选择,以确保应用程序被负责任地构建。这就是其中之一。
We also announced GitHub Copilot for Azure. Now GitHub runs across all the major software cloud vendors, right? What we did with GitHub Copilot for Azure is we took the GitHub Copilot experience, which has been very successful in market. It is one of the world's oldest at scale gen AI applications period. I don't know that people remember that very often, but it's been out there for a while.
我们还宣布了适用于 Azure 的 GitHub Copilot。现在 GitHub 在所有主要软件云供应商中运行,对吧?我们在 Azure 中使用 GitHub Copilot 所做的是,我们采用了在市场上非常成功的 GitHub Copilot 体验。这是世界上最古老的大规模通用人工智能应用之一。我不知道人们经常记得这一点,但它已经存在一段时间了。
And what we did was we took the learnings that the GitHub development team had from building the Copilot experience. And we made it smarter about Azure specifically. Because as you've seen in all of the earnings reports, we are bringing a lot of new to Azure customers as a result of our Azure AI offerings, and AWS was first. And therefore, there's just a little bit more understanding of how AWS cloud services work than Azure. And so we needed to do something to help all these new to Azure developers and customers go more quickly.
我们所做的是,我们借鉴了 GitHub 开发团队在构建 Copilot 体验中所获得的经验教训。然后,我们让它更智能地针对 Azure 进行优化。因为正如您在所有的收入报告中所看到的,由于我们的 Azure AI 产品,我们正在吸引大量新的 Azure 客户,而 AWS 则是第一位。因此,对于 AWS 云服务的工作原理,人们的理解要比 Azure 更深入一点。因此,我们需要做一些事情来帮助所有这些新的 Azure 开发人员和客户更快地上手。
And so we did fine tuning on our own Azure models inside of GitHub Copilot, which means that a developer can now just say at Azure, I want to do blah, and it will respond to them in the way that you would build that experience, that development task in an Azure service. So really fast pathing their adoption of the Azure way of doing things, and then there are so many. But maybe one more was we announced two things around Microsoft Fabric. And we can talk about this in a second.
因此,我们在 GitHub Copilot 内对我们自己的 Azure 模型进行了微调,这意味着开发人员现在只需在 Azure 上说我想做某事,它就会以您构建该体验的方式回应他们,即在 Azure 服务中执行开发任务。因此,真正快速地采用 Azure 的做事方式,然后还有很多。但也许还有一件事是我们宣布了关于 Microsoft Fabric 的两件事。我们可以在一会儿谈谈这个。
But for sure, at their base level, the foundational models are fantastic because they understand all the world's languages and they understand the corpus of the world's knowledge because they've been trained on it. But they know absolutely nothing about a customer's individual business, right? And so that's where RAG and fine tuning of a customer's data to the models is where they get to go do things like we like to say, I am a marketer. We're bending the curve of innovation. That's where these rich new experiences are being built by our customers because the very, very smart models are now very, very smart about their business data. And so Microsoft Fabric has been -- it's been a big hit in the market since we GA-ed it at Ignite, which is November of last year. And we expanded its remit at Build this year. We added real time intelligence capability to it. So we are going after that vast stream of at the edge IoT data, data in motion. There has not really been a clear dominant winner in that space. And so we are very excited about the capabilities we brought to Microsoft Fabric to go after that set of data to help our customers do the real time intelligence they need to do on such a large set of data. And then we announced the Workload Development Kit, not a very exciting name, no doubt. I could probably have done better on that one.
但毫无疑问,在基础模型的基本水平上,这些基础模型非常出色,因为它们理解所有世界语言,理解世界知识库,因为它们已经接受过训练。但是它们对客户的个人业务一无所知,对吧?这就是 RAG 和对客户数据进行微调的地方,模型可以做像我们喜欢说的那样的事情,我是一名营销人员。我们正在改变创新的曲线。这就是我们的客户正在构建这些丰富的新体验的地方,因为这些非常聪明的模型现在对他们的业务数据非常了解。因此,自去年 11 月在 Ignite 上发布以来,Microsoft Fabric 一直在市场上大获成功。今年在 Build 上,我们扩展了其权限。我们为其增加了实时智能功能。因此,我们正在追求边缘物联网数据流,运动中的数据。在这个领域还没有真正的主导者。 因此,我们对我们为微软 Fabric 带来的功能感到非常兴奋,以便获取那组数据,帮助我们的客户在如此庞大的数据集上进行实时智能处理。然后我们宣布了工作负载开发工具包,毫无疑问,这不是一个非常令人兴奋的名称。我可能在这方面做得更好。
But the key takeaway on that one, and this is one of the things that got broke through applause in Satya's keynote when he was unveiling it is we recognize that our customers' data lives in lots of different places. And the fabric vision all along has been to make it easy for them to go apply reporting, analytics and AI to data wherever it lives rather than put them through a very, very long journey to have to migrate. Of course, ultimately, we would hope that they will but we're giving them a faster path to get started. And so what the Workload Development Kit does is it allows our partners of which we have many, Microsoft has over 400,000 partners around the world, and many of them are ISVs in the data and AI space, who create data specific solutions. Think SaaS or Prophecy, Informatica, large names in the data space, they can now work with the Workload Development Kit to create what will look like a native fabric workload experience with their technology. And so what that means is customers are excited about the investments they've already made, and we want them to feel good about that. We want them to feel like they can continue to get ROI for maybe they've invested deeply in Informatica, for example. They can now bring their Informatica data and all the experience that Informatica offers into their Fabric environment.
但其中的关键要点是,这是萨蒂亚在揭示时受到掌声的其中一件事情,我们意识到我们的客户数据存储在许多不同的地方。一直以来,我们的布局愿景是让他们能够轻松地将报告、分析和人工智能应用到数据,而不是让他们经历非常漫长的迁移过程。当然,最终,我们希望他们会这样做,但我们为他们提供了更快的开始路径。工作负载开发工具包的作用是允许我们的合作伙伴(微软在全球拥有超过 40 万个合作伙伴,其中许多是数据和人工智能领域的独立软件供应商),他们创建特定数据解决方案。想想 SaaS 或者 Prophecy、Informatica 等数据领域的大名鼎鼎的公司,他们现在可以使用工作负载开发工具包来创建看起来像是本地布局工作负载体验的技术。这意味着客户对他们已经做出的投资感到兴奋,我们希望他们对此感到满意。 我们希望他们觉得他们可以继续获得投资回报,比如他们可能已经在 Informatica 上投入了很多。现在他们可以将他们的 Informatica 数据和 Informatica 提供的所有经验带入他们的 Fabric 环境中。
Question-and-Answer Session
问答环节
Q - Joe Gallo Q - 乔·加洛
Awesome. That's a perfect summary of several days in a few minutes. So I felt like I was there. I want a follow-up, I think it was the second point you made. But how are customers thinking about their AI road maps and timing? And we've heard anecdotally of customers selecting Azure because of your AI lead. Are you seeing that? Is that impacting customers' cloud road maps as well?
太棒了。这是几天的完美总结,用几分钟的时间。所以我感觉就像我在那里一样。我想要一个后续,我觉得是你提到的第二点。但客户们如何考虑他们的人工智能路线图和时间呢?我们听说客户选择 Azure 是因为你们在人工智能方面领先。你们有看到这种情况吗?这是否也影响了客户的云路线图?
Jessica Hawk 杰西卡·霍克
I think -- so for sure, every customer conversation we're having today includes AI. And we actually put together a pretty comprehensive go-to-market strategy early last year to help customers think through all of the different options, because we've heard it loud and clear. There's a lot of Copilots running around out there and this Azure AI thing seems to be hunting, what's that?
我认为 - 所以肯定,今天我们进行的每次客户对话都包括人工智能。实际上,我们在去年初就制定了一项相当全面的市场推广策略,以帮助客户思考所有不同的选择,因为我们听到了明确的声音。有很多 Copilots 在外面奔波,而这个 Azure AI 似乎正在寻找,那是什么?
And so we were trying to just help people understand what the opportunity is. And it's evolved for sure since we first launched it in, I think, January or February, or March of last year. But it does help customers understand where those 1P SaaS app purchase options are, so the Copilot capital C. And we see a lot of focus, of course, on employee productivity. You see that with all the Microsoft Copilot customer evidence and go-to-market that we're delivering.
所以我们只是试图帮助人们了解机会是什么。自从去年一月或二月或三月我们首次推出以来,它肯定已经发展了。但它确实帮助客户了解那些第一方 SaaS 应用程序购买选项在哪里,所以 Copilot 大写 C。当然,我们看到很多关注员工生产力。您可以通过所有微软 Copilot 客户证据和我们正在提供的市场推广看到这一点。
We see a lot of focus on call center innovation. And that can either be through our SaaS service, which is the Copilot for service that's delivered through our BizApps platform or customers are attaching the Azure AI experience to ISVs that they have brought in to deliver that customer call center experience.
我们看到很多关注呼叫中心创新。这可以通过我们的 SaaS 服务实现,即通过我们的 BizApps 平台提供的服务 Copilot,或者客户将 Azure AI 体验附加到他们引入的 ISV,以提供客户呼叫中心体验。
And then moving past that, business process automation has gotten cool again. Suddenly, everybody is talking about how can I get my Visio going of my entire end-to-end flow and then where can I apply AI to sort of accelerate that or reduce friction. I think everybody's felt this deeply. There's never been enough people to do all of the work across the business process chain.
然后继续前进,业务流程自动化再次变得很酷。突然之间,每个人都在谈论如何让我的整个端到端流程的 Visio 运行起来,然后在哪里可以应用人工智能来加速或减少摩擦。我认为每个人都深有体会。从未有足够的人手来完成整个业务流程链上的所有工作。
And so finding ways to provide a Copilot experience or to augment the humans who already have too much to do to get their work done more quickly, we're certainly seeing a lot of that. And then the most recent one is really around -- this is where I say bending the curve of innovation. Particularly since we started to bring the multimodal experiences to Azure AI, customers are building -- we GA-ed the Azure OpenAI service January 16 of last year.
因此,寻找提供副驾驶体验或增强那些已经有太多事情要做以更快完成工作的人类的方法,我们当然看到了很多。然后最近的一个是真正围绕着创新曲线弯曲的地方。特别是自从我们开始将多模态体验引入 Azure AI 以来,客户正在构建——我们在去年 1 月 16 日发布了 Azure OpenAI 服务。
And so it was first to market in many ways. GitHub Copilot came first and then the Azure AI API service that customers can go build on came next. Then we did -- we put Copilot out there and all the rest of Copilots came along afterwards. And I would say initially a lot of people use it for enterprise search, which has never been great. It's never been easy to go find, I need to -- I’ve got to find a document, I can't remember, was it in a call or was it an e-mail or was it texted to me? So for sure, there was a lot of initial adoption of just building a better enterprise search service, but that was really just kind of the beginning.
因此,在许多方面,GitHub Copilot 首先进入市场,然后是 Azure AI API 服务,客户可以在其上构建。然后我们发布了 Copilot,其他 Copilot 随后也相继出现。最初,很多人使用它进行企业搜索,这一直不是很好。要找到东西从来都不容易,我需要找到一个文件,我记不清是在电话里还是在电子邮件里还是发短信给我的?因此,最初确实有很多人采用了构建更好的企业搜索服务,但那只是一个开始。
And now with multimodal in particular, we're seeing all kinds of really exciting use cases that get into the other media formats. And I should start by saying multimodal is an industry term, which I'm going to change, I think we can, because it's a very confusing term, because everybody also understands that applications will use lots of different models. And I think when people hear multimodal, they think, oh, different models. Multimodal actually means multimedia.
现在特别是在多模态方面,我们看到各种真正令人兴奋的用例,涉及其他媒体格式。我应该首先说多模态是一个行业术语,我认为我们可以改变,因为这是一个非常令人困惑的术语,因为每个人也都明白应用程序将使用许多不同的模型。我认为当人们听到多模态时,他们会认为,哦,不同的模型。多模态实际上意味着多媒体。
So it's getting beyond text and getting into images and speech and video. So it's these other modalities, that's what it's meant to convey. One of the reasons why I think OpenAI went with GPT 4.0 to try and convey omni, like it's many, many different types of modalities. But the use cases that are coming out of what happens when you can do text to speech or when you can do image processing or when you can create videos from scratch. Look, we all know it's much easier to edit something than to start from a blank page. And so all of these new experiences are starting to come forward. And I think particularly now that we've started to offer some of these multimodal capabilities, we're going to see more of that.
所以它已经超越了文本,进入了图像、语音和视频。所以这些其他的模态,这就是它的意图。我认为 OpenAI 选择 GPT 4.0 的一个原因是为了尝试传达全方位,就像许多不同类型的模态。但当你可以将文本转换为语音,或者进行图像处理,或者从头开始创建视频时,会出现什么用例。我们都知道编辑东西比从空白页面开始要容易得多。因此,所有这些新体验开始出现。我认为特别是现在我们已经开始提供一些这些多模态功能,我们将看到更多这样的情况。
Joe Gallo 乔·加洛
Is there anything else further that's a gating factor to enterprises adopting AI? Is there -- are people waiting for version 2.0 or are they waiting for more on pricing? Like what can you guys do to kind of drive an acceleration of adoption?
企业采用人工智能还有其他阻碍因素吗?人们是在等待第 2 版,还是在等待更多的定价信息?你们能做些什么来推动采用的加速发展?
Jessica Hawk 杰西卡·霍克
Yes. I think probably maybe three things. Internal organizational readiness for sure is the number one thing that I think customers are focusing on. There's a little bit of natural trepidation in the system around what this is going to do to my role or to my job, and are we going to be able to do this in a safe and secure way? So I think like people are starting to really internalize that this is a change management opportunity within the organization unlike anything. You might say cloud computing was similar, I don't think so. Cloud computing was confined kind of to one part of the org largely in terms of getting into that mental shift of releasing the on-prem control.
是的。我认为可能有三件事。内部组织准备肯定是我认为客户关注的首要事情。系统中存在一些自然的不安,关于这将对我的角色或工作产生什么影响,我们是否能够以安全和可靠的方式做到这一点?所以我认为人们开始真正内化这是一个变革管理机会,与组织内的任何事情都不同。你可能会说云计算类似,我不这么认为。云计算在很大程度上被限制在组织的某一部分,从心理上释放本地控制的转变。
With AI, it's really across the org. And so what I saw last year was a deep desire to learn and understand how we're thinking about responsible AI and what tooling are we creating for our customers to go do that safely. I think this year, we're seeing more about understanding that it's a broad company culture change and you need an AI champ in every organization.
通过人工智能,它真的贯穿整个组织。所以去年我看到的是人们对学习和理解我们如何思考负责任的人工智能以及我们为客户创建了哪些工具来安全地进行这样做的深刻渴望。我认为今年,我们更多地看到了理解这是一个广泛的公司文化变革,您需要在每个组织中都有一个人工智能冠军。
I think the fact that agile software methodology came before gen AI has been extremely helpful, because it empower customers to think about I just wanted to quickly do a POC and learn and adjust and move on. Those kinds of approaches, bringing the business and the tech team more closely together, this has been kind of a divide in the market for forever. I do think that the gen AI projects that I've seen our customers do, like there's always this deeply connected business and tech function working together. So that's been really good. So I would say we're moving into less of a broad concern and more into, okay, I'm starting to understand how to think about doing this, and we're actioning it. So that's number one.
我认为敏捷软件方法论先于通用人工智能出现是非常有帮助的,因为它赋予客户思考的能力,让他们只想快速进行概念验证并学习、调整和继续前进。这种方法,让业务和技术团队更加紧密地合作,这在市场上一直存在分歧。我认为我见过的通用人工智能项目,我们的客户总是有紧密联系的业务和技术功能共同合作。所以这是非常好的。我会说我们正在从广泛关注转向更多地理解如何去思考这个问题,并且我们正在采取行动。这是第一点。
The second is the data estate. In fact, at the CEO Summit that we did last year, so this would have been May, so still pretty early in the gen AI moment but kind of past that initial, is this just another hype cycle kind of moment. I was doing a roundtable session with Eric Boyd, who runs the Azure AI platform, my engineering counterpart. And we had gone through our slides, and it was just kind of an informal room about this size.
第二个是数据领域。实际上,在去年我们举办的 CEO 峰会上,这应该是在五月份,所以在人工智能时代仍然很早期,但已经过了最初阶段,这只是另一个炒作周期的时刻。我当时正在与负责 Azure AI 平台的 Eric Boyd 进行圆桌会议,他是我的工程对应人员。我们已经浏览了我们的幻灯片,这只是一个大约这个大小的非正式房间。
And one of the CEOs of a very large global firm put his hand up and he says, “Does this mean I can cancel my data estate modernization project that I feel like has been going on for forever because the AI can just figure it all out? And the answer is no. In fact, if anything, you should feel really, really good about the investments we made and keep going and go faster. And we see that with customer after customer after customer.
一个非常大型全球公司的首席执行官举手说:“这是否意味着我可以取消我觉得已经进行了很长时间的数据资产现代化项目,因为人工智能可以解决所有问题?” 答案是否定的。实际上,如果有什么的话,你应该对我们所做的投资感到非常非常满意,并继续前进,加快速度。我们看到客户一个接一个地这样做。
There's just massive advantage in terms of latency. These apps cannot be slow. If you have to wait a long time for your response, you've lost your users. And so being colocated in the cloud, and I do think this is some of where that interest in Azure all up is coming from because technology teams understand this, being colocated in Azure, where the Azure AI services are running, there's an absolute performance benefit, there are security benefits. There's all kinds of efficiencies that come from that.
在延迟方面有巨大的优势。这些应用程序不能慢。如果您必须等待很长时间才能得到响应,您就会失去用户。因此,在云中共存,我确实认为这是 Azure 整体兴趣的一部分,因为技术团队明白这一点,在 Azure 中共存,Azure AI 服务正在运行,有绝对的性能优势,有安全性优势。从中获得各种效率。
And so for those who are already there, like some of our earliest customers that adopted the Azure AI service, CarMax is a great example, they had already done all the data estate modernization. They turned Azure OpenAI service on, I think it was like February 5th or so. It's just weeks after we had made it available because at the end of the day, it's just an API call. So they already had an app, they already had their data organized, they're already running in Azure, and they were able to flip to the Azure OpenAI service and took their -- the time that it would have taken their editorial team to review all the customer feedback reviews that are about [CarMax] that they're putting on to their Web site, they estimate it to be, I think it was like 11,000 days or hours. I can't remember the stat anymore that they were just able to do in just a few hours, because that's the pace at which you can move. But if the data is not all organized, if it's not modern, if it's not in the cloud, there's going to be some challenges there as well.
因此,对于那些已经在那里的人,就像我们最早采用 Azure AI 服务的一些最早的客户一样,CarMax 是一个很好的例子,他们已经完成了所有数据资产现代化。他们打开了 Azure OpenAI 服务,我想是在 2 月 5 日左右。这是在我们推出该服务几周后,因为说到底,这只是一个 API 调用。所以他们已经有了一个应用程序,他们已经整理好了数据,他们已经在 Azure 上运行,并且能够切换到 Azure OpenAI 服务,并将他们的编辑团队需要花费的时间估计为,我想是大约 11,000 天或小时,来审查所有关于[CarMax]的客户反馈评论,他们将这些评论放在他们的网站上。我记不清这个统计数据了,他们只需要几个小时就能完成,因为这就是您可以移动的速度。但是,如果数据没有整理好,如果不是现代化的,如果不在云端,那么也会遇到一些挑战。

已经落后的很难再有机会跟上形势的发展,存量数据是个甩不掉的负担。
Joe Gallo 乔·加洛
Maybe just following up on data. We keep hearing more and more about Fabric. How are customers thinking about that and what's unique about that platform?
也许只是在跟进数据。我们越来越多地听到有关 Fabric 的消息。客户对此有什么看法,这个平台有什么独特之处?
Jessica Hawk 杰西卡·霍克
Yes. It is fun to talk about Fabric in this room. Kendra's like the investors are not always necessarily interested in data products, but Fabric has broken through. So thank you for your interest. I think it's kind of going back to what I said before, which was, number one, around like there's nothing new about the need to wrangle your data. These systems were built over the last several decades by lots of different teams and lots of different places. And there every customer is dealing with some version of on-prem, in the cloud, different point players. I think there's -- we have some crazy slide that is like in 6 point fun of all the data providers in the world, there are so many tools out there. And so the challenge has always been there. And so with Fabric, I think one of the things that's truly unique about it is we said we know your analytics needs are mission critical to the organization. Rather than put you on the two year modernization journey to go get the outcomes you're looking for, let's give you a way to easily connect to the data wherever it lives, which includes other clouds and certainly other partner's environments and give you a simple experience. It's called mirroring where you can just create a copy, not moving the data and run your reporting right on top of that.
是的。在这个房间里谈论面料是很有趣的。Kendra 像投资者一样,并不总是对数据产品感兴趣,但 Fabric 已经突破了。所以感谢您的关注。我认为这有点回到我之前说的,第一点是,关于需要整理数据并没有什么新鲜事。这些系统是在过去几十年里由许多不同的团队和不同的地方构建的。每个客户都在处理某种形式的本地部署、云端、不同的参与者。我认为我们有一些疯狂的幻灯片,列出了全球所有数据提供商,有这么多工具。所以挑战一直存在。因此,对于 Fabric,我认为真正独特的一点是,我们说我们知道您的分析需求对组织至关重要。我们不会让您经历两年的现代化之旅来获得您想要的结果,而是让您轻松连接到数据所在的任何地方,包括其他云端和其他合作伙伴的环境,并为您提供简单的体验。 这被称为镜像,您可以创建一个副本,而不是移动数据,并直接在其上运行报告。
And so the simplification, the time to production on these Fabric projects, we do win wires at Microsoft. When the account team closed the deal, they send a mail, everyone's really excited. With Fabric, we added live wires, which is they went live. And there's been -- it's a SaaS experience, so it's pretty easy to get started. It's the same experience for the entire data team, because there's seven different workloads in Fabric, these different teams in an organization, they're all working in the same place. And it's easy for them to go grab their data wherever it lives. And so we're proud of the 11,000 paying customers just a few months after we GA the service. And the pace at which they've been able to adopt the service, I think, has been pretty spectacular.
因此,简化,这些 Fabric 项目的生产时间,我们在微软赢得了胜利。当客户团队完成交易时,他们会发送邮件,每个人都非常兴奋。通过 Fabric,我们添加了实时数据,它们已经上线了。这是一种 SaaS 体验,所以很容易上手。对于整个数据团队来说,体验是一样的,因为 Fabric 中有七种不同的工作负载,组织中的这些不同团队都在同一个地方工作。他们很容易就能获取到他们的数据。因此,我们为在 GA 服务后短短几个月内就有 11,000 名付费客户感到自豪。他们能够采用该服务的速度,我认为相当惊人。
Joe Gallo 乔·加洛
How do you think about the evolution of large language models, will Open Source continue to be relevant? And then are we just going to have a handful of models that roll them all or are there going to be more than that?
你如何看待大型语言模型的演进,开源是否会继续保持相关性?然后我们只会有少数几个模型来覆盖所有领域,还是会有更多模型?
Jessica Hawk 杰西卡·霍克
We are very, very intentional when we answer this question or when we talk about this with customers. Because at the end of the day, Microsoft's core -- I don't know that every company lives their mission statement as deeply as we live ours. Our perspective is always to empower our customers and partners. And so we recognize a sense of choice and selection and control is super critical to anybody making a decision about their tech platform or any individual service within it. And so we have nearly 1,700 models in the Azure AI model catalog. So that's what lives within that Azure AI Studio I mentioned earlier. And it is a huge collection of open source models. We have a great partnership with Hugging Face that we just extended at Build last week that Satya announced, which gives us some of that at-scale ability to bring in new models. We partnered with a company named Hidden Layer to go apply safety scanning on those Open Source models, which is really interesting when you think about Open Source in and of itself, customers, again, especially in our enterprise accounts, they are very concerned about making sure that the -- any tech that they bring in is safe and secure. And so Hidden Layer is now scanning every single one of those, nearly 1,700 catalogs.
我们在回答这个问题或与客户讨论时非常非常有意义。因为说到底,微软的核心——我不知道每家公司是否像我们一样深入地实践他们的使命宣言。我们的观点始终是要赋予客户和合作伙伴力量。因此,我们认识到选择、挑选和控制的感觉对于任何决定其技术平台或其中任何个别服务的人来说都至关重要。因此,我们在 Azure AI 模型目录中拥有近 1700 个模型。这就是我之前提到的 Azure AI Studio 中存在的内容。这是一个庞大的开源模型集合。我们与 Hugging Face 有着良好的合作关系,上周在 Build 大会上刚刚宣布了扩展,这使我们具备了引入新模型的大规模能力。我们与一家名为 Hidden Layer 的公司合作,对这些开源模型进行安全扫描,当你考虑到开源本身时,这真的很有趣,客户,尤其是在我们的企业账户中,他们非常关心确保引入的任何技术都是安全可靠的。 因此,Hidden Layer 现在正在扫描每一个几乎 1,700 个目录中的每一个。
And then in addition to our amazing partnership with OpenAI, we are also building our own family of models called the Phi family. And this is Microsoft research developed in partnership with our Azure AI team and Kevin Scott's organization. And there are SLMs. So we've got LLMs, we've got SLMs. I've not seen MLM come up yet, but I'm certain it's around the corner. And so I do think it's right and it's expected that customers will want to continue to have that choice. We just want to make it easy for them to pick the right model and that's really where the conversation is going. I think we're past the one model will roll them all. It's more about what is my price point, what is my use case, what are my unique requirements and which model is best for that. And the performance and evaluation tooling that we built into the Azure AI Studio makes it very easy for developers to go in there and do those kinds of estimates and just pick the right model for the job.
然后除了我们与 OpenAI 的惊人合作之外,我们还在建立我们自己的模型家族,名为 Phi 家族。这是微软研究与我们的 Azure AI 团队以及 Kevin Scott 的组织合作开发的。还有 SLMs。所以我们有LLMs,我们有 SLMs。我还没有看到 MLM 出现,但我肯定它就在眼前。因此,我认为客户会希望继续拥有选择权是正确的,也是预期的。我们只是希望让他们轻松选择合适的模型,这确实是对话的方向。我认为我们已经超越了一个模型将覆盖所有的阶段。更多的是关于我的价格点是多少,我的用例是什么,我的独特需求是什么,哪个模型最适合。我们在 Azure AI Studio 中构建的性能和评估工具使开发人员能够轻松进入并进行这些估算,只需选择适合工作的正确模型。
Joe Gallo 乔·加洛
Makes sense. We're out of time, but maybe if we're here in one year, which I hope we all are here. What's one thing you're hoping that we're talking about?
有道理。我们时间不多了,但也许如果我们在一年后还在这里,我希望我们都还在这里。你希望我们谈论的一件事是什么?
Jessica Hawk 杰西卡·霍克
That's the fastest 25 minutes my life. I would say it's the multi -- first of all, I hope we're calling it multimedia or something other than multimodal. So I'll get right on that. I think it's -- the world deeply understands the chat experience at this point. Getting into some of the more interesting multimedia experiences, listening to -- like we're having amazing conversations with CMOs right now, which has not necessarily been Microsoft's JM in the past, right, about what they can do from a campaign development, from marketing asset creation. There's so much there that I think we're just scratching the surface of. And then some of the more connected analytics to gen AI applications. To me, that's the other big frontier that's sitting in front of us, where a lot of what's been done today is based on unstructured documents or unstructured data as we refer to it. Think product docs, that's where that call center enablement is happening. Like it's been a problem for two decades. Product team changes something, they try to send mails, they try to do trainings, they try to do all these informs, the person on the edge, it takes that first call. The first call is always really tough. While the models can read all the product documentation that developers do by nature of their role and so that first call can be augmented with the Copilot experience to make it a little bit easier as they're transitioning to the new version of the service, which is when the customers call. So all of those things, I think, are getting really well understood. But connecting that to analytics outcomes and what customers can then learn from what's happening in the call center and put back into their product innovation life cycle, I think that's another one that's going to be really exciting.
这是我生命中最快的 25 分钟。我会说这是多媒体 - 首先,我希望我们称之为多媒体或其他名称,而不是多模态。所以我会立即处理这个问题。我认为世界深刻理解了聊天体验。进入一些更有趣的多媒体体验,听 - 就像我们现在与首席市场官进行了令人惊讶的对话,这在过去并不一定是微软的 JM,对吧,关于他们可以从广告活动开发、营销资产创建方面做些什么。我认为我们只是触及了其中的一部分。然后是一些更加连接的分析和生成 AI 应用。对我来说,这是我们面临的另一个重要领域,今天所做的很多工作都是基于非结构化文档或我们所称的非结构化数据。想想产品文档,那就是呼叫中心的启用正在发生的地方。就像这已经是一个问题已经有二十年了。产品团队更改了某些内容,他们试图发送邮件,他们试图进行培训,他们试图做所有这些通知,处于边缘的人接听了第一个电话。第一个电话总是非常困难的。 虽然模型可以阅读开发人员自然而然地阅读的所有产品文档,因为他们的角色,所以第一个呼叫可以通过 Copilot 经验进行增强,使其在过渡到服务的新版本时变得更容易,这就是客户打电话的时候。所以我认为,所有这些事情都被理解得非常好。但将这与分析结果联系起来,客户可以从呼叫中心发生的事情中学到什么,然后将其反馈到他们的产品创新生命周期中,我认为这将是另一个非常令人兴奋的方面。
Joe Gallo 乔·加洛
Awesome. Jessica, thank you so much for your time. Thanks, Kendra.
太棒了。杰西卡,非常感谢你的时间。谢谢,肯德拉。
Jessica Hawk 杰西卡·霍克
Thanks, guys. 谢谢,伙计们。