Alphabet Inc. (NASDAQ:GOOG) Goldman Sachs Communacopia & Technology Conference September 7, 2023 4:05 PM ET
Alphabet 公司(纳斯达克股票代码:GOOG)高盛 Communacopia & Technology Conference 2023 年 9 月 7 日 下午 4:05 ET
Company Participants 公司参与者
Thomas Kurian - CEO, Google Cloud,
Thomas Kurian - Google Cloud 首席执行官
Conference Call Participants
电话会议参与者
Eric Sheridan - Goldman Sachs
埃里克·谢里丹 - 高盛
Eric Sheridan 埃里克·谢里登
Okay. I know we're moving from session-to-session, so appreciate it if everyone can find their seats and thanks so much for being here. For those who don't know me, my name is Eric Sheridan, I'm Goldman Sachs' US Internet Analyst, and it's my pleasure to have Thomas Kurian, the Google Cloud CEO for our next session.
好的。我知道我们正在从一个会议过渡到另一个会议,所以希望大家能找到自己的座位,非常感谢大家能够出席。对于不认识我的人,我叫埃里克·谢里丹,我是高盛的美国互联网分析师,很荣幸邀请到谷歌云首席执行官托马斯·库里安参加我们接下来的会议。
Thomas Kurian joined Google Cloud as CEO in November 2018. He has deep experience in enterprise software. Prior to Google Cloud, he spent 22 years at Oracle, where most recently he was the President of Product Development.
Thomas Kurian 于 2018 年 11 月加入 Google Cloud 担任首席执行官。他在企业软件领域拥有丰富的经验。在加入 Google Cloud 之前,他在甲骨文公司工作了 22 年,最近担任产品开发总裁。
And before Thomas comes up on stage, I will read a Safe Harbor statement. Some of the statements that Mr. Kurian may make today could be considered forward-looking. These statements involve a number of risks and uncertainties that could cause actual results to differ materially. Any forward-looking statements that Mr. Kurian makes are based on assumptions as of today and Alphabet undertakes no obligation to update them. Please refer to Alphabet's Form 10 k for discussion of the risk factors that may affect its results.
在 Thomas 上台之前,我将宣读一份安全港声明。Kurian 先生今天可能会发表的一些声明可能被视为前瞻性声明。这些声明涉及多种风险和不确定性,可能导致实际结果出现重大差异。Kurian 先生发表的任何前瞻性声明都是基于今天的假设,Alphabet 不承担更新责任。请参考 Alphabet 的 10-K 表格,了解可能影响其结果的风险因素讨论。
Please join me in thanking Thomas Kurian for a presentation and fireside chat today.
请加入我一起感谢 Thomas Kurian 今天的演讲和炉边谈话。
Thomas Kurian 托马斯·库里安
Thank you, Eric. Google Cloud is the enterprise software division of Google. We take the technology that Google's building and make it available to enterprises and governments to help them with their digital transformation.
谢谢,Eric。Google Cloud 是 Google 的企业软件部门。我们将 Google 构建的技术提供给企业和政府,帮助他们进行数字化转型。
At the foundational level, we build our global -- on a global infrastructure. We have 39 regions around the world live, many more under construction, connected over a broad network.
在基础层面,我们在全球基础设施上构建我们的全球基础设施。我们在全球拥有 39 个区域,其中许多正在建设中,并通过广泛的网络连接。
On top of it, we build a foundational set of AI models that are many generation of foundational AI models that we have and we expose it to clients through five product lines; infrastructure to help them modernize their core systems; develop our platforms to help them build new cloud native applications; data analytics to help them understand data, analyze it, and manage it more efficiently; cyber security tools to keep their systems, users, and data safe; and collaboration to help them communicate and collaborate better.
除此之外,我们构建了一套基础 AI 模型,这是我们拥有的多代基础 AI 模型,并通过五条产品线向客户展示;基础设施帮助他们现代化核心系统;开发我们的平台帮助他们构建新的云原生应用程序;数据分析帮助他们理解数据,分析数据,并更高效地管理数据;网络安全工具保护他们的系统、用户和数据安全;以及协作帮助他们更好地沟通和协作。
I want to talk through a little bit why customers choose our cloud and how would the creation of generative AI, we're seeing increased demand for various pieces of our portfolio.
我想简要谈一下客户为什么选择我们的云以及通过生成式人工智能的创造,我们看到我们的产品组合的各个部分需求不断增加。
So, why do we lead in infrastructure and why do customers choose our cloud for infrastructure? When you're modernizing IT systems, you're looking for two primary things; performance and scale, which give you better cost efficiency because you can run your applications, more cost efficiently if you have a better performing and scalable system, and then from a risk point of view, you're worried about security and reliability, meaning uptime and security risks.
那么,为什么我们在基础设施方面处于领先地位,为什么客户选择我们的云基础设施?当您现代化 IT 系统时,您寻找两个主要因素;性能和规模,这两者能够为您提供更好的成本效益,因为如果您拥有性能更好且可扩展的系统,您可以更经济地运行应用程序,然后从风险角度来看,您担心安全性和可靠性,即正常运行时间和安全风险。
We're highly differentiated in a variety of areas. Just two examples. If you look at raw disk, we deliver 10 times better storage performance when measured on a IOPS for disk volume than competitors, which means you get much faster and cheaper access to disk.
我们在多个领域具有高度差异化。只举两个例子。如果您看原始磁盘,我们在以 IOPS 为单位测量磁盘容量时,提供的存储性能比竞争对手高出 10 倍,这意味着您可以更快速、更便宜地访问磁盘。
From an uptime point of view, we're materially differentiated. We have two to three times fewer outage hours, which means for mission-critical systems, a lot better experience for our customers.
从可用性的角度来看,我们有实质性的差异。我们的故障停机时间是其他公司的两到三倍少,这意味着对于关键任务系统,我们的客户体验要好得多。
As generative AI came along, we've evolved our infrastructure portfolio. Remember, Google has been building large scale infrastructure for AI for over 10 years. So, we have over a 10 year head start on any competitor.
随着生成式人工智能的出现,我们已经发展了我们的基础设施组合。请记住,谷歌已经为人工智能构建了大规模基础设施超过 10 年。因此,我们比任何竞争对手领先了 10 年。
领先10年还从Oracle挖个人来主持工作?
As AI matures, we see a range of needs on infrastructure. First we offer very high performance systems for training models. Model training happens on large scale infrastructure that we have, over 50% of all funded AI companies are our customers on Google Cloud and over 70% of AI unicorns use Google Cloud. In other words, those that know how to do AI understand our differentiators.
随着人工智能的成熟,我们看到基础设施上存在一系列需求。首先,我们为训练模型提供高性能系统。模型训练发生在我们拥有的大规模基础设施上,超过 50%的所有获得资金支持的人工智能公司是我们在谷歌云的客户,超过 70%的人工智能独角兽使用谷歌云。换句话说,懂得如何进行人工智能的人了解我们的差异化优势。
When you shift from model training to serving, the needs are quite different. From inferencing, you need a range of different types of accelerators to accelerate performance. You may have dense models, spot models. You may be using a technique called vectorization. You may be doing embeddings to expose data from your databases in.
当您从模型训练转向服务时,需求会有很大不同。在推理过程中,您需要一系列不同类型的加速器来提高性能。您可能有密集模型、稀疏模型。您可能正在使用一种称为矢量化的技术。您可能正在进行嵌入以从数据库中提取数据。
We offer 13 different types of accelerators for clients. The value is that they get the best choice of system for price performance. We also have the most diversified supply because these accelerators give us a choice of what to offer clients.
我们为客户提供 13 种不同类型的加速器。其价值在于他们可以获得性价比最佳的系统选择。我们还拥有最多样化的供应,因为这些加速器让我们有选择地为客户提供服务。
In addition to having a range of accelerators, we made many innovations as Google over 10 years around the system design. So, just saying I have an accelerator doesn't mean that it runs well. There's many elements of system design from optical switching, how we handle dense memory configurations, what we do with network design, and even water cooling, which delivers over 30% improvements in system throughput.
除了拥有一系列加速器外,我们在过去 10 年里作为谷歌在系统设计方面进行了许多创新。因此,仅仅说我有一个加速器并不意味着它运行良好。系统设计的许多要素包括光交换、如何处理密集内存配置、网络设计以及甚至水冷技术,这些都可以使系统吞吐量提高超过 30%。
And finally, as Google, we've invented many pieces of software with DeepMind and Google to parallelize training and inferencing across a large cluster of machines, which gives people way more efficiency on a per unit basis when they're running.
最后,作为谷歌,我们与 DeepMind 和谷歌一起发明了许多软件,用于在大型机群中并行训练和推理,这使人们在运行时在每个单元上获得更高的效率。
So, you see these results in our ability to offer two times better efficiency from training and serving models compared to the best systems of any competitor.
因此,您可以看到,与任何竞争对手的最佳系统相比,我们在培训和服务模型方面的效率提高了两倍。
Developers love to build applications with Google Cloud. There's two reasons for it. First, developers' love using open source tools and we have the deepest and broadest collection of open source tools.
开发人员喜欢使用 Google Cloud 构建应用程序。这有两个原因。首先,开发人员喜欢使用开源工具,而我们拥有最深入和最广泛的开源工具集合。
Second, we also give them the ability to build applications using a set of technology that can run on any environment that they have. When we say on any environment, active premise, on our cloud, or on any other cloud. So, in other words, they can learn once, write once, deploy anywhere, and we make money no matter where they deploy.
其次,我们还赋予他们使用一组技术构建应用程序的能力,这些技术可以在他们拥有的任何环境中运行。当我们说在任何环境中时,包括本地环境、我们的云端,或者任何其他云端。换句话说,他们可以学习一次、编写一次、在任何地方部署,而无论他们在哪里部署,我们都能赚钱。
An example of that is a recent product we introduced called AlloyDB. It's the fastest performing relational database in the market. We run it in all four environments, our cloud, on premise, and on other clouds, and it's the only relational database that can run in any of those configurations.
一个例子是我们最近推出的产品 AlloyDB。它是市场上性能最快的关系数据库。我们在所有四个环境中运行它,包括我们的云端、本地部署以及其他云端,并且它是唯一可以在任何这些配置中运行的关系数据库。
You see that in our adoption, both at the top end of the market, where a system, for example, like Spanner, which is our large scale database scales 20 times better than the largest scalable system of any competitor. So, for high end, we work extremely well. And also because we've made it so easy to use startups and small businesses are growing very quickly in their adoption of our platform.
您可以看到,在我们的采用中,无论是在市场的顶端,例如像 Spanner 这样的系统,它是我们的大规模数据库,比任何竞争对手的最大可扩展系统扩展性提高了 20 倍。因此,对于高端市场,我们的表现非常出色。而且由于我们使其易于使用,初创企业和小型企业在采用我们的平台方面增长非常迅速。
When we introduced our AI systems, we introduced a platform called Vertex AI. Vertex is used today by 50,000 companies, and it's grown 15 times quarter-over-quarter in its adoption because of the interest in generative AI.
当我们推出我们的 AI 系统时,我们推出了一个名为 Vertex AI 的平台。今天,Vertex 被 5 万家公司使用,由于对生成式 AI 的兴趣,它的采用率每个季度增长 15 倍。
Vertex offers four primary capabilities. It offers the broadest collection of models. We've got over a 100 models from Google, open source, third-parties like Cohere, and Anthropic, and others all available through the platform. But -- so people have a choice of the model they want.
Vertex 提供四种主要功能。它提供了最广泛的模型集合。我们拥有来自 Google、开源、Cohere、Anthropic 等第三方的 100 多个模型,以及其他所有模型都可以通过该平台获得。但是,人们可以选择他们想要的模型。
Vertex also gives you all the services that you really need to use and build applications with AI. Things like grounding to ensure that you have fresh, but also factual results from a model; watermarking; automation of reinforcement learning human feedback; synthetic data generation; ML pipelines.
Vertex 还为您提供了所有您真正需要使用和构建具有人工智能的应用程序的服务。例如,确保您从模型中获得新鲜而又真实的结果的基础设施;水印;强化学习人类反馈的自动化;合成数据生成;机器学习流水线。
Google has been the longest in the market in using AI and products and the tools that we use internally they made available to developers through Vertex to use for their own needs.
Google 在市场上使用人工智能和产品方面已经有很长时间了,我们内部使用的工具通过 Vertex 向开发人员提供,供他们根据自己的需求使用。
We also recognize that to use AI, you don't just want models, you need other capabilities like search, conversational AI and all those are made available through Vertex.
我们也意识到,要使用人工智能,你不仅需要模型,还需要其他能力,比如搜索、对话式人工智能等,所有这些都可以通过 Vertex 实现。
And finally, we add various kinds of controls in the system, so people don't have to worry the privacy of their data, intellectual property protection, areas around, for example, responsibility controls, so that you don't have to worry about the tone of the models.
最后,我们在系统中添加各种控制,这样人们就不必担心他们的数据隐私、知识产权保护等方面,例如责任控制,这样您就不必担心模型的调整。
Analysis. Many people are using AI for the purpose of analyzing data. We start from a position of real strength because we've always said you want to keep as few places for your data to be analyzed and for many years, we have consolidated all the different things that people want for analysis in one set of systems.
分析。许多人正在使用人工智能来分析数据。我们从一个真正强大的位置开始,因为我们一直说过,您希望保留尽可能少的数据分析位置,多年来,我们已经将人们想要用于分析的各种不同事物整合到一个系统集合中。
So, we offer a single system to analyze structured and unstructured data. A single system that is a data warehouse and a data lake. A single system that supports the ability to analyze using SQL, Python, Spark, and generative AI tools.
因此,我们提供了一个单一系统来分析结构化和非结构化数据。一个既是数据仓库又是数据湖的单一系统。一个支持使用 SQL、Python、Spark 和生成式 AI 工具进行分析的单一系统。
And you can analyze data from any SaaS application, on-premise environments, and any other cloud without needing to move the data to Google Cloud. That's led us to manage over 40 times as much data as any other data cloud provider. And we have five times more customers than other data clouds.
您可以分析来自任何 SaaS 应用程序、本地环境和任何其他云的数据,而无需将数据移动到 Google Cloud。这使我们能够管理比任何其他数据云提供商多 40 倍的数据。我们拥有比其他数据云多五倍的客户。
When generative AI came along, we've always said that we wanted to provide a digital assistant, think of it as a digital expert, to help people with their tasks for analysis. So, we created something called Duet AI, which is a real-time AI system to help you do all your tasks you need for your analysis.
当生成式人工智能出现时,我们一直说我们想提供一个数字助手,将其视为数字专家,以帮助人们完成分析任务。因此,我们创建了一个名为 Duet AI 的东西,这是一个实时人工智能系统,可以帮助您完成所有分析所需的任务。
What does Duet AI help you do? You can ask in natural language for the system to help you clean and prepare your data for analysis. You can ask it a question. Tell me how revenue is trending? It not only tells you how it's trending, it also helps you analyze why it's trending. It can generate visualization for you to see it. It can even write a set of slides for you, put the charts in it, and write the narration for you.
Duet AI 帮助您做什么?您可以用自然语言要求系统帮助您清理和准备数据以进行分析。您可以问一个问题。告诉我收入趋势如何?它不仅告诉您趋势如何,还帮助您分析为什么会出现这种趋势。它可以为您生成可视化内容供您查看。甚至可以为您撰写一套幻灯片,将图表放入其中,并为您撰写叙述。
Now, two core things with that, we already run large scale AI models in our data cloud. We're running over 300 million prediction operations a year in BigQuery, so we know how to run it super efficiently.
现在,关于这两个核心要点,我们已经在我们的数据云中运行大规模的 AI 模型。我们每年在 BigQuery 中运行超过 3 亿次预测操作,因此我们知道如何高效运行它。
Additionally, because our AI platform is right in line with our data cloud, you don't need to copy data out of our environment to another cloud or to another AI system, which saves you cost in network egress and other things and can be five times more cost efficient.
另外,由于我们的人工智能平台与我们的数据云完全匹配,您无需将数据从我们的环境复制到另一个云或另一个人工智能系统,这可以节省您在网络出口和其他方面的成本,并且可以提高五倍的成本效率。
We want to help customers from a security point of view and we do it in two ways. The first is if you run your systems on Google Cloud, we want to keep you secure. From that point of view, the data speaks for itself. We have far fewer security issues than competitors and so we start from a strong foundation. We also provide solutions to help an organization secure their data and systems across their enterprise, on premise, in other clouds, and on Google.
我们希望从安全的角度帮助客户,我们有两种方式。首先,如果您在谷歌云上运行系统,我们希望让您保持安全。从这个角度来看,数据证明了一切。我们的安全问题比竞争对手少得多,因此我们从一个坚实的基础出发。我们还提供解决方案,帮助组织在其企业内部、本地、其他云端以及谷歌上保护其数据和系统。
Now, if you think about it, there are four pieces that you need to solve for that purpose. To get the best threat intelligence, on what is going on and what are the new threats emerging. The reason we acquired Mandiant was for that purpose. Mandiant has the best threat intelligence to market. They bring real expertise from the frontline handling threats.
现在,如果你考虑一下,有四个部分你需要解决这个问题。为了获得最佳的威胁情报,了解正在发生的事情以及新出现的威胁。我们收购 Mandiant 的原因就是为了这个目的。Mandiant 在市场上拥有最佳的威胁情报。他们带来了来自一线处理威胁的真正专业知识。
We take that and put it into a platform for analysis. And what that platform allows you to do is what security threats are emerging? Which ones are going to affect me? How is an intruder likely to come in and attack me? What do I need to take as an action to protect myself? How can I validate that I've closed the door? And the combination of Mandiant and our security platform gives us the ability to offer customers that.
我们将其放入一个平台进行分析。该平台允许您了解哪些安全威胁正在出现?哪些会影响到我?入侵者可能如何进入并攻击我?我需要采取什么行动来保护自己?我如何验证已关闭大门?Mandiant 和我们的安全平台的结合使我们能够为客户提供这种能力。
When AI came along, we added similar to Duet AI for analytics, a Duet AI for cyber. What it allows you to do is to get an AI-powered security pro sitting right next to you. It allows you to get a summary of all the threats that are merging. It prioritizes the threats. Often organizations are overwhelmed by the number of threats coming in. So, this can help you prioritize the incidents.
当 AI 出现时,我们添加了类似于 Duet AI 用于分析,Duet AI 用于网络安全。它让您可以得到一个 AI 驱动的安全专家就坐在您旁边。它让您可以得到所有正在融合的威胁的摘要。它对威胁进行了优先排序。通常组织会被涌入的威胁数量所压倒。因此,这可以帮助您对事件进行优先排序。
It gives you what we call attack path management, which allows you to see what's the best path that a intruder is likely to come in with and then it allows you to automate and document your controls.
它为您提供了我们所称的攻击路径管理,这使您能够看到入侵者可能采用的最佳路径,然后使您能够自动化和记录您的控制。
Now, in macro, if you think about it, cyber is a relatively simple problem; understanding threats and threat intelligence, searching for patterns that are occurring, and the ability to lock down those patterns.
现在,在宏观层面,如果你仔细想想,网络安全是一个相对简单的问题;理解威胁和威胁情报,寻找正在发生的模式,并有能力锁定这些模式。
We have expertise as Google in search, and we have the world-leading threat intelligence from Mandiant. So, that combination allows us to offer material differentiation. You can see some of the analysis that's come from actual customers. Our security operations platform allows them to search four times faster to find issues than competitors. And our Duet AI tool allows them to handle security incidents by improving the productivity of security professionals by seven times.
我们在搜索方面拥有谷歌的专业知识,并且我们拥有曼迪安特领先的威胁情报。因此,这种组合使我们能够提供实质性差异化。您可以看到一些来自实际客户的分析。我们的安全运营平台使他们能够比竞争对手快四倍地搜索问题。而我们的 Duet AI 工具使他们通过将安全专业人员的生产力提高七倍来处理安全事件。
都是一些没有根据的广告词。
Finally, our collaboration platform Workspace. Workspace has 10 million paying companies as customers. People choose to use Workspace because it is easy to deploy and manage. It's secure by design. It's easy for first-line workers.
最后,我们的协作平台 Workspace。Workspace 有 1000 万付费公司作为客户。人们选择使用 Workspace 是因为它易于部署和管理。它从设计上就是安全的。它也适用于一线员工。
First-line workers are people in a retail store, people like nurses in a hospital, pilots in an air -- in an airplane company, concierges in a hotel, they don't want to carry around top in their backpack in order to do their work. They have native access, secure access from these environments, and it's super easy to operate.
一线员工是零售店里的人,像医院里的护士,飞机公司的飞行员,酒店的门房,他们不想为了工作而背着笨重的设备。他们可以从这些环境中获得本地访问、安全访问,操作起来非常简单。
If you compare from a security point of view since 2019, independent studies have shown we are far, far, far, far more secure. We've also integrated AI inside Workspace since 2015. So, this is our eighth year, and we'd run it at scale. Just to give you a sense, in email alone, we handle 45 billion operations a quarter. So, we know how to make AI run in line extremely cost effectively.
如果从安全角度来看,自 2019 年以来的独立研究表明我们的安全性要高得多。自 2015 年以来,我们还在 Workspace 内部集成了人工智能。所以,这是我们的第八年,我们已经在规模上运行它。仅仅让你感受一下,在电子邮件中,我们每季度处理 450 亿次操作。因此,我们知道如何以极其具有成本效益的方式运行人工智能。
When generative AI came along, we thought about providing people through Duet AI for Workspace, a helper, an author that can help you write, a digital graphics artist that can help you design images and slides and music to go with it, a project manager that can help you do analysis using spreadsheets, and a meeting assistant that record a meeting, transcribe it, summarize it, assign action items, and catch you up if you're late for meetings.
当生成式人工智能出现时,我们考虑通过 Duet AI for Workspace 为人们提供帮助,一个可以帮助您写作的作者,一个可以帮助您设计图像和幻灯片以及相关音乐的数字图形艺术家,一个可以帮助您使用电子表格进行分析的项目经理,以及一个可以记录会议、转录、总结、分配行动项目并在您迟到时帮助您赶上会议的会议助手。
Recognizing that customers want AI through solutions that they buy, we're also working with a broad ecosystem of partners. I wanted to touch on three types of partners we're working with. Data providers, last week at our conference, we announced partnerships with Bloomberg, Dun & Bradstreet, MSCI, and others to provide cleansed high quality data for both customers who want to train their models with it or they wanted to use it for grounding. Grounding is validating and answer that the model gives you.
认识到客户希望通过购买的解决方案获得人工智能,我们还与广泛的合作伙伴生态系统合作。我想谈谈我们正在合作的三种类型的合作伙伴。数据提供商,在我们的会议上,上周我们宣布与彭博社、邓白氏、MSCI 等合作,为那些希望用它来训练模型或用于基础的客户提供经过清洗的高质量数据。基础是验证模型给出的答案。
We also work with SaaS companies, Workday, SAP, Box, UKG, and many others to take our AI models and embed it in their platforms, so that they can empower the business process or business line application that they want.
我们还与 SaaS 公司合作,如 Workday、SAP、Box、UKG 等,将我们的 AI 模型嵌入到它们的平台中,以便赋予他们想要的业务流程或业务线应用程序更强大的能力。
And third, we're working with a broad network of system integrators. Just the five largest system integrators in the world have committed to train a 150,000 people on our generative AI models and they have trained over 100,000 since March alone.
第三,我们正在与广泛的系统集成商合作。仅世界上五大系统集成商已承诺培训 15 万人使用我们的生成式人工智能模型,自三月以来已培训了超过 10 万人。
So, how do we make -- how do we monetize generative AI? Generative AI gives us five key opportunities. The first is to win new customers and we are rapidly gaining customers who may be using another cloud, maybe still on premise and because they're choosing AI as a platform, they're switching to us given the strengths we have in AI.
那么,我们如何实现 -- 如何赚取生成式人工智能的收益呢?生成式人工智能为我们提供了五个关键机会。第一个机会是赢得新客户,我们正在迅速获得可能正在使用其他云服务、可能仍在本地部署的客户,因为他们选择人工智能作为平台,他们因为我们在人工智能方面的优势而转向我们。
It also allows us to upsell products, and there are three flavors of that. Infrastructure products, for example, our machine learning systems are sold similar to general purpose compute on a per virtual CPU per hour.
它还允许我们推销产品,有三种方式。例如,基础设施产品,我们的机器学习系统类似于按每虚拟 CPU 每小时的通用计算方式销售。
Our platform Vertex is sold with a platform fee and then there's fees for the different services, the models, grounding, et cetera. We also have the option to sell Duet to our installed base. Duet for Workspace is sold on a subscription. Duet for GCP, we are in preview. We will talk about pricing when we make it generally available. We have a very broad portfolio of products that we have an opportunity to upsell into our installed base.
我们的平台 Vertex 是通过平台费出售的,然后还有不同服务、模型、基础等的费用。我们还有向我们的现有客户销售 Duet 的选项。Workspace 版的 Duet 是按订阅销售的。至于 GCP 版的 Duet,我们目前处于预览阶段。当我们普遍推出时,我们将讨论定价问题。我们拥有非常广泛的产品组合,有机会向我们的现有客户提供升级销售。
Over and above that, we see two additional opportunities. One, many of the solutions being deployed are outside of IT; in marketing, for example, to do print advertising; in human resources, to automate the employee help desk; in sales and customer service response; in product design; in field service. So, we have many more opportunities outside of IT to sell our technology and we also see the rapid growth of more projects within the IT department.
除此之外,我们看到了另外两个机会。首先,许多正在部署的解决方案都不在 IT 部门之内;例如,在营销领域,用于印刷广告;在人力资源领域,用于自动化员工帮助台;在销售和客户服务响应方面;在产品设计方面;在现场服务方面。因此,我们有更多在 IT 部门之外销售我们的技术的机会,同时我们也看到 IT 部门内更多项目的快速增长。
All of our generative AI products, whether it's infrastructure, Vertex, our models, and Duet for Workspace are generally available. Duet for Google Cloud Platform is in preview and we said at our conference last week, we'll make it available very shortly.
我们所有的生成式 AI 产品,无论是基础设施、Vertex、我们的模型,还是 Workspace 的 Duet,都已经普遍可用。Google Cloud 平台的 Duet 目前处于预览阶段,我们在上周的会议上表示,我们将很快推出。
As a result of our product differentiation, our go-to-market reach, we're seeing rapid growth both in existing and new customers. On average, customers use over 14 products from Google Cloud, which tells you the depth of the relationships we have with them.
由于我们的产品差异化和市场覆盖,我们看到现有客户和新客户都在迅速增长。平均而言,客户在谷歌云上使用超过 14 种产品,这表明我们与他们之间的关系有多深。
Customers, we have over 100,000 partners in our partner program, giving you a sense of the scale of our ecosystem. We've expanded our go-to-market organization by more than four times in the last four years in a very disciplined way, and we are seeing growth from new customers as well as in existing customers.
客户们,我们在合作伙伴计划中拥有超过 10 万个合作伙伴,这让您对我们生态系统的规模有了一定的了解。在过去的四年中,我们以非常有纪律性的方式将我们的市场组织扩大了四倍以上,我们看到了来自新客户以及现有客户的增长。
Finally, we're being very focused on cost discipline as we grow. And the four primary drivers of cost discipline; how efficient is your engineering team in building products? How efficient are you in deploying capital, particularly machines and data centers? How efficient is your go-to-market organization and salesforce? And how differentiator are your products? You see all these results in our performance.
最后,随着我们的发展,我们非常注重成本纪律。成本纪律的四个主要驱动因素是:您的工程团队在产品开发方面的效率如何?您在资本投入方面,尤其是机器和数据中心的部署方面的效率如何?您的市场推广组织和销售团队的效率如何?您的产品有何差异化优势?您可以从我们的表现中看到所有这些结果。
In 2019, we were a very small organization. Four years later, we're one of the five largest enterprise software companies in the world as a standalone business. We've also grown both topline and operating income over those -- that period.
2019 年,我们是一个非常小的组织。四年后,作为一个独立业务,我们成为了全球五大最大的企业软件公司之一。在那段时间里,我们的总收入和营业收入也都有所增长。
So, just in closing, the market for cloud is still in its very early stages. As we have proven every single customer who shows Google could have chosen another cloud because the other clouds have existed for longer than we have. So, they chose us because we have differentiated products, a strong go-to-market organization and partners. We have strong customer momentum. And most importantly, we have a proven track record for growing both topline and operating income.
因此,在结束时,云市场仍处于非常早期阶段。正如我们已经证明的那样,每一个展示 Google 的客户都可以选择其他云,因为其他云的存在时间比我们长。因此,他们选择了我们,因为我们拥有差异化的产品、强大的市场推广组织和合作伙伴。我们拥有强劲的客户动力。最重要的是,我们已经证明了在增长营收和运营收入方面具有强大的业绩记录。
Thank you. 谢谢。
Question-and-Answer Session
问答环节
Q - Eric Sheridan
Great. Thank you, Thomas. As Thomas makes his way over, we're going to have a fireside chat for the next 12 minutes or so. Thomas, thanks for being part of the conference. Thanks for all the information in the slides and congratulations on a really interesting next event last week.
太好了。谢谢你,托马斯。当托马斯走过来时,我们将进行大约 12 分钟的炉边谈话。托马斯,感谢你参加会议。感谢幻灯片中的所有信息,并祝贺你上周举办的非常有趣的下一个活动。
Thomas Kurian 托马斯·库里安
Thank you. 谢谢。
Eric Sheridan 埃里克·谢里登
Maybe to kick us off, the public cloud space and Google cloud in particular has been through a lot of evolution over the last couple of years. What is your view of where we are in terms of cloud adoption, cloud usage, and where Google Cloud fits competitively inside that world view?
也许作为开端,公共云领域,尤其是谷歌云,在过去几年里经历了许多演变。您认为我们在云采用、云使用方面处于何种阶段,以及谷歌云在这个世界观中的竞争地位如何?
Thomas Kurian 托马斯·库里安
Cloud adoption is still in its early stage. If you look at all the industries, all the countries, and the need -- and many industries are being reshaped as software-powered. If you just look at the percentage of spend that's in public cloud today, it's relatively small percentage. So, there's a long way to go.
云采用仍处于早期阶段。如果您看看所有行业、所有国家和需求——许多行业正在被软件驱动重塑。如果您只看看当今公共云中的支出比例,那只是相对较小的比例。因此,还有很长的路要走。
The second thing we've always said, if you look at what we see with generative AI, it's just an evolution of cloud computing. So, cloud computing was always about simplifying technology to make it accessible to everybody.
我们一直在说的第二件事是,如果你看看我们对生成式人工智能的看法,它只是云计算的演进。因此,云计算一直都是简化技术,使其对每个人都可访问的。
The first version was infrastructure, so that instead of having to buy data centers and stand up machines, you simply have an API or a UI to go to get compute on demand. The second step was managed services. Now, with generative AI, you can go into a chat system and say, I'm building a mobile app, I need [Indiscernible] of availability. I need less than half a second of response time, a one millisecond response time, please create a cluster for me that's Kubernetes-based, manage it for me. Think of how simple that is if you can ask it like that, a question in English or Spanish or whatever language you want and have it created. And the more we make things simpler, the wider the opportunity.
第一个版本是基础设施,这样你就不必购买数据中心和搭建机器,只需使用 API 或 UI 即可按需获取计算。第二步是托管服务。现在,借助生成式人工智能,你可以进入聊天系统并说,我正在开发一个移动应用,我需要[不可辨]的可用性。我需要低于半秒的响应时间,一毫秒的响应时间,请为我创建一个基于 Kubernetes 的集群,并为我管理它。想象一下,如果你可以这样问,用英语、西班牙语或任何你想要的语言提问并让其创建。我们让事情变得更简单,机会就会更广阔。
软件的极大丰富,每个组织都能大幅度提升自己的效率。
Eric Sheridan 埃里克·谢里登
Understood. Okay. In your slide deck, you had a chart of the revenue growth you've seen over the last four-plus years since you took over Google Cloud. Can you look backwards first and help us understand what some of the key growth drivers were looking at that ramp in revenue trajectory?
明白了。好的。在您的幻灯片中,您展示了自接管谷歌云以来过去四年多的收入增长图表。您能先回顾一下,并帮助我们了解在收入增长轨迹上看到的一些关键增长驱动因素吗?
And what are you most excited about as primary growth drivers looking forward that will allow Google Cloud to win new clients?
您最期待的主要增长驱动因素是什么,可以让谷歌云赢得新客户?
Thomas Kurian 托马斯·库里安
I mean, the growth drivers have been relatively simple. You have to have great and differentiated product because customers could always choose somebody else. Two, you have to have a great go-to-market organization. And third, you have to have a big partner ecosystem.
我的意思是,增长驱动因素相对简单。您必须拥有出色且与众不同的产品,因为客户总是可以选择其他人。其次,您必须拥有出色的市场推广组织。第三,您必须拥有庞大的合作伙伴生态系统。
When we started, we were so small, we didn't have a partner ecosystem. We have over 100,000 partners now. When we were started, we had a very small go-to-market organization. To scale a go-to-market organization to the size we are with the discipline to grow topline and operating income during that period, obviously, took a lot of work, but we have a very strong foundation.
当我们刚开始时,我们很小,没有合作伙伴生态系统。现在我们拥有超过 10 万个合作伙伴。当我们刚开始时,我们的市场推广组织非常小。要将市场推广组织扩展到我们现在的规模,并在这一时期内保持顶线和营业收入的增长纪律,显然需要大量工作,但我们有非常坚实的基础。
Looking forward, I think, we've been working at Google on AI since 2004. So, that's our 20th year. There's a lot we have learned in not just building models, but building products that enabled by AI. And you're seeing that -- most people ask us, how come you guys launched so many products at next last week? Is because we've been working on it for a while, but we also have extraordinary expertise in doing it over the last 10 years.
展望未来,我认为,我们自 2004 年以来一直在谷歌致力于人工智能。所以,这是我们的第 20 个年头。我们不仅在构建模型方面学到了很多,还在通过人工智能实现产品方面学到了很多。你会发现,大多数人问我们,为什么你们上周推出了这么多产品?因为我们已经在这方面工作了一段时间,而且在过去的 10 年里我们也积累了非凡的专业知识。
Eric Sheridan 埃里克·谢里登
Understood. And referencing next, you did talk a lot about generative AI last week. Can you lay out your world view of how generative AI tools will be adopted and utilized by customers and how you as an organization are sort of helping the deployment of those tools into your customer base?
明白了。接下来参考一下,上周你确实谈了很多关于生成式人工智能的内容。您能详细阐述一下您对生成式人工智能工具将如何被客户采用和利用的世界观,以及您作为一个组织如何帮助将这些工具部署到您的客户群体中吗?
Thomas Kurian 托马斯·库里安
[Indiscernible], we see people using it for a couple of purposes. One is within their organization to become more efficient, for example, to give their software engineering teams the models help them code; helping their marketing organizations to create advertising copy using our image model and using text, for example, generation to do print ads; automating the employee help desk. Employee help desk is where employees go to ask questions about benefits and all these things.
[不可辨识],我们看到人们将其用于几个目的。其中之一是在他们的组织内变得更有效率,例如,为了给他们的软件工程团队提供模型来帮助他们编码;帮助他们的营销组织使用我们的图像模型和文本生成,例如,生成印刷广告;自动化员工帮助台。员工帮助台是员工前往询问有关福利等事项的地方。
help desk是继CRM以后最有价值的软件系统。
Looking at the procurement system, for example, find me all my contracts that have no indemnification and warranties, so that I can make sure that I have the right contracts in place. There are hundreds of these. We have over 500 use cases that customers have already solved with our generative AI platform, and we're making those available and those are all around. So, that's the first thing that people are doing is how can I use it to become more efficient?
查看采购系统,例如,找到所有没有赔偿和保证的合同,以便我可以确保我已经签订了正确的合同。这些有数百个。我们有超过 500 个客户已经通过我们的生成式人工智能平台解决的用例,并且我们正在提供这些用例,这些用例都在周围。所以,人们正在做的第一件事情是如何利用它变得更有效率?
The second is how do I transform the way that I'm working with my customer base? And the speed of the projects are super quick. Orange, a telecommunication company, the national carrier in France, built a customer service agent in two weeks. It handles over 10,000 questions a day. And so it gives you the sense of just transforming that customer interface, transforming the way that products are created, et cetera.
第二个问题是,我如何改变与我的客户群体合作的方式? 项目的速度非常快。 法国的国家运营商橙色公司在两周内建立了一个客户服务代理。 它每天处理超过 10,000 个问题。 因此,这让您感受到了改变客户界面、改变产品创建方式等方面的意义。
Eric Sheridan 埃里克·谢里登
When you take a step back, how should investors think about the sizing of the opportunity for Google Cloud between a few companies needing to train foundation models versus the application layer and other AI tools and services that you referenced in your presentation?
当您退一步思考时,投资者应该如何看待谷歌云在培训基础模型与应用层以及您在演示中提到的其他人工智能工具和服务之间的机会规模?
Thomas Kurian 托马斯·库里安
So, the way we think about it at the infrastructure level, we offer a range of different kinds of accelerators, think of it as another flavor of compute. It's much more differentiated because we have material differences in the way we design systems.
因此,在基础设施层面,我们提供各种不同类型的加速器,可以将其视为另一种计算方式。这种区别更大,因为我们在系统设计方面有实质性差异。
And systems design is far important and more important than just the chip itself to get the right performance. Because we have a range of accelerators, we can offer the broadest choice. We're not supply constrained, because, for example, manufacturing process for one type of chip doesn't depend on the chip on wafer and substrate, which is bottlenecked. But it's priced on a compute hour basis.
系统设计远比芯片本身更重要,以获得正确的性能。由于我们拥有各种加速器,我们可以提供最广泛的选择。我们不受供应限制,因为例如,一种芯片的制造过程并不取决于晶圆和衬底上的芯片,这是瓶颈。但价格是按计算小时计费。
Then Vertex, which is our developer platform, you pay a platform fee, and then you pay a fee for every operation you do. And an operation could be calling a model, an operation could be calling the grounding service to validate your results, et cetera.
然后 Vertex,这是我们的开发者平台,您需要支付平台费用,然后针对您执行的每个操作支付费用。一个操作可能是调用模型,一个操作可能是调用基础服务来验证您的结果,等等。
And then when you go up one more layer, you're using Duet. Duet for Workspace is priced on a per user per month basis, similar to the subscription. Now, the thing you should know is that because we have great efficiency in the way we'd run models, we have the ability to choose the most efficient model to answer a question. So, for instance, if you are -- if you use Gmail, we have on the mobile phone, a model that's very efficient because it runs on the phone.
然后,当您再往上一层时,您将使用 Duet。Workspace 的 Duet 定价是按每月每用户的基础定价,类似于订阅。现在,您应该知道的是,由于我们在运行模型的方式上非常高效,我们有能力选择最有效的模型来回答问题。例如,如果您使用 Gmail,在手机上我们有一个非常高效的模型,因为它在手机上运行。
When you ask, help me write, which a number of you can try, it actually writes on the phone. It doesn't go to the cloud. If you can't use -- if you can't satisfy the result locally, it'll transparently go back. But that's an example of how we're making models run in the most cost efficient way in the right place. And so every step on efficiency also helps overall for us in terms of thinking of the opportunity.
当您请求帮助我写时,您中的一些人可以尝试,它实际上是在手机上写的。 它不会进入云端。 如果您无法使用--如果您无法在本地满足结果,它将透明地返回。 但这是我们如何以最具成本效益的方式在正确的位置运行模型的一个示例。 因此,效率的每一步对我们整体来说也有助于思考机会。
Eric Sheridan 埃里克·谢里登
Okay. If we take the theme of generative AI, what synergies do you see between generative AI and other aspects of your portfolio data analytics, security data? How does those work together from a synergistic standpoint?
好的。如果我们以生成式人工智能为主题,您认为生成式人工智能与您的投资组合中的其他方面,如数据分析、安全数据之间存在哪些协同作用?从协同的角度来看,它们是如何共同发挥作用的?
Thomas Kurian 托马斯·库里安
Two elements. We've always felt that AI in the end was going to be -- was -- our goal was to build all the skills that humans have in an AI-powered system. And so when you look at Duet, it's a manifestation of that vision. We offer an analyst, a professional developer, a cybersecurity expert, a graphics designer, et cetera. So, the first thing, because we make it so much more efficient for people to do analysis, people are willing to pay for that productivity benefit.
两个元素。我们一直认为 AI 最终会成为我们的目标是在 AI 驱动系统中构建人类具有的所有技能。因此,当您看 Duet 时,它是这一愿景的体现。我们提供分析师、专业开发人员、网络安全专家、图形设计师等。因此,首先,因为我们让人们进行分析变得更加高效,人们愿意为这种生产力收益付费。
Second, because we're making it much easier to use, many more people will use the system. For example, very few people outside of professional analysts feel comfortable going to a reporting or dashboarding system. CEOs certainly don't go to that to do get the numbers.
其次,由于我们正在使其更易于使用,因此将有更多的人使用该系统。例如,除了专业分析师之外,很少有人会感到舒适地使用报告或仪表板系统。CEO 们肯定不会去那里获取数据。
But if a CEO could type in, tell me how revenue is trending, and give me the answer, not only on what's happening, but why it's happening, the seat count in terms of number of people using it will also grow because you're making it available to many more people. So, in addition to capturing the productivity benefit, we also see it widening the aperture of users within these accounts.
但是如果首席执行官可以输入,告诉我收入趋势如何,并给我答案,不仅告诉我发生了什么,还告诉我为什么会发生,使用它的人数也会增加,因为你让更多的人可以使用它。因此,除了获得生产力好处之外,我们还看到它扩大了这些账户内用户的范围。
Eric Sheridan 埃里克·谢里登
Okay. When you were here a year ago, you were in the midst of the Mandiant acquisition, we spoke about that a year ago. Can you talk about your focus on security and how you think that differentiates Google Cloud and bring us up to speed about how Mandiant fits into that broader security strategy?
好的。一年前你在这里时,正处于 Mandiant 收购之中,我们一年前谈过这个话题。您能谈谈您对安全的关注重点,以及您认为这如何区分 Google Cloud,并让我们了解 Mandiant 如何融入更广泛的安全战略?
Thomas Kurian 托马斯·库里安
As I said, two basic things. If you're a security provider, you've got to keep your house clean first. Why would anybody trust you as a security provider if you don't know how to secure your own cloud? And whether that's in Workspace or Google Cloud Platform, the numbers speak for themselves.
正如我所说的,有两个基本的事情。如果你是一家安全提供商,首先你必须保持自己的家园清洁。如果你不知道如何保护自己的云,为什么会有人信任你作为安全提供商呢?无论是在 Workspace 还是 Google Cloud 平台上,数字都能说明问题。
Now, when we acquired Mandiant, we said the rate at which threats are growing and the sophistication of the threat actors even prior to AI, but now being armed with AI, requires you to have three basic elements; one, the best threat intelligence, which is why we acquired Mandiant; second, a super scalable search and analytics form; and then the addition of AI to help you prioritize and analyze and remediate.
现在,当我们收购 Mandiant 时,我们说威胁增长的速度以及威胁行为者的复杂性,甚至在 AI 出现之前,但现在有了 AI 的支持,需要您具备三个基本要素;第一,最好的威胁情报,这也是我们收购 Mandiant 的原因;第二,一个超级可扩展的搜索和分析形式;然后是 AI 的加入,帮助您优先处理、分析和补救。
Now, when we talk to customers, for example, they've invested in lots of cybersecurity tools. When you turn on alerting, one customer showed us they had 15,000 alerts. We put Duet on it, it was able to find the two that are the most likely threat vector. And so it helps them prioritize and act and as a result keep themselves more secure.
现在,当我们与客户交谈时,例如,他们已经投资了许多网络安全工具。当您打开警报时,一个客户向我们展示了他们有 15,000 个警报。我们使用 Duet 对其进行了处理,它能够找到最有可能构成威胁的两个。因此,它帮助他们优先处理并采取行动,从而使自己更加安全。
So, that was the rationale behind Mandiant. We always knew that we were going to integrate it with our security operations form and our AI capability to offer a differentiated cyber security offering.
所以,这就是曼迪安特背后的理念。我们一直知道我们将把它与我们的安全运营形式和人工智能能力整合在一起,以提供一种不同的网络安全方案。
Eric Sheridan 埃里克·谢里登
I know we're coming to the end of our time, but if I could just squeeze in one more question, I think a key topic for investors is Custom Silicon. Can you talk about your strategy to develop Custom Silicon? How should we think about the journey you're on from where it is today where it might go long-term? And talk a little bit about the benefits of offering both TPUs and GPUs to customers?
我知道我们即将结束我们的时间,但如果我能再挤进一个问题,我认为投资者关注的一个关键话题是定制硅片。您能谈谈您发展定制硅片的策略吗?我们应该如何看待您目前的发展道路,以及长期可能会走向何方?再谈谈向客户提供 TPU 和 GPU 的好处吧?
Thomas Kurian 托马斯·库里安
So, we've been working on AI systems in production environments for over 10 years. And huge environments, search, email, et cetera. So, we've always known that to accelerate performance, lower cost, you'll need a range of kinds of accelerators and it's not about TPU or GPU, but a variety of different kinds of accelerators.
所以,我们在生产环境中研发人工智能系统已有超过 10 年的经验。在大型环境中,如搜索、电子邮件等。因此,我们一直知道,为了加速性能、降低成本,您需要各种类型的加速器,而不仅仅是关于 TPU 或 GPU,而是各种不同类型的加速器。
So, we started that, and there are many, many elements of system design that we have within our systems. For example, If you're running an application to do inferencing, but access a vector, we have optical switching that dynamically translates that and can find the right amount of memory.
因此,我们开始了这项工作,在我们的系统中有许多系统设计元素。例如,如果您正在运行一个应用程序进行推理,但要访问一个向量,我们有光学切换,可以动态地进行翻译,并找到正确的内存量。
The way we do floating point is materially different. And we also allow you to train on a TPU and serve on a GPU, for example. So, that gives us much more efficient systems overall, which means that we can be more efficient in serving workload. It gives us flexibility and supply, and it allows us to offer not just customers, but our own internal needs with the most cost efficient capital utilization.
我们进行浮点运算的方式有很大不同。我们还允许您在 TPU 上进行训练,然后在 GPU 上提供服务,例如。因此,这使我们的系统整体更加高效,这意味着我们在处理工作负载时可以更加高效。这为我们提供了灵活性和供应,并使我们能够不仅为客户提供,还能够满足我们自身的最具成本效益的资本利用需求。
Eric Sheridan 埃里克·谢里登
Understood. Well, Thomas, thanks so much for making yourself available, to have another conversation this year. Please join me in thanking Thomas Kurian for being part of the conference this year.
明白了。嗯,Thomas,非常感谢你能抽出时间,今年再次进行对话。请大家一起感谢 Thomas Kurian 今年参加会议。