### **Attendees learn how generative AI is transforming organizations, across all kinds of industries and applications.**
In his keynote speech, Swami Sivasubramanian, vice president of Database, Analytics, and Machine Learning at Amazon Web Services (AWS), said he expects that AWS services and capabilities will democratize the use of generative artificial intelligence (generative AI)—broadening access for all types of customers, across all lines of business—from engineering to marketing to customer service to finance and sales.
“Generative AI has captured our imaginations,” Sivasubramanian said. “This technology has reached its tipping point.”
![image.png](https://dev-media.amazoncloud.cn/d036fd4e89d440a9ab796f9b1cf3aca5_image.png "image.png")
Swami Sivasubramanian, vice president of Database, Analytics, and Machine Learning at Amazon Web Services
[What is generative AI?](https://www.aboutamazon.com/news/aws/generative-ai-is-the-future?trk=cndc-detail) It’s a type of [machine learning](https://aws.amazon.com/what-is/machine-learning/?trk=cndc-detail) (ML) powered by ultra-large models, including large language models (LLMs). These models are pre-trained on a vast amount of data and are known as “foundation models” (FMs).
Generative AI will help improve experiences for customers as they interact with virtual assistants, intelligent customer contact centers, and personalized shopping services. An employee might see their productivity boosted by generative AI–powered conversational search, text summarization, or code generation tools. Business operations will improve with intelligent document processing or quality controls built with generative AI. And customers will be able to use generative AI to turbocharge the production of all types of creative content.
![image.png](https://dev-media.amazoncloud.cn/7e32b0bd14b84c0f8e292307f939e9a7_image.png "image.png")
![image.png](https://dev-media.amazoncloud.cn/b37a9362c5d44ae2aa410383e931e65f_image.png "image.png")
![image.png](https://dev-media.amazoncloud.cn/adb70b4d7d1f480195359dd3611c29a0_image.png "image.png")
Sivasubramanian underscored how all this value for generative AI will be unlocked with AWS—and how AWS customers will bring these AI-powered experiences to life.
First, model choice will be paramount. No one model will rule them all. Rather, organizations will need to be able to choose the right model for the right job. Then, customers will need to be able to securely customize these models with their own data. For example, an advertising company may want to fine-tune a model by showing it the company’s top performing ad copy, while an online retailer may want to give the model access to its inventory details so it can pull up the right information when a customer asks.
[![image.png](https://dev-media.amazoncloud.cn/5c186a609f4d4a438cef874b2e848fed_image.png "image.png")](https://www.aboutamazon.com/news/aws/aws-generative-ai-innovation-center?trk=cndc-detail)
Easy-to-use tools are also a key part of democratizing AI within organizations—along with the ability to deliver responses that are low cost and low latency, thanks to purpose-built ML infrastructure. Much of this innovation will be built with [Amazon Bedrock](https://www.aboutamazon.com/news/aws/aws-amazon-bedrock-generative-ai-service?trk=cndc-detail), a service offered by AWS that helps organizations of any size and across all industries around the world easily build and scale their own generative AI applications. It does this by giving customers easy access to a wide range of FMs through a simple API and making it easy to leverage existing data stores to customize them.
Amazon has been developing AI and ML technology for more than 25 years, and recent ML innovations have made the capabilities of generative AI possible. **Here are seven generative AI updates announced at the AWS Summit in New York.**
### 1 . AWS expands Amazon Bedrock with new model provider and additional FMs
Since model choice is paramount, [Amazon Bedrock is expanding to include the addition of Cohere as an FM provider, and the latest FMs from Anthropic and Stability AI](https://press.aboutamazon.com/2023/7/aws-expands-amazon-bedrock-with-additional-foundation-models-new-model-provider-and-advanced-capability-to-help-customers-build-generative-ai-applications?trk=cndc-detail). Cohere will add its flagship text generation model, Command, as well as its multilingual text understanding model, Cohere Embed. Additionally, Anthropic has brought Claude 2, the latest version of their language model, to [Amazon Bedrock](https://aws.amazon.com/cn/bedrock/?trk=cndc-detail), and Stability AI announced it will release the latest version of Stable Diffusion, SDXL 1.0, which produces improved image and composition detail, generating more realistic creations for films, television, music, and instructional videos. These FMs join AWS’s existing offerings on [Amazon Bedrock](https://aws.amazon.com/cn/bedrock/?trk=cndc-detail), including models from AI21 Labs and Amazon, to help meet customers where they are on their machine learning journey, with a broad and deep set of AI and ML resources for builders of all levels of expertise.
### 2 . Customers can now create agents for Amazon Bedrock to enable automation of complex tasks and deliver customized, up-to-date answers for their applications, based on their proprietary data
![image.png](https://dev-media.amazoncloud.cn/f1fc840d78954cbda56d3d5560ef1f13_image.png "image.png")
* While FMs are incredibly powerful on their own for a wide range of tasks, like summarization, they need additional programming to execute more complex requests. For example, they don’t have access to company data, like the latest inventory information, and they can’t automatically access internal APIs. Developers spend hours writing code to overcome these challenges. With just a few clicks, [agents for Amazon Bedrock will automatically break down tasks and create an orchestration plan—without any manual coding, making the task of programming generative AI applications easier for developers](https://aws.amazon.com/blogs/aws/preview-enable-foundation-models-to-complete-tasks-with-agents-for-amazon-bedrock/?trk=cndc-detail). For example, to service a customer request to return a pair of shoes—“I want to exchange these black shoes for a brown pair instead”—the agent securely connects to company data, automatically converts it into a machine-readable format, provides the FM with the relevant information, and then calls the right set of APIs to service this request.
### 3 . Vector engine support for Amazon OpenSearch Serverless gives customers a simpler way to leverage vectors for search
Vector embeddings allow machines to understand relationships across text, images, audio, and video content in a format that’s digestible for ML—making everything from online product recommendations to smarter search results work. Now, with [vector engine support for Amazon OpenSearch Serverless](https://aws.amazon.com/blogs/big-data/introducing-the-vector-engine-for-amazon-opensearch-serverless-now-in-preview/?trk=cndc-detail), developers will have a simple, scalable, and high-performing solution to build ML-augmented search experiences and generative AI applications without having to manage a vector database infrastructure.
### 4 . Generative business intelligence (BI) in Amazon QuickSight produces business intelligence based on natural language questions, making insights more accessible
[Amazon QuickSight](https://aws.amazon.com/cn/quicksight/?trk=cndc-detail) is a unified business intelligence service that helps organizations’ employees easily find answers to questions about their data. Now, QuickSight is combining its existing ML innovations with new LLM capabilities available through [Amazon Bedrock](https://aws.amazon.com/cn/bedrock/?trk=cndc-detail) to provide generative AI capabilities—called [generative BI](https://aws.amazon.com/blogs/business-intelligence/announcing-generative-bi-capabilities-in-amazon-quicksight/). These capabilities will help break down siloes, making it even easier to collaborate on data across an organization and speeding up data-driven decision making. Using everyday natural language prompts, analysts will be able to author or fine-tune dashboards, and business users will be able to share insights with compelling visuals within seconds.
### 5 . AWS HealthScribe will use generative AI to ease the paperwork burden for health care professionals, giving time back for patients
![image.png](https://dev-media.amazoncloud.cn/b16b02230dac4e709a11cd31a1d099c5_image.png "image.png")
* Updating electronic health records is one of the most cumbersome tasks for doctors and nurses. Clinicians will find relief when [this HIPAA-eligible service empowers health care software vendors to more easily build clinical applications that leverage generative AI](https://press.aboutamazon.com/2023/7/aws-announces-aws-healthscribe-a-new-generative-ai-powered-service-that-automatically-creates-clinical-documentation?trk=cndc-detail). HealthScribe uses speech recognition and [Amazon Bedrock](https://aws.amazon.com/cn/bedrock/?trk=cndc-detail)–powered generative AI to create transcripts and generate easy-to-review clinical notes, with built-in security and privacy features designed to protect sensitive patient data.
### 6 . New Amazon Elastic Compute Cloud (Amazon EC2) P5 instances harness NVIDIA H100 graphics processing units (GPUs) for accelerating generative AI training and inference
These [Amazon EC2 P5](https://aws.amazon.com/blogs/aws/new-amazon-ec2-p5-instances-powered-by-nvidia-h100-tensor-core-gpus-for-accelerating-generative-ai-and-hpc-applications/?trk=cndc-detail) instances—now generally available—are powered by NVIDIA H100 Tensor Core GPUs, which are optimized for training LLMs and developing generative AI applications. (An “instance” in cloud lingo is virtual access to a compute resource—in this case, compute powered by H100 GPUs.) AWS is the first leading cloud provider to make NVIDIA’s highly sought-after H100 GPUs generally available in production. These instances are ideal for training and running inference for the increasingly complex LLMs and compute-intensive generative AI applications, including question answering, code generation, video and image generation, speech recognition, and more. With access to H100 GPUs, customers will be able to create their own LLMs and FMs on AWS faster than ever.
### 7 . AWS offers seven free and low-cost skills training courses to help you use generative AI
![image.png](https://dev-media.amazoncloud.cn/0f7a7d0490d54924a31324e47f95058b_image.png "image.png")
More than 75% of organizations plan to adopt big data, cloud computing, and AI in the next five years, according to the [World Economic Forum](https://www.weforum.org/reports/the-future-of-jobs-report-2023?trk=cndc-detail). To help people train for the AI and ML jobs of the future, AWS released [on-demand skills trainings to support those who want to understand, implement, and begin using generative AI](https://www.aboutamazon.com/news/aws/7-free-and-low-cost-aws-courses-that-can-help-you-use-generative-ai?trk=cndc-detail). Amazon has designed training courses specifically for developers who want to use [Amazon CodeWhisperer](https://aws.amazon.com/cn/codewhisperer/?trk=cndc-detail), engineers and data scientists who want to leverage generative AI by training and deploying FMs, executives seeking to understand how generative AI can address their business challenges, and AWS Partners helping their customers harness generative AI’s potential.
[![image.png](https://dev-media.amazoncloud.cn/2228638012694643a8fb433a9e6b719c_image.png "image.png")](https://www.aboutamazon.com/news/aws/7-free-and-low-cost-aws-courses-that-can-help-you-use-generative-ai?trk=cndc-detail)
Learn more about [AWS innovations in generative AI](https://aws.amazon.com/generative-ai/?trk=cndc-detail).
Access more [AWS Summit New York highlights](https://aws.amazon.com/events/summits/new-york/?trk=cndc-detail).
文章来源:https://www.aboutamazon.com/news/aws/aws-summit-new-york-generative-ai?trk=cndc-detail
点击查看[**中文版内容**](https://dev.amazoncloud.cn/column/article/64c212f769c6a22f966ac5fa)