Overnight, Amazon came to overtake in a corner.
When the world's major technology giants are embracing today's most popular large model, AIGC, Amazon only gives one impression: stealth.
Although AWS has been providing machine learning computing power to large model star companies such as Hugging Face and Stability AI, Amazon rarely discloses the details of the cooperation. Some netizens have counted that in the past period of financial reporting meetings, Amazon mentioned AI almost zero times.
But now, Amazon’s attitude has changed dramatically.
On April 13, Amazon CEO Andy Jassy released the 2022 annual shareholder letter, saying that he was confident that Amazon could control costs and continue to invest in new There is confidence in growth areas. In his letter, he stated that Amazon will invest heavily in the currently popular fields of large-scale language models (LLM) and generative artificial intelligence (AI) in the future.
# Over the past few decades, Amazon has used machine learning in a variety of applications, Jassy said. The company is now developing its own large language model, which has the potential to improve "almost any customer experience."
Before he finished speaking, Amazon’s large model and services were unveiled.
"Most companies want to use large language models, but truly useful language models require billions of dollars and years to develop. Training, people don't want to go through that," Andy Jassy said. "So they're looking to take an already huge base model and then be able to customize it for their own purposes. That's Bedrock."
Amazon version of ChatGPT: Yes part of its cloud services.
In its latest announcement, AWS introduced a new set of models — collectively called “Amazon Titan.”
Titan series models are divided into two types, one is a text model for content generation, and the other is an embedding model that can create vector embeddings for creating efficient search functions, etc. .
The text generation model is similar to OpenAI’s GPT-4 (but not necessarily identical in terms of performance) and can perform tasks such as writing blog posts and emails, summarizing documents, and extracting information from databases. Wait for the task. Embedding models translate textual inputs (such as words and phrases) into numerical representations, called embeddings, that contain the semantics of the text.
ChatGPT and Microsoft Bing chatbots based on the OpenAI language model sometimes produce inaccurate information due to a behavior called "hallucination" where the output looks Very convincing, but actually irrelevant to the training data.
AWS Vice President Bratin Saha told CNBC that Amazon “cares deeply” about accuracy and making sure its Titan model produces high-quality responses.
Customers will be able to customize Titan models with their own data. But another vice president said the data would never be used to train Titan models to ensure other customers, including competitors, didn't end up benefiting from the data.
Sivasubramanian and Saha declined to talk about the size of the Titan models or identify the data Amazon used to train them, and Saha wouldn't describe the procedures Amazon follows to remove problematic parts of the model's training data. process.
The release of the Titan model is actually part of Amazon’s “Bedrock” plan. Amazon, the world's largest cloud infrastructure provider, obviously will not leave such a rapidly growing field to rivals such as Google and Microsoft.
The Bedrock initiative comes a month after OpenAI released GPT-4. At that time, Microsoft had invested billions of dollars in OpenAI and provided computing power to OpenAI through its Azure cloud service. This is the strongest competition Amazon's AWS business has ever faced.
The Bedrock cloud service is similar to the engine behind the ChatGPT chatbot powered by Microsoft-backed startup OpenAI. Amazon Web Services will provide access to models like Titan through its Bedrock generative AI service.
The initial basic model set supported by the service also includes models from AI21, Anthropic and Stability AI, as well as Amazon’s self-developed new Titan series models. Bedrock’s debut is a precursor to the partnerships AWS has struck with generative AI startups over the past few months.
The key benefit of Bedrock is that users can integrate it with the rest of the AWS cloud platform. This means organizations will be able to more easily access data stored in the Amazon S3 object storage service and benefit from AWS access control and governance policies.
Amazon isn’t currently revealing how much the Bedrock service will cost because it’s still in a limited preview. A spokesperson said customers can add themselves to a waiting list. Previously, Microsoft and OpenAI have announced prices for using GPT-4, starting at a few cents per 1,000 tokens, with one token equivalent to about four English characters, while Google has yet to announce pricing for its PaLM language model.
We know that programming will be one of the areas where generative AI technology will be rapidly applied. Today, software developers spend a lot of time writing fairly plain and undifferentiated code, and a lot of time learning complex new tools and techniques that are always evolving. As a result, developers have very little time to actually develop innovative features and services.
To deal with this problem, developers will try to copy code snippets from the Internet and then modify them, but they may inadvertently copy invalid code and code with security risks. This method of searching and copying also wastes developers’ time on business construction.
Generative AI can greatly reduce this arduous work by "writing" mostly undifferentiated code, allowing developers to write code faster and have more time to focus. On to more creative programming jobs.
In 2022, Amazon announced the launch of a preview version of Amazon CodeWhisperer. This AI programming assistant uses an embedded basic model to generate code suggestions in real time based on developers' comments described in natural language and existing code in the IDE, improving work efficiency. After the preview version was released, developers received an enthusiastic response. Compared with developers who did not use the programming assistant, users completed tasks 57% faster on average and had a 27% higher success rate.
Now, Amazon announces that CodeWhisperer is officially available and open to all individual users for free without any qualifications or Limitation on usage time. Also includes citation tracking and 50 security scans per month. Users only need to register by email and do not need an Amazon cloud service account. Enterprise customers can choose the Professional version which includes more advanced management features.
In addition to supporting Python, Java, JavaScript, TypeScript and C#, CodeWhisperer has added support for Go, Kotlin, Rust, PHP and SQL, etc. Support for 10 development languages. Developers can access CodeWhisperer through the Amazon Toolkit plug-in in integrated development environments such as VS Code, IntelliJ IDEA, Amazon Cloud9, and can also be used in the Amazon Lambda console.
Amazon says that in addition to learning from billions of lines of public code, CodeWhisperer is also trained on Amazon’s code. Therefore it is currently the most accurate, fastest, and secure way to generate code for Amazon cloud services, including Amazon EC2 and more.
The code generated by AI programming assistants may contain hidden security vulnerabilities, so CodeWhisperer provides built-in security scanning capabilities (implemented through automatic inference) and is the only one to do so . This feature finds hard-to-detect vulnerabilities and recommends remediation, such as those in the Top 10 Open Web Application Security Project (OWASP) and those that do not comply with cryptographic library best practices.
Additionally, to help developers develop code responsibly, CodeWhisperer filters out code suggestions that may be considered biased or unfair. At the same time, because customers may need to refer to open source code sources or obtain permission to use them, CodeWhisperer is also the only programming assistant that can filter and mark suspected open source code suggestions.
Amazon has been in the AI field for more than 20 years, and AWS already has more than 100,000 AI customers. Amazon has been using a tweaked version of Titan to serve search results through its homepage, Sivasubramanian said.
However, Amazon is just one of the big companies to launch generative AI capabilities after ChatGPT emerged and became popular. Expedia, HubSpot, Paylocity, and Spotify are all committed to integrating OpenAI technology, while Amazon is not. “We always act when everything is ready, and all the technology is already there,” Sivasubramanian said. Amazon wants to ensure Bedrock is easy to use and cost-effective thanks to its use of a custom AI processor.
Currently, companies such as C3.ai, Pegasystems and Salesforce are preparing to introduce Amazon Bedrock.
The above is the detailed content of Amazon enters the ChatGPT war with high profile and releases Titan large model and AI programming assistant for free. CEO: Change all experiences. For more information, please follow other related articles on the PHP Chinese website!