Amazon Forms Dedicated Team to Train Ambitious AI Model 'Olympus'
Updated: Jan 8
Amazon is making a significant investment in training a large language model (LLM) called 'Olympus' in an effort to compete with top models from OpenAI and Alphabet, according to sources familiar with the matter.
Olympus is said to have 2 trillion parameters, potentially making it one of the largest models being trained. OpenAI's GPT-4 model, considered one of the best, has one trillion parameters.
The project, led by Rohit Prasad, former head of Alexa, aims to unite AI efforts across Amazon by bringing in researchers from the Alexa AI and Amazon science teams. Prasad, now the head scientist of artificial general intelligence (AGI) at Amazon, reports directly to CEO Andy Jassy. The team's goal is to develop homegrown models that can enhance Amazon's offerings on its cloud computing platform, Amazon Web Services (AWS).
Amazon has already trained smaller models like Titan and has partnered with AI model startups such as Anthropic and AI21 Labs to offer their models to AWS users. The company believes that having its own high-performing models will attract enterprise clients who seek top-performing AI models on AWS.
Training larger AI models like Olympus requires significant computing power and is more expensive. However, Amazon executives have expressed their commitment to increasing investment in LLMs and generative AI, while reducing investment in fulfillment and transportation in the retail business.
While Amazon has not provided a specific timeline for the release of Olympus, the formation of a dedicated team and the investment in training this ambitious AI model demonstrate the company's determination to compete in the AI space.
Amazon is investing in training an ambitious AI model called 'Olympus' with 2 trillion parameters.
The project is led by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jassy.
The team aims to develop homegrown models to enhance Amazon's offerings on AWS.
Source: REUTERS