produced | 51CTO Technology Stack (WeChat ID: blog51cto)
At VivaTech, the annual technology conference for startups in Paris, Yann LeCun, CEO of Meta AI, suggested that those who want to be in the AI ecosystem Working students should not work on LLM (Large Language Model).
If you are interested in building the next generation of AI systems, you do not need to work in LLM. This is a matter for big companies and you can’t contribute to it," LeCun said at the conference.
He also said that people should develop next-generation AI systems that can overcome the limitations of large language models.
Interestingly, discussions about LLM (Large Language Model) alternatives have been going on for some time. Recently, Devika’s young founder Mufeed VH (Devika is an alternative to Devin). Pin) talked about how people should stay away from the Transformer model and start building new architectures
While everyone is doing the same thing, what if we focus on a different architecture, such as RMKV (an RNN architecture). , that would be very beneficial. Mufeed said he accepted the infinite context window and reasoning capabilities of this particular architecture. He also believed that with this approach it might even be possible to build something similar to GPT. -4 Impressive stuff
##Pictures2. However, LLM keeps making progress
despite LeCun’s objections. All research on LLM, but the Transformer training model is still evolving. AI/ML consultant Dan Hou talked about GPT-4o and emphasized that its training model is considered the foundation of all complex models. GPT-4o is designed to understand video and audio natively. This affects the amount of data that future versions can be trained on. “How smart can the AI become? With a native multimodal architecture, my guess is the answer is very, very good," Hou said. Additionally, Sam Altman also talked about in a recent interview that data will no longer be a problem, thus solving The concerns about training LLM have been eliminated. You can imagine that if the data problem can be solved, the scaling law of LLM will continue to exist. For more information on AIGC, please visit: 51CTO AI.x CommunityThe above is the detailed content of Meta AI CEO LeCun: Don't pursue an LLM job. For more information, please follow other related articles on the PHP Chinese website!