Connect with us


Exclusive: AI startup Tenyx’s refined open-source Llama 3 model outperforms GPT-4




Exclusive: AI startup Tenyx's refined open-source Llama 3 model outperforms GPT-4

VB Transform 2024 returns in July! More than 400 business leaders will gather in San Francisco July 9-11 to delve into the advancement of GenAI strategies and engage in thought-provoking community discussions. Find out how you can attend here.

In an exclusive interview with VentureBeat, Itamar Arelfounder and CEO of AI startup Tenyx, unveiled a groundbreaking achievement in natural language processing. Tenyx successfully developed Meta’s open-source Llama-3 language model (now known as Tenyx-70B) to outperform OpenAI’s GPT-4 in certain domains, marking the first time an open-source model has surpassed the proprietary gold standard.

“We have developed a sophisticated technology that allows us to take a fundamental model and polish it or train it beyond what it was trained for,” Arel explains. “What we’re becoming more and more excited about is that we can use that technology, which essentially allows us to exploit some redundancy in these large models, to enable what is probably better called continuous learning or incremental learning.”

A radial graph shows that the Tenyx-optimized Llama 3 model outperforms GPT-4 on math and coding, while outperforming the base Llama 3 model on all capabilities, a first for an open-source AI model according to Tenyx founder Itamar Arel. (Image credit: Tenyx)

Overcoming ‘catastrophic forgetting’

Tenyx’s new approach to refinement addresses the problem of “catastrophic forgetting,” where a model loses previously learned knowledge when exposed to new data. By selectively updating only a small portion of the model parameters, Tenyx can efficiently train the model on new information without compromising existing capabilities.

“If you end up changing, say, only 5% of the model parameters, and everything else remains the same, you can do that more aggressively without the risk of distorting other things,” Arel said. . This selective parameter updating method has also enabled Tenyx to achieve remarkably fast training times, refining the Llama-3 model with 70 billion parameters in just 15 hours using 100 GPUs.

VB Transform 2024 Registration is open

Join business leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with colleagues, explore the opportunities and challenges of generative AI, and learn how to integrate AI applications into your industry. register now

At the time of release, Llama3-TenyxChat-70B is the highest ranked open source model on the MT-Bench evaluation available for download. (Credit: Tenyx)

Commitment to open source AI

Tenyx’s commitment to open-source AI is reflected in their decision to introduce their refined model, called Tenyx-70B, under the same license as the original Llama-3. “We are big believers in open source models,” Arel told VentureBeat. “The more progress shared with the community, the more cool applications and the better for everyone.”

The potential applications of Tenyx’s post-training optimization technology are vast, ranging from creating highly specialized chatbots for specific industries to enabling more frequent incremental updates of deployed models, keeping them fresh between major releases with the latest information.

Reshaping the AI ​​landscape

The implications of Tenyx’s breakthrough are profound, potentially leveling the playing field by giving companies and researchers access to state-of-the-art language models without the high costs and restrictions associated with proprietary offerings. This development could also spur further innovation in the open source community as others look to build on Tenyx’s success.

“It kind of raises questions about what does that mean for the industry? What does this mean for the OpenAIs of the world?” Arel thought. As the AI ​​arms race intensifies, Tenyx’s achievement in refining open source models could reshape the AI ​​industry and transform the way companies approach natural language processing tasks.

While the Tenyx optimized Llama-3 model experiences the same limitations as the base model, such as occasional illogical or unwarranted responses, the performance improvements are significant. Arel highlighted the model’s impressive gains in math and reasoning, achieving an accuracy of almost 96%, compared to the base model’s 85%.

While Tenyx opens the door to a new era of open-source AI innovation, the impact of their breakthrough on the AI ​​ecosystem remains to be seen. However, one thing is certain: Tenyx has shown that open source models can compete with and even surpass their proprietary counterparts, paving the way for a more accessible and collaborative future in artificial intelligence.