Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnWeb3SquareMore
Trade
Spot
Buy and sell crypto with ease
Margin
Amplify your capital and maximize fund efficiency
Onchain
Going Onchain, without going Onchain!
Convert
Zero fees, no slippage
Explore
Launchhub
Gain the edge early and start winning
Copy
Copy elite trader with one click
Bots
Simple, fast, and reliable AI trading bot
Trade
USDT-M Futures
Futures settled in USDT
USDC-M Futures
Futures settled in USDC
Coin-M Futures
Futures settled in cryptocurrencies
Explore
Futures guide
A beginner-to-advanced journey in futures trading
Futures promotions
Generous rewards await
Overview
A variety of products to grow your assets
Simple Earn
Deposit and withdraw anytime to earn flexible returns with zero risk
On-chain Earn
Earn profits daily without risking principal
Structured Earn
Robust financial innovation to navigate market swings
VIP and Wealth Management
Premium services for smart wealth management
Loans
Flexible borrowing with high fund security
Hugging Face sees small AI models advancing robotics

Hugging Face sees small AI models advancing robotics

GrafaGrafa2024/11/13 08:00
By:Liezl Gambe

Hugging Face, the AI startup known for open-source innovations, is developing smaller language models aimed at enhancing robotics and on-device AI.

Speaking at Web Summit in Lisbon, Thomas Wolf, Co-Founder and Chief Science Officer of Hugging Face, emphasised that smaller models are key for real-time applications.

"We want to deploy models in robots that are smarter, so we can start having robots that are not only on assembly lines, but also in the wild," Wolf said, noting the importance of low latency.

“You cannot wait two seconds so that your robots understand what's happening, and the only way we can do that is through a small language model," he added.

Wolf highlighted that these smaller models can handle many tasks previously thought to require larger models and can be deployed directly on devices like laptops and smartphones.

"If you think about this kind of game changer, you can have them running on your laptop," he said.

Earlier this year, Hugging Face introduced its SmolLM, a small-scale language model that performs efficiently with fewer parameters.

Wolf explained that a one-billion parameter model could match the performance of larger, ten-billion parameter models from last year.

"You have a 10 times smaller model that can reach roughly similar performance," he pointed out.

According to Wolf, training these models on tailored datasets enhances their utility for specific tasks such as data processing and speech recognition.

Adaptations include embedding "very tiny, tiny neural nets" to further refine model specialisation, likened to "putting a hat for a specific task."

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Earn new token airdrops
Lock your assets and earn 10%+ APR
Lock now!