Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
pmarca shares information about the 3.3 billion parameter model trained on historical text
ME News update, April 3 (UTC+8). Recently, the well-known figure pmarca shared information about model pretraining on social media. According to what he shared, the model’s pretraining corpus used American and British books and newspapers from Hugging Face and the Internet Archive published before January 1, 1900. After extensive filtering, about 22 billion tokens were assembled into the training corpus. The article mentioned that the model’s best checkpoint is a 3.3 billion-parameter model. pmarca said that since December 1, 2022, he has been expecting things like this. (Source: InFoQ)