Last week we released NanoGPT Slowrun , an open repo for data-efficient learning algorithms. The rules are simple: train on 100M tokens from FineWeb, use as much compute as you want, lowest validation loss wins. Improvements are submitted as PRs to the repo and merged if they lower val loss. The constraint is the inverse of speedruns like modded-nanogpt , which optimize wall-clock time. Those benchmarks have been hugely productive, but optimizing for speed filters out expensive ideas: heavy regularization, second-order optimizers, gradient descent alternatives. Slowrun is built for exactly those ideas.
They also include quick-press controls for music and calls, which sound great thanks to the H2 chip. And like you’d expect from Apple, the integration with your devices is seamless. You can pair them instantly by just bringing them near your device and you can control features with “Siri” or “Hey Siri.” You can even share audio with another pair of AirPods.
,详情可参考电影
Последние новости
Андрей Прокопьев (ночной линейный редактор)