News
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Hosted on MSN23d
Microsoft's New Compact 1-Bit LLM Needs Just 400MB of MemoryThe 1-bit LLM (1.58-bit, to be more precise) uses -1, 0, and 1 to indicate weights, which could be useful for running LLMs on small devices, such as smartphones. Microsoft put BitNet b1.58 2B4T on ...
Culminating a year-long project, [Usagi Electric] aka [David] has just wrapped up his single-bit vacuum tube computer. It is based on the Motorola MC14500 1-bit industrial controller, but since ...
Hosted on MSN28d
Microsoft researchers build 1-bit AI LLM with 2B parameters — model small enough to run on some CPUsMicrosoft researchers just created BitNet b1.58 2B4T, an open-source 1-bit large language model with two billion parameters and trained on four trillion tokens. But what makes this AI model unique ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results