News

Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
The 1-bit LLM (1.58-bit, to be more precise) uses -1, 0, and 1 to indicate weights, which could be useful for running LLMs on small devices, such as smartphones. Microsoft put BitNet b1.58 2B4T on ...
Culminating a year-long project, [Usagi Electric] aka [David] has just wrapped up his single-bit vacuum tube computer. It is based on the Motorola MC14500 1-bit industrial controller, but since ...
Microsoft researchers just created BitNet b1.58 2B4T, an open-source 1-bit large language model with two billion parameters and trained on four trillion tokens. But what makes this AI model unique ...