News
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Microsoft’s General Artificial Intelligence group has introduced a groundbreaking large language model (LLM) that drastically ...
8d
Tom's Hardware on MSNMicrosoft researchers build 1-bit AI LLM with 2B parameters — model small enough to run on some CPUsMicrosoft researchers developed a 1-bit AI model that's efficient enough to run on traditional CPUs without needing ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Microsoft (MSFT) researchers claim they’ve developed the largest-scale 1-bit AI model, also known as a “bitnet,” to date. Called BitNet b1.58 ...
Microsoft researchers have created BitNet b1.58 2B4T, a large-scale 1-bit AI model that can efficiently run on CPUs, ...
Microsoft Research has introduced BitNet b1.58 2B4T, a new 2-billion parameter language model that uses only 1.58 bits per ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results