Quote:
Originally Posted by DownInFlames
Will suck a dick for some DDR5.
|
i bought 32GB last weekend for > $500
Totally makes sense why though.
Being able to run LLMs locally means you probably want > 24GB of VRAM (ideally 96GB or even 192GB) and > 64GB of RAM for partial offload (ideally 128GB or 256GB).
RAM used to go unused, now it's basically the difference between running a small LLM, a medium LLM, or a big one.
I also bought a 5090 so I have 32GB of VRAM and 64GB of RAM which is about the threshold for a quantized medium LLM. I kind of regret not shelling out for a 6000 Blackwell Pro 96GB or 4X Intel Dual B60 (for 192GB) but I am not a corporation here, and that's who is buying up the RAM