Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Although as I undserstand m3 chips with more VRAM handle larger LLMs better because they can load more into VRAM compared to 4090.


This is true, but that is only an advantage when running a model larger than the VRAM. If your models are smaller, you'll get substantially better performance in a 4090. So it all comes down to which models you want to run.


It seems like 13b was running fine on 4090, but when I tried all the more fun or intelligent ones became very slow and would have peformed better on m3.


Yes, M3 chips are available with 36GB unified RAM when embedded in a MacBook, although 18GB and below are the norm for most models.

And even though the Apple press release does not even mention memory capacity, I can guarantee you that it will be even less than that on an iPad (simply because RAM is very battery-hungry and most consumers won't care).

So, therefore my remark: it will be interesting to see how this chipset lands in MacBooks.


But M3 Max should able to support up to 128gb.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: