modeler@lemmy.worldtoTechnology@lemmy.world•The first GPT-4-class AI model anyone can download has arrived: Llama 405BEnglish
0·
4 months agoTypically you need about 1GB graphics RAM for each billion parameters (i.e. one byte per parameter). This is a 405B parameter model. Ouch.
Edit: you can try quantizing it. This reduces the amount of memory required per parameter to 4 bits, 2 bits or even 1 bit. As you reduce the size, the performance of the model can suffer. So in the extreme case you might be able to run this in under 64GB of graphics RAM.
While the powers are separated, if all three are aligned there is nothing they can’t do. The supreme court has already demonstrated it’s able to reinterpret the constitution in a way no other court has done in history.