Run a 1B LLM on a 0 board? Check out Picolm!
Hey everyone, I just saw this on GitHub and thought it was pretty incredible. The project claims you can run a 1-billion parameter LLM on a 0 board with only 256MB RAM. This could be a game-changer for local AI enthusiasts and those looking to experiment with LLMs on...
2 comments2