![]() |
anybody here set up local LLM?
anybody here set up local LLM? Looking to learn more about them.
|
Look for a tutorial for oobabooga
|
Quote:
You'll need a fast PC with a modern GPU with at least 12 Gigabytes of VRAM, or an Apple M1/M3/M4 Mac Mini with lots of RAM. It's pretty easy, fun, and impressive. Enjoy! |
Mark,
With a local LLM, do you know if it is possible to use API functionality by connecting to the local IP address? You mentioned setting one up a few weeks ago in a thread of mine and I've been looking into it but not able to find any solid information about allowing others to use the local machine using an API. |
Quote:
|
All times are GMT -7. The time now is 08:46 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc