Thread
: Business
GPT API question.
View Single Post
04-02-2025, 10:25 AM
2MuchMark
Videochat Solutions
Industry Role:
Join Date: Aug 2004
Location: Canada
Posts: 48,529
It probably depends on your usecase, but why not just run an LLM on your local machine instead? Its probably faster depending on your hardware, dIt's efinately private, and its pretty f'in cool too..
__________________
VideoChat Solutions | Custom Software | IT Support
https://www.2much.net
|
https://www.lcntech.com
2MuchMark
View Public Profile
Visit 2MuchMark's homepage!
Find More Posts by 2MuchMark