Thread: Business GPT API question.
View Single Post
Old 04-02-2025, 10:25 AM  
2MuchMark
Videochat Solutions
 
2MuchMark's Avatar
 
Industry Role:
Join Date: Aug 2004
Location: Canada
Posts: 48,529
It probably depends on your usecase, but why not just run an LLM on your local machine instead? Its probably faster depending on your hardware, dIt's efinately private, and its pretty f'in cool too..
__________________

VideoChat Solutions | Custom Software | IT Support
https://www.2much.net | https://www.lcntech.com
2MuchMark is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote