Quote:
Originally Posted by porkramen
Yeah, I agree but for some good results you need like 80gb of VRAM. Consumer gpu's don't go that far, 24gb at least. My friend that works on the university is using a RTX Quadro designed for AI, his generation are way better than mine. I'm on a 5070 with 12gb vram. All I can do is some 5seconds gens
|
Nice discussion

but I have stated in my OP about NOT posting....
Quote:
|
......about any technical mumbo jumbo we do NOT know any and do NOT want to learn.
|
I did NOT get any of it nor do I want to. I am only interested if anybody can deliver video like posted above in my OP. But feel free to discuss VARMs , TRX Quadros or any other mumbo jumbo, I will just ignore it.
