It does seem to use more weights than stable diffusion. Apparently there is a way to get it to run on 12GB VRAM in comfyUI but trying something similar in python it still overloaded by graphics card.
Run pod isn't expensive but I still with I could run everything locally. I still think smaller models will happen.
Like, this is edge tech, which is cool. But they are gatekeeping their 12 b model only accessed through their front end service, which is most def a way to do things if you wanna control it. But for my/our needs, we dont need a 12 b model to render out custom cross hairs for the video game n stuff. We could have unique crosshairs for each weapon?
I was just playing with this.
https://img.gvid.tv/i/3RAL4yRr.webp
It does seem to use more weights than stable diffusion. Apparently there is a way to get it to run on 12GB VRAM in comfyUI but trying something similar in python it still overloaded by graphics card.
Run pod isn't expensive but I still with I could run everything locally. I still think smaller models will happen.
Like, this is edge tech, which is cool. But they are gatekeeping their 12 b model only accessed through their front end service, which is most def a way to do things if you wanna control it. But for my/our needs, we dont need a 12 b model to render out custom cross hairs for the video game n stuff. We could have unique crosshairs for each weapon?