Frigate.video is so cool! I got AI object detection working with a Google Coral USB Edge TPU: coral.ai/products
Fun to see what our cats get up to during the night
(The camera is just on the floor to try it out.)
Conversation
Replying to
I wonder if I can do anything else with this Coral TPU. It can run some TensorFlow Lite models, but they have to be compiled specifically for the Edge TPU. coral.ai/docs/edgetpu/m
It would be cool if I can get it to run Stable Diffusion or OpenAI Whisper
1
Also wondering when I might be able to run something comparable to GPT-3 on a personal server at home.
It looks like the BLOOM model needs about 400GB of GPU memory huggingface.co/bigscience/blo
So you would need 10x 40GB Tesla A100 GPUs @ $23,578 each = $235,780
2
Or even 16 of these: "I’ve read in the official Slack channel that it requires something like 8*80GB A100 or 16*40GB A100 to perform inference locally"
