-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Which GPU should I buy for ComfyUI
This is a tier list of which consumer GPUs we would recommend for using with ComfyUI.
In AI the most important thing is the software stack, which is why this is ranked this way.
All Nvidia GPUs from the last 10 years (since Maxwell/GTX 900) are supported in pytorch and they work very well.
3000 series and above are recommended for best performance. More VRAM is always preferable.
Why you should avoid older generations if you can.
Older generations of cards will work however performance might be worse than expected because they don't support certain operations.
Here is a quick summary of what is supported on each generation:
- 40 series (ada): fp16, bf16, fp8
- 30 series (ampere): fp16, bf16
- 20 series (turing): fp16
- 10 series (pascal) and below: only slow full precision fp32.
Models are inferenced in fp16 or bf16 for best quality depending on the model with the option for fp8 on some models for less memory/more speed at lower quality.
Note that this table doesn't mean that it's completely unsupported to use fp16 on 10 series for example it just means it's going to be slower because the GPU can't handle it natively.
Don't be tempted by the cheap pascal workstation cards with lots of vram, your performance will be bad.
Officially supported in pytorch.
Works well if the card is officially supported by ROCm but they are slow compared to price equivalent Nvidia GPUs mainly because of the lack of an optimized implementation of torch.nn.functional.scaled_dot_product_attention for consumer GPUs.
Unsupported cards might be a real pain to get running.
Officially supported in pytorch. It works but they love randomly breaking things with OS updates.
It works but it requires a custom pytorch extension and there are sometimes some weird issues.
I expect things to improve over time especially once it is officially supported in pytorch.
It requires a pytorch extension (pytorch DirectML) or a custom zluda pytorch build.
You will have a painful experience.
Things might improve in the future once they have pytorch ROCm working on windows.
Pytorch doesn't work at all.
Some quotes from someone with knowledge of the hardware and software stack: "Avoid", "Nothing works", "Worthless for any AI use"