Hi,
i'm new to faceswap,
when i try to train models, with "original" trainer, if i don't enable "Optimizer Savings" i receive low memory error, even if i set batch size at 2, if i enable it, i can train up to a batch size of 128.
It sounds very strange, with old card (quadro k4200 4Gb) i trained with batch 16 without memoy options, ok much slower, but worked.
thanks for answers
GTX 1650 Super 4Gb - Optimizer savings enabled - train
GTX 1650 Super 4Gb - Optimizer savings enabled - train
- bryanlyon
- Site Admin
- Posts: 793
- Joined: Fri Jul 12, 2019 12:49 am
- Location: San Francisco
- Has thanked: 4 times
- Been thanked: 218 times
- Contact:
Re: GTX 1650 Super 4Gb - Optimizer savings enabled - train
This is probably due to drivers/Windows more than the card. Quadro cards use different drivers and probably reserved less memory for OS use, leaving more memory for Faceswap. Windows will often reserve 1/4-1/2 your total vram. 4gb is really close to the minimum we support, and it's likely that you're just running into that limitation. You can try using Linux or turning the card into a non-display card to prevent Windows from reserving that memory (Requires another card or on chip GPU to run the display). Beyond that, you can use the various memory saving options, but they do cause a slowdown in training.
Re: GTX 1650 Super 4Gb - Optimizer savings enabled - train
Many thanks, now i’m only testing, probably i will buy a rtx card