Page 1 of 1

Mixed_float16 error and extremely slow training

Posted: Fri Aug 18, 2023 2:26 pm
by unkempt

I just downloaded and installed the newest version of Faceswap. When I tried training on this new file (which is 1080p - I usually go smaller than that, but I've been having blur issues and thought a larger source file would be better). I got this error:

08/18/2023 07:16:40 WARNING Mixed precision compatibility check (mixed_float16): WARNING
The dtype policy mixed_float16 may run slowly because this machine does not have a GPU. Only Nvidia GPUs with compute capability of at least 7.0 run quickly with mixed_float16.
If you will use compatible GPU(s) not attached to this host, e.g. by running a multi-worker model, you can ignore this warning. This message will only be logged once

I do have an Nvidia GPU though and specifically set Python to use it in Windows settings. However, Faceswap is tanking my CPU instead and my GPU is showing next to no use. Iterations are usually much faster, but are crawling now (less than 100 in the last 5 minutes). I lost all my settings when I reinstalled (since I can't seem to update properly), but I don't recall one related to this issue and a forum search hasn't revealed any answers. What do I do?


Re: Mixed_float16 error and extremely slow training

Posted: Fri Aug 18, 2023 10:15 pm
by torzdf

1) It's a warning not an error
2) You don't say which GPU you are using, which will impact whether you will get optimal speeds with mixed precision or not
3) speed may not be related to 2, but until I know the answer to 2, I cannot tell you.


Re: Mixed_float16 error and extremely slow training

Posted: Sat Aug 19, 2023 12:39 pm
by unkempt

I'm using Nvidia RTX 3070 M.

Regardless, I had a suspicion it had something to do with me forcing python through the administrator account/permissions and removed that and it seemed to do the job. It's using my GP U again.

I had changed it to try and solve that file in use error, but also had luck changing the preview to every 1000 iterations instead of 250. I'll keep trying.


Re: Mixed_float16 error and extremely slow training

Posted: Sat Aug 19, 2023 1:48 pm
by unkempt

Yup, I had to change it back to admin to train because otherwise it dies after a while due to "file in use errors". And when I do, it only uses the CPU - no idea why that makes a difference.


Re: Mixed_float16 error and extremely slow training

Posted: Tue Sep 26, 2023 9:06 am
by Eace1971
unkempt wrote: Sat Aug 19, 2023 1:48 pm

Yup, I had to change it back to admin to train because otherwise it dies after a while due to "file in use errors". And when I do, it only uses the CPU - no idea why that makes a difference.

An then, you solve the problem?