Do AI filters work equally well when applied to footage of people from different backgrounds and ethnicities?

While we have taken care to pick our training sets so that the results work equally well on footage of people across all different protected categories, including skin color and sex, it is possible that our filters could produce results that reflect some underlying bias present in the training data that we unintentionally failed to account for. If this happens, we encourage you to reach out to our support team to report the problem.

Was this article helpful?
0 out of 0 found this helpful

Articles in this section