Google has always allowed people to request certain content, like a malicious website or copies of your contact information, to be removed from search results. The company already had a policy for removing private images or videos, but now it's taking fake pornographic content more seriously.

Google now has a support page dedicated to removing "involuntary fake pornography," with instructions that people can follow to report said content. For a request to be accepted, the person reporting the content has to be the person depicted in the fake imagery. The company notes that this process only removes the requested content from Google search results, not from the sites actually hosting it.

This updated policy is likely in response to the growing popularity of 'deepfakes' - sexual videos and images where someone's face is substituted for another person's (usually a celebrity). As the name might suggest, deepfakes are generated using deep learning software, like Google's own TensorFlow. Many popular sites are now banning these videos and images, like Reddit did earlier this year.