The relationship may have faded long ago, but the intimate images you shared have not. If you’re lucky, your ex deleted them. If you’re not, the photos have sprouted up online.

Victims of such nonconsensual posts, often referred to as “revenge porn,” now have some help in preventing their spread: On Wednesday, Facebook announced new artificial intelligence tools designed to keep such content, once flagged, off its site for good.

“It’s wrong, it’s hurtful, and if you report it to us, we will now use A.I. and image recognition to prevent it from being shared across all of our platforms,” Mark Zuckerberg, the social network’s founder and chief executive, said in a Facebook post.

The tools announced on Wednesday are intended to address a uniquely modern and pernicious form of harassment, often but not exclusively aimed at women, that has attracted increasing attention.

In March, for example, a report that active-duty and veteran Marines had used Facebook to share naked and private photos of thousands of women in the Marine Corps prompted a congressional hearing and a Defense Department investigation.

The company has been sued in the past by victims of revenge pornography who accused it of not doing enough to prevent the spread of their intimate images.

Now, when such content is reported to Facebook, it will be reviewed by a trained member of a community standards team, most likely resulting in the image being removed and the account of the user who posted it being disabled, Antigone Davis, Facebook’s head of global safety, said Wednesday in a post on the site.

The photo-matching technology will then work to identify and thwart the future posting of similar images, not only on Facebook, but also to its instant messaging service and to Instagram.

The company also published a guide on reporting and removing such intimate images, and said that it had partnered with safety organizations such as the Cyber Civil Rights Initiative, which operates a hotline for victims of nonconsensual pornography.