Facebook is rolling out a feature built specifically to stop perhaps the site's most creepy behaviour
The new tool will look for when people are being impersonated by using another's name and profile picture, and attempt to help them get shut down.
The site is also introducing new features that are meant to stop people from having non-consensual intimate images shared of themselves online. That is intended to help stop abuse by allowing people to report inappropriate images that contain themselves - those pictures will then be removed and the site may offer links to resources that will help people if they are being abused or harassed.
If the automated tool spots a potential example of impersonation, it will send a notification to the person it believes is being trolled. That user will then have the option of confirming whether they are genuinely being impersonated or if the site has made a mistake.
If a user confirms that they are being abused, then the complaint will be reviewed and the account may be taken down, according to Mashable, which first reported the news.
The tool has already rolled out to 75% of accounts on Facebook, according to reports.
Facebook said that impersonation isn't necessarily a huge problem on the site. But for those people that it happens to, it can cause significant problems.
"We heard feedback prior to the roundtables and also at the roundtables that this was a point of concern for women," Antigone Davis, Facebook's head of global safety, told Mashable. "And it's a real point of concern for some women in certain regions of the world where it [impersonation] may have certain cultural or social ramifications."
Facebook will also introduce a photo checkup feature, it said, which is meant to help educate users about who can see which photos.
The new tool will look for when people are being impersonated by using another's name and profile picture, and attempt to help them get shut down.
The site is also introducing new features that are meant to stop people from having non-consensual intimate images shared of themselves online. That is intended to help stop abuse by allowing people to report inappropriate images that contain themselves - those pictures will then be removed and the site may offer links to resources that will help people if they are being abused or harassed.
If the automated tool spots a potential example of impersonation, it will send a notification to the person it believes is being trolled. That user will then have the option of confirming whether they are genuinely being impersonated or if the site has made a mistake.
If a user confirms that they are being abused, then the complaint will be reviewed and the account may be taken down, according to Mashable, which first reported the news.
The tool has already rolled out to 75% of accounts on Facebook, according to reports.
Facebook said that impersonation isn't necessarily a huge problem on the site. But for those people that it happens to, it can cause significant problems.
"We heard feedback prior to the roundtables and also at the roundtables that this was a point of concern for women," Antigone Davis, Facebook's head of global safety, told Mashable. "And it's a real point of concern for some women in certain regions of the world where it [impersonation] may have certain cultural or social ramifications."
Facebook will also introduce a photo checkup feature, it said, which is meant to help educate users about who can see which photos.
Comments
Post a Comment