Google+ Rolls Out New Reporting Tools to Keep the Community Safe and Secure

Google rolled out new safety and security controls for Google+, that let users granularly report to Google why you are blocking someone or why their content or actions are offensive. Google announced the changes on G+ post that include comment moderation and reporting, as well as new controls for Hangout. These features joins already existing […]

Google rolled out new safety and security controls for Google+, that let users granularly report to Google why you are blocking someone or why their content or actions are offensive.

Google announced the changes on G+ post that include comment moderation and reporting, as well as new controls for Hangout. These features joins already existing tools like comment moderation and hangout controls into Google+ to weed out bad behavior.

In the post, Google's Pavni Diwanji announcing the features says, "We understand how much you value safety and security online. But we're always looking to do better. Today we're excited to roll out new controls that will ensure our community remains a place where people look out for each other."

Reporting content in the stream. With this tool in place, you'll now have more options when reporting bad posts, comments, profiles or photos. "This will mean we can deal with misconduct more quickly, and prevent abuse from happening again," Pavni said.

Reporting people in public hangouts. If you ignore someone in a hangout, "we immediately mute their audio and video to keep you safe. Now when you report someone in a public hangout, we'll also automatically record a small snippet and notify the room. We can then check for bad behavior--and once we've done that we'll delete the clip," she added.