MeetMe Introduces New Livestream Moderation Features

The Meet Group has added a new range of safety features to its flagship dating and livestreaming app MeetMe. 

The updates aim to ensure only suitable video content is on the platform, and any streamers violating the company’s policies are quickly shut down.

Firstly, a prominent ‘Report Abuse’ button will be displayed on every livestream. If viewers believe the content is inappropriate they simply tap the button and a screenshot is sent to MeetMe’s human moderation team for instant review.

The moderator will enter the show to judge the live activity as well as the submitted screengrab.

Streamers will also be shown a full-screen reminder of MeetMe’s strict guidelines each time they want to go live. The user must read and manually accept the terms before they are allowed to begin their broadcast.

Further, new streamers must complete a waiting period where they are encouraged to review the policies and guidelines before presenting their first show.

Geoff Cook, The Meet Group’s CEO, said in a press release: “We are taking meaningful steps to further strengthen what we believe are already industry-leading online safety standards for our apps.

“We expect to add each of these features to our other apps in the coming months, as we continue to lead in livestreaming safety and moderation practices.”

CEO of ConnectSafely Larry Magid added that the updates “[make] sense for the whole industry.”

Last week, Cook joined representatives from Match Group, Spark Networks and the Online Dating Association in a panel discussion about the future of safety in the online dating industry. 

There, he claimed that his company continually improves moderation practices in accordance with industry trends and user feedback.

Visit The Meet Group website here.