Tik Tok adds the option to mute the comments of the live broadcast
As it looks to develop the use of the live broadcast
option in e-commerce, as part of its broader monetization drive, TikTok is adding a new control option for live broadcasters that enables them to mute comments from individual viewers within a broadcast for variable time periods.
Live hosts now have the option to mute specific viewers for a period of a broadcast - or the entire broadcast, if they choose.
And the platform explained: Now the host or his assistant can temporarily mute scenes for a few seconds or minutes, or for the duration of the live broadcast. If the account is muted for any period of time, that person's entire comment history is also removed.
"Live hosts can turn off comments or limit potentially malicious comments by using a keyword filter," she added. We hope that these new controls will enable hosts and audiences alike to have a safe and enjoyable live broadcast.
The added ability to remove all previous comments from users is a big plus, which can help manage live broadcast interaction and reduce unwanted distractions.
This has always been a problematic element. Twitter was forced to update its rules around live broadcast interaction in 2018 after various investigations showed that women, in particular, attract all kinds of offensive remarks and comments while broadcasting.
As noted, with TikTok exploring direct commerce, through various partnerships with big name brands, it also needs to provide a safe environment for the brand and the consumer, in order to maximize its appeal.
Tik Tok puts user safety as a major focus
With this in mind, the ability to quickly isolate inappropriate commentators and cancel their influence can be a valuable addition. The platform also added a new option for live stream moderators back in July to
provide additional management options in this regard.
The announcement is part of a broader overview of the platform's latest Community Guidelines Implementation report, which outlines all actions it took for violations of the platform's rules between April and June of this year.
The platform notes that it has removed more than 81 million videos in this period. That equates to less than 1 percent of all videos uploaded via the platform. This indicates that more than 90 million videos are now uploaded to the platform every day.
Of these videos, we identified 93.0 percent and removed them within 24 hours of being posted. and 94.1 percent before the user reported it. And 87.5 percent of the content that was removed had no views.
The platform explains that its alerts urging users to reconsider potentially offensive comments also have an impact.