Meta’s oversight board announces a new “accelerated review process.”

Meta’s oversight board has announced a change in approach that will result in it hearing more cases faster and allowing it to make even more recommendations on policy changes and updates for Meta’s apps.

As explained by the oversight board:

“Since we began accepting appeals over two years ago, we have published 35 case decisions covering issues ranging from the Russian invasion of Ukraine to LGBTQI+ rights, as well as two policy advisory opinions. As part of this work, we provided Meta 186 recommendations, many of which are already improving people’s experiences with Facebook and Instagram.”

Complementing its ongoing in-depth work, the Oversight Board is now also introducing a new expedited review process to provide more advice and respond more quickly in situations of urgent substantive issues. worldly consequences.

“Meta will submit cases for expedited review, which our co-chairs will decide whether to accept or reject. If we accept an expedited case, we will publicly announce it. A panel of board members will then consider the case and draft and approve a written decision. This will be published on our website as soon as possible. We have developed a number of new processes that allow us to publish an expedited decision as early as 48 hours after accepting a case, but in some cases it may take longer – up to 30 days.”

The board says expedited decisions on whether to remove or remove content are binding on Meta.

In addition, the panel will now also provide more insight into its various cases and summary judgment decisions.

READ :  Social media is all about storms in teapots - Pasadena Star News

“After our case selection committee has established a list of cases eligible for selection, Meta sometimes realizes that its initial decision on a position was wrong and reverses it. While we publish full decisions for a small number of these cases, the rest have only been briefly summarized in our quarterly transparency reports. We believe these cases hold important lessons and can help Meta avoid the same mistakes in the future. Therefore, our case selection committee will select some of these cases to be reviewed as summary decisions.”

The Board’s new promotional periods are set out in the table below.

This will allow many more of Meta’s moderation calls to be reviewed and more of its policies reviewed, which will help establish more workable, fairer approaches to similar cases in the future.

Meta’s independent oversight body remains an intriguing case study of what social media regulation might look like if there could ever be an agreed-upon approach to content moderation to replace independent app decisions.

Ideally, that’s exactly what we should aim for – instead of having management at Facebook, Instagram, Twitter, etc. all on the phone about what’s and isn’t acceptable in their apps, there should be an overarching and ideally global body that reviews the difficult calls and dictates what can and cannot be shared.

Because even the most staunch supporters of free speech know that there has to be a degree of moderation. Criminal activity is generally the line in the sand that many are pointing out, and that makes a lot of sense, but there are also harms that can be amplified by social media platforms that can have real-world implications, though they are not illegal as such and current regulations are not adequately equipped to mitigate them. And ideally, it shouldn’t be Mark Zuckerberg and Elon Musk making the ultimate decision on whether or not this is allowed.

READ :  How to run a successful social media experiment

This is why the Oversight Board remains such an interesting project and it will be interesting to see how this change in approach to facilitate more and faster decisions impacts its ability to provide a true independent perspective on these types of difficult decisions to offer .

Really, all regulators should look at the oversight body example and consider whether a similar body could be formed for all social apps, either in their region or through a global agreement.

I suspect that a wide-ranging approach is a step beyond what is possible, given the different laws and approaches to different styles of speech in each nation. But perhaps independent governments could try to implement their own oversight body-style model for their nation/s, taking decisions out of the hands of platforms and maximizing harm reduction more broadly.