We missed the dark side of social media. Let’s get smarter about the metaverse

When I became deputy spokesman for the Pentagon in 2009, Secretary of State Robert Gates hired me to draft the department’s social media guidelines. I have got it wrong.

I focused on the benefits of these new technologies, not the poorly perceived problems that would become national threats. As today’s tech giants push for a new vision of social media — an immersive experience sometimes referred to as extended reality or the metaverse — national security leaders must not make the same mistake.

The potential dangers these social media platforms pose are commonplace today. Authoritarian and repressive governments (Russia, Iran, China, North Korea) use Twitter, Facebook, YouTube, etc. to attack their own people and the people of democratic countries. This digital evil axis works hard to divide the populations of democratic nations and seeks to weaken their governments from within.

Our policy of allowing free access and use of these technologies has not increased civic engagement or spread democracy. It did exactly the opposite.

We have overlooked these threats because no one has taken the time to consider where these technologies could take us, and we have not and have not held any of them accountable.

We didn’t ask the right questions: How will you protect people’s personal data? Will you allow other nations and non-state actors access to this data? How could this data be exploited by opponents of free and open societies? Few of us even took a step back and asked: How is this platform free to use? How do you make money from my participation?

READ :  Social media is ruining our vacations

We have only seen that an ever-growing audience can be reached through these platforms and we have not understood that these are actually new networks without regulation.

We were extremely naive.

The question we must now ask is: will we repeat our mistakes by hoping that the same digital players will be good business leaders? Will they protect our data, our privacy and our human rights? Can we allow them to run virtual worlds without regulation?

There is a need for in-depth research addressing the impact that extended/virtual reality technology will have on users and communities. It is imperative that we address these technologies before they become widespread. These challenges include the health and physical impacts on users, privacy and security, governance, and diversity and inclusion.

For this research to be effective, it will likely mean that the companies developing these technologies will need to make their data accessible to researchers and academics. Here are just a few of the many questions that should be asked and answered:

How and where is all data collected from users backed up? What are companies doing with the data now? How are participants’ responses to stimuli used in the design and development phase? (Let’s not wait until it’s deployed to find out) What is done at the design stage to ensure children are safe? How do they ensure that predators are not allowed to hunt children? Do the creators and designers themselves have different backgrounds? Is unconscious bias understood and addressed?

The answers to these and certainly many other questions can feed into research, which should lead to concrete policy and regulatory recommendations.

READ :  UB research provides guidance for teaching positive, helpful content sharing on social media - UBNow: News and views for UB faculty and staff

While companies may protest the intrusion and possibly cite risks to their intellectual property, we should not let that stop us from protecting our national security and the health and well-being of the very people these companies claim to serve.

There are some approaches to conducting research in think tanks and nonprofits like the Center for a New American Security. Funding for these research projects is tiny compared to the reported $4 billion that Meta/Facebook alone spends each quarter to develop its version of the Metaverse.

Eventually, Congress will likely have to step in, but the Department of Defense can take first steps to ensure these technologies are not adopted by troops and personnel until this research is complete. We simply cannot afford to make the same mistakes again. There is precedent for this. At the beginning of the use of social media, the Pentagon blocked access to MySpace. This was seen as a clumsy tactic by executives who didn’t understand the capabilities of this communications platform. Now these leaders appear to be prescient.

The good news is that we can do this right. We have the time, but we have to move now. With the insights gained by researchers, we can create virtual worlds that are innovative, safe and collaborative while ensuring free and open expression.

Price Floyd was Assistant Secretary of Defense for Public Affairs from 2009 to 2010.