The Social Dilemma- What are we Doing to Fix the Problems?

Shashkes
8 min readSep 24, 2020

There is a lot to be said about Netflix’s new documentary, “The Social Dilemma”, which combines interviews with engineers and designers who built our social platforms and an invented story about a family in today’s America. While some of the invented story and the way it depicted AI algorithms seem exaggerated to me, the basic problem showcased is a very real one.

There is a misalignment between our needs and the needs of the companies creating the social networks we use.

The advertising monetization model is at the heart of the problem. This model turns us, the people using the platforms, into the product being sold to advertisers. This attention economy leads companies to promote addictive technology that will glue our eyeballs to the screen for as long as possible. It is also behind the lack of privacy and consent around selling our data to third party companies.

The documentary ended with some recommendations for what we can all do. Leaving my phone out of the bedroom is the recommendation I’m choosing to adapt as a user but as the CEO of the company that is building Meu, the future of human communication in 3D, there is a lot more me and the team have been doing. I want to share some of the design and business choices we’ve made to promote our users’ wellbeing.

Many of the ideas I’ll present are based on my cognitive neuroscience research but also the research and writing of Cris Beasly, Douglas Rushkoff, Robin Hunke, Dan Arieli, Bret Victor and others.

Philosophy:

  1. What we are building is a tool to augment people’s ability to communicate non-verbally and play with friends in 3D, as such we aim to give people the maximum amount of agency in the platform and to create unique feedback loops between users and their creations. Tools are things you can pick up and use allowing you to extend your capabilities. Tools are things you put down and stop using when you don’t need them.
  2. Never build anything we wouldn’t want to use ourselves or have concerns about our younger siblings/children using.
  3. Utilize and invest in scientific research that can help us make sure what we are building is healthy for our users.

Design choices to prevent addiction:

  1. Users have full agency on Meu message playback. Many platforms continue presenting you with the next video/post or continue looping the media we are watching so they can maximise the time we are in their platform and sell our attention at a higher price. We chose to show the Meu message once and we let users decide if they want to see it again, or continue to the next message, or close the app.
UI that maximizes for movement… stretch to send and check out the keyboard
  1. In the mobile version we are building, phone notifications will be set by default to off. You will have agency around notification, choosing email or phone and decide who you want to receive notifications from.
  2. The metric we hope to improve in humanity is movement time and movement variety. Our design and UI both in VR and AR are based on this, prioritizing physical interactions over passive browsing.

Design choices to prevent anorexia/dysmorphia:

Embody playful avatars that let you share your inner strange
  1. Meu avatar system lets people share their inner creatures promoting creativity and playfulness. You can currently play as a cat, an owl, a truffle or even be a swarm of sparkling particles. While just about every other avatar system focuses on letting users present themselves as disney-like cute, thin version of themselves which promotes unrealistic expectations.
  2. The effects of embodying a first-person avatar are incredibly powerful in VR and have been shown to affect our sense of self on very deep levels. That’s why in the Meu VR version we decided to record messages while puppeteering an avatar from a third-person view. This allows for unique feedback mechanisms showing users their movements without interfering with our first-person embodied self as much.

Design choices to create consensual communication between company and users:

  1. In most social platforms advertisements frequently pop up, interrupting the user, taking over their attention with no consent. With Meu we decided that all sponsored content will have a specific section allowing users to choose if they want to engage with that content or not. We believe that if we curate our sponsored content to actually provide added value to users they will choose to engage with it, whenever and however they want.
  2. We are building settings to allow users to decide the exact shareability of what they create in Meu. They will be able to decide if their message is a “Snowflake” - a unique message that can only be sent and received by one person without the ability to share further or remix or if they do want their friends to be able to mutate and reshare their creations.

Legal Actions To Protect Users Data:

  1. Typical user agreements give a company sole discretion to unilaterally modify the terms at any time. “Notice” is given in an obscure corner of the company’s website, and “consent” is assumed unless a user terminates her account. Thus a user has no more power than simply whether or not to click “I Agree”; and in any event, a user cannot even determine to what she is agreeing. To further ensure our users have full agency, we are working with our legal advisor Graham Pechenik to create user agreements that are clear about our obligations to protect our users’ data. Moreover, we are yielding power back to our users with provisions that are legally binding on us as well, and will commit us to maintaining the same protections as we grow.
  2. We are even going a step beyond, by making sure these legal protections are also binding on our corporate successors. There are many stories out there of startup founders who ended up selling their company, and despite their best intentions to protect their users’ data, the companies that acquired them didn’t follow suit. We are working to create changes to our bylaws that will protect our users’ data even if the current co-founders are removed from the leadership of the company, or corporate control otherwise falls into different hands. While the cynical may think of this as a poison pill that diminishes the “value” of the company as an acquisition target, this very notion is exactly what we are trying to prove wrong. That corporate value is centered in the ability to auction off user data to the highest bidder serves to underscore how “value” needs to be reexamined from the ground up. To us, value is not merely a tally of corporate profits, but more a measure of our contributions to our users’ wellbeing and the wellness of society as a whole.
  3. Our recently filed provisional patent application, and our IP strategy in general, also will be used to support our ethos. Taking inspiration from Creative Commons licenses and the copyleft movement, we plan to use our IP as a means to guarantee our technology only will be used by other companies who will likewise obligate themselves to steadfastly protect their users’ data, and who are similarly committed to their users’ wellbeing.

Business choices:

https://wefunder.com/radix.motion
  1. We are crowdfunding to build the platform. This aligns the incentives of our investors and users because many of them are one and the same. Our lead investor on Wefunder is Dr. Alan Jacobson a clinical psychologist and the CEO of Italian Home for Children who is helping us build human technology that increases our mental health.
  2. One of the main reasons we are crowdfunding is to hire a “Players’ Health and Wellness Officer”. We believe any company, especially if it is in the immersive/human interaction space should have a person whose job it is to use scientific research to promote users wellbeing and put up red flags if research is showing that features in the product are unhealthy. I’ve been searching for someone in Facebook Reality Labs whose job that is. A company with so many resources like Facebook should have at least one person looking into questions like how long is it safe for us to wear the headsets they are building? In previous years I’ve given feedback to the design team of the Facebook Spaces regarding research showing that their avatar designs may promote anorexia and dysmorphia and that the game mechanics showing users they are blinking even if they aren’t may be unhealthy. As far as I know this hasn’t caused any change in the design. I want the company we are building to be different, putting users’ wellbeing at the center of what we build.
  3. Provide art grants and early access to our technology to creators and dancers in underserved communities to increase diversity.

Future thinking about harassment, hate and gender equality on the platform:

  1. What if instead of banning users whenever they harass someone, which just leads to further isolation and them opening another profile where they lash out even more, social networks would invest in engaging with such users with AI chatbots trained in non-violent communication? I’m not sure about the results of such an experiment but I think this is a way to try to solve some of the divisiveness and hatred that plagues many social networks.
  2. Meu is specifically designed to promote prosocial interactions and build empathy using an embodied cognition approach focused on physical play. While I’m a big believer in free speech, messages promoting hate or violence will be removed in order to protect the safety of our community.
We won’t ban your tentacles… experimental avatar for Valentines day
  1. No body part will be banned from the platform. The “female” nipple ban most social platforms adhere to is oppressive and stuck in binary gender thinking. As we open Meu’s avatar system to allow users to create their own avatars we expect we’ll be seeing a lot of strange stuff. We will encourage users to tag their avatars with NSFW tags so others can filter what they want to see or not.

I’m writing this in hopes to collaborate with other people who are building technology companies and products they would actually want their kids to use. I’d love to hear what metrics you are using behind the Silicon Valley mantra of “make the world a better place”. What am I missing? What else can I do?

--

--