Accessibility links

Breaking News

Plugged In with Greta Van Susteren-Social Media: Who Decides?


[[GRETA]]

On Plugged In ---



Social media …

was supposed to …

bring us closer together.



But it can also be used …

to spread dis-information ...

driving us further apart.



[[SOT – SWISHER “They’ve got rules but they don’t enforce them all terrifically. You know they sometimes enforce them, they sometimes don’t.]



Governments …

around the world …

are considering …

new regulations …

on big technology.



[[SOT – YORK “Although I do think there are some reasonable restrictions to be put in place, I do think denying people access to these platforms really does great harm.”]]



Is there enough oversight?

Are civil liberties being violated?

Is your privacy protected?



Next on Plugged In …

Social Media: Who decides?



###



[[GRETA]]



Hello and welcome …

to Plugged In



I’m Greta Van Susteren …

reporting from Washington.



In 2020 more than …

three and-a-half ...

billion people...

used some form …

of social media …



And by 2025 use is expected to grow …

to nearly four-and-a-half billion.



Social media is connecting people …

in ways never imagined.



And it is being used …

to distribute news and information …

relying on shared connections …

for circulation …

and validation.



###



[[GRETA]]



Social media companies …

are coming under …

increasing pressure …

from users and governments ...

to moderate the content …

that crosses their platforms …



The pressure includes checking for facts …

and inflammatory speech.



VOA Technology correspondent …

Michelle Quinn …

examines the debate …

here in the U.S. ...

over social media’s …

gatekeeping role.



[[QUINN PKG]]



((NARRATOR))

In the wake of the January 6 attack on the U.S. Capitol, there is renewed interest in looking at the power of technology giants. Protesters reportedly used social media to plan. Facebook and Twitter kicked off their sites former President Donald Trump and others. Google, Apple and Amazon booted Parler, an app used by supporters of Trump.

((Jerry Davis, Professor of Management, University of Michigan))
“The real source of power is being a gatekeeper, being an intermediary that everyone has to pass through. That's really the thing that we want to look at.”

((NARRATOR))

Scrutinizing so-called Big Tech is nothing new. Google, Facebook and Amazon are already facing antitrust investigations and tech CEOs routinely testify in Congress. While many agree that tech has too much power, they differ on what to do about it. Break up the companies? Restrict their abilities to collect user data and allow users to sue if their privacy is violated? Put laws in place to make the firms responsible if they are conduits for online falsehoods?

((India McKinney, Director of Federal Affairs, Electronic Frontier Foundation))

“The Republicans are upset at what they see is cancel culture or companies moderating conservative speech on their platforms and Democrats tend to be more concerned that platforms aren't moderating enough and are leaving up too much hate speech or too much harassment.”

((NARRATOR))

When it comes to online misinformation, something both parties say is a problem, creating new rules can be difficult, particularly in the U.S., where the First Amendment protects speech.

((India McKinney, Director of Federal Affairs, Electronic Frontier Foundation))

“How do you write a piece of legislation that makes illegal misinformation but protects satire and parody and comedy and commentary and hyperbole and like all of these other things that we use for entertainment and that are important parts of political discussion and discourse?”

((NARRATOR))

What’s needed, say some observers, is a fresh look at Big Tech’s power, including how the technology itself works in uniting or dividing people.

((Nicol Turner Lee, Brookings Institution))
“I think a national conversation on this …what are the rules of the road, what is the appropriate conduct, how do we keep people safe. That’s really a turning point.”


((NARRATOR))

Over coming months, more users and governments worldwide will debate Tech’s role as powerful gatekeepers for society.

((Michelle Quinn, VOA NEWS))



[[GRETA]]



THE U.S. CONGRESS...

HAS BEGUN …

A SERIES OF HEARINGS …

ABOUT THE SPREAD ...

OF MISINFORMATION ...

AND COMPETITION ...

IN THE DIGITAL ECONOMY.



KARA SWISHER IS AN OPINION WRITER …

FOR THE NEW YORK TIMES …

HOSTING THE PODCAST “SWAY.”



SHE IS A NOTED AUTHORITY …

ON BIG TECH …

AND THE RISE OF SOCIAL MEDIA …

REPORTING ON TECHNOLOGY …

SINCE 1994.



WE TALKED ABOUT ...

THE EVOLUTION OF SOCIAL MEDIA …

AND THE ISSUES OF …

FREEDOM OF SPEECH ...

AND CONTENT MODERATION.



[[SWISHER INTERVIEW PART 1]]



####



GVS: Okay, Kara How long have you been reporting on the internet social media?

KS: Oh social media came a little later but the internet since the early 1990s, so 30 years.

GVS: and what what got your attention into social media when it came along?

KS: Well I think you know it was just every every bit of technology has a different iteration, every time something changes and social media was sort of the natural extension from everybody getting on the internet. And so what caught my attention was the ability to actually talk and reach out to people in a much more significant way now this has happened early in the internet with AOL if you remember their chat rooms and different things so it's not a new fresh idea that social media isn't, but it certainly took a huge leap forward when there used to be. There were companies like Friendster and then Facebook and many others and then it just moved into a much more integral part of people's lives like a utility almost.

GVS: So what's one of the big ones Facebook, Twitter, Instagram and YouTube are those sort of the big ones?

KS: Yeah, well, YouTube is a video network but it's a social network in a lot of ways, but you know there's Reddit, there's Snapchat they're all different sort of- Tick tock, they all take it from a different angle. snapchats more communications tic tocs more media. Facebook is more community good and bad. Twitter is more instant news observation; Reddit is much deeper discussions so they all take a different, different slice of the pie essentially.

GVS: in looking at Twitter and Facebook, Are they more like news organizations that publish or are they like phone phone company and sort of connecting people?

KS: Well if that's the problem we don't quite know what they are they're calling themselves platagers if you, if you want to have a weird horrible English word that which is a publisher and a platform. They tend to say platform when they don't want to take responsibility for things; they never say publisher because they don't want to take responsibility for the media that's on their platform, but in fact they're kind of a new kind of media company in my estimation.

GVS: So did they have responsibility for the content that's on their platform?

KS: Well, no, they don't because of a thing called section 230, which was passed, a long time ago also in the 1990s, which protects them and gives them broad immunity for third-party stuff that's published on their platform. so no they right now, do not have liability except in certain cases, having to do with sex trafficking, and pedophilia and things like that.

GVS: when you talk about section 230 that's a US law, does that is if you know does that impact, whether or not they're vulnerable to lawsuits or responsibility for content around the world?

KS: No it does not, they're not protected from a lot of things around the world, you know, there's not the first amendment doesn't exist around the world. so in Germany, they have to behave quite differently than they do here right, around Nazi symbolism things like that they have to take it down pretty quickly so they act much more like a publisher than anything else in those places.

GVS: What's the impact of Twitter and Facebook, those are the main ones I'm talking about, on international politics and domestic politics in US politics?

KS: Huge I mean, Twitter has been the way Donald Trump had been communicating almost, he campaigned on it, he made political decrees, he attacked enemies. That was his mode of communication with most of his followers and most reporters, everybody else. You know, Facebook is more a place where people gather. and they did a look like for example the Trump administration used it for a lot of like the campaign used it for a lot of political advertising and political targeting of content and things like that. Abroad it's used in a variety of ways by variety of people sometimes, you know they use it to abuse people. sometimes they use it to create campaigns and different things. It just depends on the country.

GVS: Well, if you look at like Myanmar where there's recently a political coup. They've taken down the internet, they have taken people off Facebook and Twitter, and you see parts of in India right now where there's this, there's a protest about farmers is another problem taking down. I mean how does how does Twitter, how to Twitter and Facebook handle these international problems?

KS: Well not well I mean I think that's the issue there's so complex in Myanmar, that's happened several times, there's a lot of incidents of Facebook doing a bad job monitoring its platform and then riots and deaths, resulting that was several years ago. In this case, the government's really want to control the flow of information and most people get their news from Facebook it's like, it's some number in the 90s of how many people get their daily news from Facebook. it's pretty high in the United States. And so they definitely want to control the flow of information, the ability to organize and things like that. In other cases, like in the Philippines, the government uses it to, to put out false information about its opponents and so it can be used, you know, both in ways that are good for dictators and bad for dictators and so they want to control it.

GVS: but the social media can do things like promote democracy. I mean it gets cut out of certain places if a country like Myanmar doesn't want to doesn't want people gathered to protest. For instance, a military coup so it has had a value in terms of promoting democracy around the world.

KS: Yeah, so is the fax machine right? that's what happened in China, many, many years ago they use the fax machine to reach people. it can be used, just like any anything, Brad Smith who's the president Microsoft says “things like this are two digital tools, the digital technologies are either a tool or a weapon, you can use them as a tool to create democracy, you can use it a weapon kill democracy.” I think Greta the problem is human beings, that's really the situation is that we tend to take these tools and we either use them often for good and believe me, when I started covering the internet, I had that feeling. like here's a way to unite the world in sort of the Star Trek vision. And for those who are big sci fi fans I think a lot of tech people are, I use this analogy because I think it's easy for them to understand-- you're either a Star Trek or Star Wars person and in Star Trek you have great hope for all the technology and to go out and meet new people and even when there are villains are going to change their mind through good smart debate and bringing goodness and democracy and, and diversity to the universe. And then there's Star Wars where even the heroes are flawed and the villains win the lot, and even when you win, you can lose and people die. And so it's a darker vision of the future, where even, even when they win the sword fight they don't, it's never over and so that's really I think probably the real world, but I like to live in a Star Trek universe.



[[GRETA]]



FOR A GROWING PART …

OF VOA’S AUDIENCE …

SOCIAL MEDIA …

IS THE PRIMARY WAY …

OR ONLY WAY …

TO REACH THEM.



[[FS]]



ALL 49 OF VOA’S …

LANGUAGE SERVICES …

USE A COMBINATION …

OF SOCIAL MEDIA PLATFORMS.



FACEBOOK IS THE STRONGEST …

OF THE SOCIAL MEDIA PLATFORMS …

FOR VOA TO REACH AUDIENCES ...

STREAMING LIVE REPORTS …

AND PROGRAMS THERE.



IN 2020 …

VOA’S DIGITAL …

AND SOCIAL MEDIA EFFORTS …

EXCEEDED ITS GOALS ...

REACHING MORE THAN …

10-MILLION PEOPLE …

EACH WEEK.



###



[[GRETA]]



SOON AFTER...

THE JANUARY 6TH ATTACK …

AT THE U.S. CAPITOL ...

SOCIAL MEDIA GIANTS ...

TWITTER AND FACEBOOK ...

SUSPENDED AND THEN BANNED ...

PRESIDENT DONALD TRUMP...

FROM THEIR PLATFORMS.



A SHORT TIME LATER …

TWITTER MADE ITS BAN …

OF THE FORMER PRESIDENT …

PERMANENT.



FACEBOOK IS CONSIDERING …

REVERSING THEIR DECISION.



BUT THAT CALL WILL BE MADE ...

BY THE COMPANY’S ...

INDEPENDENT OVERSIGHT BOARD.



VOA’S TINA TRINH

HAS MORE ON THIS PROCESS.



[[TRINH PKG]]



####



Now that former President Donald Trump has been acquitted at an impeachment trial in the U.S. Senate, he faces another court of sorts – the Facebook Oversight Board. It will decide whether the internet company’s decision to indefinitely block the former president from Facebook and Instagram was the right call.

((Endy Bayuni, Facebook Oversight Board Member))

“In a way, it’s an experiment and I think many people believe that this is the way forward because the alternative is for the government to be making the rules, or Facebook, Mark Zuckerberg, to be making those decisions.”

((NARRATOR))

Endy Bayuni, an Indonesian journalist, is one of 20 members of the oversight board.

((Endy Bayuni, Facebook Oversight Board Member))

“We take into account, of course, Facebook's standards, Facebook values, and international human rights laws.”

((NARRATOR))

Members recently issued their first rulings -- and in 4 out of 5 cases overturned Facebook’s decisions to remove content. Facebook complied with the board’s decision and reinstated the posts.

After Trump’s appearance at a January 6 rally, Facebook referred its decision to ban Trump to the board for review and also asked for a review of how it should handle speeches by world leaders.

Some are asking if the board itself makes sense.

((Lilian Edwards, Newcastle University Professor))

“This is sort of like Walmart or Aston Martin or Jaguar deciding to have a court, right? Which is really very strange.”

((NARRATOR))

The social media giant is acting as if it were a sovereign nation with its own court structure, says law professor Lilian Edwards.

((Lilian Edwards, Professor, Newcastle University))

“The more people discuss what the board does, the less they discuss whether there should be a board at all.”

((NARRATOR))

Others blame an absence of government regulation that tech companies can easily fill.

((Marietje Schaake, International Policy Director, Stanford Cyber Policy Center))

“It doesn't solve the structural problems of the business models, of the algorithmic amplification of hatred that I think have to be addressed.”

((NARRATOR))

Board member Bayuni says the goal is not to replace government regulations.

((Endy Bayuni, Member, Facebook Oversight Board))

“We don't see ourselves as a substitute to the laws in the different countries. That's for them to decide. But we are here to help in content moderation.”

((NARRATOR))

The board is accepting public comment on Facebook’s Trump ban and expects to make a decision as soon as possible.

((Tina Trinh, VOA News, New York))



[GRETA]]



AS FACEBOOK LOOKS...

TO REVOLUTIONIZE...

THEIR APPROACH...

TO CONTENT MODERATION...

AND POLICY DECISION MAKING...


THERE IS QUESTION HOW...

FACEBOOK’S NEW POLICY...

MIGHT CAUSE A RIPPLE EFFECT...

FOR OTHER PLATFORMS.


IN PART 2 OF MY DISCUSSION...

WITH KARA SWISHER...

WE TALKED ABOUT...

WHAT THE FUTURE...

OF SOCIAL MEDIA...

WILL LOOK LIKE.



[[SWISHER INTERVIEW PART 2]]



####



KS: Initially they have taken Donald Trump off of Facebook right now. So what they did is they elevated the issue to a new thing, it's an independent group allegedly independent a lot of people question, I believe them that it is independent it's funded by Facebook. It's made over right now I think it's 20 people is supposed to be 40 at some point, people from across the world and they are going to take up this case of banning Donald Trump permanently or not off of Facebook and they will decide this group of international. Oh, there's a variety of people some, you know, one one was the was the Prime Minister of Denmark, for example, the former prime minister of Denmark and all kinds of legal scholars and different people, different walks of life different parts of the world and they all decide

Donald Trump's fate, because they elevated this one question. but there's no doubt he broke the rules of these platforms, over and over and over again. The question is whether in the case of Twitter he's off completely for the rest of his life in the case of Facebook where we have to wait and see what this Oversight Board decides.



GVS: But it's sort of interesting I can also add a situation where I was in a refugee camp in Bangladesh Rohingya refugee camp and took a picture of some adults and unbeknownst to me because I wasn't paying attention was a child probably under two who was naked, something you can see in the in the old days in National Geographic you know there was nothing particularly pornographic about it. You couldn't even, you couldn't really see much of the child, but I never saw it, I put it up and it was taken down as objectionable, and I protested it because I didn't want to be seen as putting a you know naked children on the internet. I never saw in the picture and it was it was it was not an objectionable picture. it's news.



KS: Yeah, Twitter has been struggling, Twitter not twitter i think is Facebook right. It has been struggling with this they had issues around breast cancer issues, a long time ago, they're not that long ago but about, I forget how many years ago, over the Mai Lai pictures. There's all kinds of pictures that there's a famous picture of the girl running from napalm, Do you remember that one where she's naked, they took it down initially and then they put it back up. and so what probably happens is a lot of this stuff is algorithmic and it naturally pulls it down, which it should to try to protect you know against pornography or child pornography or things like that. And then when they review it, they tend to put things back up. Most of the time when it's legitimate news pictures, but that their systems, especially around nudity are very --they spent a lot of time focused on on nipples, for example, and a lot of people think that that's not where they should be focusing on the issue of a much more robust content moderation system. They've tried all kinds of ways to do it algorithmically with people. And there's some very good reporting on what happens when they have content moderators, they tend to go crazy after doing a lot of this moderation because some of its so vile and, you know, whether it's abuse or, or, all kinds of things happen on the go or conspiracy theories.



GVS: how many, there must be billions of posts that they look at every day on Twitter and on Facebook and on Facebook trying to determine whether or not it's inappropriate and that service squishy standard.



KS: It is well that's the problem is they've got rules but they don't enforce them all terrifically. you know they sometimes enforce them, they sometimes don't they make mistakes like anybody would the volume of content coming over these platforms is so vast I think it'd be hard not to make mistakes, but I do think that they, they haven't put enough guardrails in place at all in order to stop the mistakes in the first place or to deal with them, that’s because it's really expensive to moderate all this stuff right and so they tend to want to not moderate it at all like let it go and we'll pick up the mess afterwards, rather than create a system that's safer.



GVS: A lot of people on the conservative part of the political spectrum got upset with Twitter, especially after President Trump was banned permanently from it, and went on to a social platform called parler what is parler?



KS: well it's back up again but what was it, it was it was it was a typical It was like a Facebook Twitter kind of amalgamation where people posted things so it would look very familiar to anyone who uses Facebook or Twitter, in some ways. And so people were posting all kinds of things they, many people felt a lot of the organization happened there along with on Facebook to not just parlor. And I actually did an interview with the CEO that got him a bit of trouble where he said he didn't really take responsibility for anything on the platform and well, then the people, the vendors he relies on do take responsibility. And so Apple and Google banned the app, and then Amazon said “we're not going to host your, your, your service anymore, because you refuse to fix the moderation issues.” And so somewhere down, down the line there's someone who just doesn't want to proceed with certain platforms. In the case of other conservatives complaining about conservative bias. There's no proof of it. and they keep saying it, and it doesn't mean itself, it's like a lot of things that some people say in this country. I'm gonna use this, some people say things Greta. But it's a lot like a lot of things it's not true. There's been no evidence of this, they continue to say it there's study after study not showing it, but they believe it. So, you know, I don't know what to say about that.



GVS: There's a new app called clubhouse that people are just beginning to join. What is it, and is that social media?



KS: It is in a way it's a it's a it's sort of social media but social media through audio so what happened it's a little bit more like LinkedIn a little bit, you could say it's a little like a business conference or it could be a class at university, could be a dinner party or a bar, like it's kind of got a weird thing where you go in, there's all these rooms that get created and some by the people on the service, some by just people just doing it, and you become the host of a room and then discuss a certain topic but the host gets to limit, who gets to talk or not. And then there's discussions and it could be a lecture, it could be an interview, it could be a wide ranging discussion among everybody. And so it's an audio version of a social network. and it's it's tied to social network in a truly underscore social because that's what people are doing is socializing or being entertained by other people.



GVS: and is itself moderated, moderate itself because people can toss people out of the room?



KS: The host can, if the host is offensive I don't know what you do. I mean, you could make anything on these services. you could they say they don't want hate speech but how can you stop it really that's the same thing. Facebook says it doesn't want hate speech but you know a lot of hate speech is on Facebook. And so it's the question is can they moderate audio? that's like it's much harder to do, you know, YouTube faces a much harder challenge than a Facebook or Twitter because it's easier to monitor algorithmically text than it is video and so audio presents another quantum level of difficulties. So it could quickly degenerate into some bad things. It could also be some good things. we'll see if people continue to need to do this after the pandemic’s over but it's certainly tailor made for a pandemic situation where people are stuck at home and want some socializing.



GVS: All right, let me go back to the beginning where I started where I asked you about how long you've been covering this thing, and I asked you that because I think you've been covering it more deeply more than anyone else I know. Has your opinion about the value of social media changed from the from when you first started covering it to now?



KS: I was worried about it from the beginning because I saw what happened on the internet, you know what I mean? And then this was a quantum level of more ability to spy on people to collect data. I always had an issue with that that people were uploading all kinds of private information all kinds of personal information to these services and getting very little in return, except for a chat or a date or a map. and so I've always been worried about the data collection around them and the kind of information people, and they live their lives online and what these companies -are they fully protecting your privacy? The second thing is I don't think they care about the consequences of these tools and, you know, just like carrying you know you can look at nuclear nuclear energy in a good way, or you could look at it like a bomb right? there's just lots of different ways and I think they don't take enough care to deal with the, the misinformation implications with the ability of people to cheat and to lie and to steal on these platforms. And, and they don't do enough about it, and I'm sure I'm not sure they can, given the the amount of information, but they certainly don't seem to care for the consequences to think about the consequences before they make things. and so I've always had an issue with that.



GVS: All right, now look into your crystal ball 10 years from now. We're having a conversation about social media. What do you think the conversation is going to be?



KS: I think that's gonna happen a lot. I think eventually you'll be plugged in all the time with something in your ear so you'll always be, you'll be going somewhere wearing a pair of glasses or they'll be some element where you look at, say you're in Paris and you look at the Eiffel Tower, it will tell you everything about the Eiffel Tower while you're standing there, without a book without a phone, not staring down, you'll be looking up and so I think probably that's the way it's going to go and hopefully we'll be able to sort out some of these terrible problems that we've had and the lies. but it's it's a medium designed for lies and propaganda. and every medium can be abused that way but this one really can reach a million different people with a million different lies and it gives a great power to do great damage and also great good.



GVS: Kara thank you very much always nice to see you.



KS: Thank you. All right. Thanks Greta.



[[GRETA]]



Australia is taking …

a step toward regulating …

social media …



requiring tech giants …

to pay publishers …

to have their content …

shared on their platforms.



And after a five day …

Facebook ban on news stories …

for Australian users …

Facebook has reached …

an agreement with Australia …

Now Facebook has agreed to pay news providers for news shared to Australia.



Google has struck deals …

with several news providers …

to include their content …

in its news feed.



In many other countries more government regulation …

of social media …

is under consideration …

in many countries



Jillian York …

is the director for …

international freedom of expression …

at the Electronic Frontier Foundation …

a non-profit organization …

that defends civil liberties …

in the digital world.



We talked about …

the challenges of moderating …

speech on social media.



[[YORK INTERVIEW]]



JY: What makes moderating speech so difficult is the sheer scale of it. Back when social media platforms started out 10, 15 years ago, a much smaller number of people, so it was easy to moderate speech, using human moderators as time has gone on we've seen a lot more automation brought into the picture, and frankly automation, just doesn't do this as well as a human touch, and yet at the same time, you know, human moderation requires quite a bit of attention and it's simply too costly for companies to get this right.

GVS: Where's the line if there is even a way to describe between government censoring and might take the example of the heavy hand by India against Twitter and Social media content moderation questions of decency or what's appropriate what might be inflammatory inciting violence. I mean where is the line?

JY: The line is often really blurry so I mean, you know, on the one hand, you've got democratic governments that put laws in place such as Holocaust denial being illegal in Germany and companies are going to want to comply with that, to keep operating there. On the other hand, a lot of rules that they put in place are really going to be ideologies, or ideals of a given executive. So we've seen this week from report that Mark Zuckerberg political beliefs have played into the way that he moderates speech on his platform that's really you know some of the more problematic aspects of this the fact that a lot of these decisions are really, you know, not up to a democratic process, but rather up to, you know, the whims of a handful of executives at a company.

GVS: Well, these are big companies they have a lot of money so I mean it's not there's not a shortage of money for these companies?

JY: That's true. A lot of these companies would rather invest in you know acquisitions, invest in their engineering teams their PR teams, whereas content moderators are often some of the lowest paid workers. Some of them are employed through third party firms in places like the Philippines and the southwest of the US. They're simply not given the care that they deserve.

GVS: Do Twitter and Facebook pretty much use the same methodology for moderating content or is it different?

JY: It's a little bit different Facebook is a much bigger company with a lot more money and they do have offices around the world, where their content moderators work. They also employ a lot more automation, from the very beginning, where the speech may not even pass through human eyes, whereas Twitter keeps most of their moderation in house, and there is much more of a human touch to it. They're also less likely to take things down entirely, particularly government's requests, but rather use something called geo locational blocking to make sure that people in a given country cannot see the content if a government doesn't want them to.

GVS: Well, it must take some degree of sophistication as to what might be inappropriate for instance on Facebook, which was criticized heavily for inciting violence against the rohanga as a forum, people used it. So, how do you moderate against something like that once, who determines the level of sophistication and what gets what gets taken out and what doesn't?

JY: Yeah, I mean it's a really difficult thing to get right, but a lot of these companies aren't really trying that hard when it comes to Facebook what we saw in Myanmar was really a lack of attention, a lack of local expertise being brought in, and also a lack of individuals moderating content in the Burmese language, back when the Facebook was first getting reports that genocide was happening there. They only had something like eight or 10 moderators who had expertise in the given language.

GVS: What about the mixed standards for countries, for instance, in India recently there was a fight between India and Twitter India wanted the government demanded that Twitter blocked a number of accounts, but here in the United States because we have a First Amendment right something like that might not have happened along the same lines is about a protest over the Indian farmers. So, how do they reconcile the different standards in the different countries?

JY: Yeah, these companies are absolutely subject to local direct jurisdiction, especially when they have boots on the ground so to speak or individuals in a country. India is one of those countries it's obviously you know pretty profitable country for a lot of these platforms. And so they do have to comply with local law, whereas in the United States of course it's very unlikely that law enforcement or the executive branch of the government would even demand that speech be taken down and although it's not unheard of. And so when it comes to India, it's really difficult for these companies to refuse, an order because it means that they'll probably get blocked or kicked out of the country.

GVS: Should the government, any government around the world have the right or the force to shut down or ban social media or control social media?

JY: I don't think that they should. Although I do think that there are some reasonable restrictions to be put in place I think that denying people access to these platforms, really does great harm. you know it is of course governments that use these platforms, but it's also the people and we've seen over the past decade and a half the ways in which these platforms can provide powerful space for community engagement for protests and for organizing.

GVS: What do you see as the future?

JY: I think that we have to put the power back in the hands of users in order to have a different future than the one that we're headed towards. So what I would like to see is a much more democratic process being put into place where users and people around the world can have more of a say in the way that these platforms are governed with their spaces.

GVS: Julian thank you very much for joining me.

JY: Thank you for having me.

[[GRETA]]



That’s all the time …

we have for now.



My thanks to Kara Swisher …

and Jillian York.



Check VOANews.com …

For the latest news updates.



And yes, I am on social media.



Follow me on Twitter @greta.



Thank you …

for being Plugged In.



###

XS
SM
MD
LG