Aug 30, 2024
Telegram CEO Arrested For Allowing Illegal Activity On His App
Telegram founder Pavel Durov has been charged with multiple crimes.
- 17 minutes
Pavel Durov, the Russian born entrepreneur
and founder of telegram,
the hugely popular messaging app
with over 900 million users,
was arrested in France with a large range
of crimes being alleged but all
essentially falling under his failure
to prevent illicit activity on the app.
[00:00:20]
This is incredibly noteworthy,
as it is a very rare instance of
authorities trying to hold a top executive
accountable and personally liable for the
behavior of its users on a tech platform.
Mr. Durov, 39 years old, was detained
by the French authorities on Saturday
[00:00:37]
after a flight from Azerbaijan.
He was charged on Wednesday with
complicity in managing an online platform
to enable illegal transactions
by an organized group, which could lead to
a sentence of up to ten years in prison.
He was also charged with complicity in
crimes such as enabling the distribution
[00:00:53]
of child sexual abuse, material
drug trafficking and fraud, and refusing
to cooperate with law enforcement.
He was ordered to pay bail
equal to about $5 million,
has to check in with the police
twice a week and cannot leave France.
[00:01:09]
Telegram has played a role
in multiple criminal cases in France
tied to child sexual abuse,
drug trafficking and online hate crimes,
but has shown a near-total absence
of response to requests for cooperation
from law enforcement.
The Paris prosecutor said
the case has turned up the heat
[00:01:26]
on a long debate about free speech online,
and whether companies themselves
should be policing what their users say
and do on their tech platforms.
Governments, especially in the UK
and the EU, are putting increased pressure
on these companies to do something about
very important categories, of course,
[00:01:44]
of child safety, terrorism, disinformation
and other harmful content, often
running like wildfire on these platforms,
on these apps after the arrest.
The Dubai based company said
that it follows all EU laws.
And that, quote, it is absurd to claim
that a platform or its owner are
[00:02:03]
responsible for abuse of that platform.
Mr. Durov now joins a small list
of high ranking tech figures who have
been indicted in connection with crimes
committed by users of their services,
including Ross Ulbricht,
the creator of the Silk Road
online black market, and Changpeng Zhao,
[00:02:19]
the founder of Binance,
who pleaded guilty last year
to US money laundering violations that
took place on his cryptocurrency platform.
According to a New York Times article.
Daphne Keller, professor of internet law
at Stanford Law School,
said Mr. Durov and Telegram
were conspicuously different from other
[00:02:37]
major platforms such as Meta and Google,
which have a more robust,
have more robust safety and trust teams
that take down illegal content
and respond to law enforcement requests.
Quote, I continue to assume the reason
they can indict is because telegram
[00:02:52]
forfeited their immunity by not taking
down things they were notified about.
If that's true, the indictment seems
like a not surprising next step.
In Mr. Durov's case,
telegram did not answer a request
from the French authorities to identify
one of his users in an investigation
[00:03:08]
into child sexual abuse materials,
a person with knowledge
of the matter said.
The article continued.
Telegram,
which Mr. Durov founded in Russia in 2013,
has more than 900 million users.
It works as a messaging app similar to
WhatsApp or iMessage, but also hosts
groups with up to 200,000 users and has
other channels with broadcasting features
[00:03:27]
to help reach even larger audiences.
Light oversight of content on the platform
has helped people living under
authoritarian governments on a
positive side to communicate, but has also
made the app a haven of harmful content.
Durov has refused, has infused rather
telegram with an anti-authority ethos
[00:03:45]
and commitment to free speech.
He said his worldview was informed
by his experience in Russia,
and he came to believe strongly that
government should put few restrictions
on people's online speech and actions,
and that digital privacy trumped security.
Saying, quote, privacy ultimately is
more important than our fear
[00:04:02]
of bad things happening, he said in 2015.
This could, however, also serve
as precedent for authoritarian regimes to,
try to restrict executives or try to
imprison them, even for content or freedom
of expression that they do not like.
[00:04:18]
So it's a really thorny issue.
It's a crazy precedent.
- I'm curious your guys thoughts.
- Hey, don't scroll away.
Come back, come back.
Because before the video continues,
we just want to urge you
to lend your support to TYT.
You power our honest reporting.
You do it at t.com/team
and we love you for it.
[00:04:35]
Super hard issue.
So let me give you the spectrum here.
So you see why it's so hard.
So if someone sneaks child porn onto a
platform, whether it's Twitter or Telegram
or whatever it is, and you say, well,
they did not take it down quickly enough,
efficiently enough, etc., I'd say
[00:04:53]
you're going to arrest somebody over that.
No. That's crazy.
They didn't put the child porn up
and they don't want the child
porn up on their platform.
Yeah, sometimes it's super hard
to run a business.
In Elon Musk's case, when you
fire two thirds of your employees.
Right.
And so and okay, so that was a mistake
or that was a thing that they didn't want.
[00:05:11]
And it's not like the founder
of the company or the CEO of the company,
like was advocating for it
or trying to make it.
No, that's not
if you arrest somebody over that.
That's nuts that you can't
run any business online.
Right. Because there's all guys.
And if you don't know this, there's always
people who are trying to put this kind
[00:05:30]
of crazy material on every platform,
on different websites, 24 over seven.
There's like a whole industry of this.
So when you start anything online,
you have to like figure out,
how do I block the trolls?
Dragons.
Monsters like dragons
is a bad thing to use here because.
[00:05:47]
But anyway.
But like all these ghouls and goblins,
I mean, from coming on.
Now, if you don't do that perfectly well
and they arrest you for it,
I think that's nuts.
On the other hand, if the platform knows
oh, there's a ring on here that's doing
[00:06:03]
child porn, money laundering, string of
robberies, a string of rapes, child rape.
Oh, my God.
And you've got an evidence and a warrant,
and a country comes and says,
we know there is illegal activity going,
and somebody is going to get harmed.
[00:06:18]
Somebody is being harmed right now,
and they go, whatever,
I'm not going to do anything about it.
Okay. Well, that's a different issue.
You see the range of possibilities here.
And so when you get to issues
in the middle, it gets really hard.
And there's one other complexity here.
And I'm going to give you
a curious analogy here.
[00:06:35]
But I think you'll see why I'm doing this.
So I'm watching suits with my kids
and I'm explaining the law to them
as we go along.
It's a fun show,
but it's heavy on legalese, right?
And I happen to go to law school
and I'm a lawyer, etc.,
and so I can at least explain the basics.
[00:06:51]
And so what I explained to them is lawyers
have a responsibility to their clients.
And that responsibility is so sacred in
that profession that in that context, we
put different burdens and responsibilities
on them than we would an average person.
[00:07:09]
So if you know your client did something
wrong, there's a range of things there.
If you know they're about
to commit a crime,
you still have to turn them in, right?
But if you know they did something
wrong in the past, a normal person
would have to turn them in,
but their lawyer can't and shouldn't.
It's a different moral code
for a good reason,
[00:07:26]
which is we need you to advocate for your
client in this system unreservedly.
Right.
And we're superseding other moral factors
we care about to make that possible.
So when you're running a platform
like this, there's a really interesting
argument to have over is privacy on these
platforms so important like serving
[00:07:47]
your client is If you're a lawyer,
that is supersedes
other moral factors involved.
Again, I don't have that answer
because it's a super hard answer,
but it's one that we're going
to have to decide as a society here.
So how this applies to this particular guy
depends on the details of the case.
[00:08:07]
And right now we don't know
the details well enough.
Are they on the end of the spectrum
where they're like,
oh, there's bad things on there.
Let's just arrest the guy.
Crazy. I would be 100% against that.
Or is it on the side here where they go?
No, no, we have evidence of law
breaking happening right now.
[00:08:22]
We're trying to stop it
and they won't let us.
- And we'll get.
- A few more details in a second.
But I'm curious to hear
what his thoughts before we get
to a little bit more of the nuance.
This is a really interesting case.
As Jake just mentioned,
there's a couple of things.
One, telegram has gone out of their way
and I think quite a little dishonestly
[00:08:41]
as marketing themselves
to the people who use the app as the most
encrypted or private of all of them,
and it's just not true.
In terms of what is the most privacy
forward of the messaging apps,
you can take, for example, iMessage,
where, you know, you can only do about 32
[00:09:02]
people at a time on group message.
And there's a reason for that,
because Apple's just like we're encrypting
end to end every single message.
When the feds come to us,
our stuff is so encrypted, we literally
can't even go in there and tell you
what people are texting to each other.
[00:09:18]
However, the FBI has figured out how to,
hijack iPhones,
so it's a moot point, right?
Like when the feds want
to get into a phone, they can.
So it kind of doesn't really matter.
Whatsapp can have up to 1000 people,
in a group chat, and it's also end
[00:09:34]
to end encrypted, but it's only 1000.
How telegram, got popular is because they
do have these 200,000 people chats,
and they also have what's called
like these channels where as many people
as they want can kind of subscribe
and get a news feed, or almost like a blog
[00:09:52]
from the telegram people.
But telegram again, is the most loosely
encrypted, meaning all of the information
that happens on telegram is happening on
a server that they can go in and look at.
And so the idea that the French
authorities are coming to these guys,
[00:10:10]
like bro, we found the freaking, you know,
pedophiles and money smugglers and all of
these people, this is what they're doing.
We want the freaking information.
We know you can you have access to it
and give it to us?
And they're just like,
nah, can't help you.
[00:10:26]
It's insane.
And lastly, what I will say about,
you know, Facebook, Instagram, WhatsApp,
all of these people trust me, in America,
they play ball with our government
law enforcement agencies, period.
Yeah, I mean, they often do.
Sometimes they say no to that.
[00:10:43]
But you know, these things always have,
you know, pros and cons because people
use it for illicit activities, for sure.
But I'm also in a few telegram groups
that keep me up to date on things breaking
on the ground in the Israel-hamas War.
[00:10:59]
I mean, a group that talks
about atrocities and difficult things
happening to Israelis.
That group's got like 20,000 people.
And I'm in a group called Gaza now in
English that talks about horrible things
happening to the Gazan people
that has 200, almost 200,000 people.
[00:11:14]
And there's another group that I can't
read, but it's Gaza now in in Arabic,
and it's got 2 million people.
They're broadcasting out constantly.
Those are not encrypted.
So also they can just they can just get
that information themselves.
By joining those groups,
I was able to join with no problem.
[00:11:31]
But it does bring up the details
of encryption and makes encryption
really put on the spotlight here.
French authorities said telegram
had provided cryptology services
aimed at ensuring confidentiality
without a license.
Encryption has been a long running
point of friction between governments
[00:11:47]
and tech companies around the world.
For years, tech companies have argued
that encrypted messaging is crucial
to maintain people's digital privacy.
While law enforcement and governments have
said that the technology enables illicit
behaviors by hiding illegal activity, U.S.
Tech companies, including some of the ones
that was mentioned, signal,
[00:12:05]
Apple and Meta's WhatsApp, all give
their users end to end encryption.
These apps are encrypted by default
as well, which means users conversations
are immediately private
at the moment they start chatting,
whereas telegram handles
encryption differently actually,
and is a lot less encrypted.
[00:12:21]
You have to opt in via a pretty
hard to find setting,
and is only offered again for the one on
one communication, not for large groups.
The debate has grown more heated
as encrypted messaging apps
have become mainstream.
Signal has grown by tens of millions
of users since its founding in 2018.
[00:12:37]
Apple's iMessage is installed
on the hundreds of millions of iPhones
the tech company sells each year.
WhatsApp is used by more than 2 billion
people globally, but one of the examples
they didn't turn it over in 2020.
Apple pushed back
against an FBI request, for example,
that the company break its encryption
in order to access data on two iPhones
[00:12:54]
that belonged to a gunman who opened fire
at a naval base in Florida.
I didn't even know until researching
this story that Apple's iMessage
is encrypted end to end.
I did not know that.
It takes away a lot of the reason
to even use some of these other apps,
but to my mind, if something
genuinely is encrypted end to end,
[00:13:11]
in a world where privacy is disappeared
and it becomes very hard to have
truly private conversations,
and there are times when you have
to rise up against a government or against
an invading force like we've, you know,
like I just mentioned with some of these
other groups that I'm part of,
if they're truly end to end encrypted, the
company can't get into that communication.
[00:13:28]
I don't think they should be asked to.
There has to be other ways
to investigate those crimes.
- That's my thought.
- Yeah, so devil's in the details.
So this is a trial
I'll probably follow very closely
to see what exact evidence they have.
Is it a crime they know is happening?
Is it imminent?
[00:13:45]
Can it be stopped?
And you know, how are they
holding the the founder of the company
accountable for that?
Is it someone else to hold accountable?
So I just I think it's super interesting,
but I'm worried on both sides.
I'm a little worried that if they can't,
if they found a way to to create a safe
[00:14:06]
haven for significant criminal activity
that we're all worried about,
that's a problem.
And on the other hand,
if they basically block us
from having any private conversations,
I'm massively worried about that.
And then saying, oh yeah, if you
create a safe space for anyone online
[00:14:24]
to have any private conversation,
you're going to get arrested.
That's a giant problem,
that's big Brother, etc..
So this case could be deeply problematic
or it could be logical
depending on the details.
And so I think a lot of the world
is going to be watching this case.
[00:14:43]
This is not a little thing.
This is a giant giant issue
for the whole planet.
Yeah.
And just a couple of things
for people to understand at home,
like this concept of freedom of speech.
That's a very American thing.
That's our right.
That's why like, you know,
a lot of times companies can go
[00:14:58]
to court based on that, right?
Like it's literally codified,
here in our laws here.
That's not the case.
In a place like France and England,
there's no First Amendment up there.
And so, this is just a different animal
altogether over there because like,
[00:15:16]
this telegram guy can say, I believe
in the principles of freedom of speech.
And these other governments
can be like, we don't care.
You don't get to operate here.
Under that principle.
And the second thing that I will say,
because I think all of us here,
on the show today are pretty staunch
free speech advocates.
[00:15:33]
I think when people when,
when the framers, invented this amendment.
Like, you know, they're thinking
about some wacko on the corner yelling at
the, you know, howling at the moon, right?
Like, of course he should
have the freedom to do that.
I don't think they could conceive of
a of an internet where like, theoretically
[00:15:52]
from your phone, like, theoretically,
you can reach the entire planet, you know,
it's just a different way of viewing
these principles that we hold so dear.
Yeah.
One super last thing that was
and Ben both kind of alluded to that.
[00:16:07]
I want to make clear, remember, there's
different laws in different countries.
So if a country goes well in this country,
it's illegal to criticize
the leader of the country.
And you allowed an your app to have
private conversations where they were
criticizing the leader of this country.
That's illegal. You're under arrest.
[00:16:24]
That's a huge problem. Right.
And how do we solve that problem?
Of course,
one way is don't go to that country.
- Right.
- Yeah.
Just stay out of Saudi Arabia. Right.
That's for sure. And Jake.
And the implications here are exactly
that, because Elon Musk obviously
[00:16:40]
does not want this to be happening
and does not want to be held accountable.
He immediately tweeted out on X free Pavel
because he doesn't want this happening.
But on the same token, it also doesn't
make a ton of sense to hold some
of these platforms accountable when like
why is it different than a telephone?
[00:16:57]
Once you have like conference calling
because somebody plans a drug deal
on a four way phone call.
Is AT&T responsible?
You should stop the crime,
not the ability to talk to people.
Thanks for watching The Young Turks
really appreciate it.
Another way to show support
is through YouTube memberships.
[00:17:13]
You'll get to interact with us more.
There's live chat emojis, badges.
You've got emojis of me
Anna John Jr. So those are super fun.
But you also get playback
of our exclusive member only shows
and specials right after they air.
[00:17:29]
So all of that, all you got
to do is click that join button
right underneath the video.
Thank you.
Now Playing (Clips)
Episode
Podcast
The Young Turks: August 30, 2024
- 18 minutes
- 13 minutes
- 9 minutes
- 12 minutes
- 17 minutes
- 9 minutes
- 13 minutes