In this episode of the podcast, we welcome back Sascha Meinrath for a timely and urgent conversation about surveillance, digital infrastructure, and the growing tension between connectivity and civil liberties.
Sascha explores how modern broadband networks are increasingly intertwined with systems of data extraction, monitoring, and behavioral tracking.
The conversation digs into how surveillance operates at multiple levels: through corporate data collection, government monitoring, and emerging technologies that blur the line between public safety and privacy intrusion.
This episode challenges us to think beyond speed and access, and to grapple with what kind of digital future we are actually building.
If communities are investing in digital infrastructure, Sascha argues, they must also ask: Who controls the data? Who benefits? And who is being watched?
This show is 44 minutes long and can be played on this page or via Apple Podcasts or the tool of your choice using this feed.
You can also check out the video version via YouTube.
Transcript below.
We want your feedback and suggestions for the show-please e-mail us or leave a comment below.
Listen to other episodes or view all episodes in our index. See other podcasts from the Institute for Local Self-Reliance.
Thanks to Arne Huseby for the music. The song is Warm Duck Shuffle and is licensed under a Creative Commons Attribution (3.0) license
Christopher Mitchell (00:12)
Welcome to another episode of the Community Broadband Bits Podcast. Perhaps one of the last. Maybe we'll have a new name sometime. I keep teasing that and we'll be talking more about that in the future. But in the meantime, we've got information to share and I'm back. I'm Chris Mitchell. I'm back with Sascha Meinrath, who is the Palmer Chair of Telecommunications at Penn State University, as well as the founder of X-Lab. Welcome back to the show, Sascha.
Sascha Meinrath | X-Lab (00:38)
Great to be here. I can't wait to see what we're slinging today. Should be fun.
Christopher Mitchell (00:42)
Yes.
Well that and I took a Red-Eye home from hanging out with our friend Matt Rantanen Wiring the Rez and you know, I didn't sleep a whole lot. So we'll see what where I go with things. I'm not. Yes, and I don't think I noted it. I'm Chris Mitchell at the Institute for Local Self-Reliance in St. Paul, Minnesota, where I direct the Community Broadband Networks Initiative. So Sascha, we're gonna talk a little bit about the stuff that you work on. The stuff that I usually ask you to talk about is not where you focus a lot of your time, all your time.
Sascha Meinrath | X-Lab (00:51)
So it will be punchy. I love it.
Christopher Mitchell (01:10)
You spend a lot of time on surveillance and privacy. And this is something that I was always looking for an excuse to talk more with you about. We're going to talk about that for our fair amount today and then might wander into a few other topics if there's time available. So, you know, I live in Minnesota. I feel like I just this is like a bigger part of my identity and it was always a big part of my identity. And looking around it is fascinating to me.
the role that Signal plays in telecommunications. And I wanna talk about that, but we're gonna get there first, I think, by talking a little bit more about telecommunications and surveillance and that sort of a thing, which you know more about than I do. But I just think it's really important for people to be understanding that when we're talking about telecommunications and particularly in this time, it's really important to understand where information is copied and the power of that. And...
Sascha Meinrath | X-Lab (01:39)
sure.
Christopher Mitchell (02:01)
An example of people should be aware of is the databases that East Germany put together on citizens where they had this like massive apparatus and collected information. And I feel like most people are I don't have anything to hide. And then it turns out that like they could be blackmailed or any number of things at any point because of, uh, of anything. So, um, why is it important to care about privacy and surveillance in an open society? Like the one that we're trying to preserve.
Sascha Meinrath | X-Lab (02:07)
Yeah.
I mean, it's not just important. is constitutionally guaranteed, right? Whether it's the First Amendment, freedom of speech, the Fourth Amendment, right? That people need to get a warrant to search your stuff, the Fifth Amendment, the right not to self-incriminate, Fourteenth Amendment, the right to due process. There's a whole variety of constitutional protections predicated upon our rights to
privacy, to self-determination, to the integrity of our communications, the integrity of... That's right.
Christopher Mitchell (02:55)
Rights, rights that were not,
sorry, rights that were not granted by the government. These are rights that are supposed to be preserved by the government that we get because we are human. We are endowed by our creator according to the words of the constitution or the declaration of independence. forget which.
Sascha Meinrath | X-Lab (02:58)
Correct.
Correct. Correct. That's right. And
they are afforded everyone. Simply by dint of being in the United States, everyone is afforded these rights.
Christopher Mitchell (03:13)
Mm-hmm.
Papers are no papers. You get those rights.
Sascha Meinrath | X-Lab (03:19)
Correct. And when it comes to the digital sphere, we have a 100 plus year history of taking over and in essence,
removing the integrity of our communications, surveilling our communications. Actually, I should say it goes way back into the analog space, not just digital, but in essence, in the communications, in the telecommunications space, the federal government has consistently abrogated.
our rights to privacy has continually attempted to increase its surveillance power over the populace in ways that are both illegal and unconstitutional. And in this regard, when people are shocked, simply shocked to learn about the latest iteration, I say, well, let me tell you a story about the last hundred years.
Christopher Mitchell (04:06)
But
let me, I feel like context is important, right? I agree with you. That's one way of looking at it. I think another way of looking at it is we've done pretty good compared to the competition. know, like I'll just, know, China's easy to pick on, right? I mean, like they're trying to exterminate Muslims. So, you know, like not great on recognizing human rights on a lot of these things, the Chinese government, not the Chinese people. But
Sascha Meinrath | X-Lab (04:10)
Yeah.
here.
Christopher Mitchell (04:32)
⁓ but even, even in Europe, think at times the United States government, ⁓ has done better. so over the past hundred years, we've done better and we've done worse. And so I don't want to, I don't want to pretend like the government is like this thing, right. That like, is always just trying to grab our rights away. I feel like who is there matters. And, and, and I think I'm not in a position to argue this deeply with you. but I'm interested. I do feel like it's arguable that the time we're in.
is not the worst time in the history of the United States. The Palmer Raids were probably worse, but we're like racing right up to that point right now. And so like, this is a bad time. The government has done better and it has done worse. And I guess one of the things I would certainly concede to you is there's probably not a time when the United States government has hit the point of what we want to be in terms of respecting privacy and not engaging in surveillance.
Sascha Meinrath | X-Lab (05:00)
Ha
Well, and we're living, again, I would certainly agree, right? This is not as bad a time as has been normative throughout the United States' history, whether it's the genocide of Native Americans, slavery, the civil rights movement like Jim Crow, like we have definitely been much worse to our populace. And...
Christopher Mitchell (05:35)
Jim Crow, yeah.
Sascha Meinrath | X-Lab (05:43)
what one needs to appreciate is the unprecedented ability to infiltrate our lives. That in essence, the tools that are available today are unprecedented. Their reach, their scope, their integration in everything that we do is new. And so whereas the moment, the snapshot in time that is today is not as bleak as it has been, the danger
that we face today is potentially far grander. In essence, the weaponization of surveillance, the abrogation of privacy has never been more grandly dangerous than it is today.
Christopher Mitchell (06:16)
Mm-hmm.
And, and this is,
this is not a critic, a critic. This is not solely a critique of the Trump administration. Like AT&T is called the tier one telecommunications because basically most of the traffic of the world flows over AT&T lines or in the, in the mid 2000s did. right. mean, like all of South America basically flowed over lines in the United States, a lot of them controlled by AT&T.
Sascha Meinrath | X-Lab (06:27)
Correct. Not at all.
Christopher Mitchell (06:46)
And in San Francisco, AT&T was like, hey, NSA, would you like a copy of everything on our network? And they did that. And that was illegal. And President Obama, when he was a senator, was like, cool, let's give them immunity for that illegal act that they did, which was grossly violating our rights.
Sascha Meinrath | X-Lab (06:53)
Correct.
Yeah.
Correct. And we've seen this time and again, after the Snowden files of 2013 showed that pretty much every major platform was turning over private information illegally to law enforcement and to the intelligence community, we passed the USA Freedom Act of 2015, which granted, I kid you not, retroactive immunity for illegal surveillance of American citizens.
Christopher Mitchell (07:17)
Mm-hmm.
Some, think, I don't know if Google, I think Google claimed, and I thought, I believe them. I haven't gone back to them, but I thought that they, Google claimed that they were not willingly do it, that they had so few weaknesses that they had not been aware of that were being exploited. So I see by your look, you're not as convinced as I was of that.
Sascha Meinrath | X-Lab (07:43)
Ha
No, I you can literally look at the PowerPoint presentations showing Prism uptake and it shows the dates by which each of these companies came on board to voluntarily provide information to the NSA. Yep, yep, yep. No, is, they might not have been like, the higher ups in Google may not have been fully briefed on everything that Google was doing.
Christopher Mitchell (07:58)
Okay, that I didn't remember.
Sascha Meinrath | X-Lab (08:10)
but Google was a willing participant. In fact, you can see at the bottom of some of the presentation materials that Snowden's, the Snowden files released, where they are proudly saying that we're like, we're running all of this surveillance systems on like free open source software from private corporations that are, you know, basically, and you have to imagine.
So the NSA comes to you and it's like, Hey, yeah, we need like the largest data storage ever with the most processing power ever. Can we do that with you? It's not like they're like, yeah, that sounds totally not at all involving surveillance. Like, I mean, come on, come on. Like they do.
Christopher Mitchell (08:50)
Okay. So,
so we've got to the point of getting a sense of like, this is this in this century, we have unprecedented ability to sweep up all communications. And even if you cannot crack them, you can store them until the future when you will be able to crack them, we believe.
Sascha Meinrath | X-Lab (09:06)
Yeah, Q day, we could talk about that. That's a very scary thing amongst technologists. So most of our encryption today is predicated upon these mathematical models for securing our data.
based upon prime numbers. I don't wanna get too far down into those weeds, but like long story short, traditional digital computers have a very difficult time. It takes them years to many, many millions of millennia to crack these codes.
Quantum computers, when they get powerful enough, they do it in seconds. And we're already seeing that with encryption that's like a few tens of bits, right? That it can do that. But like...
Christopher Mitchell (09:39)
They do it on their breaks.
I'm less positive
about that. don't have to get into it, but I'll just say that some people have critiqued that the researchers are cooking the books on that. So maybe, maybe not.
Sascha Meinrath | X-Lab (09:51)
But long story short, yeah.
Yeah, but there
is this phenomenon, it's called Q-Day. And the idea is at some point, all of these missives, all of this information that has been encrypted with non-quantum proof encryption will retroactively become available. So one of the reasons why the NSA, but also China, Russia, and Israel, and who knows who else, is just hoovering up copies of encrypted data.
is in anticipation of a moment in time where all those things that you sent that you thought were encrypted and therefore couldn't be read become readable. And you can imagine, like, there's a lot of information out there that people think is secure. I did use encryption. It's like, did you use quantum proof encryption? Because if not, guess what? Within our lifetimes, it's very likely.
Like, I don't know the exact year, but like, if I'm looking forward over the next decades, it's very likely that baseline non-quantum proof encryption, the standards of today will no longer be secure.
Christopher Mitchell (10:59)
Yes. Okay. So, governments have unprecedented capabilities and it's growing at a rate that is disturbing. this is, this is in sort of, a natural byproduct of, of the technology I feel like, which is digital communication can be copied and seems inevitable to be copied in the, you know, the whole information wants to be free claim, right?
Sascha Meinrath | X-Lab (11:06)
Okay.
Yeah, yeah, yeah. And there are mechanisms to ameliorate this problem, things like multipathing, where you break up a message, you encrypt it, and then you send it via multiple different avenues so that you would have to surveil all those different avenues to put it back together in a form that could be decrypted. But, you know, we don't have a whole lot of protocols that do that. And again, like this cat and mouse game for the average.
Christopher Mitchell (11:37)
Mm-hmm.
Sascha Meinrath | X-Lab (11:45)
person is well beyond their technical ability. So you really have this asymmetry where the ability to surveil and eventually abrogate your privacy, that part is unprecedented. And whereas the administrations of the last...
several decades maybe haven't been as terrible as like the Hoover FBI era as the Palmer Raid era. The trajectory we're on is pretty bleak. And I'm not talking about just Trump. Trump is continuing a trajectory that has been in play for the last quarter century or more dating actually back to the DEA and the war on drugs era.
Christopher Mitchell (12:12)
Well, in-
Yeah. So I don't want to get into it I want to get more into the future and what's happening now. But if people are interested in looking at the clipper chip, we almost didn't have encryption, arguably. Like there was, there was efforts from the federal government to prevent Sascha and Chris from having the ability to encrypt anything. And there's still have arguments about this between law enforcement and the spies and whatnot, about how much we want the government to make it illegal to do this. There's fights constantly about whether Apple should have to be able to open up any tech.
Sascha Meinrath | X-Lab (12:34)
You
Correct.
Christopher Mitchell (12:52)
And Apple, think has set a decent standard of, of trying to, develop technology that it cannot open. ⁓ you may or may not agree. That's my read of like the sort of standard non specialist security understanding of what's going on.
Sascha Meinrath | X-Lab (12:59)
That's right.
Correct. And the Clipper chip was a battle in the 1990s around putting a physical chip into our devices that had a back door, a key. In essence, it was available only to the government, officially, to open up those communications. it took... yeah, yeah, I think it took a guy named Matt Blaze less than a year.
Christopher Mitchell (13:21)
only in special circumstances though.
Sascha Meinrath | X-Lab (13:27)
to figure out how to crack it, which had this thing gone in place, you would have had then this hole in all communications that was predicated upon the hardware. So you would never be able to undo it. But fast forward to today, and we're seeing more and more of these hardware-based systems being baked into our laptops in ways that have not been fully eliminated to ensure that there is not another back door being secretly
Christopher Mitchell (13:35)
Thank you.
Sascha Meinrath | X-Lab (13:55)
put into our computers. Now here's where it gets really interesting. The guy who was the single most important backer of the clipper chip, then Senator Joe Biden.
Christopher Mitchell (14:04)
I did not remember that. Now that you say that, I think I did know it at the time. yeah, interesting. ⁓ So the one of the things that I want to I want to make sure we touch on is that the amount of information that is out there while individual pieces are boring. What does Chris buy at the store? What is who does Chris talk to? When does he talk to them?
Sascha Meinrath | X-Lab (14:09)
Yeah.
Christopher Mitchell (14:22)
If you have access to all of that and not only can we now collect it, but we have the machine learning ability to assemble profiles to a point at which like, if you had access to all my data, you probably know more about me than I do because my brain isn't that good at it. Right? Like there's all kinds of stuff we forget about ourselves and this and that. And so our behaviors can be more predicted. So one of the things that Congress has, one of the things that Congress has done is to try to like make sure that like, for good reasons,
Sascha Meinrath | X-Lab (14:36)
Yeah. Yeah.
Yeah, that's the theory.
Christopher Mitchell (14:48)
You could not use some databases of government information by other agencies, right? So there's some data the government collects that they don't want others to have access to because people wouldn't give that information to the IRS perhaps, or, or to the, ⁓ to the census that we know in order to keep our economy functioning, people would be much less willing to share accurate information if they felt like it was going to be used by police departments, not police departments, but like by the policing apparatus of the federal government and whatnot.
Sascha Meinrath | X-Lab (15:17)
correct yep
Christopher Mitchell (15:17)
That all seems to have collapsed, not because
the law was changed, but because people just decided that technically you could do this and probably the law wouldn't stop you because you can combine databases. so Sascha, you've tracked this closely. What do we know about what Elon Musk sort of started and what is happening at the federal government in terms of combining databases to potentially be able to build more profiles and things like that?
Sascha Meinrath | X-Lab (15:38)
Yeah, so it's important to understand that Elon Musk was able to accelerate this breakdown, but it very much predates him. You can look back to the last week in office for Obama, right? So we already know, like Trump's taken over, et cetera, et Obama signed an executive order that opened up data sharing amongst like 20 odd different agencies.
That is what actually lays the groundwork for DOGE to do what it does, did, right? To then say, okay, get us the IRS stuff, get us all these other things. We're gonna all mush this together in one big database. It's Obama that made that possible.
Christopher Mitchell (16:12)
And often
this is justified out of a fear of a repeat of 9/11 right? When in theory, mean, and I'm not gonna go into conspiracy spaces, but it is fairly well established that if the federal government did a better job of ⁓ connecting the dots, the attack could have been prevented. And the argument is that perhaps, and I don't think this is accurate, but like the reason those dots weren't connected was because of a...
Sascha Meinrath | X-Lab (16:16)
Yes.
Christopher Mitchell (16:35)
of well-intentioned, deliberate fire breaks between agencies and things like that.
Sascha Meinrath | X-Lab (16:40)
Yeah, well, the actual reason why this wasn't caught is because there was a surveillance program built by a guy named Bill Binney at the NSA who turned whistleblower. He met, That's right. Like he built a system that actually when they spun it back up after 9/11 did, it's not that it magically catches everyone, but it did flag several of the hijackers from 9/11 as people of interest.
Christopher Mitchell (16:50)
Who we met. Yeah, we've talked to them about it.
using
information that was only available prior to that. so, I guess I'm gonna say.
Sascha Meinrath | X-Lab (17:10)
That's right. So
the government shut down thin thread, his system in favor of a system called Trailblazer, which didn't work. Right.
Christopher Mitchell (17:20)
So
incompetence is often the reason that we have this, but then there's a boogeyman that is created, which is that we need the ability of all government agencies to be able to share data in order to prevent terrorism and to save lives and to make sure that buses of orphans don't fly off the road.
Sascha Meinrath | X-Lab (17:24)
Yes!
So when one talks to the actual analysts, they are completely inundated with false positives, right? And when you get a false positive, like everyone's a terrorist, so you have to investigate all the terrorists, then you can't really focus on, and this is one of the great limits of signals intelligence. So if I go onto a podcast and say, hey, I'm gonna go kill the president, the context matters. The fact that this is an example of the kind of false flag that leads to a false positive, like,
AI doesn't know this. It just says, holy crap, like we have a legitimate fear that like Sascha's seriously going to go and kill the president.
Christopher Mitchell (18:11)
You
can't treat all assertions as being equal.
Sascha Meinrath | X-Lab (18:13)
Correct.
Correct. And this means that you are going to get false positives, that an over-reliance on signals intelligence, as it's called, and under-reliance on human intelligence, that anyone watching this actual podcast would understand. This is an example of the failure of AI to correctly mark something as an actual threat versus an illustrative example of how things go wrong. That matters greatly.
Christopher Mitchell (18:32)
Mm-hmm.
Sascha Meinrath | X-Lab (18:42)
And again, the actual analysts are saying, hey, look, you can't just rely on signals intelligence. It's this blend of human judgment and signals intelligence that leads to the diminution of what's called type one, type two error. False positives. You're a terrorist when you're not. False negatives. You're not a terrorist. I fail to find the terrorist, but you in fact are. These are fundamental problems with statistics and there's no magical way out of them.
Christopher Mitchell (19:08)
Right. We have hundreds of years of history of thousands of years of history of dealing with this problem.
Sascha Meinrath | X-Lab (19:12)
Correct.
Correct. So as we chase, as the intelligence community does, this white whale of like, if we could just get more data, we'll find the terrorists. Well, actually, when you get more data, you get more false positives.
Christopher Mitchell (19:26)
Right, it's trying to find the needle in the haystack by piling more and more hay into the barn.
Sascha Meinrath | X-Lab (19:32)
Correct.
Right. And to bring this to the absurd conclusion, there is a way to stop all the terrorists, to lock up all the terrorists in the United States.
Christopher Mitchell (19:40)
It requires 360 million jail cells, 350 million jail cells.
Sascha Meinrath | X-Lab (19:43)
Exactly. you lock up
everybody, you get the terrorists. Right? Now, are your false positives high under that scenario? Yes, yes they are. But you will have a 100 % success rate in jailing all the terrorists.
Christopher Mitchell (19:52)
you
Okay, let's skip over some of this stuff to just so basically we have government and has found often that by saying that you'll prevent terrorism, they are able to justify these things. And this happens under both Democratic and Republican presidents, although I will continue to say we don't see it across all presidents like like there are are are people in law enforcement and there are people in like the Congress that do take
Sascha Meinrath | X-Lab (20:11)
Correct.
I'm
Christopher Mitchell (20:24)
it seriously to try to limit government power. so there's a push and pull throughout history of this.
Sascha Meinrath | X-Lab (20:26)
Quit. Quit.
Okay.
And it must be understood that this is coming from both sides of the aisle. This is an area of vociferous agreement amongst libertarian right and progressive left that we actually have to rein in illegal and unconstitutional surveillance. This will come to the fore over the next two months because there is a section of the bill that was passed after 9/11 to allow for foreign intelligence surveillance that's been applied to domestic
intelligence and surveillance that's coming up for renewal. And in the last renewal battle, then not President Trump said to kill this thing. It's a terrible bill. I hate it. Like, I mean, it's big, you know, and in Trumpian style, he was just like, kill it with fire. Well, now he gets to decide whether now that he's in power, he still wishes to kill it.
And that is coming up between now and April. And what I anticipate, what I know will happen is you will have progressives and libertarians fighting the leadership of both parties to rein in warrantless government surveillance domestically.
Christopher Mitchell (21:36)
Chuck Schumer is a big civil libertarian, right? He's gonna be on our side on this?
Sascha Meinrath | X-Lab (21:40)
Now you're just like poking the bear here. No, of course not. So Schumer consistently whips in favor of warrantless government surveillance.
Christopher Mitchell (21:51)
⁓ so this gets us to this question and I don't, you can bring me back to get the key stepping stones that we want to hit, but I wanted to get down to this issue then of what I found interesting in among many different things. What's happening as the federal government has targeted Minneapolis and St. Paul and, and actually a lot of Minnesota. I get annoyed at people just forgetting Minneapolis is people have done an amazing job. ⁓ and, ⁓ and they deserve tremendous credit, but a lot of us outside of Minneapolis have also been.
Sascha Meinrath | X-Lab (22:06)
Yeah.
Yes.
Christopher Mitchell (22:18)
active and being targeted and resisting. But ⁓ one of the things that happened early on is that people were creating accounts on Instagram to track ICE and to alert people when ICE was in certain neighborhoods and things like that. And those kept disappearing because Mark Zuckerberg's folks were like, nope, violates the terms of service because the federal government asked us to decide that this violates the terms of service, right? ICE block.
Sascha Meinrath | X-Lab (22:20)
percent.
Thank
Thank you
Christopher Mitchell (22:44)
by an app on Apple, does not violate any, you know, I don't think any judge was really gonna like, if this went to a decision about whether that violated any laws would find that it really had, but Apple sure didn't wanna upset the Trump administration. And so it's blocked from the app store. And so it is blocked from people. And so the sole app that is used by people really is Signal, which is this technology that you know a lot about.
Sascha Meinrath | X-Lab (22:58)
Correct.
Christopher Mitchell (23:12)
⁓ Signal is is ⁓ is beloved by a lot of people around the world who wants to speak without government surveillance tell us where came from and just what were the key points about it
Sascha Meinrath | X-Lab (23:22)
Yeah, so Signal, if you send text messages, you should know that a lot of the text messages, are inherently insecure. Inherently. The protocols are built so that they are not secure.
Christopher Mitchell (23:34)
If you use the default
SMS app on your phone and then also some other apps too. And we might talk about WhatsApp, but WhatsApp has a different issue, different issues, but you're saying that the SMS.
Sascha Meinrath | X-Lab (23:38)
Correct.
At this point, I would say
WhatsApp is insecure, even though the core technology is the same as Signal. They have then layered in additional tracking of their users on top of that so they can take a secure communication and make it not as secure. ⁓
Christopher Mitchell (23:59)
Right. And then also I think I
misspoke because SMS is probably long gone and now it's RCS, right? So, yeah.
Sascha Meinrath | X-Lab (24:04)
Yeah.
But long story short.
there is a functional equivalent, a freely available end-to-end secure application called Signal. And I was very much involved in the transition from just super geeks using what was called WhisperCore at the time into Signal being a widely utilized technology. And here's the 30 second history. So when I was running the Open Technology Institute,
with Libby Liu, who was then the president of Radio Free Asia, and we created the Open Technology Fund. And the Open Technology Fund, which is today the largest funder of open technology for security in the country, in the 2012 portfolio, our first portfolio, we decided to fund, chaking the whisper core.
and turn it into something non-geeks could use. And that was the goal.
Christopher Mitchell (25:00)
And just out of curiosity,
do you want to share who helped fund that significantly?
Sascha Meinrath | X-Lab (25:05)
The federal government.
Christopher Mitchell (25:06)
This is what I love!
Sascha Meinrath | X-Lab (25:08)
That's right. So, but it was done. You have to imagine back in 2010 and 2012, this is the Arab Spring and post Arab Spring era where the US government was like, hey, you know what? If you want to bolster democracy, you got to protect the privacy of a citizenry. You've got to ensure that they have the freedom to assemble and freedom of speech and freedom to communicate and that they don't self incriminate. And like all these rights that are baked into our constitution that aren't existed under authoritarian dictatorships around the globe.
government rightfully said we need to ensure that people have the technology to ensure that these fundamental human and civil rights cannot be abrogated by a authoritarian regime.
Christopher Mitchell (25:51)
And this is a time when women who are organizing, well, women and men, but people were organizing for women to have basic human rights in multiple countries. And the technology that they were using allowed government and non-government forces to find them and kill them. And so this is a serious thing.
Sascha Meinrath | X-Lab (25:59)
Correct.
that is
So the use cases we were facing were quite fraught, the need immediate, the ramifications for failure, literal life and death.
and Signal was the breakout product and in essence showed a, was a mechanism for having end-to-end encryptions that you could send text off of the phone that you have in your pocket using an app that you can download from pretty much anywhere and to do that securely. And that sets this.
trajectory for a series of different services and apps. So now Signal does both the text messaging, but it also does things like video conferencing. You can have one to many communications. You can have all sorts of different things sitting on top of that app.
Christopher Mitchell (26:53)
you are limited
because if you do it right, there's an overhead. so for the texting, you have a limit of a thousand people in a text in a, in a text room. And for the live video and audio, have a limit of 50 because, ⁓ devices can only encrypt to so many unique keys within a given time period without turning your device into a flaming hot mess.
Sascha Meinrath | X-Lab (26:57)
Ha
Correct.
correct.
Correct.
correct, which leads to the needs of today for the federation of different services and apps that can interlink these kinds of systems for the building of a secure ecosystem of products and services and facilities to enable this next generation of scalable end-to-end encrypted systems. We know how to do that.
What we lack are the resources to actually accomplish that. And if you look at the success of Signal, it's really due to both an incredibly talented team of tinkers and hackers building this thing. And it's due to a single patron that was like, hey, this is super important. I will continue to fund Signal to make it possible to not worry about paying rent next month. So you can just focus on building the product.
Christopher Mitchell (28:05)
This is a scale that
you require millions of dollars to be able to do this, right? 10 million?
Sascha Meinrath | X-Lab (28:09)
Yeah, Signal's a
non-profit so anyone can download its 990 forms. We're at $5 to $10 million. It's not a lot. When you think about what is out there and what people spend and what have you, what billionaires have available, this is like the dust mites on the dust in the pockets of billionaires. And that's all that is between, say, having secure communications for everyone and the current state of much of our community.
Christopher Mitchell (28:17)
Exactly.
Sascha Meinrath | X-Lab (28:39)
communications.
Christopher Mitchell (28:40)
Yeah, this, this does remind me that like, there's this question and I, and I'll come back to DOGE a little bit because I do think there's an executive order from Obama that you cited that allows us sharing. And there's this question of like, what, can be done? And so in Larry Lassig, give this a lot of thought 20 years ago, a book called code, you know, basically the law of the Internet is basically what is technically technologically possible to do within code. Um, and.
Sascha Meinrath | X-Lab (29:00)
Yeah. ⁓
Yep.
Christopher Mitchell (29:08)
And I just think that I look back, but when you're not constrained by the law, when you feel that you can act in an unaccountable manner, suddenly things are quite different. And what, one of the things that's great about Signal is that it's based on math, not policy, right? And so, and so you can't have, it's not, it's not possible at one decision maker. There's no like Mark Zuckerberg of Signal that can be like, you know what? I think I'm just going to open this up so that everyone's conversations are exposed. It doesn't happen.
Sascha Meinrath | X-Lab (29:16)
Go.
Yeah.
Yes, Signal
does not have access to you and cannot access your communications if you don't want it to.
Christopher Mitchell (29:37)
Like the entity that builds
it.
Sascha Meinrath | X-Lab (29:38)
Correct. Signal also uses quantum proof encryption. when Q day arrives and it was like, no, secure communications are no longer secure, Signal will still be like, ha ha, but we are. And that to me is really important. That forward thinking, you know, what do we need to do to secure communications? Not just today, but over the long period of time. How do we make it secure against both legal threats and upcoming technological threats that might
to the integrity of these communications. That has been very well thought out and Signal being an open technology stack allows any crazy mathematician that wants to look and see like, is this really secure? Can do so. And likewise, if we find anything where it's like, hey, look.
Something new came out, we need to figure out like, is this still secure given this new thing? Anyone can look and audit and figure that out. It makes for far more secure systems to have an open technology stack. That's why at the heart of every AI, every cloud computing, every giant process, digital data processing center, they're all running Linux. And there's a reason for that.
But when you look at the services and applications that we use for our everyday communications, they're all proprietary, closed, and insecure. If people knew how insecure our data, our information, our most private spaces actually were, we'd all be horrified. So you keep that out of the public domain as much as possible in order to promulgate these business models that are inherently
disempowering to all of us.
Christopher Mitchell (31:17)
Well, this is where I, I'm not sure how to ask this question, but I think you'll know what I'm getting at, which is that email is something that's not controlled by anyone, not particularly secure. ⁓ you know, there's different, right, right, right. Even Proton Mail right? Like people are like, people are like, love Proton Mail. It's like, all right, well, like it's a little bit, like it solves a couple of problems, but if you've got 10 problems and you solve three, you really, you know, you're not, you're not, you can't say you're secure. So, ⁓ but it, but it's also something that's not under the control of a single company. And.
Sascha Meinrath | X-Lab (31:25)
Yeah. Not at all secure. Terribly insecure.
Yeah, it does.
Yeah, yeah.
Christopher Mitchell (31:46)
Whereas like communications on Facebook, like I hadn't been on Facebook. I would check into Facebook periodically to tell people that I wasn't answering instant messages on Facebook, you know, once a month or so and then I'd poke in. And then when this happened, I was like, well, shit, like I want people to know what we're experiencing here in Minnesota. So I started writing on Facebook again. And was just, and I was talking with Sean earlier today about this and just like, you know, I'm hopeful that Bluesky can accomplish.
Sascha Meinrath | X-Lab (31:54)
Yeah, yeah.
Christopher Mitchell (32:11)
one of the things that's set out to do that I think threads is a little bit open and mastodon was the goal of which is this idea of like, what if we're able to build audiences and share things without any one person, any Mark Zuckerberg able to tell us, nope, we've, know, because we have business interests and we want to get this merger approved, we're going to crack down on your communications because you're unpopular with the current leader. ⁓ That's one of the things we have to avoid. Signal gets us there, right? But
Sascha Meinrath | X-Lab (32:16)
Yeah. Yeah.
Right.
Correct.
Christopher Mitchell (32:39)
Do we have anything else in that direction? Is Bluesky heading in that direction? ⁓
Sascha Meinrath | X-Lab (32:43)
I mean, BlueSky definitely, that's the goal. You have a federated system, right? There's no center to BlueSky. You just have a whole bunch of different servers that are running these protocols that all interlink and create the platform. I mean...
You and I, we've got a bit of gray in our beards. So we remember when people used to have blogs independently hosted wherever, and then you would link between the blogs and that would drive you down rabbit holes that were just peer to peer. We're rediscovering why that's important today. And there are a number of federated systems that are functional equivalents to the...
the products that are centralized and disempowering that have the algorithmic discrimination and curation of content that you yourself don't have any control over, right? What shows up in your feed, like you no longer control. And so at the one hand, have...
Christopher Mitchell (33:34)
That's what I tell
my wife when she sees my algorithm. I'm like, have no control over this.
Sascha Meinrath | X-Lab (33:37)
That's right.
That's right. Well, you get fed what the algorithm thinks you want next.
Christopher Mitchell (33:42)
Right. No, I was just, yeah, I wanted to make a cheap joke because it's fun.
Sascha Meinrath | X-Lab (33:45)
That's right. That's right. So the fact that you have all scantily clad bikini wearing
Christopher Mitchell (33:50)
Why would
you think that's what it was?
Sascha Meinrath | X-Lab (33:52)
That's right. That's right.
That's nothing to do with. Yeah. But in essence, no, you are being fed this stuff. And like, I can see when my kids have used my account because then for days afterwards, the ads and the videos that are fed to me, I'm like, yeah, that's not what I usually see. But that's the algorithm deciding what it is I want. So.
untethering from that force feeding, right? This is the whole battle that's happening right now, the legal battle where earlier this week, Zuckerberg was forced to testify under oath about what the hell is Facebook doing is all predicated upon this notion of we are being force fed a certain type of content.
Christopher Mitchell (34:24)
last week as you're listening to this.
Sascha Meinrath | X-Lab (34:38)
predicated on surveilling your activities, knowing what it is you want, and then feeding that back at you.
Christopher Mitchell (34:38)
That's the
right that's a court
case that that people aren't paying attention to it's well worth paying attention to
Sascha Meinrath | X-Lab (34:50)
That's right. It may set a major precedent for the liability of these platforms that are no longer neutral purveyors of content, but are in fact actively forcing content upon their users, some of which is actually quite harmful.
Christopher Mitchell (35:09)
Right. this is quite harmful, whether it's an individual who is encouraged to consider self harm. It is also really damaging if it is a pure nation state that is trying to use it to undermine culture and create internal divisions, which has been my fear with TikTok, among other things that this company out of China has ⁓ potential interest in doing.
⁓ Or if you're just interested in promoting certain news, as we see the concentration of what's happening with Larry Ellison trying to wrap things up, what algorithms will they be using for the things that they can control over time as they buy more and more products?
Sascha Meinrath | X-Lab (35:45)
Correct. And the US government today is like, you know, we're not certain about how, you know, we're not sure we buy that, you know, these platforms are causing harm and could engage in propaganda and promulgation of misinformation and blah, blah. But we are passing a national ban on TikTok unless they sell over fear that they will be able to use the power of propaganda on their platform to influence Americans. So you can see this paradox at play.
Christopher Mitchell (36:05)
Mm-hmm.
Sascha Meinrath | X-Lab (36:13)
right now, which is to say, very obviously these platforms can engage in massive pushing of individuals and populations.
based upon these algorithmic prejudices that are not necessarily aligned with the best interests of democratic deliberations and open and informed citizenry of democracy writ large. And that's a very big new threat, right? The Stasi, which you had mentioned early in the, but like.
Throughout history, authoritarian regimes have attempted to harness these powers of propaganda. We've just built tools to make that infinitely easier to do. And whereas we haven't entered a Goebbels-esque media sphere yet, for sure we're not there yet, it is of grave concern to those of us that study history to say, look, the trajectory we're on.
is not a good one. It's not bending towards justice or liberation. It's bending more and more towards authoritarianism, misinformation, and propaganda being weaponized by our government, not necessarily in the best interests of us, the body politic.
Christopher Mitchell (37:13)
Mm-hmm.
Yes.
And that's where, from my perspective at the Institute for Local Self-Reliance, our perspective is the danger is centralization. It's not just government. It is centralized power, which is also large corporations. And my understanding is limited and I keep meaning to go back, but haven't to better understand the post Berlin wall coming down reckoning with it was that one of the things that came out of Eastern Europe was this sense that nobody, no government.
no corporation, no individual should be able to amass large databases of information about people. is just inherently, there is no good reason to allow it. It is too dangerous for an open society where we respect rights.
Sascha Meinrath | X-Lab (38:08)
Correct. And if you look at the dangers, the damage, the weaponization of earlier iterations of surveillance systems, you look at the COINTELPRO program here in the United States that was targeting advocates of its era, of the civil rights era, you fast forward to its discovery, basically based on whistleblowers, people that...
Christopher Mitchell (38:24)
Right, as well as the KKK among others, which, so, yes.
Sascha Meinrath | X-Lab (38:32)
broke into the FBI offices and carted off a whole bunch of data showing that this thing was existent. And you fast forward to the Church Committee, which issued a report in 1978 that said, hey, look, we need to ensure that this never, ever happens again.
Christopher Mitchell (38:47)
Or at least not for 40 years.
Sascha Meinrath | X-Lab (38:49)
and you look at the laws that were passed, including the Foreign Intelligence Surveillance Act of 1978, and you look at what's up for renewal, like, in April.
Christopher Mitchell (39:00)
Yeah, no,
and then there's interesting data that's come out regarding the rubber stampedness of it.
Sascha Meinrath | X-Lab (39:05)
Correct.
And you look at the Snowden files and what was revealed. And any time you build this kind of power, it has been abused throughout all of history. Patrick Eddington wrote a book, The Triumph of Fear, about what was happening in and around the Palmer Raids of that era. It is a fantastic compendium of knowledge about this early iterations of government surveillance and the weaponization against the public. You see like, wow, if I looked at that,
and compare it and contrast it with what's happening in your backyard in Minneapolis, right, and many other communities, frankly. Today, it is the same playbook, it is the same problem, it is the same weaponization, it is the same kinds of surveillance tactics.
just using far more pernicious tools today than were ever available, whether to the STASI or the original Bureau of Investigation of 120 years ago.
Christopher Mitchell (40:02)
Well, on that hopeful note, we're drawing to a close. Is there anything that gives you hope, Sascha? Perhaps the uniting of people across various political beliefs that you work with to rein this in?
Sascha Meinrath | X-Lab (40:11)
Yeah, I think
one of the most hopeful elements is this shared solidarity that we have long known exists, but has really come to the fore today, that you really do have the lion's share of the population being absolutely not on this. It's not a left issue. It's not a right issue. It's like the general populace does not like government surveillance, period. And so the way that the government promulgates these activities is in secret.
And when they come to the fore or just so blunt that you cannot ignore this fact, there is a widespread agreement that this needs to stop. Now, the problem has been leadership of both parties refusing to acknowledge this widespread super majority of Americans that don't like this. And my hope...
My optimism is born of the fact that people are getting pretty angry, that the more this gets abused, which is inevitable, the more people are pushing back. And so we are seeing this alignment of not just the courts, 4,000 plus times now ruling like, no, this is illegal, but also the general populace saying more and more and more more vocally, no, this is unacceptable, leading to the pressure necessary to rein in this government overreach.
Christopher Mitchell (41:27)
Excellent. Well, Sascha, I appreciate you coming on to dig into that. I think there's a lot there. And we're going to be heading more in the direction of not solely talking about telecom for future shows. ⁓ But this is very related to telecom.
Sascha Meinrath | X-Lab (41:43)
It is important to acknowledge that telecommunications and modern digital communications is this wonderful, liberatory, potential technology that is also a force multiplier for pretty nefarious activities and that we can't have one without the other. The most important thing we need to understand is like, this is why you have oversight and accountability of a tool this powerful.
And of course we've been in a laissez-faire environment where it's like, no, just leave it to the private corporations. It'll all be fine.
Christopher Mitchell (42:15)
Yeah, who turn out not to have the proper controls or language specialists when there is a ethnic cleansing campaign on their platform that they'd rather just ignore while people are being brutally murdered because of ⁓ the ability to use that platform in that direction. Yeah, there's danger and we need to we need to figure this out and get a handle on it both here and elsewhere. So Sascha, I appreciate you building these tools while working with others so thoughtfully on this and ⁓
Sascha Meinrath | X-Lab (42:24)
you
Christopher Mitchell (42:43)
I know people have a better understanding of why it's important, ⁓ regardless of which color, red or blue, is in charge at the White House.
Sascha Meinrath | X-Lab (42:51)
Indeed. ⁓
Christopher Mitchell (42:52)
Thank you.
