Katherine Druckman (10s): Hey everyone. Welcome back to reality. 2.0, I'm Katherine Druckman and joining me as every week is Doc Searls and we have Petros Koutoupis back today. Thank you Petros for joining us. We have a couple of things on our agenda today. It all kind of centers around content moderation, though, at some, at some level, you know, what is, what is acceptable content on the internet and, and, and what can people do with it? So, you know, a couple of the things that we were going to talk about are very specifically related to content moderation. One of them is Twitter's new birdwatch experiment, which involves crowdsourcing their content moderation. Katherine Druckman (56s): So that's kind of interesting. The other thing is something going on with signal the verge had an article a few days ago that got a lot of traction and was kind of interesting in that there seems to be some internal conflict, its signal about their desire to report abuse or, or moderate in some way, which that seems a little complicated, but we'll get to that. And then the other thing that's big in the news, you know, the past week is, is this a Robin hood Reddit game stop stock market kerfuffle so we'll try to, we'll try to touch on that a little bit, because that's interesting because to an extent that is also about content moderation, it's also, you know, it's about de de platforming and it's about sort of the democratization of financial markets via internet forums. Katherine Druckman (1m 51s): So with that, what do you have, I want to talk about first? Doc Searls (1m 57s): Well, we could talk about the, what was the first one you mentioned Twitter is always the Twitter thing. Yeah. Katherine Druckman (2m 7s): Twitter birdwatch crowdsourcing Twitter's content moderation, opening up for users to report or sort of like report back to the bird watch system. Doc Searls (2m 16s): So I, I just wrote this down. The, the problem with content moderation is giant in giant silos is the giant silos. And the problem with moderating child giant silos is that it can't be done. You know, I mean, there's no way for Facebook, for example, to fully moderate all of the speech that happens in there, they're there, they're built to ex to accelerate it. They're built to drive engagement and, and they're not even doing it. You know, they're doing it with algorithms. I mean, to me, the more interesting, the most interesting thing about that is a Tim cook came down hard on Facebook about that. And the story is basically Apple versus Facebook, but that's not the real story. Doc Searls (2m 59s): The, the story is no, no engagement algorithms versus engagement algorithms. And if you're, if you're using engagement algorithms, you're going to drive up tribalism. You're going to drive people, gathering around people who think and talk the same way and, and who complete their sentences for each other and say, amen on what others say and retweet it and whatever else is going to be. And that that's a problem in itself. But when it happens inside giant silos, that those sellers themselves can't fix it as long as they're using their engagement algorithms. So it's a, you know, it it's a design issue, but it's also, you know, I, I mean, a lot of people would say, what else? Doc Searls (3m 44s): Well, let's bring regulation in here, but I'm not. I mean, the way to regulate it maybe is to say, okay, no more engagement algorithms, because they're bad. They're gonna make bad things happen. And I don't think we're going to end up doing that. It's interesting to me that as big as it is Facebook, isn't big enough to have regulators as capture, as captured as captives. They don't have Google to some degree. I don't think they do. Katherine Druckman (4m 8s): I think I know Facebook has a team of humans that review. Doc Searls (4m 11s): Yeah, no, I'm talking about the regulators. I mean, I don't know, they're not popular in Congress. Right. So, right, right, right. Yeah. I mean, they, they, they, haven't managed to cap capture the regulators. They're not smart enough to do that, I guess, or powerful enough to do that Google to some degree may. I don't know. Petros Koutoupis (4m 27s): I guess I'm trying to visualize what Twitter's Verde watches attempting to do, because Facebook has already started to, I guess, moderate the content and fact check already. I mean, how many times do you see somebody posting? And it says that this is great out. Yeah, great out. This is not true information. Or, you know, it's not valid or it's been proven to be false. And it'd be interesting to see how Twitter could accomplish the same thing. Petros Koutoupis (5m 8s): I mean, I'm sure it's possible, but I guess the question is, is it something we want Katherine Druckman (5m 15s): Well, so I think so the way that they've, they've, they're going about it apparently is they're, you know, they're opening up to a small group of users. I don't know what the selection criteria on that is. And I don't know if they've actually said at all, but they they've given the ability to, instead of just marking a, a tweet as misleading or, or spammy or, you know, whatever sort of, you know, flag you could put on something to report it. They've, they've, they've added an additional feature that allows you to Mark it and then add notes. So you can say, this is why this is misinformation here. These are the, the FA you know, these are the facts, and these are not facts. Katherine Druckman (5m 58s): You can, you can write a little, a little blurb, you know, of your choosing and submit it back to Facebook. And presumably there's something behind the scenes that compiles all of these, all of these flagged tweets plus, you know, they, the information provided and does something with it. We don't really know. And it's, it's, it's new and it's experimental and, and who knows. I just, you know, I wonder as somebody who dealt with comments spam for over a decade and lost the battle, I would argue, we, you know, we tried, we tried, you know, crowdsourcing it a bit and that never really worked for us. But then, you know, we weren't Twitter sized either Linux journal, but we had a solid, I mean, we had a solid audience and it's it's, I don't know. Katherine Druckman (6m 42s): I questioned the reliability and I questioned the ability to get a sample group that is representative enough to get effective enough feedback representative of what I don't know, but I'm qualified. I mean, who knows? I mean, how do you, how do you then vet the vetters? So, you know, I just wonder how it, how it works out. Doc Searls (7m 7s): I, I don't know if you can, the wall street journal several years ago had a, a story which is, you know, the, the worst, the worst job in the world is being one of the 20,000 people whose job at Facebook is to look at human depravity and identify it. And, and these people get PTSD and how do they deal with, and that's just, there's just pure human depravity, but if you're busy looking for what might end incite a riot or some other thing like that, who knows, I mean, I know of one person at Facebook, or was it Facebook, who said one of the most difficult things for them as they can actually identify on an individual basis? Doc Searls (7m 48s): I mean, just to get monitored, this, they can scan for people who are likely to commit suicide, but they can't do anything about it because to like jump in all of a sudden and say, Hey, you're about to commit suicide is like, that's not the right thing. I suppose they could, you know, and if you run an ad for suicide prevention and call this number, that's a little creepy too, right. Somebody might kill himself if the scene that that's being followed. Right. I mean, that's, Katherine Druckman (8m 14s): Yeah. That's an interesting thing. I hadn't thought about that. And the other, you know, the other thing is you have to wonder psychologically, what type of person is going to be the person to take the time to Mark something and give informed feedback? You know, you have to wonder, like, who's going to take the time to do that. And, and, and what, what what's in it, you know, what agenda might they have or, or not, or maybe they just have a lot of free time more than, than some of us. But I feel like it, it reminded me, I, I posted something on Facebook the other day about being in neighborhood groups. I don't know if anybody is familiar with tendencies, but we all have that Mrs. Kravitz neighbor, right. Anybody old enough to have watched Bewitched or which reruns, but we all, we all have them. Katherine Druckman (8m 59s): And I, I wonder, like, is it just going to be a bunch of Mrs. Kravitz's, you know, reporting, you know, on other people's tweets? I don't know. It's a, you know, it's food for thought. Doc Searls (9m 11s): It's an interesting thing. I actually, I don't have a neighbor like that. And suddenly I thought, maybe I'm that neighbor, I don't know. I'm old enough to be that neighbor. Petros Koutoupis (9m 24s): Yeah. Doc Searls (9m 25s): Yeah. Get off my lawn. If I had a lawn, I could get off my dirt. Katherine Druckman (9m 30s): It's not a specific neighbor. It's just the tendency of, I don't know if you use things like next door, or if you have a Facebook group for your whole neighborhood, it tends to bring out the worst in humanity. I think there's a lot of frankly racism. It's always, you know, you know, if somebody is reporting something that they've seen the highly suspicious and it's usually not at all. Petros Koutoupis (9m 50s): Yeah. We get that. We there's a neighborhood app or I'm sorry, neighborhood group that my wife and I are, are a part of. And for the most part it's, you know, it's, it's okay. You know, you have people asking for recommendations of places to eat or services like electricians, you know, to, to hire stuff on there. Things like maybe they want to offload or get rid of or whatever people I see a lot of good use out of it. But then you do have your random individuals who come on and are there are the, the Karen's of the group. Petros Koutoupis (10m 35s): I, I feel so bad for anybody named Karen. I do apologize. Katherine Druckman (10m 42s): Yeah. That's a problematic term for many reasons, but it is convenient and people understand what you mean. Yeah. There's a little bit of sexism in there, but you know, I know those people still exist, Petros Koutoupis (10m 54s): You know, you get, you get a hint of that every now and then, and it's just, it's, it kind of brings the whole group down every now and then. Katherine Druckman (11m 3s): Oh, I can tell you a story. Since we like to talk about, you know, privacy and basic human decency on our podcast, I want saw somebody post photos of people's license plates, and then somebody ran their license plate because I don't know what sort of contact they had with somebody who has the capability of doing that. And then they doxed the person right there on Facebook. And I, I was, you know, I found that shocking and the fact that nobody else apparently found it shocking was even more shocking anyway. So that's the kind of stuff you see it. And it, you know, you just wonder like, well, is that what Twitter is going to turn into? You know? But then again, the comments obviously would not be public. I would hope anyway. Katherine Druckman (11m 43s): I don't know how they process all of that, but Doc Searls (11m 47s): There's a story. There's a story about, I don't know where it was one of the, one of the pubs about how in New Jersey, I think it's in New Jersey, they've got a thing on all the cop cars that are constantly everywhere they go. They're looking at everybody's license plate. It's a thing that just specifically reads license plates on a constant basis and keeps track of them just constantly keeping track of them. And when, and when some, so that let's say a crime is committed, they can say, Oh, well, we'll go back and see that this license plate has been in all of these places in the past. They get a whole history on it, which to me is kind of a privacy violation, but yeah, it's creepy. Yeah. It's creepy. Same kind of thing. Katherine Druckman (12m 28s): Yeah. So anyway, so I mean, I guess we can conclude this this bit with good luck, Twitter. We hope, you know, it's worth trying, I guess. Doc Searls (12m 40s): I mean, it's a similar thing to it is that, you know, the well they're, they're looking to crowdsource something, but Facebook like convened this, you know, panel of, of notables that are looking at, at, you know, are going to tell, I forget what it's called. I'm going to tell Facebook what to do, you know? And yeah. And, you know, and, and they're going to pass on, you know, they're, they're going to say whether or not, you know, the, let Trump back out, don't let Trump back on, which is going to be interesting pendant review panel. Yeah. Something like that. And I think all of that is problematic. I think the basic problem is, is still, you know, Al Al Gore algorithmic nudging at all times, that's completely unaccountable and possibly not even fully understandable by the company that runs, it Katherine Druckman (13m 32s): Created a monster. Yeah. So then the other thing that, that I mentioned at the, at the beginning was what's going on with signal. Apparently signal's founder has pushed back against internal efforts to have some sort of mechanism to prevent misuse of the platform, whatever, however, they define that. I think th you know, this is something we've talked about before, but, you know, technical ethics and, and, and I dunno the, the desire to that by some to limit harm done by one's software or, but, or on the opposite side of, of the token would be, you know, the, the notion that it's even possible to only only allow ethical uses of something without, without having any unethical uses. Katherine Druckman (14m 39s): In other words, backdoors being the idea that it's even possible that a backdoor could be, could be only used by good guys to do the same kind of the same thing. Doc Searls (14m 51s): Yeah. It's impossible. I think there's something oxymoronic about having a monitoring function on a system that's built for secrecy, the secrecy go here, use this. If you want to be secret with each other, and then somebody comes, I know, wait a minute. No, you can't be fully secret because we want to monitor what you're doing, because you might be doing something bad. Katherine Druckman (15m 12s): Yeah. I'm confused. I'm confused by the controversy. That's the thing. It was a big story, but I'm frankly confused as to how it even came up, because if it's encrypted, how, how would they possibly know what if there's abuse happening or misused by whatever definition, how do they know? Like the whole point is that they shouldn't, they shouldn't no, I mean, they're just the, the, the pipes through which the water flows. Doc Searls (15m 37s): Well, you know, back here in the physical world, we with telephony, you know, you, if you were law enforcement, you had to get a court order to monitor, you know, to, to, to wait to get a wiretap right. There, there were these safeguards that were basically made possible by the difficulty of, of monitoring the inherent difficulty of monitoring. But in the digital world is not inherently difficult. It's real easy to monitor stuff. And, and if it's encrypted, well, then the whole idea is that you're not being monitored. This is the whole appeal of it, but still, you know, bad uses can be made of it, which is why law enforcement would very much like back doors. Doc Searls (16m 24s): But then, you know, the tech techies among us, including us will say, wait a back, doors are a bad idea. Cause even bad guys could get in the back doors. So you're kind of stuck. I mean, it, you know, until we figure out a way to, to make this work, there's a, there's a book, a friend of mine wrote Brett, Frischman called Shepherd's drone, which I recommend Brett freshmen, Shepherd's drone and it's set in a future and it's a science fiction book, but it's based on a science present. And there's a society where everybody's Jack, Jen, everybody has a brain implant and a, an ocular implant. So at a certain level, there's, there are no secrets, but in another level, everybody can like look through all the world's knowledge and look through other stuff. Doc Searls (17m 12s): But they do have ways inside of it that society can work where there are secrets where people can communicate selectively with each other at a, at a mental level. And it's, and it works. You know, I mean, of course it works in fiction, but he's imagining a future where this problem is solved, right? The problem is solved. Somehow the problem is solved that, but we haven't solved it here in the, in the digital world that we have, which is still really young and really early. And, and I, I, I don't know how to, I think it it's for me to contemplate that it can't be solved and that potters me too, because it's quite possible for somebody who's a truly bad actor to do some truly bad stuff without anybody knowing about it. Doc Searls (17m 60s): It reminds me, Katherine Druckman (18m 1s): Oh, go ahead. I'm sorry. I was going to say it to be fair. They, there are ways that they could know about misuse and that's because some, one of their, their group chat features would allow you let's say, for example, you're with some extreme organization, whatever it is, you could post a link to use your signal group to a public forum that anybody could see signal could see, Oh, Hey, you know, our signal link is being used over here to recruit in this shady group. And just because signal doesn't have access to, you know, the names and the members and the content, they can still see that it's potentially being used to organize something they don't approve of. Katherine Druckman (18m 45s): So, so I, you know, I can see that that's maybe how the discussion came up, but apparent, but, you know, signals, founder has said that, you know, they, they're just not gonna deal with that until it becomes until it is, you know, until they notice it being a problem. But, you know, it's still, you know, I, my position has always been, we either all have, you know, you can, we either all have privacy insecurity or, or nobody does. And w and that, you know, it's like the ACL HCLU, you know, marching to, or, you know, defending a KU Klux Klan rally, which happened, you know, a long time ago, which I don't know, what would that happen today? I don't know. I don't know if that would fly anymore. And, you know, in our current Doc Searls (19m 28s): Climate, you can tell us, but Katherine Druckman (19m 30s): Yeah, that's a, you know, I don't, I don't know if they have that, that position anymore, but it's the same thing. It's either everybody has the freedom of speech in that case, or nobody does. And, and, and in this case, it's either everyone has the freedom of having private spaces on the internet, or nobody does. Doc Searls (19m 46s): There's a writer named Lewis Hyde who taught at Kenyon for many years. He's been a colleague and he's, he won a MacArthur prize for his book, the gift, which is really kind of a landmark book. But a later book that he wrote is called common as air. And it's about commons and governing the commons, whatever comments might be. And one of the things he talks about is that all rights are what he calls stinted. They are no right. Is absolute when does not have an absolute right on one's property to graze cattle in their front yard. If, you know, if the community doesn't like it, I mean, there are just lots of ways that we have rights that are not, you know, that are not total. Doc Searls (20m 29s): And I think that we're sort of in this, you know, we, we made a binary kind of choice with in the, in the internet world that if it's encrypted than it is really secret, and there is no way, and isn't went away, we, we have complete privacy that way, or we don't. And I don't think it's, you know, I mean, The problem is that if you try to design stinting those rights into the system, then lots of us are going to feel insecure. Right. We're going to feel like, no, wait a minute. They can get in here. I, you know, I mean, one of the things I've wondered about often is, you know, is it possible to have fully anonymous email fulfill? You know, if I just wanted to have, you know, you can have burner emails, are there ways around it in some ways, but are they completely secure? Doc Searls (21m 15s): I don't know. You know, I mean, and I've never tried it, but it's an interesting question. I think we're, we're very early in the evolution of wherever this is going to go. Katherine Druckman (21m 26s): Yeah. So, so our fi I think Petros actually, did you have something to add that you wanted to add before we move on to the next thing Petros Koutoupis (21m 36s): I, it came to mind if you recall about, I think it was either 20, 15 or 2016, the San Bernardino shooting event in, I think it was a Christmas party and it was homegrown terrorism, and yes, it was in an office. And, you know, it's unfortunate if an event of an event, it was the, you know, our federal agents got a hold of the shooter's iPhone. And I remember how big of an, a situation it was that Apple, and even though Apple does not condone does not support any acts of terrorism. Petros Koutoupis (22m 28s): I want to make that clear. They refuse to open up or create a back door for the purpose of retrieving, whatever sensitive data was on this phone, because of the potential risk it was going to create for everybody and anybody, whether they were doing shady things or not. I mean, this story goes, you know, it's, it's a reoccurring thing. I mean, we can go back even further in time and it's just, I don't think we'll ever get to a resolution. I mean, we're just going to continue seeing this, you know, to moderate, to not moderate, to, to Snoop, to not Snoop. I mean, it's, it's going to be a reoccurring thing. Petros Koutoupis (23m 9s): I mean, probably indefinitely. Katherine Druckman (23m 13s): Well, but then they got, they got access to the data. Anyway, I own it in Israeli security company that I'm like, no, no, not through Apple. And, you know, and Apple was right to stand their ground and, you know, among other things, it would have hurt their brand, you know, too. I think, you know, I, I dunno, I think, I think they were right. And I think the FBI didn't need them anyway. So the Apple didn't have to turn down your ethics. And, you know, when, when I guess, although Petros Koutoupis (23m 42s): It would have been different if they actually did find, you know, truly incriminating Doc Searls (23m 46s): Evidence on the phone that they cracked, that the FBI cracked or whatever cracked it. And I forget who cracked it, but Katherine Druckman (23m 52s): Would it though, I mean, would that, would that matter? And speaking of the, the ethics of it, I mean, until you, until you crack it, you don't know what you're going to find. So Doc Searls (24m 0s): Well, let's, let's say a crime is committed by somebody who drops their iPhone and Apple says, we're not going to open this iPhone up. And then the FBI takes two weeks to open it up. In the meantime, this serial killer was killed five more people. Somebody is going to say Apple, that blood is an Apple's hands because had they put it back to her in there for the FBI, the FBI could have prevented those killings that were going to have that. Katherine Druckman (24m 28s): Yeah. That will happen. You're right. Yeah. It would happen. I would think that that position is wrong, but, you know, it's, but it is, Doc Searls (24m 37s): I think it is too, but I think we're going to have that it's and until we find some, it's not even a middle ground, it's some other ground there's an other ground that we don't have right now that because it's too easy to think of things simplistically. Katherine Druckman (24m 53s): Absolutely. Yeah. That's the, that's exactly the problem. On the surface it does seem kind of obvious that that, that Apple was, was at fault. If that sort of thing happened, but what people I think don't consider is that by making their devices less secure, how many other people are they harming beyond? You know what I mean? It's... Imagine all of the crazy stalker ex-boyfriends that, you know, can now hack into, you know, their girlfriends, every location, and, you know, just the, the larger implications of making a less secure device are to me and to probably a lot of people far more harmful than any consequence of making them secure. Doc Searls (25m 42s): Yeah. And it's, and it's an argument we're never going to stop having until we invented underground. And again, it's a grant. I was involved in a conversation this morning where somebody said, well, there are these two sectors. There's the public sector. And the private sector is if the whole world is divided that way. And is this a private sector solution or a public sector solution? No, wait a minute. There, there's lots of, there are lots of things that are neither, you know, technology what's technology. I mean, technology cuts all both ways. It's not that simple, you know, there, there are, there are subtleties to things, right. So Katherine Druckman (26m 18s): Probably over-simplifying too and making false equivalences. Doc Searls (26m 21s): Well, we can't help doing that. Katherine Druckman (26m 25s): Yeah, exactly. So, yeah. So the other big news, and this is maybe the biggest thing, is this a Robin hood game stop Reddit thing where a Reddit group, I believe they're called wall street. Bets got together and figured out that a hedge fund had a massive short position and go in a GameStop and said, Hey, let's play with that. And I don't know how much it was activism and how much of it was just the desire to make money with a short squeeze, but it seemed to have worked. Katherine Druckman (27m 6s): It was, I think, I want to say, was it at 19 last week? And I don't know what it let's see what it closed out today. I, I, at some point I looked at it and it was in the high, it was like three 80 or something. Definitely over 400. Yes. Doc Searls (27m 19s): So somehow the traders have kept that up there. So anybody who's, I advise anybody who owns game stopped on load it now, still, it isn't worth it. If it's not worth that much unload it now. But the problem, I mean, again, it's one of those things where people are calling for, well, we have to fix this. I think it's fixing itself. I think, you know, we see what's going on with it. Wall street is a, is a casino, you know, not it's a casino run by the gamblers. It's not necessarily, you know, to, to some extent it's overseen by the, by the sec and by regulators, but they're just, you know, they're not working for the house. Doc Searls (28m 1s): Everybody is in the house and, and, and we have to let it play out. I think, let everybody have their lulls and let it, let it, let nature take its course. And that's my position on it. Katherine Druckman (28m 13s): It does. But it ties into this idea of content moderation, because the consequence of this was that Robin hood, the sort of accessible everybody's, you know, trading app took a game. So they would, they wouldn't let anybody trade it. They took it off. Right. They, so they, yeah, they like froze trading on it. It discord banned the Reddit group wall street bets from their platform, but then they reversed their decision and apparently they're back on, on discord, but they temporarily Def the platform to them, you know? Katherine Druckman (28m 54s): So that, that's kind of interesting. And then, and then you have the whole, you know, the actual like circuit breaker thing where it actually trading actually froze, I think mid day, I think, was it today? Or was it yesterday that it actually froze? I'm not sure. Anyway, they massive, massive price search triggered this, the built-in circuit breaker. And it's, it froze trading for awhile. So that, I mean, that's not, that's not content mineral moderation. That's just interesting. But yeah. So I don't know. I mean, it's kind of interesting to think of how this will shake out because I mean, it's, it's created a lot of market volatility. A lot of, a lot of tech stocks have been hit a little bit. Katherine Druckman (29m 34s): And I think there's, there are a few reasons. Some of it is because of the, the big major, you know, institutional investors are well, they're having, they're having to cover various short positions, you know, who knows how many hedge funds had shorts on some olives, all these stocks that are, that are suddenly surging. And if you have to cover those positions while you might have to sell your Apple stock or your, you know, your other, you know, big positions. So I don't know, that's, that's kind of a bummer and it's, it's, it's gonna, it's creating volatility. I mean, AMC at bed bath Doc Searls (30m 4s): And beyond amphitheaters others. Katherine Druckman (30m 7s): Yeah. And now apparently they're going into, into a cryptocurrency because they're bored over the weekend to now they're they're going in. They're going to go trade crypto. Yeah. But I don't know, like I was going to say, so what, like, if you're a hedge fund, what do you do now? You're like, Oh, the game is that Doc Searls (30m 30s): Yeah. Get out, you know, go into, go into value investing and stuff. And I mean, for awhile anyway, I mean, did it, I mean, the weird thing about shorting is that you're, you know, and I, I have to, it, I, I don't know that much about this stuff and I don't invest. So I've, I had, I had assumed, okay, you've got to bet that the stock is gonna stop. It's gonna, I'm betting that the stock is going to go down to five and, and then I collect if it ever does go there, but that's not it. No, you borrow the money. Yeah. And so in the real world, and then all of a sudden, some people that are securing that are, or are dependent on it or are demanding collateral. Doc Searls (31m 13s): And that's why people had to, had to throw a billion dollars or something to shore these guys up that the pro the one, the one problem I have with it is with what ha has happened is actually a moral one. And the moral equivalency one too, which is, well, screw those guys. We're going to stick it to them. And then on the part of the guys are sticking it to that. They were being bad in the first place. I mean, this is assuming that they work. So we have two wrongs, not necessarily making a right, but just creating a whole lot of havoc. And to me, you know, trying, trying to screw somebody because you don't like them and bankrupt them because you don't like them is, is really icky ethics. Doc Searls (31m 56s): It's not, it's not good. It's not a good way to act in. It's not a good way to be if you're seeking vengeance, you know, and for a perceived slight or for perceived wrong. And, and, and that they were that let's say the hedge funds are really big, bad, and awful and evil. And you know, all that stuff, I let's find a way to fix them or make them impossible, but to, to just outright screw them for the sport of it. Katherine Druckman (32m 24s): Well, I don't, I don't know if it was sport. I mean, you know, I think, Doc Searls (32m 28s): I dunno, I read some of the threads. They seem pretty sporting to me. Katherine Druckman (32m 31s): Some of it was yes, absolutely. But I think a lot of it is also young people going, Hey, wait a second. You mean, I can make this, these, this big, you know, these, these gains to you mean I can do it by myself, what seriously? You know, I think it was an opportunity. People thought, you know, I'll, you know, these big millionaires and billionaires have been doing it. Why, why can I, you know, I think there was a little bit of that too. Right. Doc Searls (32m 52s): And, and that's, you know, gaming, gaming, the market that way is I suppose, illegal, you know, gaming Katherine Druckman (33m 0s): The market. I mean, that's what hedge funds do every day Doc Searls (33m 4s): Polluting to get together. And yeah, Katherine Druckman (33m 6s): Like a position where they've sold more stock than even exists. You know what I mean? That's literally what they're doing. And you know, everybody, anybody who knows anything about investing in the stock market knows today that a short position is high, high risk. And you know, that you just know, because with, with ignore a regular, you know, buy low, sell high kind of position, you know, you know, buying based on fundamentals and you know, the old fashioned way, so to speak, you have a limited loss, right? You buy, you buy a, I don't know, Google at a hundred as if, but if you buy Google at a hundred and you sell it at 150, that you can't lose any more than that, a hundred dollars. Right. But if you have a short position, so your loss is unlimited, you can, you know, you've sold it at a hundred speculating that it will drop to 50, you'll buy it back. Katherine Druckman (33m 55s): And then you'll, you'll have just made 50 bucks. Well, if you, if you sell it at a hundred, there's no limit to how high it can go. Therefore there's no limit to your life. Doc Searls (34m 4s): You say it's risky. But think about it. Everything that you've seen that's being targeted right now is it's all retail. Katherine Druckman (34m 14s): Yeah. Yeah. No, I mean, they, you know, I, I totally understand why they were shorting it, but I don't understand. We know, I do understand how these people identified their companies, but yeah, no, I mean, yeah, sure. They thought that they had, they had a pretty safe, yet risky it's safe yet risky, whatever that means a safe risk or calculated risk, I suppose is the word, but, well, apparently not. I don't know. It'll be interesting to see what, what happens. I think I read that Biden appointed a new, the, this guy Gensler as the new head of the sec. And it'll be interesting to see where he falls, like w you know, who he decides maybe he wants to investigate. Katherine Druckman (34m 57s): So that'll be interesting. Doc Searls (34m 59s): There's a, I know we're not talking about this, but somebody just sent me a note. Shoshana Zuboff is a friend. And who wrote the rise of the, the age of surveillance? Capitalism has a new piece in New York times called the Coover not talking about, and the subtitle probably says it, we can have democracy, or we can have a surveillance society, but we cannot have both. So that's sort of plays into what we're talking about. I haven't read it before. I'm just read the headline, but it plays into it. Katherine Druckman (35m 27s): Yeah. It's interesting to see that, you know, politicians of, of all sorts are, are, and, you know, and, you know, business people like Mark Cubans in the world and like AOC and also Ted Cruz, you know, we're all, all seem to be on the same side, which is kind of, well, it's weird, but yeah. Interesting. I mean, maybe peer people perceive this as the democratization of, of finance. I dunno. Doc Searls (35m 55s): What is democratization at this point? Anyway? I mean, what does that mean? Katherine Druckman (35m 59s): No, I think, you know, I think we want to believe that it's power to the people, but I don't know if that's what it actually means. Doc Searls (36m 5s): Yeah. I mean, you know, but Plato warranted that democracy is not a great idea in some ways, but then you have some elite in charge and that may not be the best idea either, but William F. Buckley said, he'd rather have, he'd rather be governed by the first 2000 names in the Boston phone directory than by the Harvard faculty. Right. So that's, you know, people versus elites. Katherine Druckman (36m 30s): Yeah. At bare minimum, I think that people are realizing, I guess, the power of, of grassroots organizing. Yeah. Doc Searls (36m 43s): That's, that's true. I mean, I think that I have a bet though, that in the long run really positive grassroots organizing, which is we're going to vote for people who are working for us in our constituency, actually doing things that work for us, you know, giving us laws that are going to give us better infrastructure and better accountability and better courts of schools and whatever else truly matters. You know, that, that would be good. I'm, I'm concerned about the kind of activism that's basically, we want this guy for emperor and, and he's right. And everybody else is wrong, which is kind of, Katherine Druckman (37m 20s): Yes, that is a bit disturbing, but I don't know. I mean, maybe it's maybe some good comes out of this, this Reddit game stop thing. I mean, maybe, maybe there will be a little bit of a gentle regulation that benefits us all in some way. I don't know, reduce it. I mean, you know, we, there, there, we have a history of allowing predatory investment products. You know, Doc Searls (37m 47s): I sent a scene before crisis Petros. I've seen this scene before in the dark Knight rises. You don't remember having seen that, however long it's been with Bain and how he gets the, the market to crash. Oh, right. Yeah. Yeah. I mean, that's, you know, the, you know, we're, we're living in a science fiction future, you know, right now the interesting thing is the world is so messy. You know, you don't need a Bain in order to make this stuff happen or Alexa, new Thor or some other evil, super genius. What will you need is just a chaos. Doc Searls (38m 29s): You just need, you need the chaos that tends to happen when a bunch of people with weird influence at given moment in time to something strange and big things happen. I mean, it's, you know, or, or it could be done. I mean, you know, chaos could be caused by one bad person. I mean, an interesting thing to me is that, I mean, Timothy McVeigh, you know, did what he did. He killed 200 some people, but it had no cascading effects. I think he expected it to, but it didn't, you know, so there we are. Yeah. Katherine Druckman (39m 4s): Well, I'm interested. Oh, that was it. We actually covered all three. Maybe the first time. Doc Searls (39m 9s): No, the signal one. Okay. Yeah. I know there was another one. Katherine Druckman (39m 14s): Nope. We've got signal. We've got, we've got Twitters. Birdwatch crowdsourcing. And we had this Robin hood thing. Yeah. Well, well it is, it is the hour we hit the hour. Yeah. We hit the hour virtual high five Doc Searls (39m 31s): Virtual high five. Yeah. Katherine Druckman (39m 34s): So, so yeah, I think we conclude, conclude that there's, there's a lot of information on the internet. Some of it could be pretty powerful. And now what now? What are we do? Everyone's answer is different. Doc Searls (39m 48s): Oh, one of my favorite headlines from the onion, like from 1997, you know, something like that was error found on internet. Katherine Druckman (39m 59s): It reminds me of the XKCD. Someone is wrong on the internet to fix it. I love that one. Oh, I love that one. Petros Koutoupis (40m 7s): She shared that one with my wife, because she has a tendency to argue with people on Facebook, like in, on groups and stuff. So as soon as I saw that, that, that comic strip, I immediately sent it to her because I said, this is, you know, Katherine Druckman (40m 23s): Oh, it's, it's perfect. It's evergreen. It's always relevant. Yeah. And, and maybe never more so than, than today when you know, everyone's trying to figure out, they've identified the problem that someone is wrong on the internet and now we don't know what to do about it. And then everybody's taking a different approach. Signal is saying, well, maybe it's not a problem yet. Maybe we'll just worry about that later. And Twitter is saying, well, maybe, maybe all the users can fix it. And Robin hood said, no, let's get kick game off our, a trading platform. Petros Koutoupis (40m 57s): Yes. I miss the days when the internet was just a bunch of strong, bad videos and random blogs here and there. Doc Searls (41m 6s): Yeah. I miss him too. I miss her too, because I had, I had a lot of readers when my blog was a thousand years ago. Yeah. So, yeah. Let's see. Katherine Druckman (41m 19s): Cool. Well, if you've made it this far, thank you so much for listening to us ramble on about current events and trying to solve the problems of the world in less than an hour. Yeah. So, yeah. Please check us out next week. I'm sure we'll have something interesting