Jonathon Morgan

Chitra Ragavan Founders Stories, When It Mattered

Ep. 4 — A Single Dad’s Quest for Parenting Advice Positions him to Expose Russian Interference in U.S. Elections / Jonathon Morgan, CEO and Co-Founder, New Knowledge

In this episode, Jonathon Morgan, CEO and Co-Founder of New Knowledge describes how the social media skills he acquired as a young single dad on the quest for parenting advice and online community gave him the skills and tools 15 years later to uncover the Russian interference in the 2016 U.S. Presidential elections. 

Morgan talks about the difficult period in his life when he and his cyber security company were accused — he says mischaracterized — of using social media manipulation tactics similar to the Russians, to influence 650,000 likely voters in an Alabama Senate election to vote in favor of the democratic candidate. And he shares what that experience taught him about leadership. 

And Morgan looks at what it will take for society to move from what he describes as online mob rule towards an authentic Internet and the price society will pay if we fail in those efforts.

Transcript

Download the PDF

Chitra:   Hello and welcome to When it Mattered. I’m your host, Chitra Ragavan. I’m also the founder and CEO of Good Story Consulting an advisory firm helping technology startups find their narrative. On this weekly podcast, we invite leaders from around the world to share one personal story that changed the course of their life and work and how they lead and deal with adversity. Through these stories, we take you behind the scenes to get an inside perspective of some of the most eventful moments of our time.

Chitra:   On this episode, we will be talking to Jonathon Morgan. He is CEO and co-founder of New Knowledge, one of the first organizations outside the US intelligence community to identify Russia systematic campaign to influence the 2016 presidential election a social media and counter terrorism expert. Jonathon has advised the State Department and Congress and he and his team produced the Senate intelligence committees extensive report that revealed the scope of Russia’s effort to sway the 2016 elections. Jonathon, welcome to the podcast.

Jonathon:   Thanks so much for having me. Excited to be here.

Chitra:   Tell us a little bit more about yourself and how you first became involved in social media and understanding its importance.

Jonathon:   Well, I mean I guess it goes back kind of a long way. So when I, I lived outside the US when I was younger, especially in high school. And one of the ways that I stayed in touch with all the friends that I had back home was this was the early days. So this was AOL Instant Messenger and it was kind of like a social network at the time. And it really got me into, I don’t know, just how much personal connection people could make online and then got into coding and building websites and it’s kind of really early days. I was a super geeky. I mean this was almost 20 years ago now, and then I think ultimately I thought my career was going to go a different direction, but kind of stumbled back into social media in the early days of online communities. Like just kind of pre Twitter to properly date myself in early adulthood, kind of right after I’d had a had a baby.

Chitra:   So how old were you then?

Jonathon:   Well, so it would have been 21 which was daunting at the time, and actually I think that was, that was a big part of it. I think that’s why, again, I was kind of in a weird situation where I had a kid at a really young age. For me anyway, very unexpected, it was an odd time. I think now it’s pretty commonplace for young parents to retain some sense of their identity that’s like outside of being a parent. There was that whole, you know, hipster parenting movement 10 years ago. And so now it’s kind of normal. But I think at the time it was, it still felt to me like the only model that I had for parenting was, was my parents, which again, at 21 felt really weird.

Jonathon:    It’s like, Oh, am I done? Like am I done being cool? Am I, am I done being relevant? Like is now the, the idea that I just, you know, kind of get a job and act like an old person, which, and now as a, as a parent with a 15 year old, I look back at the way that I used to think about parenting and I’m super embarrassed. But nevertheless, there I was at the time.

Jonathon:   But yeah, and so the way that I found I, I kind of needed the connect with other parents who were parenting differently or parenting in a way that kind of felt right to me and I just didn’t know that many people where I lived. And so building, it was the early days of parent blogging when the first couple parents were starting to talk about their experiences online and they’re really kind of real and vulnerable way. And it was a great, just insight into how other people thought about it and how they were going through the process. And I’ve built a really strong community of parents who were going through some of the same stuff as I was.

Chitra:   So you’re, when your daughter was a baby 15 years ago, in a strange way, social media and social media influencing and building community using social media was also in its infancy. And so as she grew, so did your knowledge of and sophistication about social media. So, so how did that happen? What, what was your next career move and how did that shape your, your understanding of social media?

Jonathon:   Well, weirdly the two things happened kind of at the same time. So as you know, parents of young kids probably know when they’re, when they’re little, it’s the, the schedule is really weird. They, they wake up at weird times in the morning, they have to nap if they don’t eat right on schedule, like the entire day blows up in your face and there’s daycare. But daycare is only open from some weird time in the middle of the morning after most jobs start and they finish at three or four in the afternoon. And so just trying to make that scramble work was pretty tough. And because I’d, at that point I’d been, you know, kind of writing about my experiences in building this community. And I had a, I actually started working as a professional blogger initially writing about parenting. But then also because at the time I don’t, it feels weird to say now, but just to anchor everybody, it was back in the days when people still used to talk about the important difference between print and digital as if it was like, you know, two totally different worlds.

Jonathon:   And so as those of us who were early on and kind of understood blogging and understood kind of how content worked online, we would get put into positions where we were running websites or kind of running publications about things we didn’t need to know anything about, which was, and so I know a little bit about parenting, but I didn’t know anything about fashion. I didn’t know anything about green technology. I didn’t these other websites that I was running. And the only reason that people like me were in those positions is that somebody would say, “Well, look, all the real journalists are writing for the print version of Teen Vogue or something,” if it was a fashion publication. But they would look at me and say, “Well, but that nerdy guy, he knows how the Internet works, he gets this whole blogging thing.”

Jonathon:   And that really just meant that I was comfortable like filling out forms and a like a content management system or something. But it was totally foreign to all of the who were used to working off-line. And so, and just basically started to build that, build that profession in the really early days of professional blogging kind of right around 2005 just before Twitter launched. And right at the age when my, when I needed to have this super flexible schedule because my, you know, daughter was in and out of daycare, she was on the normal little kid schedule where, you know, every nap is super important. And so it just, I don’t know, just kinda found each other, which was, and it was kind of a fun time to be online because I don’t think it just, it was, it was really the wild west. There just wasn’t a sense of whether or not this was even going to be real longterm or whether it was just kind of a temporary fascination.

Chitra:   So your next kind of layer of understanding of it came I guess when you were at AOL, right? And you had to promote a lot of these blogs.

Jonathon:   Yeah, I mean, so this is the other really strange thing about it, about the Internet at the time. I mean people were already starting to figure out that it was really difficult to keep track of the vast volumes of content that were on the Internet. And so there were systems for discovering new content and it was pretty much very nerd centric, you know, kind of for nerds by nerds type, social media communities. These are websites like systems like StumbleUpon if anybody remembers that or Digg and Digg was a precursor to Reddit. So like the basic premise of a website like Digg was that you would submit stories to the Digg community and then people would up-vote stories that they liked and down-vote stories that they didn’t like. And then the more up-votes your story got, the more it was shared with a wider audience.

Jonathon:   And the goal for anybody submitting stories was to get on the Digg front page because that got a ton of traffic and everybody went to the Digg front page because it was curating the best content on the Internet. And so it was just kind of a, it was, it was the early days of crowdsourcing when people believed that there was wisdom in the crowd and that if something was popular, it was probably inherently valuable. And it was part of this, I think, kind of this grand idea that we were removing all of the traditional gatekeepers for information and content and we were just giving that power directly to the crowd. And so if people believed in it and people wanted to read it, then, then it was better. And so it was kind of a, I mean, so sort of a beautiful idea and I was really attracted to it at the time.

Jonathon:   However, I got a little bit disillusioned because it’s, that’s not really how it worked for better for worse. It turns out that in order to reward people for participating in that system, websites like Digg would start to give certain users more authority than others based on their historical success. So you know, if you, if you posted something to Digg and it got popular, you were like a more valuable user to Digg than somebody who’d never had a story get popular on their platform. And so whenever you submitted something, their algorithm would prefer it and it kind of give it a little bit of a boost right out of the gate.

Jonathon:   And so kind of very early on, you could see these social media platforms recognize that users who generate high levels of engagement are valuable. But of course that’s not the way that we’re thinking about it at the time. At the time we were, all I’m doing is writing a bunch of silly stories about parenting and celebrity fashion and and whatever. And so, and it’s important for me as the, you know the person running these websites to get traffic to those websites.

Chitra:   And you’ve figured out a way to do that.

Jonathon:   Yeah that’s right.

Chitra:   How did you do that?

Jonathon:   Well it turns out so there was a, a very popular user on Digg whose handle was Mr. Baby Man and I, to this day, have no idea why he chose that. Later he became kind of popular. So it’s like 2007, 2008 you can actually go find stories about this guy called Mr. Baby Man. He was the most popular user on Digg. Everything that he submitted went to the front page and he was basically like curating the Internet for, at the time, it was a super popular website. Mr. Baby Man was its most kind of powerful user. And before he was being written about in you know, Wired and whatever kind of, you know, Internet focused publications that were sort of fascinated by this idea. He was just some guy who everybody knew was, was super popular. And so I thought, well, here’s what we need to do.

Jonathon:   We need to make friends with this Mr. Baby Man guy because everything that he submits goes to the front page. And so, and with a little bit of a sense, kind of reverse engineering how the Digg algorithm probably worked. It started just, chatting with Mr. Baby Man over AOL Instant Messenger, kind of making friends, learning what he was into and would send him stories that we were publishing. And so if he thought they were cool, he’d, he’d put them on Digg for us which was way better than me doing it myself. And then as soon as he submitted it, like the trick was, that then you’d go back into the AOL chat room where all of the different writers from all of the different blogs that AOL owned hung out, and you’d say, “Hey everybody, I need you to go up-vote this story on Digg right now.

Jonathon:   It just got submitted. It’s going to be a thing and I need your up-votes. And so then you’d get dozens or hundreds of up-votes within like 30 minutes or an hour, which was a signal to the Digg algorithm that that story must be really popular. And so it was like clockwork. Like Mr. Baby Man would submit it, all the writers in the AOL chat room would go up-vote it and it would get to the front page of Digg and it would send lots of traffic to our websites. Which, at the time I was like, oh this is great. Like it was just kind of a, you know, a way to hustle and get more attention to the content that we were publishing. And it was pretty innocuous, you know, it was like silly celebrity stories or whatever. But it, I think in the back of my mind it was like an itch.

Jonathon:   Like wait a second. This was supposed to be based on what was inherently valuable and it’s clearly not. It’s clearly possible to figure out how to identify how the system works and then almost find the gaps in the system and then you can sort of capture the attention of tens of thousands or hundreds of thousands of people. I can kind of hijack it basically, which again, it, it didn’t mean that much at the time. I think everybody sort of knew that this whole idea of like the wisdom of the crowd was a facade. It was a nice idea that didn’t really work in practice, but it was, I don’t know, it was kind of an interesting moment. I think it, it revealed a little bit, I guess opened a door that that maybe the Internet didn’t work like people thought it did.

Chitra:   And that was really, it was more than a little breakthrough for you. I think it was a big leap and understanding for you because from that you started to move towards really understanding how social media could be used to manipulate the public and the area that you next kind of stepped into, moving away from sort of the lighter teen fashion stuff actually was conflict and terrorism. How did that, how did that happen?

Jonathon:   Years later my career changed. I got into software engineering and then ultimately like machine learning and data science. And so it kind of left that social media world behind for a little while and then started working with a nonprofit group and that nonprofit group was interested in crowdsourcing information about natural disasters and conflict. And so it, it’s a group called Ushahidi. They’re still actually used around the world for monitoring conflict in the aftermath of elections and parts of the world where there’s a lot of instability around elections and democracy is still a new system. So they had a platform that would receive text messages from people. So it was kind of a one to one platform. And what I was focused on was trying to figure out could we get that same information from kind of general social media, Facebook, Twitter, whatever. And it, it turned out, I think that this was like maybe 2012, 2013 and everybody thought about Facebook as the place where it was right around the time when people were starting to say, “Hey, promote your small business on Facebook.”

Jonathon:   Like are you a business owner? Don’t forget you need a Facebook page. And that was kind of this like the, the most innovative online digital marketers were giving that kind of advice to, to people with small organizations that they were trying to promote. Which seems super naive now, but I think it’s important to remember at the time that was actually kind of useful advice for people, many of whom didn’t even have personal Facebook pages at the time. From that premise, it turns out that it wasn’t just people trying to promote their business or their sports team or their afterschool club or whatever. Literally everybody who had an interest in promoting their organization was on social media, including terrorist groups.

Jonathon:   And I think that was the, as I was hunting for information about conflicts online, started to discover Facebook pages, YouTube channels, Twitter accounts, Twitter networks that we’re using the platforms exactly as they were intended. But from a point of view that I don’t think any of the platform designers had ever anticipated. And so these were like in the conflict in Syria had just sort of fractured. It was no longer like the resistance against the Assad regime. It was dozens of different groups, different factions, some aligned with Al-Qaeda, some aligned with what later became ISIS, some rebels aligned with the US and then the Assad regime itself. And so it was just as real hornet’s nest. It was a mess inside Syria of all these different groups. And they were all vying for attention on social media so that they could recruit people to their, cause. It turned out just by monitoring these online groups, we were able to get a sense about the level of violence in any given region inside Syria that was on par with the, on the ground reporting that NGOs in the area were doing.

Jonathon:   And so, you know, NGOs at the border would try and get a sense from, you know, refugees leaving conflict areas, how many, like how many attacks there had been in the past, seven days or how many people had been killed or how many of what sorts of bombs had been dropped. Was anybody using kind of the types of chemical weapons that the Assad regime was rumored to have used. And we could get a similar level of intelligence just by scanning social media posts from Facebook and Twitter. Just showing like how much information was actually available and how these groups were, again, like using these platforms as designed but to capture the public attention in ways that I think, again, nobody really anticipated.

Chitra:   And understanding sort of these facades of fake popularity, right? Pretending like you have a bigger reach than you actually do. Understanding that led you to ISIS and how they were using that over social media.

Jonathon:   Yeah, that’s absolutely right. And so the, that, that discovery led to investigating how groups were manipulating the mechanics of social media. And I think shortly after it was in a weird way it was like, oh, I’ve, I’ve seen this before. In fact, you know, I know exactly how this works. You just need to figure out where, how popularity is determined in any of these online systems. And if you can mimic that, that, that popular behavior, then you can in effect, make something popular. And so just in the way that, you know, we used to, we realize that the path to popularity on a platform like Digg was to find an influential user and then get a lot of up-votes really quickly. It turns out that the path to popularity at the time on a platform like Twitter was to use a popular hashtag and tweet the same thing at the same time from lots of accounts.

Jonathon:   Groups like ISIS had figured that out. And so this was maybe 2014, 2015 when ISIS was dominating Western media. All anybody talked about all the time was ISIS, ISIS, ISIS, and for sure they were a serious threat in a very volatile part of the world. And they’d committed, they’d done some very like atrocious things. But what I think most people didn’t recognize is that the real success of ISIS was that they, at least in terms of kind of raising their profile, was that they’d figured out how to manufacture popularity on a platform that was heavily used by journalists who would then take the fact that they were, that ISIS was influential on Twitter as a proxy for their influence in real life and journalists were in effect laundering the ISIS narrative and publishing it in the mainstream media and kind of making it real and, and in effect kind of telling the ISIS story on their behalf.

Jonathon:   That led to some research with the Brookings Institution with a a really fantastic researcher on extremism called J.M. Berger. And our mission was basically to figure out how many real ISIS supporters were there on Twitter and then what were the ways in which they were using this large network of accounts, some of which were run by individuals, some of which were automated. Kind of what were their tools and tactics and techniques to manipulate social media to create the impression of popularity and strength. And that was it turns out that in fact they had a very small number of accounts. They, it was maybe 40,000 accounts that they controlled at any given time. But even with that small number, because of the way the Twitter operated, they were able to create the impression of being a huge global force.

Chitra:   So then comes the next level of your understanding of social media manipulation. And that came not from terrorism, but in the political arena, in the lead up to the presidential election. How did that happen?

Jonathon:   Well, I think what was also important to understand about the way that a group like ISIS could manipulate social media. It wasn’t just manufacturing popularity. What they were also able to do was exploit the fact that the Internet is a series of echo chambers. And in fact, so the way that everybody thinks the Internet works is that it’s like a, it’s like a, like a big cocktail party or something, you know, like everybody’s just hanging out and having chitchat and having conversation. And then the most interesting chitchat kind of percolates to the top. And that’s the stuff that most of us in the mainstream interact with. Which of course as we’ve been talking about, it’s just never been true.

Jonathon:   And so the, at the same time as these, we’ve been talking about how different, how the system had kind of vulnerabilities over time. And the reason that that happened is because the designers of these social media platforms recognized that the most value in their platforms came from high engagement users and that small networks of highly engaged users, were kind of good predictors of the zeitgeist.

Jonathon:   And so those are like, they’re, the systems became, their algorithms were designed to encourage this type of behavior. And so what they encouraged was small insular networks of hyperactive users who were hyper engaged and would promote content at high volume and those were the most successful valuable users on their platform. And they designed everything to accommodate it. c

Jonathon:   And so that was an important foundation to understand because right after working with the Brookings Institution, that led to some work with the State Department where we were trying to investigate this, almost like the systems level problem. How did we get to the point with our modern information ecosystem that a small group like ISIS could not only manufacture popularity but exploit this echo chamber dynamic and pull people into online spaces where it was easy to radicalize them very quickly. And, and you can use radicalization as a proxy for kind of most extreme social behavior, even political polarization, obsessive fandoms. There’s not, any time when people get kind of hyper obsessed and hyperactive around a single idea, at least online, the mechanics are very similar.

Jonathon:   And so kind of how did we get to this point and what were the implications. And I think that, it led us to develop some technology approaches where we can kind of quantify this. How can you, how can we A. Identify these networks quickly? How can, and then how can we measure their influence on the larger conversation and start to understand some of these dynamics? And we started to find things that if you had, if an, if an online group could coordinate its activity, whether through automation, like social media bots or, or whether it was by kind of galvanizing large groups of humans. If they could own 1% of the conversation, 1% of the posts on a given topic, on any platform, Facebook, Twitter, YouTube, whatever. If they could capture that number of posts, they could, they could start to measurably change the language of the larger community that they were operating in.

Jonathon:   So, you know, 10,000 accounts operating on a major candidates Facebook page, those 10,000 accounts by posting in high volume could, could kind of fill 1% of that conversation. And over time, the other millions of people of people who were on that platform or who were participating in conversation on Facebook page would start to change their language in a measurable way over time. Well now we’re like in early 2016 what I thought was happening as I was kind of designing technology that would absorb content from different social media communities like Facebook and Twitter and Instagram. And I was measuring the change and the language of these communities over time and whether or not they were showing signs of radicalization. And it certainly seemed like they were, like their, their language was becoming increasingly like the, the kind of the, the rhetoric in these online communities was becoming increasingly antisemitic, increasingly violent and increasingly polarized in a way that if it were real, if it actually represented the kind of the point of view or the ideology of the mainstream American public, we would’ve been looking at, you know, like a civil war level uprising.

Jonathon:   An actual, like a fracture in the country that, that the, you know, was going to lead to violence in the streets, which didn’t really seem consistent with how things were happening in the physical world. So as scary as it seemed that it was almost like this, this couldn’t possibly be accurate, and the only way that this much change in ideology and language could happen across these many different social media platforms, like the only way that that could happen that quickly is if somebody was doing it on purpose. You know what I mean?

Jonathon:   So either it was a public uprising and we were gonna see like militia in the streets or somebody was trying to manipulate the public or trying to orchestrate a large scale social media campaign, the size of which we’d never seen before and kind of predicted that it was possible based on the way that we’d seen smaller groups like ISIS manipulate the conversation but had never seen anybody follow through with. And so pretty quickly started to believe that it would have to be a campaign at the size of a nation state, like a, it’d have to be funded by some organization as big as a government because it had clearly been planned and executed over a number of years. Just the technology infrastructure required to run a campaign of that size would probably cost millions of dollars. And so started to quickly realize that there was a, a nation state campaign to manipulate the American public and the most likely culprit was Russia.

Chitra:   And where did that knowledge take you? How did you then end up partnering with the US government to help them understand what was going on? Especially with as we, as we got closer to the elections and there was a concerted move by the Russian government as the report now by the intelligence committees have shown, to move the election away from Hillary Clinton and towards Donald Trump.

Jonathon:   I mean I think at the time it was really anybody who would listen. I was, I was on a little bit of a, almost like an evangelical tour trying, you know, speaking at events, briefing security officials, writing in publications. I, you know, it was writing in the Washington Post and the Atlantic just trying to get the message out as kind of strongly and quickly as possible that this was happening on a scale that, that I don’t think was fully appreciated. There’d been a publication by the, by the Director of National Intelligence about the fact that this type of campaign was occurring. There’d been some kind of, you know, rumors about the scale. There was an increasing awareness of it. But I think at the time, my mission personally was just to put that information in the hands of as many people as possible, particularly, security officials inside the US government. Even if we couldn’t mitigate the impact on the 2016 election, at least raise awareness so that we could quickly take action and stop influence on future elections.

Chitra:   And so you then worked with the Senate Intelligence Committee as did your team to put together the 101 page report on how the Russians tried to manipulate the, the elections. And that was, I guess a significant inflection point for, for New Knowledge your company.

Jonathon:   That’s right. I think, you know, we knew that it was essential that in order to repair this problem, that not only would we need to directly address the problem of the security issue of foreign interference in US elections specifically through online disinformation, but we’d need to start to figure out how we would repair the global information ecosystem. And so we had the develop technologies that that would identify this type of activity, measure it and describe it as quickly as possible. Which led us to developing a set of tools that we could use to investigate the data that the Senate intelligence committee had acquired as part of their investigation into how these different social media platforms were exploited in 2016 and 2017. We really felt like it was an important part of the company’s mission to use that technology to inform the public as best we could about how those types of campaigns were conducted and what their impact was.

Jonathon:   And so there was a team inside the company that was dedicated to that work with the Senate Intelligence Committee for the better part of a year investigating, you know, millions and millions and millions of documents from Google and YouTube, from Twitter and Facebook and Instagram to really get a sense of kind of the scale of the operation, the tactics, the focus, and then the consequences. And then that team, to your point, put together a report, an exhaustive report about what we’d found by investigating and interrogating that data and made it available to first the Senate Intelligence Committee and then of course later to the public.

Chitra:   So clearly your own influence as a leader was growing. You know, at this time it must’ve been a really heady time for you and the company. But then in 2017 you kind of took a stumble because you and New Knowledge were accused of trying some of the same tactics as the Russians did to influence 650,000 likely voters in a Senate election in Alabama between the Democrat, Doug Jones, who would win the race and his Republican opponent, Roy Moore. And you were accused of running some kind of online campaign to influence votes towards the Democratic candidate, including creating a fake Facebook page and manufacturing a bot network on Twitter and Facebook to target the Republican candidate. Could you tell us a little bit about what happened and how you ended up on the, on the wrong side of this?

Jonathon:   Yeah, in fact, that was, I mean, that was a really difficult time. I think what we didn’t anticipate and sort of what was really a surprise is that, we see ourselves as kind of, we’re a small company and, and we’re individual researchers who’ve just been kind of passionate about trying to understand this problem for a long time. And I think the, we were really taken aback by the amount of attention and exposure the company got and we as individuals got after the, the report to the Senate Intelligence Committee was released. It was, like you say, it was kind of a surprising and heady time and very shortly afterwards we found that we were then the, the target of reporting that was not, was much less interested in the good we were trying to do in the world and was in fact interested in trying to undermine the company or undermine us as individuals by taking, I guess the most negative view possible of research that we’d done in the past.

Jonathon:   And so that was, it was just a, it was a learning experience. I think it was pretty difficult to navigate that. And I think it was just skills that we didn’t have to recognize when there was the reporters pursuing a story, didn’t have a charitable interpretation of what we were working on. And frankly, I mean, and I don’t mean this in a, oh, I guess I don’t know how I mean it, but I think weren’t really interested in our side of the story and were kind of pursuing a different objective. And I think that’s, of course well within their rights to do that, but was unexpected and just something that we didn’t know how to navigate.

Jonathon:   And so I think that was a, we, we, we had to learn a little bit like if you’re going to be that high profile, you’re going to need to learn how to take a punch and, and, really, just make sure that we were focused on our mission and focused on the things that are priorities both for us and I think for our larger mission of restoring integrity to the information ecosystem.

Chitra:   But it wasn’t just research, right? Facebook suspended five people, including yourself for what they call “coordinated inauthentic behavior” around the special election. Were you right to take on that project? I mean, looking back as a leader, were there lessons learned from how that unfolded and, and how you look upon that moment in your life?

Jonathon:   I mean, I think we look back at the, so this was researched in 2017 the Facebook page that we created was a relatively innocuous, the people who administered the page did it using their real names. And so the way that it was characterized, I think I would say almost rightly from Facebook’s point of view, led to a situation where it was important for them to be having to be seen, to take action. And so the fact that they suspended my account, my personal account, I think was a reasonable thing for them to do. However, I think that ultimately that was because of how a relatively innocuous research project was characterized.

Jonathon:   Regardless. I think that, you know, I think what, what was important for us to understand is that in 2017 it seemed like this was an, it was an urgent issue and something that, I found myself in a position to try and test something that I thought was incredibly important and an essential way to a technique that had been widely believed to reduce the amount of influence or reduce the amount of polarization and radicalization online.

Jonathon:   And so it’s difficult to find opportunities to test that type of theory. And it was a moment where I felt like there was a lot of value in being able to test it. But I think that what I clearly failed to do was recognize the way that that research might be perceived in the future. It didn’t document the research particularly well. Didn’t share our findings in a way that was open and transparent at the time because the findings I didn’t find to be particularly interesting.

Jonathon:   But of course, that didn’t matter and I just didn’t, I think were where there was clearly had made some mistakes, was having the foresight to know how that research might be interpreted or kind of reinterpreted in the future and make sure that I had taken some steps to both communicate my intentions and be transparent about the actions that we took so that, for better for worse that we could stand behind our choices and let the public decide in a way that we at least had some input into that story as opposed to letting it be characterized only by people who had a very uncharitable view of, of the actions that we had taken.

Chitra:   And clearly you’ve moved past it now and your focus has expanded well beyond the political realm and now you’re primarily working with the corporate world and looking at disinformation campaign around products and services and, and are you finding sort of similar things that you have found in all of these other areas?

Jonathon:   Yeah, and in fact I think that’s been, you know, for us as a company, the, that well, that was a lot of the work that we’ve done initially, and the way that the public has understood this problem has been very focused on security and national security and election integrity in particular. But I think for us what’s always been true and kind of what continues to be true is that the real issue is that this is a design flaw. Like the reason that disinformation is possible, the reason that radicalization happens online, all come back to some kind of fundamental design flaws in the modern information ecosystem.

Jonathon:   Ultimately, the way that we interact with each other and share information online, it just doesn’t work like people think it does. And that system, sometimes it’s exploited intentionally by bad actors like terrorist groups and or Russia and the 2016 elections. Sometimes those dynamics are exploited by groups that are operating with good intentions, that are just trying to kind of shed a light on something or, or kind of get attention for something that they’re passionate about. And I think what makes this difficult is it for anybody who’s a stakeholder in this, in this system, whether you’re a brand is trying to communicate your values to the public.

Jonathon:   Whether you’re an individual with a reputation to protect, the fact that this system is run by, it’s not really run by individuals. It’s not even really run by the social media platforms themselves. It’s kind of governed by mobs for better, for worse, it’s kind of mob rule on the Internet and the, and in fact it’s those dynamics that make disinformation and radicalization and other phenomenon like this possible. And so I think it’s always been true for us that we needed to develop a technology that would get at that root problem. And that root problem is what ultimately is valuable. Solving that root problem is what’s valuable to our commercial customers. In addition to kind of any government partner.

Chitra:   Jonathon, this has been a fascinating conversation. Do you have any closing thoughts?

Jonathon:   I think, you know, what I, what I’d like people to take away is that this is a solvable problem, but I think as a society we all have to want to be behind a more authentic Internet. This is a once in a generation problem that you all have some responsibility in solving. And I’d say the consequences for failing to solve it are, are pretty severe. I think there’s an end state where we don’t address this problem sufficiently, where nobody trusts anything that they read online. We can’t trust the connections that we make online and the kind of rapid spread of ideas that’s allowed for kind of amazing social progress over the past couple of decades starts to go away.

Jonathon:   It starts to erode. And I think we’ll find ourselves back in a situation where the only reliable information we get is from gatekeepers who are ultimately vulnerable to being swayed, to shape the world from their own point of view. And as a, as a society, as a groups of individuals, we really lose access to the power of the modern Internet if we don’t safeguard it. And so I think if you look back, you know, 20 years to when the Internet was static pieces of content that at best you could search with an engine like Google. We moved from a static Internet to a social Internet where we could share information directly with each other and in a way that unlocked a lot of power but also a lot of vulnerabilities. And the next step has to be an authentic Internet where we have the same level of understanding about how we receive information, who wants us to receive that information, what their objectives are, just the same amount of context that we have in our offline life.

Jonathon:   We have to be able to take those values and encode them into our digital online lives. I think it’s possible, but I do think it will take the platforms, the brands that fund the Internet, legislators, and the public to really work together to address what I think is the most pressing issue of our generation.

Chitra:   Well, thank you so much for joining us. Where can listeners learn more about you and about New Knowledge?

Jonathon:   The best place is to go to our website. We’re at newknowledge.com.

Chitra:   Thanks so much.

Jonathon:   Thanks so much.

Chitra:   Jonathon Morgan is the CEO and Co-founder of New Knowledge an Austin Texas based cybersecurity firm.

Chitra:   Thank you for listening to When it Mattered. Don’t forget to subscribe on Apple Podcasts or your preferred podcast platform, and if you like the show, please leave a review and rate it five stars.

Chitra:   For more information, including complete transcripts, please visit our website at goodstory.io. You can also email us at podcast@goodstory.io for questions, comments, and suggestions for future guests. When it Mattered is produced by Jeremy Corr, CEO and founder of Executive Podcasting Solutions. Come back next week for another episode of When it Mattered. I’ll see you then.