Long-read: Big Tech's election year, in conversation with Katie Harbath
A discussion on tech's role in political communication, the risks of AI and what happens if there's another insurrection.
I’m very pleased to bring you this collaboration piece with Katie Harbath at Anchor Change.
I first met Katie last when she was working for Facebook and in Australia to advise ahead of an election. At the time I was a political reporter in the Canberra Press Gallery, and we met to chat about the ways social media could be used for political communication.
Fast forward nearly ten years and we’ve bumped into each other here on Substack and have been reflecting on how much the tech environment as well as the way we view political campaigning and communication have changed since we first discussed the topic.
It’s fair to say we were far more optimistic about the role social media would play than events went on to show.
So we sat down to discuss these issues. Paid subscribers will sent a link with an option to watch and listen to our full chat.
Below is an edited transcript of our discussion.
Latika M Bourke (LMB): Katie this week is quite momentous because Facebook turns 20. I can remember when I first started using Facebook, it was 2006 and I was working at Radio 2UE in Sydney and I logged on to this site — somebody had told me about it — and it was back in the day when you could graffiti each other's walls.
That was the height of Facebook! And I used to get in trouble for using this ridiculous social media site during work time. And I think the data usage was so huge that I would get reports back from my superiors from when I was working at 2 am on a Saturday night and logging onto Facebook, saying please stop using this, you will get reprimanded if you use it.
Fast forward 20 years and I think it's almost impossible to be a journalist without social media.
From where you began with Facebook how have you seen it evolve from your side?
Katie Harbath (KH): When I started I still had to use an alumni address because it still wasn't open to everyone when I first signed up.
And so 2005 - 2006, I had just finished working at the Republican National Committee for President Bush's reelection.
And you know, those early 2000s, it was a lot of trying to convince politicians to be on this site, because they'd be like, ‘Why do I want to be somewhere where people are just posting what they had for lunch that day’, right?
But YouTube was coming out around that time and politicians started to see people uploading videos of themselves and they started to see other politicians start to use it.
And go into the 2010 - 2011 timeframe, then you have the Arab Spring - that was probably the height of people looking at social media and being like, ‘Wow, this could be a great democratiser.’
You now have a lot more people that can have access to those that represent them, you have a lot of different ways for people to gain awareness for the issues and the candidates and stuff that they care about.
And looking through those rose-coloured glasses, that really lasted through until about the end of 2015 and the beginning of 2016 when you had President Obama's re-election in the United States that had seen so much use of social media.
And that's how you and I met in Australia, because all these political parties and others were kind of like, ‘how do we do what Barack Obama did?’
Politicians were getting a tonne of great press because they were seen as new and cutting-edge and using different tools to reach voters. They were using data to better target voters and that's when everyone was like, ‘This is really cool’ and not necessarily ‘Oh my gosh, what's this doing to my privacy? And how's this used to persuade me etc.’
After 15 years of having blinkers on to the negative impacts of social media, we are now in this process of really still trying to figure out how we mitigate the bad behaviour and the negative effects of social media while still amplifying the good.
It can be very easy for people to forget that there are still a lot of good aspects of social media that are a part of our society. We just have to figure out how to find that right balance.
LMB: Do you think Facebook is still the social media medium when it comes to electioneering? Because, from my experience talking to digital campaigners in various political parties over jurisdictions and in different countries, it seems like there's a medium for each election. So back when you were beginning, it was YouTube/Facebook, then it kind of moved to Twitter, to Instagram. I don't think we'll see it on Threads but we've definitely seen a huge surge in TikTok use, where do you think Meta sits in all those platforms?
KH: This election a lot of people are talking about AI and there are a lot of different platforms.
People use a lot of different online platforms for a lot of different reasons and it kind of depends on where you are.
So in the United States, candidates are definitely still advertising on Facebook, it's still a place to get email addresses, and raise money and in many other countries around the world, Facebook is still the dominant platform.
Some places like India have banned TikTok so apps like Snapchat are also rising.
It'll be really interesting to see, as we go through this election cycle, how much activity is actually happening on the Facebook app versus moving to some of these newer apps that might not have as many protections on it.
LB: Is Meta the most protected in terms of those protections you're talking about?
KH: It depends. I've been mapping out and comparing the tech company announcements for 2024 versus past elections and it's almost kind of hard to compare election cycle by election cycle because so much stuff changes between each one.
Places like Facebook, Google and Microsoft are probably the best equipped in terms of the enforcement of a lot of this, particularly around threat intelligence and looking for foreign interference.
But even then we're already seeing examples of these companies struggling to find AI-generated content automatically. And I think that that is going to be a place where all these companies are still trying and the technology is just not there yet to be anywhere near 100 per cent accurate.
And we’re never going to hit 100 per cent accuracy of trying to find some of this stuff, so we’re going to see new challenges but they have put the most money and time and resources into election work.
LB: So there’s a difference there because you're talking about infrastructure companies of the internet versus the dissemination companies, which are the social media ones.
KH: Every platform is a little bit different in what they choose to do and even what the problems are on that platform. So you're right in that it does depend on what kind of company they are in the types and how those problems might manifest.
LB: Before you said 2015-2016 was kind of that point where things began to flip a bit for tech and their role in campaigns. And that, of course, was Brexit, that was the year of Trump, Cambridge Analytica, what changed in that year, in your view, that made tech where it is now, which I think is almost in many ways the villain of political campaigning and engineering?
KH: Yeah, I think so.
The big date for me is May 9, 2016, because that was the date of the Philippines election, as well as when a controversy broke against Facebook where a contractor accused the company of suppressing Conservative content and trending topics unit.
That broke open into the public and a lot of Republicans complained against the social media platforms, which they always view as being liberal — that's not true — but that was the perception they had.
Brexit and Trump were two election results that really shocked people, they were not expecting those two things to happen. And when that happens, people are searching for answers. And they don't want nuanced answers they want something they can get their head wrapped around.
It wasn't until after November — even with Brexit there wasn't a ton of Tech-lash towards companies — but when Trump won, people were searching for those answers.
And the Trump Team is like ‘We won because of Facebook’. Now, when Barack Obama did that in 2012, we were all singing from the rooftops, it was seen as a really good press story. Now, it's a really negative one.
Then people are like, ‘This thing had an impact that we didn't realise was happening and we want to dig into how Trump won,’ but ‘Wait a second, we can't see the ads that he ran’.
Then you have Trump, throughout the campaign really pushing the boundaries of content moderation, pushing the boundaries of just campaigning overall with the language and the rhetoric that he is using and also really testing these companies, their rules and whether or not they're going to apply to a politician or not.
And those are really tricky conversations, many people have different viewpoints of what should happen and it continued to snowball, as people debated this.
Right after the November election, it wasn't Russian ads, it was Macedonian teenagers running fake news, that people were like,’ Oh, wait, was that it? Was that what caused it?’ Then we announced that we found the Russian ads and people were like ‘oh, it was the Russians, the Russians cause caused this to happen.’
And then like you said, Cambridge Analytica, that scandal broke in March of 2018 and then people were like, ‘Oh, wait, it was the data. It was how they were targeting.
‘So wave after wave, after wave kept building upon itself. And then ever since then you just keep having more of these stories and leaks and things that are coming out where people are getting a better sense of what's happening underneath the hood and what they don't know about what is happening underneath the hood, when before they didn't even know to ask those questions or to even be concerned.
LB: From an outsider's perspective, it felt like two things were happening. One, Facebook was quite happy, originally — or the social media companies at large were very happy to host this content — and then when it all got too hot for them, they went ‘We're going to downgrade news, we don't want to do links. People just want to share their coffees and pet photos so we're going back to that.’
So it kind of felt like it got a bit too hot in the kitchen and then tech wanted to vacate that space. Is that the way that you see it? Or is that a bit unfair? And is that ultimately, particularly in Meta’s case saying ‘No, we're giving up?’
KH: I don't think it's that they're giving up, they’re businesses and they’ve got to keep growing and make money.
And Meta, after so many years of challenges and scandal, their approval rating with people was quite low. Morning Consult did a poll that shows people like Facebook more with less politics and news, so if you're a business person, if you're a leader at the company and you're looking at that you're like, ‘What am I getting out of doing anything around politics?’
A lot of it shifted with where society as a whole was. Because Barack Obama was such an inspiring figure and people were excited about politics and being involved in it, it was a positive thing to be a part of.
Since 2016, and in this Trump world that we've been living in now, for eight-ish years, it's been a negative experience, it's gotten a lot more polarising and so people do want to pull back because they feel like they can't win.
They don't know which way to move, it can be very paralysing. And then the question is, you get distracted trying to deal with that versus constantly innovating or building new things.
I always tell platforms, you can run from politics, but you can't hide. If you're a place where people are at when an election cycle happens, people want to talk about that. And so you have to have a plan, you can't just be like no politics on our site, because that's just not going to be realistic, you're still going to have things that you have to deal with.
Even if they're showing less of that content they still know they have a responsibility.
I do worry as we go into this epic year of elections that people are burnt out on the news. It’s been a tough four years, we're coming up on that four-year anniversary of COVID.
In the United States, they are not wild about a Trump-Biden rematch - the Iowa caucuses saw some of the lowest turnout, some of that was the weather but we saw a really low turnout.
Then when you have these platforms not necessarily showing as much of this content I worry about what this could mean for civic engagement going forward.
Are we going to enter a period where people put their hands up, not wanting to be involved? And that tends to seed it to the extremes, which could further exasperate the polarisation that we see all around the world.
My hope is that we can find the right balance between how we help people get information to stay involved in politics and care about it while also trying to figure out how we hold people accountable for their speech.
It's going to be chaotic for a while.
LB: What do the tech companies do if we are in a situation where one of the nominees is potentially back where we were on January 6, and disseminating messages that many would say are inciting violence or trying to overturn a democratic result? This is a whole new world, isn't it for the tech companies?
KH: It is and it's a more complicated world, too, because there's a chance that Trump will only post on Truth Social. He's running ads on Facebook and other places but thus far he's been able to just post there and then others share that message.
LB: His team is still very active on X.
KH: Yeah his supporters, his campaign staff and others, very much are on these different platforms and they are sharing it.
A lot of the platforms are going to be reluctant to kick any candidate around the world before an election off the platform and I wish the Facebook oversight board would do a little bit more than this, trying to figure out what the right approach for that should be.
More likely than not we'll see them reducing the reach of a lot of this type of content, to try to make sure that fewer people are seeing it.
LB: That goes back to square one, doesn't it? ‘You're suppressing us’.
KH: Exactly and so how do you pick? That's why you are seeing the platforms move a little bit back towards a little bit more of a hands-off approach.
You can't win. Whatever happens this year is not going to look exactly the same as what January 6 looked like, it's not going to look the same as anything else.
I get asked a lot, are the tech companies prepared for this year? And the truth of the matter is, we don't know.
Until we actually see their decision-making processes in real-time, as these types of events are occurring, how transparent they are, and how well they have been at detecting and knowing themselves what's happening on the platform, it's going to be really hard to answer that question until after the election, or even many years after it because I would argue that we still don't even really have a good answer of how well these companies did in 2020.
And if they're, what if they did anything differently, would have actually changed the course of history or not? And I think that's just that's something that researchers are trying to understand. But it's a very complicated question to answer.
LB: Take us into what happens inside a tech company when these events unfold. Who is literally making these decisions? How big is the team? How fast do you have to make these decisions? What kind of qualitative processes are you going through?
KH: There's a lot of work that we do in the build-up to an election.
When I was at Facebook we would usually start about a year-and-a-half to two years out from an election. We would do a lot of research and risk analysis, make sure that we had enough language capability because if you want to build products or hire people, that stuff takes time.
We did this first going into the 2018 midterms, where we unveiled these election war rooms and a lot of people thought it was just a PR stunt. But it wasn't, it was a real thing.
As you would get closer to the election, we would have 24/7 coverage, we had three shifts of people, we had them all in one room, and you would have a shift lead for each one.
And so we'd have a process that whether it was our systems flagging something or a trusted partner, a journalist calling us up saying, ‘Hey, we're seeing this on the platform.’
We had a process by which it would go to that lead, it would get triaged to the content policy team. It could be an Operations Team, it could be the Threat Intelligence Team. It really depended on what the issue was.
Some of those things can be decided in the room and quite quickly. Other ones, let's say something Trump said, then that team would pull together a one-page memo with the facts of: here's the question, here's what happened, here's what we know. And then that would get escalated to leadership, where sometimes you could get an answer quite quickly, sometimes it would take a couple of hours because you’ve got to pull different folks together and schedules and stuff like that.
Sometimes it took days depending upon the issue, but you had a constant stream of stuff that was coming in.
LB: And does that process at all involve liaison with the teams themselves? And I'm thinking here, if we do have a repeat of a January 6-style incident, is there a point where someone at Meta sees a post go out from Trump or his aides and says ‘Hang on, that's dangerous, that's inciting?’ We can call the team and ask them to take it down before we even have to escalate ourselves.
Does that kind of conversation go on?
KH: It used to. I don't know if it still does today. And it sort of depends, right? We actually had a very philosophical and ethical conversation about whether or not that should be done.
Taking if it was Trump out of the picture, sometimes they may not have known it was against our policies. And if we took something down, going back to people being like, ‘You're censoring us, you're taking down our content,’ We figured if we took it down, they probably run to the press and then we get a negative story. And then you’ve got to deal with the whole news cycle.
So if you can call them up and be like, ‘Hey, this is actually against our rules. We wanted to give you a chance to edit it before we took it down.’
At Facebook, some people might know this as the cross-check system. We had an extra layer of protection for accounts so that nothing got taken down without review. Sometimes we would have this remediation window where they had 24 hours or something to fix it. And then if they didn't we would take it down.
There's just been a lot of controversy about whether or not the platform should do that. Whether or not the company should call Trump or Biden or anybody like that — that itself has gotten very controversial.
You're even seeing that's now ballooned into the Supreme Court case, that's actually in front of the court right now about how the social media and the US government cooperate and have a partnership around potential mis and disinformation on the site.
And the suit started because the Missouri Attorney General was like ‘This is the government violating the First Amendment because they're telling these tech platforms to take down content by Americans, and that violates the First Amendment.’
And so this is all kind of being decided right now. And while it's being decided it's had a real big chilling effect on any sort of those conversations, or partnerships, or anything, because nobody wants to be looking to run afoul of the law.
LB: Interesting. One of the observations I've had since Elon took over X is that so many people left that platform but they haven't looked elsewhere, they've just quit social media altogether.
And it seems like what's happened to the mainstream media, 20 years ago, when we had online advertising, and then you guys coming along, in terms of social media was the fragmentation of that. And it's been very hard to be gatekeepers in a mainstream media sense.
I feel like that is now also happening to social media, you're disintegrating the main platforms further and further and further, and actually, no one is getting that critical mass that they used to enjoy. Where do you feel like a company like Meta and a platform like Facebook, or any of the other social media companies might be if we're having this discussion in another 20 years’ time?
KH: I think that these platforms will still exist but they’re gonna look incredibly different to what they do today. Some might end up getting merged into other platforms. A lot of it could be in virtual reality, or augmented reality, depending upon the types of tools. We're going to see so much technological innovation in the next 20 years, I don't even think I can get my head wrapped around exactly what that's going to look like.
Remember about 20 years ago we didn't have the iPhone, you were mostly using a desktop to do a lot of this. Live streaming was really hard, doing anything like we're doing right now was was quite difficult.
But you know what? Platforms like MySpace still exist and I even Googled to see if Friendster was still around. And it looks like somebody wants to revive Friendster so I think I think we're gonna keep seeing these companies exist, it's just the question of what exactly looks like and where does this technological innovation take us?
And no matter what these are going to be used for politics, by politicians, brand marketers - they're going to keep using all of this and if it's not through the mainstream platforms, it will be smaller platforms that build stuff specifically for politics.
And so that's going to be the big question does this become its own market where people are building these for politicians, while bigger platforms try to stay as hands-off as possible?
LB: The most immediate tech challenge to that is AI. How well prepared do you think the various companies are for AI and this year of elections?
KH: They're definitely trying but that being said, detection systems have a long way to go.
There's still a lot of debate about the right way to label or watermark this content, whether it's organic, or ads, and they've written policies around it, a lot of them like Open AI and Bard, which is Google's, do a prompt around anything political, they're just not going to allow it to happen while they try to figure out the impact of this, or what the right ethical boundaries are for it, which is a very different approach than in the past which was we’re going to let them use it and then we'll try to figure out the solutions.
Now people are like maybe we should just not allow them to use it while we try to figure out the rules, which I think is it is a responsible approach. But I think the challenge for me there is that, again, people are still going to use AI, they're going to find other ways around it. They're going to build their own large language models, especially with open-source AI.
LB: The scenario you raise there, I mean, what's the point if we're all stopping ourselves and then there's bad guys out there who aren't going to stop themselves? Which is where a lot of this disinformation we fear as democracies might come from, right?
KH: Exactly, which is why you cannot run away from this. The responsible thing is to actually try to tackle these types of problems because if we just stick our heads in the sand, that does nothing, because like you said, other actors who want to influence these elections are going to use these tools, they already are.
And there's no finish line to all this, that's another thing that's exhausting to people. Disinformation has been around since the dawn of time, that's been happening, it’s just accelerated now with social media and AI.
While we're talking a lot about platforms pulling back, I don't want to leave us thinking that nobody's working on this, a lot of people are.
A lot of people are trying to figure this out, a lot of researchers, despite all of the attacks on them, are still wanting to do this work, there are people trying to figure this out.
It's just that the answers are not easy, it is going to take time, working through and seeing how people use it and are reacting to it, for these platforms to figure out what those guardrails are.
LB: Ok, I think that's a great place to leave it because we're going to have so much more to talk about this year as the elections unfold and we'll see how many of our predictions and thoughts came true.
But I'll be watching all your analysis on Anchor Change and it's been awesome to run into you again, virtually, a decade later and I can't wait to see when I'm next in the US.