In this week's episode, Jamie discusses ethics and disinformation surrounding not just SEO but the internet as a whole, covering topics such as who's responsible, why it's an issue, ways to tackle these challenges and much more.
We also find out what inspires Jamie, challenges she has faced as a women in the industry and what empowers her to be the brilliant women she is today.
Where to find Jamie
Sarah: Hello, and a very warm welcome to the Women in Tech SEO podcast, where your hosts are myself, Sarah McDowell, podcaster, and SEO content executive at Holland and Barrett. And they absolutely wonderful, Areej AbuAli, who is an SEO Consultant and founder of Women in Tech SEO. WTSPodcast is your weekly podcast for all things SEO related guest, starring brilliant women in the industry.
Hello! How are we doing?
Areej: Yeah. Good. Thanks. Super excited to be here today.
Sarah: Yes. It's the first one. How are you feeling about it?
Areej: Yeah, I'm feeling good. I'm really excited. I'm in great company. I've got you. I've got Jamie. So looking forward to it.
A very warm welcome to the show Jamie!
Jamie: Thank you so much. It's an absolute pleasure to be here with two of my favorite humans, part of my favorite community.
Areej: It's so good to have you, Jamie. I'm sure that a lot of people who are listening already know a lot about you and about how awesome you are, but we would love to hear a little bit about you from you.
Jamie: Oh, me for me. Well, that's exciting. Well, I'm not a robot, but I speak bots and that's pretty important. Because my focus is as an SEO so niche that I don't understand how rankings work and I'm too afraid to ask, but really it's focusing on the rendering process. Everything from how Googlebot fetches content to, you know, response codes, we're giving a look at the log files and see where those resource requests are going.
And when it goes into the web rendering engine, how is that executed? Where the contents available at two, we have a whole sneaky set of your eyes that no one expected to exist. The cannibalize, our world. You'd be surprised a lot of cannibals heading out there.
Sarah: Wonderful, wonderful. Something we used to do with SEO SAS was with each of our guests. We do like a quick fire round of silly questions. How are you feeling about that?
Jamie: I love silly questions. Yay.
Sarah: Okay. So there's six in total and I literally just want you to like clear your mind and just answer the first answer that comes to your head.
All right, let's do this question. Number one. Can you share food?
Sarah: Something that spiky
Sarah: They are very spiky. Are you a tidy person?
Jamie: Absolutely not.
Sarah: Ah, they say the best creative people are untidy. I swear. I've read that somewhere. When coming in, do you stay within the lines?
Jamie: Hm, no. I'm a big fan of coloring books, and that inherently involves some new lines.
Sarah: And final question if it's a hot day, what drink are you craving?
Jamie: Oh, water.
Sarah: Just water?
Jamie: Yeah, I want to stay hydrated.
Sarah: You don't want anything in your water? Like a bit of lime or sparkling
Jamie: Sure, Absolutely. If you’re offering me sparkling water, I will take it. Some cucumber water I am in.
Sarah: Yes. Whatever you want. Jamie, I can make sure that happens. Areej, what do you think to the quick fire answers?
Areej: Yeah, I don't know why I had a feeling you were a tidy person and you started off by saying absolutely not. Our answers are very different. I never share food. I am a very tidy person and I always color within the lines.
Sarah: I reckon we should move into the meaty bit of this podcast and discuss what we invited you on today to discuss. So that is around ethics and this information when it comes to SEO and the internet, basically. So to kick things off let's start around the basics. So what is ethical SEO and why is it important?
Jamie: That's a brilliant question. It's a great place to start. I want to start by looking at the traditional definition of ethical SEO and that's truly been, is it white hat? Does it abide by the guidelines search engine has provided? And there's some interesting nuances. If we take a step back and we break that down.
So ethics become relative to each search engine. They are only about the techniques and strategies used the companies that build search engines get to define ethics. This is problematic when we have lives that are. And digital our lives are digital take care of everyday tasks and reliant on our phones to get a vaccination for COVID-19.
We rely on our digital world. So if now this power is relinquished to companies defining their own relativistic morals. Where does that leave? Humans? And I spent time studying philosophy in college and decided to take a step back and go, well, what if we. Remove ourselves from the idea of these aren't inherently moral thoughts.
How would we create a human centric definition of ethics and search and the simplest way to do it was really borrowed from a manual con to just boil it down to two questions. Can I rationally let everyone act as I propose to act and. Do my actions perspect the goals of humans rather than merely being to get those humans to serve my purpose.
I find that sometimes with marketing, I found a lot of disinteresting when I pitched this to conferences for awhile, I'm not gonna pretend that I have clear answers. In what ethical SEO is, but I think it's very important given how ubiquitous we are in everyone's lives, that we start asking you the questions.
Sarah: Okay. I think you made a really good point that, because the search and like search engines are, they have to sort of come up, decide themselves what is ethical, isn't it. And I suppose that's a quite hard task because ethics is very broad. Topic isn't it. And what is ethical? And it gave like w when you talked about sort of, okay, so the traditional sort of description of ethical SEO is white hat SEO.
And that more is in line with that, like Google's guidelines, but I suppose in ethics and SEO and the internet is much bigger and it's a bigger thing to tackle, I suppose. Isn't it?
Jamie: Absolutely. It's focused on the human impact to it. I'm an American. I witnessed the actions on January 6th. I saw humans acting in what they thought was an altruistic, a very passionate way for the betterment of all, all people I inherently was angry. I wanted them to be punished. I wanted these folks to be held accountable for their actions and then how to. Do some introspection and realize that these were well-intended humans.
Who'd been weaponized by disinformation who were in and Bolden to take action based on that data that was completely within guidelines. You can, you can create a page. Joe Biden dot info I believe he searched for that right now. You can still find that on page one. If you search for Joe Biden, that particular page was created by a member of Trump's campaign. Brad Purcell. He ran the death star initiative and Ken's hits considers himself very skilled in SEO. But the goal of that was not to provide real humans information about a presidential candidate. It was to lead them astray to give them a false representation to skew their views.
Areej: I think it's interesting. You bring this up because actually other putting search engines assigned, I know that a lot of CMS is and web hosting platforms have also been facing these issues where you know, specific parties or groups would use the CMS or platforms to build their websites on it. And there's a lot of ethical questioning around that as well of what should be allowed and what should not be allowed there.
Jamie: Absolutely. And for the longest time, there was a difference between the publisher and the platform. And now in the modern age companies like Facebook have argued that, well, we're a platform that a publisher, but at what point should these platforms. Be held accountable for the information that they publish.
Facebook is a great example. They launched in Myanmar and they launched to such an immense degree that, you know, as we say, let me Google that they say, let me Facebook. The problem here was that Facebook had no content moderators or sorry, very few content moderators who spoke the local language. There are thousands of users on this platform who actually came free with a cell phone plan.
So there was no way to keep in ship the radicalization that would occur in these echo chambers, because that algorithm is based on getting you to interact and high intensity emotional posts get you to interact. And what we actually saw was Facebook helped you create a genocide in Myanmar. People had to clear their homes.
The questions that are brought up have still not been answered. Not saying I have good answers for them, but let me search engines. There are imprints of power. They inherit the biases and blind spots of their creators. Even the internet in general 54% of the internet in 2019 was in English, despite English only being about 25% of users.
Areej: And do you feel this is something that the industry there is there much visibility about it? I personally haven't heard them. A lot of people speak up about it. Do you think there's enough talk there's enough visibility on this topic within our industry at the moment?
Jamie: This is why I think sometimes marketing depends on the ethics or at all it's, you know, in marketing, you've got a goal.
You're going to launch a new site and a new language to get more users, but that content is not properly translated. And it goes even further, you know, if you're asking users to consent to cookies, can they consent? That's not a language they speak. And there is medical research going on where users are agreeing through the websites to these testings and the content isn't translated to them.
Sarah: I mean, when it comes to sort of cookies and I mean, I don't know what you guys are like, but I hardly ever read terms and conditions about anything. And yeah, and I suppose that's just another issue in itself because Like who, who has the time to sort of sit and read through, okay. By me sort of consenting to this cookie or me consenting to my information, what does that actually mean for me?
And I don't know if this is a bit like off topic, but just coming into my head now. But yeah, like I bet you there's, there's lots of things that we're agreeing to that we don't really know. And especially when it comes to our data as well, like. W there's a lot of data of ourselves that's being shared on the internet in between companies and stuff. But we don't really know what the impact of that is. And I suppose that's the issue as well. Isn't it?
Jamie: I mean, the size of Facebook's tracking pixel should be enough for you not to trust them. Let's be honest. But yeah, the data mining Israel there was a bit of outrage room was shocked and amazed that Cambridge Analytica was a thing and they were able to identify voters who were as hardcore in one base or another, but then they were able to actually narrow out the ones who could be swung or who could be persuaded not to vote.
And by segmenting those with such accuracy, they were really able to manipulate groups. And I spoke with an individual who worked in a political campaign and they had shared with me that it wasn't necessarily. The disinformation they saw happening was people trying to say the wrong date was going on.
If you look at Google Europe, they released a paper they talked about what they were doing to keep election security safe. And they talked about polling places being changed. Like the location information on those that, that paper with it changed. And personally, I didn't grab a snapshot, but as it was, it was really interesting.
This individual who worked in the campaign, so that one of the biggest things they saw is just information regarding one and where to vote. It's not as though, if we go beyond, if you go back to language, you know, cookies are annoying. Yeah. We get it. Banners pop up all the time, but in linguistics. So this concept is called linguistic determinism and it means that.
Our language helps define our world. When we have a word to represent something, it has meaning we can talk about it. And the world's very fast paced and changing right now. And there's so many languages out there where the content isn't available to users. And essentially those languages are being strangled because in order to learn new concepts information, that user has to adopt English and other languages.
And some of them are able to bring those concepts over into their language. But without. Evolving and adapting to new concepts and ideas, language slowly dying. That's why it flattens. Its language has no longer changed, no longer used. And the internet being predominantly in English is. Restricting at this clarity of information, leading problematic opportunities for radicalized ideas to, to gain volume attraction.
There's a, there's a lot to this and we don't have the conversations about them because our goal for that quarter is just launched, launch this thing.
Areej: So what would you say then in terms of, what can we do as an industry, whether we work in specific companies, whether we work for ourselves, well, what can we do to start tackling this issue of this information?
Jamie: That is an excellent question. And there are really interesting things that even Google is doing. They have an initiative where they are teaching journalists how to use search engines and spot this information, make friends with the journalists, perhaps. Hope in that way, teach everyone, you know, that Google is a personalized algorithm, that it is going to give you a biased result based on how you interact with content, which means if you have fallen vulnerable to a disinformation campaign and interactive with that content, similar content is now going to appear higher in your search results because of it.
That's what the search engine thinks. That's what you mean. That's what you want. Fundamentally is their love, a flaw they're designed to give you an answer based on what it thinks you want. So there isn't a baseline without your personal bias placed into it.
Sarah: I suppose like say there's an onus on the individual, isn't there? So when you are consuming content that you're sort of aware of, okay. Is this a reputable source? Are the things that they're talking about? Is it backed up by research and things like that? And I suppose when we are on other websites, can you vet it from other sources? If you're a publisher, can you talk about this without amplifying it?
And I suppose what we can do when we're creating content is to be mindful of that. And so when we are sort of writing about a subject or whatever, we're writing about, make sure that it's factually correct. You've got sources in there. And because by doing that, you can make sure that the content that you're creating is correct, but then you can also give people who are reading your content, that sort of security can't you and stuff like that. So I suppose there's two sides of this, isn't it?
Jamie: It's important to realize too, that experts can be manufactured. We can make them there's a great example of this fairly recently, where it was a. A prominent newspaper basically accused these two people in London of being terrorist sympathizers and the author of this article. There was no person who actually had that name. We can create fake experts. Like you can listen to video reviews because those can be manufactured now using Amazon's voice tool. And if you hear like a tidiness to the explosives, the teeth out, I believe it is. You can recognize them there, but it's getting harder and harder to spot.
Thanks. So we work in this field. We work in. Getting people to achieve the goal that we have on that page for them, we need to recognize when it's being used then us and go above and beyond in that area.
Areej: As marketers we need to make sure that we're not part of the problem, but we're actually part of the solution.
Jamie: We are the ones that can see this. We're aware. We know the back alleys of the internet, and there are a lot of people on the highways who aren't aware of these. We need to be the ones that help.
You're not your caps. Lock is hard. It's frustrating sometimes that you can still have a civil conversation and lead gently with questions that make them Reevaluate rather than telling them they're wrong and I'll them, like, we can be part of a really important healthy discourse because the goal of disinformation is to make an ingroup and outgroup to make us fight amongst ourselves.
Areej: Awesome. Well, thanks Jamie. That was super insightful.
Since we're a Women in Tech SEO podcast, we wouldn't be doing our job if we didn't learn more about your experience as a woman within the industry. And it's brilliant to have a lot of women who are very specialized within specific topics to come on and discuss it, but what I'd really love to learn as well.
And I'm sure a lot of people would love to learn is, what keeps you motivated?
Jamie: Curiosity, and lots of it keeps me going. I get to watch how humans are searching for things and interacting with content and when we have friends, but that's only when people are clicking there. And it's, it's quite beautiful. It's almost like we have this digital diary of the collective unconscious and how we've changed, moved and adapted to new ideas and concepts. I love that. Beautiful.
Sarah: I could just listen to your talk, you've just got such a lovely voice and you talk sense and I'm just getting like a need to ask questions. I can't just sit and listen to you talking. It would be good to sort of like to highlight any sort of challenges. That you've experienced as a woman in the industry. I mean, it's awesome if you say you haven't but yeah, if you have any sort of experiences of that,