Are SEOs Responsible For Google Search Bias?


In 2016, the U.N. declared that access to the internet is a human right.

This decision was made on the understanding that the internet is a tool for everyone to access information affordably and express themselves.

This resolution stirred up discussion around access in terms of infrastructure, where fiber cables are installed or updated, or allowing uninterrupted access during civil unrest and other emergencies.

While these are valid and important points, the internet is not just wires and screens, and the information accessed can be altered based on algorithms.

As the internet has integrated into our lives, it is now part of our social infrastructure (similar to medical or educational services).

It is well documented that biases exist within medical and educational spaces, including access to care and quality of care, but what about search results?

Are they fair? Are they representative of the world around us? Or are they causing more harm than good?

What’s In An Algorithm?

In digital marketing, “algorithm” is a term that is being tossed around daily regardless of whether someone understands what it means. Each platform has one (or multiple), and our job is to try and satisfy them.

An algorithm is a procedure followed when a system is performing a computation.

This process takes an input and utilizes formulas, rules, or other problem-solving operations to produce an output.

For search, this means queries entered into a search box are the input, and the SERP (search engine results page) is the output.

This is a highly simplistic explanation of what is happening. Google uses multiple algorithms in combination with AI (Artificial Intelligence) and machine learning.

To dissect the whole system would be well beyond my scope and beyond the purpose of this article.

The Canary In The SERPs

As a woman, I am no stranger to biases in websites, policies, and society in general.

Every day I navigate the world with a pinch of salt. Investigating the potential biases in search results is something I’ve been interested in for some time, and I began researching the topic in 2021.

An original research project (Full disclosure: That I helped conduct) called Give Us Features, Not Flowers examined the gender bias of social and search landscape for professional photographers.

Several gender-neutral queries were tested, such as “best photography Instagram accounts” or “best photographers.”

The results?

Women were featured as professional photographers far less than men in rich results and within the content on page one, despite composing 50% of professionals.

Who is responsible for these biases? The writers who wrote the articles? Search engines for rewarding those pages? SEO pros for recommending the article to their client?

My gut reaction is to blame whoever created the algorithm.

While this is true to a degree, it is not the whole story and simply isn’t fair.

Biases are rooted in our existing societal structures, woven into our culture, government, and our interactions with the world around us.

Is The PageRank Algorithm Biased?

Research published in 2011 has already called into question the fairness of PageRank.

Models show that as the web has grown, the stability of the top-ranking websites becomes more stable, leaving remaining websites to argue for scraps.

Nature, a peer-reviewed journal, published an article in February 2022 examining the PageRank algorithm to see if it introduces or amplifies biases.

To put this in the simplest of terms, researchers created five potential societal models with varying degrees of homophily (“the tendency to connect to similar others”).

Each model contains 20 nodes, but let’s refer to them as websites. Then each website was assigned a page rank and either as part of the majority or minority within the society.

Inequality was measured using the Gini coefficient (a statistical analysis to measure inequality) to see how an individual scored against an equal distribution. Inequity was measured by calculating the percentage of minorities in the top search results.

Their findings show that PageRank’s algorithm can reduce, replicate, or amplify biases depending on the model used.

In models with a high degree of homophily, the dominant voices perpetuated those perspectives and biases while under-representing minorities.

On the other hand, when the majority group is heterophilic (the tendency to collect in diverse groups), there is an over-representation of minorities.

This lays some groundwork for future research around potential interventions or bias reduction to algorithms.

The Intersection Of Culture And Google Image Search Results

Plenty of research has shown that algorithms can be and that many are biased. As previously discussed, PankRank can play into these biases to amplify or diminish them, but algorithms do not act alone.

In the instance of Google, there are not only multiple algorithms at play but also AI and machine learning. All of these elements are evolving continuously through our (human) interactions.

Another piece of research that was published this year investigated if societal gender inequalities were present in Google Image search results (via localized search algorithms).

Researchers plotted gender inequality by country (based on the Global Gender Gap Index) and the percent of men that appeared in Google Image search results when searching for “person” in each country’s respective language (using a VPN to access local results).

Countries with greater gender inequality saw more images of men for the gender-neutral keyword “person.” Which they claim is a link between societal norms and algorithmic output.

The second part of the study looked into how these biased results can influence individuals’ decision-making.

Participants looked at screenshots of Google Image results from low-inequality and high-inequality nations and were asked questions about gender and occupation.

Skipping the details (although I think the article is worth a read), the results showed that the cultural biases that are present in algorithms can (and do) influence individual decision-making.

When participants saw image results from low-inequality nations, their results were more egalitarian compared to results from high-inequality nations, where results reinforced gender biases.

The level of societal gender inequality is reflected in the search algorithm, which leads me to wonder just how much. The combination of these elements is then influencing individual perception through each use.

Who Is Responsible For The Biases In SERPs?

I began this journey by asking this very question hoping for a simple answer.

Unfortunately, there isn’t one because we are all responsible for biases in search results. From the original coders to the writers, to the SEO pros and the link builders, and also society, culture, and the environment we exist in.

Imagine all of the algorithms you interact with daily. If exposure to those algorithms influences your perception of the world, then it gets messy, untangling the strings of multiple inputs.

How Can We Make It Better?

As a hopeless optimist, I can’t leave you with such a heavy burden. Let’s start the discussion around how we can make search and content a more inclusive space.

The researchers who examined biases in PageRank discussed that while homophilic networks were driving inequalities in representation, minorities can overcome this through strategic networking.

That is not a reasonable solution, so they suggested implementing DPAH (don’t worry, I will not go into the details!).

This model would eliminate the need for minorities to be required to network with majorities.

Psychology-based interventions were suggested by the other study as they concluded that societal gender inequality was being reflected in the algorithm. They call for a more ethical AI that combines our understanding of psychology and society.

Typically an SEO pro’s biggest concern is how to appeal to the algorithm rather than question the equity or equality of them or how we might be perpetuating harmful biases.

Through the use of AI-powered software to interpret AI-powered algorithms, there should be a moment where we start to question the ethical component of our work.

Currently, search results are not an accurate representation of an equitable world when they can be.

As SEO pros, content creators, and marketers, we play a large role in reproducing inequitable content, increasing visibility to already large voices, and perpetuating our local-cultural biases.

Here are a few other suggestions I had to help create a more equitable search landscape.

  • Stop replicating biased content – share your platform with diverse voices and create new narratives around your niche.
  • Audit AI content – I’m not going to say no to all AI content, but it should be reviewed by a human as it risks falling into the same patterns.
  • Algorithm audits – similarly to how we audit websites, algorithms can be audited. There are resources to audit the potential biases and audit for impact assessments.
  • Support education – support or volunteer with organizations that provide coding, software, or technical training to women, people of color, or other marginalized groups. Shoutout to Women in Tech SEO for being one of those spaces.
  • Multi-lingual resources – create SEO and other marketing resources in languages other than English to allow for diverse voices and perspectives.
  • Create less biased algorithms and AI – easier said than done, but Google AI announced KELM last year, which has some potential in regards to fact-checking and reducing bias
  • Stop the gentrification of search – To be anti-competitive is to be anti-business. It suppresses new and diverse voices, so I would like to see more companies in the search landscape and more variety in results.

I don’t intend to have the final word on this topic, as this conversation should continue in Twitter threads, at conferences, over coffee, and in our daily work.

Please share your thoughts or questions about this topic so we can start discussing the creation of a search experience that does not harm society.

More resources:


Featured Image: Andrii Yalanskyi/Shutterstock

window.addEventListener( ‘load’, function() {
setTimeout(function(){ striggerEvent( ‘load2’ ); }, 2000);
});

window.addEventListener( ‘load2’, function() {

if( sopp != ‘yes’ && addtl_consent != ‘1~’ && !ss_u ){

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘google-search-bias’,
content_category: ‘seo’
});
}
});



Source