How to Exclude Websites from Google Search? [Step by Step Guide]

How to Exclude Websites from Google Search

There are many reasons why you would want to block a website from showing when you’re browsing the internet. Maybe it’s showing explicit content that you don’t want to see, maybe it’s a popup site that’s constantly spamming your browsing experience, or maybe it’s something that’s getting in the way of the actual results you want to be seeing. Whatever the reason is, we’ve all faced those annoying sites that keep popping up, and today we’ll be taking a look at how to block those sites.

In today’s article, we’ll be learning how to exclude websites from Google search, both in a formal way and in other ways, just in case Google isn’t working properly. Also, we’ll be looking into how to exclude Pinterest from Google search results, how to stop bots from crawling on your website, and how to filter Google search results. Let’s get started.

Google has been very open and understanding about this, and they only want to ensure that their users have the best experience possible, that’s why you can find this guide anywhere on Google’s official site.

Firstly, you’ll want to access your control panel. From there, select the search engine you want to edit. Since we’re talking about Google search, that’s the engine you’ll want to choose in this process. Following that, select ‘Setup’ from the menu on the left.

Browsing through the tabs, find the ‘Basics’ tab, then – find ‘Sites to Search’, ‘Advanced’ – ‘Sites to Exclude’. Press ‘Add’. Enter the URL you want to exclude and select whether you want to include any pages that match or only that specific page.

This is important – if you want to block that whole site and stop it from popping up, then you have to choose that option. Otherwise, you’ll only be blocking this one specific page from that whole site. You have two options: ‘Exclude all pages whose address contains this URL’ – this option will exclude all pages that include this URL. The other option is ‘Exclude just this specific page or URL pattern I have entered.’ – this will exclude this specific page, but the rest of the site will remain available.

After finishing this, all you have to do is press ‘Save’, and you’re done. This, however, may not always work. Although Google is a global giant when it comes to the internet, it may still face bugs, and sometimes your options may not actually work.

For this purpose, I’ve gathered other options on how to block a page or a site from popping up.

Another way to do this is by adding a minus sign right before the “site:” expression.

Here’s an example, let’s say you want to search for the term online marketing. If you type just that into your search bar, a bunch of results will come up. One of those results will be Quick Sprout, but let’s say we want to exclude that.

If we don’t want Google search to show Quick Sprout, all we have to do is type: online marketing -site:https://www.quicksprout.com

This way that specific site won’t show. However, this needs to be done every time you make your search, that’s why Google’s way is the primary way of doing this, and you should only use other methods if Google isn’t working properly for some reason.

How Do I Exclude Pinterest From Google Search Results?

Google Search results can often be top-heavy with sites that aren’t related to what you’re looking for or that require an account or subscription to view. Pinterest often seems to be one of the unwanted results of these searches. This is because Pinterest is a highly-rated site, which means that Google will always aim to rank it higher. Pinterest also has tags, and if you include one of these tags in your search term, then you’re likely going to be offered a Pinterest page as one of your results.

If this presents an issue to you, you can always install Unpinterested! This will exclude Pinterest from all your Google searches.

However, you can’t exclude Pinterest from your Google image searches, or when you’re searching by image – it’s entirely impossible, so you’re just going to have to deal with that.

How Do I Stop Bots Crawling on My Website?

Bots, spiders, and other crawlers hitting your dynamic pages can cause extensive resource (memory and CPU) usage. This will, unfortunately, have an overloading effect on your server, which will slow down your website. Bot traffic (any non-human traffic to a website or an app) has become an intense issue to developers and website owners alike. This doesn’t necessarily mean that all bots are bad for your site, there are many essential bots (especially for search engines and digital assistants).

It’s actually believed that 40% of all internet traffic is comprised of bots, and a large portion of that are malicious bots – bots that are bad for your site. This is why so many organizations and companies are looking for new ways to root out these digital pests.

Crawling bots can hurt your site through analysis. Bot traffic can impact analytics metrics such as page views, bounce rate, session duration, geolocation of users, and conversions. This will always cause issues to you, the site owner, as you won’t be able to properly analyze your site’s performance, and you won’t adjust your site to proper numbers.

Google Analytics does provide an option to “exclude all hits from known bots and spiders”, and while these measures will help – they can’t block all bots.

The first thing you need to get right in order to get rid of bots is to identify them properly. Here are some of the most common symptoms:

  • abnormally high pageviews: if you’re witnessing a large surge in your views, one that’s unexpected and sudden – you’re likely witnessing bots clicking through the site
  • abnormally high bounce rate: an unexpected rise in the bounce rate can be the result of bots being directed onto a single page
  • junk conversions: a sudden surge in conversions that appear to be faking (using seemingly-fake emails, contact forms submitted with fake names and numbers)
  • a spike in traffic from an unexpected location: if you witness a sudden spike in traffic from an unexpected location – most likely a location where native speakers don’t speak the language that’s spoken on the website – this can be a massive sign for bot activity
  • a surprisingly high or surprisingly low session duration – too long or too short sessions may be an indicator of bot activity

The first step towards controlling bot traffic is including a robots.txt file in your site. This file will provide instructions to bots crawling the site, and it can be configured to prevent bots from interacting with the site completely. However, only good bots will comply with this file, malicious bots will ignore it.

One way to stop malicious bot traffic is to let a network engineer identify IP addresses from which the traffic came from, and then block them. However, this will not block all the bots.

The most effective way to get rid of the largest number of bots is by using a bot management solution. A good example of this is Cloudfare Bot Management – this tool will find, identify, and block bots that it previously identified.

How Do I Filter Google Search Results?

In case you’re looking to filter your Google search results because you’re looking for a specific result, you’ll be able to do it just by using Google’s filtering options.

You can narrow and customize your search results to find exactly what you want. For example, you can find sites updated within the last 24 hours or photos with license information.

Once you’ve searched for your term, select the type of results you want to find – images, news, etc.

After that, click ‘Tools’, in that bar – click ‘Filter’ and choose an option. To remove those same filters, all you have to do is click ‘Clear’.

You can filter your results by type. Types can be images, news, shopping, or videos.

When filtering your search results by tool, you can choose: location, color, size, or the date a page was published. You can actually specify this even more: verbatim, dictionary, private, nearby, recipes, applications, patents.

When you’re looking for places you’ve visited in the past – you can filter your results by past visits, rating, cuisine, price, and hours.

Author

  • Tristan

    Tristan has a strong interest in the intersection of artificial intelligence and creative expression. He has a background in computer science, and he enjoys exploring the ways in which AI can enhance and augment human creativity. In his writing, he often delves into the ways in which AI is being used to generate original works of fiction and poetry, as well as to analyze and understand patterns in existing texts.