If you are facing and error with Googlebot in WordPress, you are in the right place!
This article outlines how to resolve the ‘Googlebot cannot access CSS and JS files’ error for WordPress sites.
The Google webmaster tools alert contains links to instructions on how to fix the error, but those instructions are not very user-friendly. That’s why i’ll show you how to fix Googlebot cannot access CSS and JS error on WordPress.
Let’s get started!
Table of contents
The best way to answer a question is always to understand the answer.
The default WordPress settings do not prevent search bots from accessing any CSS or JS files. However, some site owners may block search bots after adding extra security measures, such as a security plugin.
When robots crawl your site and want to index it, they follow the instructions in your robots.txt file first.
This file guides the bot to index or not index any page of your website. If you are running WordPress, you will find the Robots.txt file in the root directory.
How To Edit Robots.txt For Googlebot
Your robots.txt file should include a sitemap and a few other items if you want to fix the Googlebot cannot access CSS and JS files error.
WordPress Robots.txt file is easy to understand.
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Allow: /wp-includes/js/
Disallow: /folder means that the folder won’t be crawled by Googlebot.
Allow: /folder means that the folder will be crawled by Googlebot.
To fix the “Googlebot cannot access CSS and JS files” Error in WordPress, you need to allow Googlebot to crawl the folder where your CSS and JS files are stored.
Most of the time, removing the following line will fix the issue: Disallow: /wp-includes/
Afterward, save your robots.txt file and upload it. Then click “fetch and render” in the Google tool. Now compare the fetch results, and you should see that many blocked resources will no longer be blocked.
It may happen for some users that their robots.txt file is either empty or does not exist at all. If this happens, then Googlebot automatically crawls and indexes all files.
It is a lot easier to make or edit a robots.txt file using Yoast SEO. For this, follow the steps below.
Step 1: Go to your WordPress website and log in.
Once logged in, your account will appear in your ‘Dashboard’.
Step 2: Click on the ‘SEO’ tab.
There is a search bar on the left-hand side of the screen. Select ‘SEO’ from that menu.
Step 3: Then, click on ‘Tools’.
You will find additional options in SEO settings. Click on ‘Tools’.
Step 4: Click on ‘File Editor’.
And that’s it, you just have to remove the line “Disallow: /wp-includes/” and click on “save changes to robots.txt”.
Edit your robots.txt file with SEOPress
Step 1: Go to the SEO tab.
Step 2: Check the green radio button associated with Robots.
Step 3: Enable the robots.txt virtual file by clicking Manage.
Step 4: Write your robots.txt file in Virtual Robots.txt and save.
Make sure to remove the line “Disallow: /wp-includes/”
If you want to see your robots.txt file, click View your robots.txt or go to ‘yoursite.com/robots.txt’
Edit robots.txt with Rankmath
On Rankmath, the robots.txt file is editable as shown below:
Once again, remove the line “Disallow: /wp-includes/“
And that’s it!
I hope this article helped you to fix the ‘Googlebot cannot access CSS and JS files’ files error on your WordPress site. You may also want to see my article on How to Fix “Missing a Temporary Folder” Error in WordPress.
Are you receiving an error message saying “This site contains harmful programs” on your WordPress […]
A step-by-step tutorial on how to resolve the WordPress already exists destination folder error.
Step-by-step guide on how to fix the 502 bad gateway error in WordPress.