Yes, robots can actually prevent your website from being indexed by top search engines like Google and Yahoo – and no, this isn’t a sci-fi movie plot! Don’t worry, you can fix it, there is an easy solution. …Just to clarify first though, we are talking about robots – as in a robotx.txt file – in the root of your website, i.e. www.your-website.com/robots.txt .
The Quick Fix: Use a Proper robots.txt File, Don’t Delete It
If you want your website to be indexed – all of your webpages – do not delete the robots.txt file.
In the past, removing it might have helped, but today, tools like Google PageSpeed Insights expect a properly configured robots.txt file.
Having none at all can trigger a warning and even lower your performance score.
That’s why all versions of Ultimate Web Builder software come with a default robots.txt file that’s designed to work correctly – allowing indexing and keeping SEO tools happy.
How to Keep Your Entire Website Open to Search Engines
To make sure your entire website is crawlable and indexable, your robots.txt file should explicitly allow everything. A simple and effective file would be:
User-agent: *
Disallow:
This tells all search engine bots (*) that nothing is off-limits, so your whole site can be indexed. This setup keeps your site fully accessible to search engines and satisfies performance tools like PageSpeed Insights.
Blocking Specific Crawlers (Optional)
You might come across a more advanced setup like this:
User-agent: ia_archiver
Disallow: /
User-agent: *
Crawl-delay: 120
Here’s what it does:
- Blocks
ia_archiver, which is Alexa’s crawler used for archiving websites. - Applies a 120-second crawl delay for all other bots to reduce server strain.
This is optional and depends on your goals. If you don’t want your site archived or if your server struggles under crawler load, it might make sense. But if you want maximum visibility and performance, keep it open and fast.
Important Notes
- The
robots.txtfile is publicly visible – anyone can access it at yourdomain.com/robots.txt. - It only works on bots that respect it – which includes major search engines like Google and Bing, but not all crawlers.
- For real access control (such as member-only pages), use server-side tools. For example, UltimateWB includes a Members App and Page Access Tool that uses PHP to protect content securely behind the scenes.
Are you ready to design & build your own website? Learn more about UltimateWB! We also offer web design packages if you would like your website designed and built for you.
Got a techy/website question? Whether it’s about UltimateWB or another website builder, web hosting, or other aspects of websites, just send in your question in the “Ask David!” form. We will email you when the answer is posted on the UltimateWB “Ask David!” section.
