Your law firm’s website has a secret, secondary audience: bots! While your site visitors’ opinions matter most, naturally, search engine bots can actually have a bigger impact on the reach and ranking of your attorney website.
We frequently remind firms that they are writing for multiple audiences, including current clients, prospective clients, and colleagues. But it’s important to understand that you’re also writing for an invisible audience that isn’t human at all.
This article explains the basics of search engine bots, how (and if) you can write to appeal to them, and how to communicate with Google’s bots when necessary.
What are bots and why do search engines use them?
A bot is a web crawler, or bit of code, that simulates a user and gathers information about your website. Bots collect data to help search engines properly index your website with the most up-to-date information.
During this process, Google sends multiple bots to “crawl” the pages of your site to learn more about the quality of your content. Basically, bots follow links and form a map of your site.
Anytime a bot finds new information on your site, the bot will inform the search engine so that your site’s entry on the search engine results page is accurate.
Bots look for keywords but they also look at factors like freshness and load speed. Why? A recently updated website signifies greater value, and a fast load time signifies that you have healthy servers. It’s important to impress bots so your website ranks highly in the search engine results page.
Writing for bots: do or do not?
This is a sneaky trick question—let’s find out why.
Your primary audience is real humans and your secondary audience is bots. But the real goal of bots is to determine whether your site has good content for humans!
For example, if someone stays on your site for a long time—or shares a link—these are human data points that hold significance for the search engines. The best way to write for bots is to write high-quality content for humans.
Here are a few practices for producing bot-friendly and human-friendly content:
- Use simple language that an average 10-year-old could understand (no jargon!).
- Create content that is useful by considering user intent.
- Balance readability, SEO and keyword placement.
- Write in short sentences and short paragraphs.
- Use external links with a high domain authority when it makes sense.
How to communicate with the Google bots
You’re not at the mercy of the search engines! You can tell Google how often you want their bots to crawl your website (if at all). This is called a crawl budget. You can’t increase it (since that wouldn’t be fair) but you can decrease it.
Why would you want to reduce or opt out of crawling?
- You’re concerned that bot visits are interfering with the user experience of real humans.
- You want to hide less-useful content (like duplicate content) from Google.
Here’s a good overview of how to maximize a crawl budget. (It’s important to note that most law firms have relatively “small” websites and managing your crawl budget isn’t necessary. However, if you’re part of an AM100 firm or in-house for a large corporation, this may be helpful). If you want to opt out of crawling entirely, look into robots.txt. You can also limit crawling via the Search Console.
Note that there is a difference between preventing crawling and preventing indexing. To prevent indexing, you can use something called a noindex directive.
Ask your developer for help strategizing about how to present the best possible online experience to Google’s bots.
Review and next steps
Bots are an unavoidable part of managing a website. But once you know how bots work, you can use them to your advantage!
- Bots collect information about your website and share it with search engines.
- To impress bots, write human-friendly content.
- You can reduce bot crawling if necessary.
Understanding your invisible bot audience is key to the success of your attorney website. Speak with your online marketing agency to craft a plan that takes bots into account.