Use This Technical SEO Checklist to Improve Your Dispensary’s Search Ranking
Feeling overwhelmed by your dispensary website’s technical SEO?
The good news is that you at least understand its importance. Otherwise, you wouldn’t be worrying about it at all.
And since we already know that technical SEO is the backbone of your website, now you just need to get organized.
To help you, we’ve put together this technical SEO checklist to help improve your dispensary’s search ranking.
1. Install the Right Tools and Use the Right Services
These tools are a must-have for technical SEO management:
- Google Analytics – analytics tool that tracks and reports your web traffic
- Google Search Console – helps you monitor and optimize your presence on Google Search
- Yoast SEO (for WordPress users) – user-friendly WordPress plugin that allows you to optimize your pages for search engines. Other alternatives for WordPress are All in One SEO Pack, SEO Ultimate
2. Find Out How Many Pages On Your Website Have Been Indexed by Google
You can use Google Search Console for this one.
Ideally, you want the number of indexed pages to be as close to the total number of web pages as possible – excluding, of course, the pages that you don’t want indexed.
If there’s a big gap between the indexed pages and the total pages — bigger than you expected — then you’ve got work to do.
3. Review Your Crawl Budget
Your crawl budget is the number of pages on your website that Google bots will crawl on a particular day.
Your crawl budget is determined by
- the size of your site
- the health of your site (i.e., how many site errors Google finds)
- the number of backlinks to your site
Google Search Console can tell you how many of your pages are crawled each day. If you want to optimize your crawl budget, keep working your way through this checklist.
4. Avoid Duplicate Content
As a general rule, you want to avoid having duplicate content on your website. It can confuse Google’s crawlers because they might not know how to properly index or rank it.
But avoiding duplicate content can be tricky.
For example, if you sell a specific product, and you use the manufacturer’s description of the product on your menu page, that can create duplicate content. The same thing can happen if you have a printer-friendly version of your content.
You can use Google Search Console to identify your duplicate content issues going. From there, depending on the nature of the content, you can use 301 redirects, canonical tags, meta robot tags, or other methods to resolve the issue.
5. Repair Broken Links
Broken links (4XX, 5XX pages) don’t just make your site look bad. They also waste your crawl budget and hurt your if they keep popping up in unnatural high volumes. Fix them as soon as possible.
6. Don’t Let Pages With No SEO Value Get Indexed
Which pages might that be?
- Disclaimer Policies
- Terms and Conditions
- Expired Promotional Pages
You can use robot.txt files to “disallow” indexing, thus giving your crawl budget to more SEO-worthy pages (more on robot.txt files further down) or use the meta noindex tag to remove them from index.
7. Test Your Website’s Loading Time
A fast-loading website doesn’t just improve user experience. Google also uses that as a ranking factor.
Aim to get your load time down to 2 seconds.
8. Make Your Site Mobile-Friendly
Google has already begun shifting to “mobile-friendly” search indexing. Crawlers are now looking at the phone/tablet version of your site for indexing purposes.
Yet another reason why your website should be mobile responsive. It will create a better user experience for your mobile customers, AND it will help your search indexing.
9. Submit an XML Sitemap to Google Search Console
You want Google to find every essential page on your site. But if some pages don’t have internal links, Google will have a hard time finding them.
An XML sitemap can show Google all of the important content on your site, making it easier for their bots to index your pages. Once you’ve created your XML sitemap, you can add it to Google Search Console.
10. Create and Optimize your Robot.txt File
Robot.txt files tell Google crawlers which pages to ignore and which pages to access. You can create different rules in robot.txt for different web crawlers, which will, in turn, affect how different search engines index your site.
***
Your dispensary website’s technical SEO will be an ongoing project. There’s virtually no limit to the number of improvements you can make.
But if you successfully check off these ten items, you’ll be in a better position to reap the benefits of SEO and make future improvements.
Have a question or need help with your dispensary website’s technical SEO? Feel free to give us a call at (702) 600-9687 or email us at hello@hazymarketing.com for a consultation!