It can be a challenge to create content that is valuable and provides unique insights.
You cannot create a method for content creation that is adaptable to all industries and audiences.
This field requires you to be adaptable and learn how to create new techniques depending on the audience, your industry, and your individual profession.
1. Content Provides Value & Unique Insights
You can only figure out what content provides value and uniqueness by analyzing your competitor’s content.
There are other ways to find this information, like using Python to search the SERPs, but we won’t get into that here.
I suggest performing this check manually.
How to Check
Type your chosen keyword into the Google search engine and analyze the top 10 organic competitors that come up.
If we’re not being blocked by Screaming Frog, we can use it to find this type of content.
Obtain your list of competitors from Google and store it in an Excel spreadsheet.
2. Contact Us Page
Finding the contact page on a website and making sure it has the right contact information is a good idea.
What to Check
Look at the website and see if you can find their contact information.
Although there are no guarantees, if the research is complete and the information matches what is in the WHOIS records, it is probably a reliable source.
If this is not the case, you will most likely want to include this in your audit.
3. Site Architecture
The way your website is set up can help Google understand your content better and make it easier for them to crawl it.
There are different schools of thought on this topic.
Some believe that it is best to have a flat site architecture where the internal pages are only one click away from the home page.
Others believe that a siloed architecture is best.
A silo is a structure used to store or isolate something. In the context of digital marketing, a silo is a structure used to organize content by topic.
How to Check
In Screaming Frog, check the far right window.
Click on the Site Structure tab.
You can use this tool to identify any potential issues with the top 20 URLs on your website.
4. Identify Pages That Are Under-Optimized
Under-optimized pages can prevent a website from ranking high in search engine results pages (SERPs).
Or, they are only able to get to third-page rankings, which is worse.
These pages are also typically candidates for keyword cannibalization because they are already targeting keywords that are being optimized for on-site. This can create problems because it can cause search engine confusion about which page to rank for a given keyword.
What to Check
You want to check on pages that have the following missing:
- Pages with missing keyword targeting (content that has no obvious keyword target), basically default content that looks like it was written and put there, without being optimized.
- Pages with missing headers like H1s, H2s, H3s.
- Pages that do not have any meta tags optimized whatsoever.
- Pages that have zero outbound links or internal links.
- Pages that have zero text structure.
5. Make Sure the Page Is Shareable
Make sure your website’s content is shareable by adding social sharing plugins or some other functionality.
You can make a world of difference to a site’s functionality by doing a simple check.
What to Check
To share an article on the site, look for any functional sharing buttons and click the one for the social media site you want to share on.
6. Site Has Rampant Interstitials & Offensive Ads Above the Fold
What to Check
This will be a visual check of the site. Do a search on the platform for ads that match the above criteria.
If you have a lot of interstitials on your site, you will need to remove them or redesign them so they are not offensive or disruptive.
You should take screenshots of the pages with the interstitials and identify each page where the issue occurs.
7. Schema.org Markup Exists On-Site
It is now considered best practice to have some sort of Schema.org markup on your website.
Most websites should have Schema.org optimization applied, and you will be able to see Schema.org when you search the code.
What Is Schema.org?
This website is a joint effort from Google, Microsoft, Yandex, and Yahoo! to improve the web by creating a common vocabulary for structured data. Microdata markup on the Schema.org website, referred to as Schema.org, is a joint effort from Google, Microsoft, Yandex, and Yahoo! to improve the web by creating a common vocabulary for structured data.
Schema.org refers to the actual vocabulary being used on Schema.org.
Different vocabulary terms can be used to markup and describe the data present on a page.
Different content types use Schema.org structured data, things like:
- Restaurants.
- Events.
- Articles.
- Local businesses.
- All types of reviews.
- Product ratings.
- Many other different types of contextually relevant information.
The Schema.org website provides vast opportunities for utilizing structured data.
8. Any Important Pages That Have Low CTR (Click-Through Rates)
First of all, what the heck is a click-through rate, and why do I need to care about it?
The click-through rate (CTR) is a metric that measures how many people click on a search engine results page (SERP) listing out of the total number of people who view it. A lower CTR means that people are less likely to click on your listing, while a higher CTR means that your listing is more appealing to searchers.
This refers to the number of people who saw your search result and the number of people who clicked on your page.
Larry Kim at WordStream has some great examples of how you can improve your CTRs.
9. Fact-Checking Claims Made in Your Content
It is important to always check the facts behind the claims made in your articles to make sure they are not fake.
You should start incorporating fact-checking into your routine now if you have not been already.
Google has not only created fact-checks within the search results, but has also created some fact-checking tools for SEO professionals and content publishers.
The best thing about fact-checking tools is that you can use them to easily check the accuracy of claims made in your articles.
10. Ensure you have a robots.txt file available on your website
The robots.txt file is a file that is automatically crawled by robots when they visit a website. This file contains a list of commands that tell robots which pages to index and which not to index. The format of your text is important to ensure that search engines can read and crawl it.
If you want to prevent indexing of some content, you can use an appropriate rule in the robots.txt file.
For more information on such rules, check out robotstxt.org.
Keep in mind that the commands in the robots.txt file are more like suggestions than strict rules for robots to follow. There is no guarantee that a disobedient robot will not check the content you have disallowed. If you want to keep something secret or sensitive on your website, do not rely on robots.txt to keep it hidden.
Having a valid robots.txt file on your website does not guarantee that it will be quickly or correctly indexed by search engines.
The first step is to ensure that the file is set up correctly, so that important content is not accidentally excluded from indexing. If you want to see what pages on your site are restricted from being indexed by search engines, you can check the “Pages restricted from indexing” section.
11. Make sure you have a .XML sitemap on your website.
The sitemap should be one directory away from the homepage, and it should contain all the website pages that you want to be indexed. In general, it serves to aid indexing and saturation. The sitemap should be updated whenever new pages are added to the website, and those pages should be coded correctly.
The sitemap also allows you to set the frequency of each page, telling search engines which pages to crawl more often because they are updated more frequently.
A sitemap that is in the .xml format does not guarantee that it will be indexed perfectly by all search engines. It is important to have a Robots.txt file in the root directory of your website to ensure that all pages are indexed, and that no important pages are restricted from indexing.
12. Ensure pages with useful content are not restricted from indexing
A page can be restricted from indexing in several ways:
The three ways to prevent a page from being indexed are by including a Noindex robots.txt file, by using the Noindex X-Robots tag, or by using the Noindex Meta tag.
The following lines of code tell web crawlers how to handle specific pages on a website. The “noindex” tag tells search engine crawlers that they are not allowed to index the page, follow its links, and/or archive its contents.
Make sure to have pages with unique and useful content available to be indexed.
Make sure that none of the sections of your website containing valuable content are restricted from indexing by checking the Disallow rules in your robots.txt file. If you want to keep low-quality pages from appearing in search results, you can use the robots.txt file to block them.
13. Make sure you have fixed www. and non-www versions of your URL
Websites are generally available with and without “www” in front of the domain name. Stating that an issue is common does not make it any less of an issue. If people are linking to both www and non-www versions, then that is clearly an issue that needs to be sorted out. This will help you keep search engines from indexing two versions of your website.
It is best practice to set one version (with www or without www) as a priority, especially because it helps you save link juice from links that have www and from links that do not have www.
You can choose which version of your website, www or non-www, to display in the .htaccess file. Google Webmaster Tools allows you to set a preferred domain, which is recommended.
14. Remove duplicate HTTP/HTTPS versions.
If the HTTP and HTTPS versions of your website aren’t set up properly, both can get indexed by search engines and cause problems with duplicate content. To resolve these issues, prioritize one version of HTTP or HTTPS, depending on the content of the page.
You can set the primary versions of your HTTP or HTTPS pages using either the .htaccess file or the rel=”canonical” tag.
15. Remove Meta Refresh redirects from your website
Meta refresh is generally not recommended from an SEO perspective since it may be seen as violating Google’s Quality Guidelines.
One of Google’s representatives said that it is not recommended to use meta-refresh type redirects as it can be confusing for users and search engine crawlers. He also said that although it is not causing any problems with regards to crawling, indexing, or ranking, it would still be a good idea to remove it.
So stick to the permanent 301 redirects instead.
Having no meta refresh redirects does not mean that there are no redirect problems with your website. To check that your redirects are set up correctly, look for pages on your website that have a 302 redirect, as well as pages with a rel=”canonical” tag.
16. Make sure your website is mobile-friendly!
Google has announced that their mobile-friendly algorithm will now affect mobile searches in all languages worldwide. This algorithm has a significant impact on Google rankings. The algorithm evaluates each page individually to determine whether it is mobile-friendly or not. It is not concerned with how mobile-friendly your website as a whole is, only with whether specific pages are optimized for mobile devices. The algorithm is based on factors such as small font size, tap targets/links, readable content, your perspective, etc.
Each landing page on your website should be mobile friendly in order to rank well in Google’s mobile-friendly algorithm. You can check to see if your pages are mobile friendly in the Page Audit module. You can also check the Mobile Usability report in Google Webmaster Tools to find and fix any potential mobile issues on your site.
17. Avoid Pages With Frames Where Possible.
Frames allow multiple HTML documents to be displayed in the same browser window. Because of this, documents that only contain images don’t contain the important signals that search engines needs to find them.
too many Frames will make it difficult for search engines to index your content and could result in a lower ranking for your website.
If you’re using frames for specific reasons, you can add the NOFRAMES tag and insert optimized content there. If a browser does not support frames, the content within the NOFRAMES tags will be displayed. This is similar to how the content in the ALT attribute of an image will be displayed if the image can’t be rendered. We recommend avoiding using frames as much as possible.