SEO 11 tips to improve indexing skills
SEO 11 tips to improve indexing skills
Once a website is online or has passed an age, most webmasters are no longer concerned about their crawl budget.
As long as you continue to link to a new blog post somewhere on the site, it should only be displayed in Baidu or Bing’s index and start ranking.
Just after a while, you find that your site is starting to lose keyword rankings, and your new posts don’t even reach the top 100 of your targeted keywords.
It may just be the technical structure of your site, the result of streamlining content or new algorithm changes, but it may also be caused by a very problematic crawl error.
With hundreds of billions of web pages in the Baidu index, you need to optimize your crawl budget to stay competitive.
Here are 11 tips and tricks to help you optimize your crawl rate and help your pages rank higher in your search.
1. Track the crawl status using Baidu Search Console
An error in your crawl status may indicate a deeper problem on your site.
Checking the crawl status every 30-60 days is important to identify potential errors that affect the overall marketing effectiveness of your site. It is actually the first step in SEO; without it, all other efforts are empty.
On the sidebar, you will be able to check your crawl status under the Index tab.
Now, if you want to remove access to a webpage, you can tell Search Console directly. This can be very useful if the page is temporarily redirected or has a 404 error.
The 410 parameter will permanently delete the page from the index, so please note that the kernel option is used.
Common crawl errors & solutions
If your site unfortunately encounters a crawl error, you may need a simple solution or a larger technical issue on your site. The most common crawl errors I see are:
DNS errorsServer errorsRobots.txt errors404 errors
To diagnose some of these errors, you can use the Fetch as Baidu tool to see how Baidu effectively views your site.
Failure to properly get and render the page may indicate that the DNS provider needs to resolve deeper DNS errors.
Resolving server errors requires diagnosing specific errors that can be referenced in this guide. The most common mistakes include:
TimeoutConnection rejectedConnect failedConnect timeoutNo response
In most cases, server errors are usually temporary, although persistent issues may require you to contact the hosting provider directly.
On the other hand, a Robots.txt error may cause more problems for your site. If your robots.txt file returns a 200 or 404 error, the search engine could not retrieve this file.
You can submit a robots.txt sitemap or avoid using the protocol altogether, choosing to manually un-index the pages that may cause crawling issues.
Resolving these errors quickly will ensure that all landing pages are crawled and indexed the next time the search engine crawls your site.
2. Create a web page for mobile devices
With the advance of the mobile priority index, we must also optimize our page to display a mobile friendly copy on the mobile index.
The good news is that if there is no copy suitable for mobile devices, the desktop copy will still be indexed and displayed under the mobile index. The bad news is that your rankings may be affected.
There are a number of technical adjustments that can immediately make your site more mobile-friendly, including:
Implement responsive web design. Insert a viewpoint meta tag in the content. Reduce page resources (CSS and JS). Mark the page with the AMP cache. Optimize and compress images to speed up load times. Reduce the size of UI elements on the page.
Be sure to test your site on a mobile platform and run it through Baidu Pagespeed Insights. Page speed is an important ranking factor that can affect the speed at which search engines crawl your site.
3. Regularly update content
If you make new content on a regular basis, search engines crawl your site regularly. This is especially useful for publishers who need to post and index new stories on a regular basis.
Regularly producing content signals the search engine that your site is constantly improving and publishing new content, so you need to crawl it more often to reach your target audience.
4. Submit the site map to each search engine
One of the best tips for indexing today is still to submit sitemaps to Baidu Search Console and Bing Webmaster Tools.
You can use the Sitemap Generator to create an XML version, or you can manually create an XML version in Baidu Search Console by marking the canonical version of each page that contains duplicate content.
5. Optimize your interconnection plan
Establishing a consistent information architecture is critical to ensuring that your site is not only properly indexed, but also organized.
Creating the main service category in which the relevant web page is located can further help the search engine to correctly index web content in a particular category if the intent is not clear.
6. Deep links to quarantined web pages
If the pages on your site or subdomain are created independently, or if there are errors preventing them from being crawled, you can index them by getting a link on the external domain. This is a particularly useful strategy for promoting new content on your site and indexing it faster.
Note the syndication content for this purpose, as the search engine may ignore the federated page, and if it is not properly normalized, it may cause duplicate errors.
7. Reduce page resources and increase load time
Forcing a search engine to crawl a large number of unoptimized images will take up your crawl budget and prevent your site from being indexed frequently.
Even some resources like Flash and CSS can perform poorly on mobile devices and take up your crawl budget. In a sense, it is a double-loss scenario where page speed and crawl budget are sacrificed for prominent page elements.
Be sure to optimize your pages to speed up by narrowing down resources on the page (such as CSS), especially on mobile devices. You can also enable caching and compression to help spiders crawl your site faster.
8. Use the Noindex tab to fix the page
It may make sense to implement a noindex tag on a page that may be duplicated or only for users who take an action during your website development process.
In any case, you can use a free online tool (such as Screaming Frog) to identify pages with noindex tags to prevent them from being crawled.
WordPress’s Yoast plugin allows you to easily switch pages from index to no index. You can also do this manually at the back end of the website page.
9. Set a custom crawl rate
In the old version of Baidu Search Console, if Baidu’s spiders had a negative impact on your site, you could actually slow down or customize the crawl rate.
This will also give your site time to make the necessary changes if a major redesign or migration is underway.
10. Eliminate duplicate content
Having a lot of duplicate content can significantly slow down crawls and lower your crawl budget.
You can eliminate these issues by preventing these pages from being indexed or by placing canonical tags on the pages you wish to index.
Similarly, optimizing the meta tags for each page is also worthwhile to prevent search engines from mistaking similar pages as duplicates in crawling.
11. Block pages you don’t want spiders to crawl
There may be situations where you want to prevent search engines from crawling specific pages. You can do this in the following ways:
Place the noindex tag. Place the URL in the robots.txt file. Delete the page completely.
This can also help your crawling run more efficiently, rather than forcing search engines to pour duplicate content.
If you have followed SEO best practices, you may not need to worry about crawling.
SEO 11 tips to improve indexing skills