Search engines want to present their users with the best possible results for their query. Therefore, robots crawl and evaluate web pages on a multitude of factors. Some factors are based on the user’s experience, like how fast a page loads. Other factors help search engine robots grasp what your pages are about. This is what, amongst others, structured data does. So, by improving technical aspects we help search engines crawl and understand our site. If we do this well, we might be rewarded with higher rankings or even rich results.
It also works the other way around: if we make serious technical mistakes on our site, they can cost us. We wouldn’t be the first to block search engines entirely from crawling our site by accidentally adding a trailing slash in the wrong place in our robots.txt file. But it’s a misconception we should focus on the technical details of a website just to please search engines. A website should work well – be fast, clear, and easy to use – for our users in the first place. Fortunately, creating a strong technical foundation often coincides with a better experience for both users and search engines. Let’s see what are the best practices to follow: