Last Updated on August 3, 2023 by
Is your site failing to rank well in the SERPs? Did your on-page SEO techniques not lead to any results?
Find out what are the most common SEO problems and improve your ranking .
By now, appearing among the first organic results of Google has become increasingly difficult, often also due to problems related to basic on-page SEO techniques, which if not used or used incorrectly do not help the positioning of your site in the least.
Thanks to the anonymous data collected by SEMrush from 100,000 sites and 450 million pages it was possible to determine which SEO problems are most encountered by site owners and the solutions to solve them.
Here are the summary data:
The 11 most common SEO problems,
1. Duplicate Content
According to SEMrush data, 50% of the pages analyzed face duplicate content problems .
Although there is apparently no specific penalty for duplicate content on the same domain, the problem arises when similar pages start competing for the same position in search results and Google ends up filtering one at the expense of the other, perhaps that one. that you wanted to place!
In a similar situation, in addition to a thorough site crawl aimed at eliminating duplicate content, using the rel = canonical attribute can help, because it allows Google to know exactly which of the duplicate pages to position.
The research found that 45% of sites have images without alt tags, while 10% have broken images. Let’s see in detail.
Alt tags are used to describe your images to search engines and ensure they are correctly indexed in image search, driving additional traffic to your site.
Broken images can cause the same problems as broken links – a bad user experience. One of the ways to avoid them is to make sure you have the images in your media library , and not on another image host.
3. Problems with Title Tags
Title tags are used to tell search engines and visitors what each page of your site is about, in the shortest and most accurate way possible.
SEMrush found that 35% of sites have duplicate Title tags, 15% have too long text in the tag, 8% don’t, and 4% don’t provide enough text.
4. Meta Descriptions
The meta description is a short paragraph of text present in the HTML code of a web page and describes its content. This description appears below the page URL in the search results, as highlighted in the figure.
Research conducted by SEMrush reveals that 30% of sites have duplicate meta descriptions, while 25% do not.
5. Internal and external broken links,SEO techniques
The data collected shows that 35% of the sites have internal broken links that refer to HTTP status codes (70% of which to the 4xx page not found code).
While 25% of the sites analyzed have broken external links , which can seriously damage the authority of your site. This is why it is important to keep them under control and correct or delete them (if the page they were referring to no longer exists).
6. Low Text-to-HTML ratio SEO techniques
On 28% of the sites analyzed, the search showed a low text -to-HTML ratio.
According to SEMrush, these sites contain proportionately more strings of HTML code than actual user-readable text and recommends maintaining a ratio of 20% as the minimum acceptable limit.
Here is a complete list of things to do to help reduce this ratio:
- Remove whitespace
- Avoid a lot of tabs
- Remove comments in code
- Avoid tables
- Use CSS for styling and formatting
- Resize your images
- Remove any unnecessary images
- Use Javascript only if necessary
- Keep your page size under 300kb
- Remove any hidden text that is not visible to users
- Your page should always have clear, easily readable text – quality text for user information
7. Problems with the H1 tag
It is important to know the difference between H1 tags and Title tags. The Title tag appears in search results, while the H1 tag , usually the title, is what users see on your page.
Let’s see some data of the sites analyzed:
- 20% had too many H1 tags on the same page
- 20% had no H1 tags
- 15% had duplicate information in Title and H1 tags.
Normally you should only use one H1 tag per page and break posts with many h2 tags.
8. Low word count
Google is increasingly favoring in-depth articles in ranking over those with weak content. Of the websites viewed, as many as 18% had a low word count on some pages.
It is therefore important to create in-depth web pages that rank high in Google’s search results and stay there.
9. Too many on-page links
Research reveals that 15% of sites contain too many on-page links on some pages.
Having a maximum number of links on a page is not a problem in itself, but filling a page with unnatural links certainly is. After all, a chaotic page full of links leads to a bad user experience, especially on mobile.
As SEMrush argues, good SEO requires a natural link profile that includes high quality links . Then check the links on each page and eliminate the ones that do not provide any value to your readers or your SEO strategy.
10. Incorrect language
SEMrush found that 12% of sites omitted the default language of the page text.
Declaring the language is useful for translating and displaying the page, and allows people who use text-to-word converters to hear the read content with the right pronunciation.
To change it, just enter the International Targeting section of the Search Console.
11. Temporary redirects,SEO techniques
Research has shown that 10% of the sites analyzed contain temporary redirects .
According to SEMrush, a 302 redirect can cause search engines to continue indexing the old page, while ignoring the one you’re redirecting it to.
Even though Google claims that its algorithm does not penalize 302 redirects and that, for indexing purposes, they are treated as if they were 301, remember that a permanent 301 redirect passes link authority to your favorite page, while a temporary 302 redirect does not. , so it’s best to avoid it.
To conclude, here is the complete infographic published by SEMrush: