•
20-minute read
Google keeps releasing updates, content guidelines change, and AI is applied literally everywhere — yes, this is the reality of today. Nevertheless, some optimization points remain constant.
Today, I’m going to cover 14 SEO basics that are a must-have for any SEO strategy to beat the competition and survive through Google updates with flying colors in 2024 and beyond.
Ready, steady, go.
Every page serves for a specific keyword. Still, targeting solely a broad generic search term, say, shoes, is not the best option for many reasons:
To avoid these traps, your aim is to refine your top-level broad keyword with modifiers. This way, you’ll be able to cover your topic in more detail, increase your chances of ranking, and make sure you attract more relevant traffic.
Besides, targeting a more focused keyword increases your chances of uncovering the term as an entity, i.e., describe not the keyword but the concept behind it. This entity-based content approach is highly appreciated by both Google and users.
Example. Your primary keyword is shoes, which has a pretty vague variety of possible content and a high Keyword Difficulty score:
But if you add a modifier, say, let it be trekking shoes, the difficulty score becomes lower. Plus, the factor/position correlation becomes more obvious.
One of the best ways to find perfect keyword modifiers (aka long-tail keywords) for your website is Rank Tacker’s Keyword Research module. Or, better say, its Autocomplete Tools section. Just enter your primary search term and enjoy the big and joyful set of keywords, all coming together with their SEO metrics (#of searches, expected visits, competition, cost-per-click, keyword difficulty, etc.):
Besides, try the Related Searches and Related Questions methods of Rank Tracker. They work pretty similar to Autocomplete Tools but offer more long-tail keyword options for targeted optimization.
All keywords have different search intents behind the search query. This search intent is usually the initial reason why a user searches for this very query. The better you get this search intent, the more related content you create, so your chances to boost your rankings grow higher.
The traditional approach to search intent suggests there are four types of intent:
Navigational search intent is the first stage of the marketing funnel and means a user needs to get somewhere online or offline. This type of intent is typically indicated by location mentions: McDonald's, Subway Seattle, SEO PowerSuite help center.
Sure thing, you don’t need product pages or blog posts here — target home pages and Google Maps listings will probably be more popular.
Informational search intent means that a user needs to find some kind of information to learn more about the subject. Or to entertain. These queries often include modifiers like what, when, who, or are sometimes clear without modifiers at all: who invented bricks, euro 2024 winner, Google ranking factors.
Here, informational pages with blog posts, videos, and target topic pages will do the best.
Commercial or commercial investigation intent signals that a user is getting closer to purchase and wants to find more details on the product or service. This intent usually goes behind queries like best trekking shoes, top locations to see in Italy, sony vs. marshall.
To get into commercial investigation SERPs, try creating listicles and reviews — these content formats cover the investigation intent in the most detail.
Finally, we have transactional search intent. This intent means that a user is ready to buy and needs to find the best solution. Transactional queries are usually pretty long and less traffic-rich yet much more conversion-oriented: sneakers under $200 buy, optimum nutrition protein discount buy, Rank Tracker download free.
Transactional SERPs most often feature product pages, catalogs, and local results with addresses if we’re talking about offline shopping. So, do your best with product-oriented content.
Meta title and meta description are the first bite of information about your page users see from the SERP. Well-written title and description make the snippet appealing and increase click rate (i.e., traffic), so more users visit your page. As Google treats traffic flow as a ranking factor, these meta elements are becoming even more influential.
Besides, meta title and description are also the first things Google discovers about your page’s content and decides if they match the user’s query. If yes, then your page has a better chance of ranking higher.
So, here are a couple of ideas on how to make your meta title and description shine bright like a diamond:
In WebSite Auditor, go to Page Audit > Content Audit and paste the URL of the page you’re optimizing and your target keyword. The tool will analyze your page alongside your top 10 SERP competitors. You will see how your target keyword is used in title tags and meta descriptions of your competitors and get more ideas for your page.
Analyze, optimize, and enjoy the result.
HTML tags remain one of the most important things in SEO. They help Google algorithms understand your content even in the era of growing AI usage, i.e., rank it better. They enhance user experience despite being actually invisible, i.e., help get more attractive SERP snippets. They help avoid many issues like content duplication or crawling errors, i.e., improve indexing.
The misusage of HTML tags may even lead to manual penalties from Google — isn’t this proof of HTML tags’ importance?
There are actually many, many HTML tags widely used in SEO in addition to the above-mentioned meta title and description. Still, let’s focus on the most widely used and important ones.
Headings (H1-H6) are used to split your page into sections or chapters. Each heading is like a small title on the page.
H1-H6 headings cater to user experience. They help users navigate on the page, thus preventing bounce rate growth. But the best part here is that H1-H6s can actually function as titles for Google.
For example, you have a page about how to assemble a bookcase, and there’s a part about how to use a screwdriver marked with the corresponding H2 tag. In this case, Google can rank your page for the query how to use a screwdriver, using this H2 title in the SERP snippet. Even if your page’s initial purpose (and, consequently, the meta title) is to talk about bookcases.
Robots tag actually tells Google if the page should be indexed or not. With its help, you can restrict some pages from indexation for whatever reason: say, a page may contain private information or be under development. Bet you don’t want anyone to see you half-dressed? Well, the same is true about your site.
The robots tag can feature the following parameters depending on what message you want to communicate to Google:
A canonical tag helps you prioritize the indexation of similar pages with identical content. Thus, you save yourself the trouble of duplicate content issues and manual sanctions Google can issue for this.
Another purpose of the canonical tag is to prevent keyword cannibalization — the situation when several pages from your site start competing for the same keyword, thus eating up each other’s chances to actually rank.
The canonical tag is widely used on big product sites offering variations of the same product (say, pink, white, or brown shoes). One more use case is related to international websites with various language versions of the same page — in this case, self-canonicalization is applied.
In SEO, it is common to underestimate the importance of images. Yet, this is not actually the right way to go. Images help to improve user experience, increase chances to rank in image sections, and make your content more helpful.
Besides, images have a pretty visible correlation with SERP position. This was brought by one of the speakers at the Digital Olympus Event 2024:
"The top-ranking page has 21% more images than the page ranking in position 20."
With this in mind, you should treat your images properly:
A good way of creating better alt texts is to upload your image to Google's Cloud Vision API and check what Google already sees on the image.
Schema markup is the best way to directly feed your data to Google. You actually save search bots the trouble of crawling through your page by giving all the crucial data at the start.
Structured data helps your pages win rich snippets and SERP features.
This, in turn, makes your snippet stand out and get more clicks, which positively affects rankings and your business SEO goals.
There are tons of Schemas for any type of content: recipes, articles, product listings, books, movies, hotel bookings, etc. So, your aim is to choose the most relevant Schema for your page and apply it correctly.
If you don’t feel like digging through nearly a thousand schemas, use Google Structured Data Markup Helper — its 12 schemas cover most basic needs.
Make sure to test your structured data before publishing it — Google is pretty strict about incorrect schema usage and may penalize your site if it catches you using something fishy.
Internal linking is often underestimated, yet it shouldn’t be. Proper internal linking provides efficient crawling and navigation, as both users and search bots can easily reach even the most distant pages. This consequently streamlines the PageRank flow and helps strengthen the necessary pages to make them rank better.
What’s more, interlinking relevant pages helps Google build ties between the topics and better understand the entity you’re describing. Say, a good idea is to link from the page about pizzato a page about tomatoes, as tomatoes are actually used to make a pizza. Doing so makes Google see you cover the entity in-depth, so its value increases.
Proper interlinking also helps wisely allocate the crawl budget, prevents the appearance of linkless orphan pages, and prolongs the time users stay on your site.
Note. It may happen that you need to redirect your old page to the new, relevant one for whatever reason. In this case, mind the proper implementation of internal redirects. The best solution here is to apply the HTTP 301 (moved permanently) redirect type, but there are other options as well.
You may have a look at them in our comprehensive redirect guide.
Backlinks remain one of the two key pillars of SEO strategy success; SEOs have proved their importance many times. Besides, Google would not have introduced so many updates and link regulations if this factor was a really minor thing.
Backlinks help build authority, earn link juice, and establish a brand reputation, which altogether contributes greatly to your search rankings.
So here are the ways how to equip your website with high-quality backlinks without the risk of getting a penalty from Google.
One of the best ways to get relevant and authoritative backlinks is to find backlink prospects who work with your competitors.
First, if a provider links back to several competitors, then chances are they will link back to you. Second, a prospect like this is surely relevant to your niche and trusted — your competitors have already proved this.
A nice way to find relevant prospects is to analyze your competitors’ historical backlinks via SEO SpyGlass (Historical Data).
Pay attention Lost Links of your competitors — these are your top potential backlink providers. Besides, look at the domains’ Domain InLink Rank, and check if the provided links are dofollow, as they are what you’re looking for.
Outreach prospects and get the details of the backlink partnership.
Placing a guest post on a credible resource is a great way to build links, boost brand popularity, and prove expertise.
Reach out to relevant sources to get more details on guest post placement. For example, some websites offer a list of topics they need, while some ask you to offer the topic first. Agree on the number of links per post and check if your links are placed correctly once the post is live.
Provide enough information to include in the author's profile. If possible, add some links that prove expertise, mention professional achievements, and share if the author participated in conferences — all of that demonstrates expertise and strengthens E-E-A-T signals that also matter when it comes to page strength.
There are listicles featuring products and services in any business area, from AI technologies to dairy farms. Do research to find listicles for your niche and connect with their authors to add your product there.
Tip. You can use LinkAssistant to find guest blogging and listicle inclusion opportunities in bulk.
The tool will show the relevant pages according to your preferences, demonstrate the preview of the page, and tell if that page mentions your competitors.
Google needs to properly see and understand your site to rank it, so do not neglect the indexing details. Otherwise, your great content and relevant backlinks will not help you.
Besides, some indexing factors have been proven to be ranking factors, so do not neglect them when building your SEO campaign.
HTTPS is a protocol for secure communication between a user and a website. It makes it harder to eavesdrop on the information that you send and receive over the internet. HTTPS is actually a standard meant by default these days, as every managed hosting provider ensures end-to-end encryption.
What’s more, HTTPS is a proven ranking factor, as Google prioritizes websites with secure connection. Actually, Google may not open sites without security protocols at all. So make sure you don’t neglect HTTPS and renew your protocols regularly.
A sitemap is actually the list of your site’s pages you want Google to index. Sitemap is uploaded right to your Google Search Console. Although a sitemap works as a hint, not as a directive, it is still a good practice to inform Google about your pages like this.
To generate a sitemap, launch WebSite Auditor and go to Site Structure > Pages. Then, click Website Tools > Sitemap:
The tool will let you choose the pages you want to add to a sitemap, specify crawling frequency, indexing property, hreflang elements if relevant, and indexing instructions.
Once you set everything up, you can download the file and then add it to your Google Search Console admin.
Remember to update your sitemap if your website has any important changes, so Google can crawl and index the new version quickly.
Robots.txt is a file that tells Google which pages to index and which not. It is also used to hide pages from indexation for whatever reason. Robots.txt is placed into the root directory of your website.
The main thing to keep in mind here is that you shouldn’t block pages that you submit in your sitemap, i.e. the pages you actually want to index. This may lead to issues when a page you mean to be indexed is blocked by robots.txt and doesn’t get the traffic it had to.
And vice versa — make sure you don’t include the pages blocked by robots.txt into a sitemap. In this case, Google (and users in search) may see the pages meant to be private (like personal accounts, shopping carts, etc.).
You can create a robots.txt file in WebSite Auditor — just as you did it with a sitemap. Go to Site Structure > Pages > Website Tools, and choose Robots.txt:
Slow pages bring nothing but low positions and high bounce rates. Moreover, Google is likely to fail to see the content of the page and consider it empty or cloaked. This is a direct way to a penalty or even delisting.
To fix PageSpeed-related issues, make sure your pages load quickly and pass the Core Web Vitals assessment.
You can check your pages in Google’s PageSpeed Insights and Search Console or do the bulk check in WebSite Auditor.
All these tools provide fix suggestions so that you will see the opportunities for improvement.
One of the main causes of PageSpeed issues is heavy images, videos, design elements, etc. In the era of visual perception, having good-looking visuals is a must for your website. Still, mind the following rules:
Besides, remember that Google moved to mobile-first indexing, so having your site optimized for mobile devices is a must. Moreover, billions of users choose smartphones over PCs just because they are always at hand and allow access to information anytime.
As for the top solutions for mobile SEO, make your site design responsive so it would fit any device. Don’t overdo it with viewport-blocking pop-ups and ads — they make UX a disaster and may even get you under Google penalty for misleading content if search bots fail to read the content behind the pop-up. In addition, take care of font size so the text is readable, and please, no Comic Sans or any other unreadable fonts.
Your URLs have to be user-friendly, i.e. users have to understand what the page is about by looking at the link. Google also appreciates this type of structure as it prevents confusion when crawling and gives Googlebot a hint about the page’s content.
To make your URLs user- (and Google) friendly, make sure to include page-relevant keywords in the address. Say, if a page is about carrots, then the URL has to feature the keyword carrots.
Besides, keep URLs short and avoid special characters. Long addresses full of various symbols may be harder to process, slow down indexing, and look super-spammy.
Many product websites fall for dynamic URL issues when different URLs point to one page depending on how the page has been reached. If it’s your case, mark the “clean”, attribute-free URL as canonical, so Google will see which one to index.
A site structure is considered good when it is shallow. This means that all the end destination pages should be not more than three clicks from the homepage.
Shallow site structure has a positive impact on your site’s search performance. First, your crawl budget is allocated wisely, so you’re sure Google does see and index all the necessary pages. Then, it is much easier to spread PageRank if its flow has fewer paths (i.e., URLs) to follow to reach the destination.
If the destination pages are located too far, then these pages may have indexing problems. Google crawlers draw parallels between the depth of a page and its importance, so if a page is located deep, it is then considered less important. Thus, it is less likely to be recrawled on time and appear in SERPs.
Tip 1. Use the Click Depth column of WebSite Auditor to quickly assess the click depth of your pages.
Besides, a shallow site structure also lets you add new pages without affecting the overall site structure and click depth, so you will have enough room for new content. And can make sure you’re free of orphan pages — it’s much easier to timely spot them when there’s order, not chaos.
Tip 2. Go to the Visualization module to have a clear view of your site structure and spot distant or lost pages at a glance.
These days, best content creation practices are mostly focused on writing valuable content for people, not search engines. Besides, as AI writing tools become more and more widespread, expertise and unique insights become an issue as well, especially in areas where real-life experience is of great importance.
Recent Google updates have also proved that content has to be useful and expert. Bet you remember that many, many websites lost traffic and positions because Google found their content spammy.
Probably, I will not be the first one to tell you to “write clear, thoughtful content as if you’re talking to a live person”. But this tip is still relevant, even if you use AI tools as writing assistants. Human supervision is important anyway.
Make sure your pages are free from keyword stuffing — this is a strong spam indicator, and Google will hardly like content like this.
Tip. To spot keyword stuffing (and other content-related issues) and do the fixing, analyze your page in WebSite Auditor’s Content Editor module:
Do not write tons of text for the sake of word count — your page has to be useful, not long. If it feels like you’ve explained everything you need in 400 words, then ok, let it be.
At the same time, creating too thin pages is not a good idea, either. Try to write at least 300 words — according to common Google SEO practices, pages with less than 300 words are likely to be treated as thin content. Otherwise, you’re under the risk of getting a thin content penalty and being deranked.
A nice habit is to regularly monitor and audit your website’s Google Analytics and Search Console to timely spot any suspicious search behavior and investigate the case. Doing so will save you the trouble of long recovery processes if you get hit by any Google update or suffer a negative SEO attack.
In addition to your site, it’s useful to monitor your search competition to spot similar patterns in position fluctuation. Or, vice versa, see that competitors’ sites sailed through without any losses and investigate what helped them.
Sure thing, you cannot check your competitor’s GSC and Analytics (at least, legally), but you can easily monitor their sites in Rank Tracker to spot ranking changes and draw parallels between what’s happening to your own site.
Go to Rank Tracking > Tracked Keywords, add competitors to your workspace, and monitor their rankings, position difference, and visibility changes together with yours:
The SEO industry reacts to technological progress and keeps developing accordingly. Still, these SEO tactics are a golden standard that remains relevant today and will remain so for the future. If something changes, we will add the new info asap so you can quickly optimize your site using the top SEO practices.
By the way, what tactic did you benefit from the most? Share your experience in our Facebook community.