What is On-Page SEO?
On-Page SEO also known as on-site SEO which means we perform different tasks on every webpage of the website and optimize the content of each webpage to get the top ranking in search results. It contains different tasks that we perform on the website such as optimize the content, Meta tags according to our targeted keywords, heading tags, permalinks, etc…
Why On-Page SEO Important?
As we all known On-Page SEO plays the most important role to get a higher rank in Google search results. There is no change in Google search terms. They still search for a specific keyword on your page that more relevant to your search query. There are various On-Page Tasks that we have included in our On-Page Guide as follows.
Step by Step On-Page Optimization Guide:
- Keyword Research: This is the very first task you should focus on. Search the most relevant keywords for every webpage of your website according to your niche. If your domain is new then you should go for searching the long-tail keywords. Next, check the search volume of every keyword that shows how much the keyword is popular and competitive. For example below in the given screenshot you can see I have searched the keyword “SEO” on Google and there is the selected search volume for this keyword is 584,000,000.
- Meta-Title Optimization: Meta Title is the first HTML element that specifies the title of a web page. After completing the proper keywords research the next task is to prepare Meta Title for every webpage of your site and upload them in the backend by using Yoast SEO plugin. Make sure your title should be (55-60 Character in Length) According to Google.
Meta Title Example:
- Meta-description Optimization: Next write the appropriate Meta description with Meta titles by using relevant keywords on the description and upload them in the backend by using Yoast SEO plugin. Keep Meta Description between (155-160 Characters in Length).
Meta Description Example:
- Heading Tags Optimization: Heading Tags is also a very important task in SEO that helps you to get the best ranking on Google for your keywords. For this make sure that every webpage is having proper formatted heading tags from (H1, H2, H3……..H6). Also, use your keywords in heading tags according to the page content.
- Image Optimization: Image optimization helps you to get your site in image search also. For Image optimization, you need to add Alt Tag (Alternative Text) to the media images of your site. It is best to use your keywords as Alt Tag.
- Internal Linking: Internal Linking is the process of adding a link that points from one page to another page on the same site. Internal linking needs to be a keyword-rich anchor text pointing to a relevant page. Only use 2 internal links per page.
- External Linking: External Linking means adding a link that points from your page to another website. No need to use keyword-rich anchor text as long as it’s relevant to the destination URL. Only use 1 external link per page.
- Permalink Structure Optimization: Permalink is the URL of the page. As permalink help Google to understand the content of your page, always use your main keyword or focus keyword in the URL of the page. Do not use random numbers and underscores “_” and make sure it’s all in lower case.
- Sitemaps: Create sitemaps for your site which include all pages of the site. Generate All (sitemap.xml, sitemap.html, ror.xml, and urllist.txt) Sitemaps and upload them on the server.
- Robots.txt File Optimization: Robots.txt is a text file that tells web robots or search engines which page on your site to crawl and which page is not to crawl. Generate a robots.txt file and upload it on the server or in the root of your domain.
Here are some of the examples for robots.txt in action for a www.example.com site:
Robots.txt file URL: www.example.com/robots.txt
- Blocking all web crawlers from all content
- User-agent: * Disallow: /
- The above syntax tells all the web crawlers not to crawl any pages on www.example.com, including the homepage.
- Allowing all web crawlers access to all content
- User-agent: * Disallow:
- Using the above syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
- Blocking a specific web crawler from a specific folder
- User-agent: Googlebot Disallow: /example-subfolder/
- This syntax specify to black the specific bots for crawling your site such as above syntax tells only Google’s crawler not to crawl any pages that contain the URL string www.example.com/example-subfolder/.
- Schema Markup: Schema Markup is a code (semantic vocabulary) or microdata that you can put on the HTML code of your website to boost up the ranking of your website. Schema Markup helps the search engine to provide detailed and correct information about you in SERPs (search engine results pages). This includes information like organization details, reviews from other sources, etc. Schema very easily tells search engines what your website or a particular page is about and helps search engines to understand your website or return a more accurate and informative result for users.
So, above are the most important tactics that you can use to get the best ranking for your website on Google Search Results. Don’t wait let’s start with step by step guide and share your experience. Please share your thoughts in a comment on how this blog is helpful to you.