GOOGLE WEBMASTERS
Google Webmaster Tools (GWT) is the primary mechanism for Google to communicate with webmasters. Google Webmaster Tools helps you to identify issues with your site and can even let you know if it has been infected with malware (not something you ever want to see, but if you haven’t spotted it yourself, or had one of your users tweet at you to let you know, it’s invaluable). And also, GWT let you evaluate and maintain your website’s performance in search results Offered as a free service to anyone who owns a website, Google Webmaster Tools (GWT) is a conduit of information from the largest search engine in the world to you, offering insights into how it sees your website and helping you uncover issues that need fixing. You do not need to use GWT for your website to appear in search results, but it can offer you valuable information that can help with your marketing efforts.
How GWT can help monitor your website’s performance?
- It verifies that Google can access the content on your website.
- GWT makes it possible to submit new pages and posts for Google to crawl and remove content you don’t want search engine users to discover.
- It helps you deliver and evaluate content that offers users a more visual experience.
- You can maintain your website without disrupting its presence in search results.
- It allows you to discover and eliminate malware or spam problems that may not be easily found through other means.
IMPLEMENTATION OF SITEMAP
A site map is a model of a website’s content designed to help both users and search engines navigate the site. A sitemap can be a hierarchical list of pages (with links) organized by topic, an organization chart, or an XML document that provides instructions to search engine crawl bots. The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling.
- Generate your website sitemap using an online sitemap generator. Enter your website URL and Start the process
- Once the process is completed, it will generate a sitemap.xml file.
- Upload the sitemap.xml file to your root directory and enter the path to submit the sitemap
IMPLEMENTATION OF ROBOTS.TXT
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally, search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way of preventing search engines from crawling your site (i.e. it is not a firewall or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sensitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
The location of robots.txt is very important. It must be in the main directory because otherwise, user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don’t find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way.
Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:
User-agent:
Disallow:
“User-agent” are search engines’ crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “User-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:
- All user agents are disallowed to see the /temp directory. User-agent: *
Disallow: /temp/