Pages

SEO Interview Questions Part 2

Q:What is Black Hat SEO?
A:Race Among websites to attain High Ranking has brought about a Number of methods in addition to approach used to achieve this goals. These Methods may be characterised into 2 Groups depend on there acceptability to search engines.
White Hat SEO - SEO Methods which conform to search engine Regulations are White Hat
Black Hat SEO - SEO Techniques all those are considered as less acceptable are called Black Hat.

Q:What are the other methods to restrict a webpage from the search index?
A:we can use noindex meta tag for the page which we dont want to be indexed in google search so that google will crawl and leave the site without indexing.

Q:What exactly do you understand from the Panda update?
A:The Panda update was done to improve the quality of search results in Google. Panda update also known as the farmer update appeared to be done to eliminate content farms which provided less user friendly experience. It used machine language scalability as one of the important metrics for judging relevancy of a web page. All the focus was transformed on the user and now quality content, proper design, proper speed, suitable use of images and videos, content to ad ration all mattered more after the Panda update. You should optimise your site for better clickthrough rate and a a reduced amount of bounce rate.

Q:What areas do you think are currently the most important in organically ranking a website?
A:Text on page! Search engines utilize text and only text in providing search results. That textual content can be found in many place including the URL and title of your respective pages as well as the visible text you place on your webpages.

Q:Why is a Robots. txt File Used?
A: Robts. txt File is used to prevent the indexing of duplicate content on a website or for implementing a no follow rule in an area of a website. Martijn Koster invented this file in the early part of the 90s - the file used more and more with the development well-known search engines as time progressed.Therefore, if a webmaster or an owner of a website wants to direct the web robots, he must establish the Robots. txt file and provide exact guidelines for robots to read before accessing other files on the site. If the file is not used, the web robots will certainly simply conclude that no explicit instructions are available for the site. Also, If the website has several subdomain, then a Robots. txt. file must be used for each of the subdomains.

0 comments:

Post a Comment

After your comment subscribe to our blog to get the latest SEO interview questions with answers which are asked in top mnc companies.