Wapka Robots.TXT Code To Get Huge Traffic In search Engines
What Is Robots.txt ?The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to advising cooperating web crawlers and other web robots about accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites
How to Configure Robots.txt in Wapka Site :
Go to Wapka E.S >> Global Settings >>
Select Head Tags(meta,style,....)
Now, Click On Edit Robots File
Insert the Below Code In It :
Here is the Robots.txt Code :
User-agent: Mediapartners-Google
Disallow: /
User-agent: *
Allow: /
Sitemap: http://yoursitename.wapka.mobi/sitemap.xml
Note : Replace yoursitename.wapka.mobi with your site Url
Click & Copy Your Code Here :
Click On Save And Wait for Google Spiders to Crawl Your Site...
0 Comments