Robot.txt
If You Want To Index Your Site In Google Chrome ,
Firefox , Safari , MS Edge. Use " Robot.txt "
What Is Robot.txt?
A Robots.txt File Is Used To Manage Crawler Traffic To Your Own Site. A Robot.txt Is A Program Where You Index Your Site Only In These Search Engines.
• Chrome
• Firefox
• Safari
• MS Edge
Where Can I Find Robot.txt?
If You Use A Site, Such As Blogger Or Wordpress. So This Is For You. You Must Be Follow These Steps. Here You Can Find Robot.txt In Your Blogger Website!
1. Go To Your Blogger Settings.
2. See Crawlers And Indexing.
3. Enable Custom Robot.txt.
How To Index Your WebSite In Search Engine.
You See Image And Enable Custom Robot.txt
Click On Custom robot.txt
& Paste This :
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourURL.com/sitemap.xml
Paste Your Site In Sitemap : YourSite/sitemap.xml
/sitemap.xml Is Very Important.

0 Comments