Robots.txt kise kehte hain ? Ek blog me inka use kya hai ? Hum kis tarah isko apni choice aur preferences ke according use kar sakte hain ? Agar aap in sab cheezo ka matlab janna chahte hain to ye post sirf aur sirf aapke liye hai. Is post mein mai aapko bataunga ki kis tarah aap apne Blogger Blog me Custom Robots.txt add kar sakte hain wo bhi bina kisi pareshani ke.
Robots.txt ek set hota hai lines ka, definitions ka aur commands ka jo Search Engine aur other Bots ke liye likhi jaati hain aur apne blog ke liye use ki jaati hain. Robots.txt ke through google hume humare published content ke upar control deta hai hum kis search engine pe kaun sa content index yaani show hone dena chahte hain, kaun si images ya other files index hone dena chahte hain ya nahi, aur bhi bahut kuchh.
Example ke liye - Jaise aapne apni koi blog post publish ki. Aap us particular post ko google me index hone de sakte hain aur us blog post me used images ko nahi. Aap usi post ko Yahoo me index hone de sakte hain aur google me nahi. Aise hi bahut se control ka fayda hum Robots.txt ka use karke utha sakte hain aur ye bahut hi easy hai.
Jab bhi aap koi new blog blogger pe banate hain to usme default me yaani pehle se hi Robots.txt added rehte hain aur usko aapko apne according modify karna hota hai aur usi modified yaani edited texts ko hum Custom Robots.txt Kehte hain. Niche main aapko us texts ka matlab bataunga aur usko set karna ke steps bhi bataunga.
Ab mai aapko un lines ka matlab samjhane ki koshis karunga aur yakeen maaniye ki ye bahut hi easy hai.
1st Line - User Agent: *
Iska matlab hua ki ye Robots.txt sabhi Bots ke liye apply hota hai na ki particular Google ya other search engine bots ke liye. Jaise aap particularly Googlebot, Bingbot, YandexBot etc bhi use kar sakte hain.
2nd Line - Disallow: /search
Yaha pe aap clearly dekh sakte hain ki humne Disallow ke aage /search daala hai. Iska matlab hua ki hm is url string se related sabhi blog posts ko index nahi hone dena chate. Users aapke blog pe kuchh bhi search karte hain to wo particular search pages bhi index ho jaate hain jo ki koi matlab nahi rakhta. Issey aap un pages ko search results me aane se rok sakte hain.
3rd Line - Allow: /
Iska matlab hua ki hum apne naked domain yaani homepage aur other strings ko google pe index hone dena chahte hain. / ka matlab yahan www.domain.com/ aur uske other pages aur posts se hai. Isko use karke aap apne blog ki saare posts aur pages ko ek sath index hone ke liye alow kar rahe hain. Kisi particular page ya post ko de-index karwane ke liye aap Disallow use kar sakte hain jaise ki 2nd line mein humne search pages ke liye kiya hai.
4th Line - Sitemap: http://www.a2zinhindi.com/sitemap.xml
Ye humare blogger blog ke sitemap ka link hai. Isko provide karke hum search engines ko apne blog pe published sabhi posts ki list de rahein aur jo automatically update hoti rehti hain. Issey search engines ko aapke blog pe published har ek post ka update milta rehta hai aur wo usey apni crawling me index kar dete hain.
To dosto ye tha Blogger ke Custom Robots.txt ke baare me. Maine apni bahut ko bahut simple aur easy to understand rakha par fir bhi mujhe pata hai ki aapke man me bahut tarah ke doubts aur questions honge.
To doston aap apne doubts aur questions niche comment section me puchh sakte hain aur mujhe aapki help karne mein behad khushi hogi.
Robots.txt ek set hota hai lines ka, definitions ka aur commands ka jo Search Engine aur other Bots ke liye likhi jaati hain aur apne blog ke liye use ki jaati hain. Robots.txt ke through google hume humare published content ke upar control deta hai hum kis search engine pe kaun sa content index yaani show hone dena chahte hain, kaun si images ya other files index hone dena chahte hain ya nahi, aur bhi bahut kuchh.
Example ke liye - Jaise aapne apni koi blog post publish ki. Aap us particular post ko google me index hone de sakte hain aur us blog post me used images ko nahi. Aap usi post ko Yahoo me index hone de sakte hain aur google me nahi. Aise hi bahut se control ka fayda hum Robots.txt ka use karke utha sakte hain aur ye bahut hi easy hai.

Blogger mein Custom Robots.txt Kya Hai ?
To dosto upar ke paragraphs mein maine aapko Robots.txt ke baare mein already bata diya hai. Ab mai aapko Blogger ke Custom Robots.txt ke baare mein batata hun.Jab bhi aap koi new blog blogger pe banate hain to usme default me yaani pehle se hi Robots.txt added rehte hain aur usko aapko apne according modify karna hota hai aur usi modified yaani edited texts ko hum Custom Robots.txt Kehte hain. Niche main aapko us texts ka matlab bataunga aur usko set karna ke steps bhi bataunga.
How to Add Custom Robots.txt in Blogger (Complete Steps in Hindi)
Yahan mai aapko inhe aad karne ke steps bataunga aur saath hi saath us texts ka detail mein matlab bhi bataunga jisse ki aap unko poori tarah se samajh payenge aur apni choice ke anusaar unko apne blog pe use kar paayenge.- Sabse pahle aap Blogger open karein aur apne account se login karein.
- Ab Settins mein click karke uske niche waale option 'Search Preferences' ko click karein.
- Search Preferences mein aapko niche 'Custom robots.txt' ka option dikhega. Wo disabled hoga, usko enable karna hai.
- Aap uske saamne wale button 'Edit' pe click karein aur waha niche diya hua text copy-paste kar dein.
- Paste karne ke baad aap 'Save Changes' pe click kar dein aur Robots.txt added.

User-agent: *
Disallow: /search
Allow: /
Sitemap: http://www.a2zinhindi.com/sitemap.xml
Note - 'www.a2zinhindi' ko apne domain se replace karna na bhulein. Waha pe jo aapka domain hai w daalein.Ab mai aapko un lines ka matlab samjhane ki koshis karunga aur yakeen maaniye ki ye bahut hi easy hai.
1st Line - User Agent: *
Iska matlab hua ki ye Robots.txt sabhi Bots ke liye apply hota hai na ki particular Google ya other search engine bots ke liye. Jaise aap particularly Googlebot, Bingbot, YandexBot etc bhi use kar sakte hain.
2nd Line - Disallow: /search
Yaha pe aap clearly dekh sakte hain ki humne Disallow ke aage /search daala hai. Iska matlab hua ki hm is url string se related sabhi blog posts ko index nahi hone dena chate. Users aapke blog pe kuchh bhi search karte hain to wo particular search pages bhi index ho jaate hain jo ki koi matlab nahi rakhta. Issey aap un pages ko search results me aane se rok sakte hain.
3rd Line - Allow: /
Iska matlab hua ki hum apne naked domain yaani homepage aur other strings ko google pe index hone dena chahte hain. / ka matlab yahan www.domain.com/ aur uske other pages aur posts se hai. Isko use karke aap apne blog ki saare posts aur pages ko ek sath index hone ke liye alow kar rahe hain. Kisi particular page ya post ko de-index karwane ke liye aap Disallow use kar sakte hain jaise ki 2nd line mein humne search pages ke liye kiya hai.
4th Line - Sitemap: http://www.a2zinhindi.com/sitemap.xml
Ye humare blogger blog ke sitemap ka link hai. Isko provide karke hum search engines ko apne blog pe published sabhi posts ki list de rahein aur jo automatically update hoti rehti hain. Issey search engines ko aapke blog pe published har ek post ka update milta rehta hai aur wo usey apni crawling me index kar dete hain.
Check - Fast Loading Blogger Templates
To dosto ye tha Blogger ke Custom Robots.txt ke baare me. Maine apni bahut ko bahut simple aur easy to understand rakha par fir bhi mujhe pata hai ki aapke man me bahut tarah ke doubts aur questions honge.
To doston aap apne doubts aur questions niche comment section me puchh sakte hain aur mujhe aapki help karne mein behad khushi hogi.
Blogger Me Custom Robots.txt Kaise Add Karein ?
Reviewed by Vipin Mishra
on
28 November
Rating:
Reviewed by Vipin Mishra
on
28 November
Rating:
No comments: