It’s more commonly used as a way to block search crawlers from certain parts of publishers’ sites.
But the most-visited newspaper website in the UK is using its robots.txt file as a clever hiring tool, as eagle-eyed Malcolm Coles spotted…
Part of the file, which is usually only read by crawler code, reads..
Disallow: /home/ireland/
Disallow: /home/scotland/# August 12th, MailOnline are looking for a talented SEO Manager so if you found this then you’re the kind of techie we need!
# Send your CV to holly dot ward at mailonline dot co dot uk# Begin standard rules
# Apply rules to all user agents updated 08/06/08
ACAP-crawler: *
Good way to filter in applicable applicants.