Robots are the sites 'Autonomous' programmes that scan the site and monitor the content. Some sites, like 'Google' use them to index the Web content automatically. Content scanned by Robot's is allways stored and can be used at a later date.