Google robots take a snapshot of each page visited as they crawl the web. These are stored and used as a backup if the original page is unavailable.