The Google spider crawls over the internet and examines all the websites. It checks the internal and external links of each website it examines. It checks to see if the external links are working and not linking to spam sites. Google spider catalogs and indexes its findings of each website it crawls. Google spider can only check linked pages and not pages or sites which require username and password to enter. If you cannot open a link by clicking on it, neither can Google. Just for a minute imagine if you were to open a page in Wikipedia. You click on each link of the page and on each link on the pages that open, this is what Google does. Google likes links which are relevant and real links, and not malicious or spam links. If Google comes across a set of spam links on a site, it may penalize it.
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.