The process of a search engine searching, sorting and collecting data from its index
It must be constantly updated to make sure there aren't any broken links, old sites are removed and new sites are added
They use web crawlers (spiders). These crawl around websites collecting info like keywords, phrases and the metadata of different websites. They get though multiple websites via their internal and external links.
An algorithm that checks the number of links and the quality of the links(how trustworthy the links are) to a website. WHich determines the websites importance
- How many inbound links from other web pages it has
- The page rank of the web pages that link to it
- The damping factor
Determines the importance of a web page based on its incoming links
Higher - The PageRank is more heavily influenced by its inbound links
Lower - The PageRank is less influenced by its inbound links
So that the PageRank value of a webpage can settle and be given a true value