"Google's search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them."
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. "A source that has few false facts is considered to be trustworthy," says the team (arxiv.org/abs/1502.03519v1). The score they compute for each page is its Knowledge-Based Trust score.What the heck is an "incorrect fact"? And the very next sentence a "false fact"?
The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings. Read More
***
"Dark matter, has of course, not been proven to exist, but evidence has piled up for the existence of some sort or matter that we cannot see or truly explain. To find out if it might be at play in the sun (captured perhaps by its gravity) the team built four models. One of the models was based on standard theory, the other three all took into account the possible impact of dark matter. The dark matter models mathematically described the possibility of interactions between dark matter and regular matter and the momentum that might or might not occur." Read more
No comments:
Post a Comment