Friday, January 1, 2010

The Metaphors of the Net

I. The Genetic Blueprint decade after the invention of the World Wide Web, Tim Berners-Lee promoting the "Semantic Web". Until then, the Internet is a repository of digital content. It has a rudimentary inventory system and very crude data location services. As a sad result, most of the content is invisible and inaccessible. Moreover, Internet manipulates strings of symbols, not logical or semantic propositions. In other words, compare the values of the network, but do not know the meaning of the values that drive that. E 'in a position to interpret the strings, to infer new facts, deduce, induce, deduce, or otherwise to understand what he is doing. In short, do not understand the language. Run an ambiguous term with any search engine and these shortcomings are evident. This lack of understanding of the semantic foundations of its raw material (data, information) prevent applications from databases and sharing resources and feeding each other. Internet is discrete, not continuous. It resembles an archipelago, with users jumping from island to island in a frantic search relevancy.Even visionaries like Berners-Lee does not provide an "intelligent Web". They are simply intended to enable users, content creators and web developers assign descriptive meta-tags ( "name of the Hotel"), fields, or strings of symbols ( "Hilton"). These meta-tags (arranged in semantic and relational "ontologies" - lists of metatags, their meanings and how they relate to each other) will be read by various applications and allow the process associated strings of symbols correctly (place word "Hilton "in the phone book under" hotels "). This will make information retrieval more efficient and reliable and the information retrieved is bound to be more relevant and capable of processing the highest level (the statistics, the development of heuristic rules, etc.). The fare is HTML (referred to whose labels with visual appearance and content indexing) to languages such as DARPA Agent Markup Language, OIL (Ontology Inference Layer or Ontology Interchange Language), or even XML (whose tags are concerned with content taxonomy, document structure, and semantics). This would bring the Internet closer to the classic library card catalogue.Even in its current, pre-semantic, hyperlink-dependent phase, the Internet brings to mind "the seminal work of Richard Dawkins, The Selfish Gene" ( OUP, 1976). This is doubly true in the semantics Web.Dawkins suggested to generalize the principle of natural selection to a law of survival of the block. "A stable thing is a set of atoms which is permanent or common enough to deserve a name." He then proceeded to describe the emergence of "Replicators" - molecules that have created copies of themselves. The replicators that survived in the competition for scarce raw materials were characterized by high longevity, fecundity, and copying fidelity. Replicators (now known as "genes") constructed "survival machines" (organisms) to protect them from the vagaries of an increasingly harsh environment.This highly reminiscent of the Internet. Things "stable" web pages are coded in HTML. They are replicators - to create copies of themselves every time their "web address (URL) is clicked. The HTML for a web site can be viewed as" genetic material ". It contains all the information necessary to render the page. And just as in nature, the higher the longevity, fecundity (measured in links to the website from other websites), and loyalty, copy the HTML code - the greater your chance of survival (as a web page) . Replicator molecules (DNA) and replicator HTML have one thing in common - they are both packaged information. In the context of the case (the right biochemical "soup" in the case of DNA, the proper software application for HTML ) - this information generates a machine "survival" (agency or a web page). The Semantic Web will only increase the longevity, fecundity and copying fidelity or the underlying code (in this case, the oil or XML instead HTML). By facilitating many more interactions with many other websites and databases - the underlying code "replicator" to ensure "survival" of "their" web page (= its survival machine). In this analogy, " DNA website (oil or XML code) contains "single genes" (semantic meta-tags). The whole process of life is to deploy a kind of semantic Web.In a paragraph

No comments:

Post a Comment