What the heck is an ontology???
A (shared) expression of belief, an agreement on the terminology (and sometimes the meaning) for communication and action. Ontologies serve to bound discourse, facilitate communication within & across communities and networks, leverage action by gathering agreement around meaning, values, objects, the way things are and what is 'out there' that is important. Ontologies help to orientate new folks and act as the stores for key learnings & distinctions accumulated through experience. Ontologies have a large influence on identity and help with the tacit transfer of context. Ontologies IMO are destined to become a very influential part of knowledge work as the semantic web evolves.
See this article on ontologies and XML: http://www.semanticweb.org/knowmarkup.html
Ontologies hold promise for:
* Providing a common language for different parties
* Improving communications through sharing meaning and raising social capital
* Increasing alignment and leveraging self-organization via shared understanding
* Providing an enterprise wide schema for intuitive navigation
* Being able to leverage language as a tool
* Helping communities of practice to improve their dialog and make key distinctions
* Sparking innovation, helping to recognize emergent concepts, knowledge gaps and improving relationships
So where do I start?
Look for an existing ontology and see if it will do or if you can adopt it for your work. Think about people before you jump into selecting a tool or a representation. Get buy-in from key users, stakeholders and identify evangelists. Start small, begin with the familiar construct a glossary, identify key terms, focus on immediate issues, map out the domain before you dive into definitions and become mired in language differences. Avoid becoming the word police
Work on terms that carry high value to the group / organization, i.e. emergent concepts that spark awareness, troubling terms that help clarify common communications, important concepts that carry many names or terms that are used in different ways. Strive for emergent consensus rather than imposing taxonomic 'laws', look for areas where standardization will give immediate benefits e.g. XML markup, computer application integrations, multi-agent processes, e.g. outsourced operations or joint-venture projects.
Here is an article on ontology development
Resources:
* Sebastian Paquet
* Ontology at Buffalo State
* KM through ontologies
* Christian Ohlms, McKinsey & Company
* Corporate memory & ontology
As to the meaningless of the text contained by an XML tag, keep in mind that all content on a computer is meaningless. Without the conventions that define data types and instructions, and the computational power used to act on the assumptions of type, computing couldn't happen.
Markup isn't the only computing device that separates content from presentation. Processor instructions define the presentation of the memory content within the processor itself. Cache decoherence separates the temporal context of the content from processor presentation. Hypertransport moves data via instructions and goes one further by packetizing the content and instructions. You might say it creates a tag, where the instructions are attributes and the data is serialized between the header tag and the trailer tag.
The maintenance of context is job one in computing at all levels. Content and presentation are only one on the screen.
Posted by: David Locke | October 13, 2003 at 03:31 AM
Contrary to the author of the webpage associated with the first link, XML markup is semantic markup. The concept of semantic markup originated in SGML. This semantic markup is the value provided by the ease of knowing which meaningless chunk of text was an instance of a particular kind like title, abstract, or etc.
The quickest way to see the value of semantic markup is to ask how much computational power would have to be applied if the text wasn't marked up semantically. How many pattern recognition programs were written in the history of A.I.
This semantic markup may not be all that rich, but it is rich enough to present a task allocation boundary between man and machine. And, it puts the machine in the role of helping man rather than replacing man.
Semantics is an onion. Container-based semantic markup is the base layer of the intended semantics.
As for serialization, narrative is the norm for man. Narrative is expressed in a language constrained by a grammar. Bacus-Naur Form describes grammar, not schema. It should come as no surprise that XML is a grammar, not a schema.
Schema capabilities are an extension to XML. I was very surprised that database tables express a grammar. But, if XML can be used as a schema, then this conclusion must be valid. Not being a linguist, I'm wondering if linguists use ER diagrams to define the languages they study.
Posted by: David Locke | October 13, 2003 at 03:21 AM