In the recent years, there is a growing interest in combining explicitly defined formal semantics (in the forms of ontologies) with distributional seman-tics “learnt” from a vast amount of data. In this paper, we try to bridge the best of the two worlds by introducing a new metrics called the “Semantic Impact” together with a novel method to derive a numerical measurement that can sum-marise how strong an ontological entity/concept impinges on the domain of dis-course. More specifically, by taking into consideration the semantic representa-tion of a concept that appears in documents and its correlation with other concepts in the same document corpus, we measure the importance of a concept with respect to the knowledge domain at a semantic level. Here, the “semantic” im-portance of an ontology concept is two-fold. Firstly, the concept needs to be informative. Secondly, it should be well connected (strong correlation) with other concepts in the same domain. We evaluated the proposed method with 200 BBC News articles about Donald Trump (between February 2017 and September 2017). The preliminary result is promising: we demonstrated that semantic im-pact can be learnt: the top 3 most important concepts are Event, Date and Organ-isation and the least essential concepts are Substance, Duration and EventEduca-tion. The crux of our future work is to extend the evaluation with larger datasets and more diverse domains.
|Name||Lecture Notes in Computer Science|
|Conference||12th International Conference on Human-Centered Computing (HCC 2018)|
|Period||5/12/18 → 9/12/18|
- Semantic Impact
- Ontology Learning
- XYZ Model