Knowledge induced transformer network for causality prediction
Companion Proceedings of the ACM Web Conference 2024, 2024•dl.acm.org
Causal extraction from text plays a crucial role in various downstream analytical and
predictive tasks, such as constructing repositories of causal insights for reasoning. However,
existing models often overlook the rich contextual commonsense knowledge that could
enhance the reasoning process and evaluate underlying causal mechanisms. In this study,
we introduce a knowledge-induced transformer architecture for predicting causality. Our
model accepts an antecedent and a set of contextual knowledge as input, then ranks …
predictive tasks, such as constructing repositories of causal insights for reasoning. However,
existing models often overlook the rich contextual commonsense knowledge that could
enhance the reasoning process and evaluate underlying causal mechanisms. In this study,
we introduce a knowledge-induced transformer architecture for predicting causality. Our
model accepts an antecedent and a set of contextual knowledge as input, then ranks …
Causal extraction from text plays a crucial role in various downstream analytical and predictive tasks, such as constructing repositories of causal insights for reasoning. However, existing models often overlook the rich contextual commonsense knowledge that could enhance the reasoning process and evaluate underlying causal mechanisms. In this study, we introduce a knowledge-induced transformer architecture for predicting causality. Our model accepts an antecedent and a set of contextual knowledge as input, then ranks plausible consequences from a given set of hypotheses. To enhance semantic understanding, we augment the transformer with a relational graph network, which computes fine-grained semantic information between the antecedent, knowledge, and hypotheses using a similarity matrix that quantifies word-to-word similarity. We evaluate the proposed architecture against state-of-the-art models using openly available datasets and demonstrate its superior performance.

Showing the best result for this search. See all results