Structured Explanations
…structured explanations…describe explanations that are not entirely free-form, e.g., there are constraints placed on the explanation-creating process, such as the use of specific inference rules.
English
Dataset | Task | Explanation Type | Collection Method | # Instances | # Explanations per Instance | Total # Annotators |
---|---|---|---|---|---|---|
WorldTree v1 | science exam QA | explanation graphs | authors | 1,680 | 1 | 4 |
OpenBookQA | open-book science QA | 1 fact from WorldTree | crowd | 5,957 | 1 | n/a |
WorldTree v2 | science exam QA | explanation graphs | experts | 5,100 | 1 | n/a |
QED | reading comprehension QA | inference rules | authors | 8,991 | 1 | 3 |
QASC | science exam QA | 2-fact chain | authors + crowd | 9,980 | 1 | 62 |
eQASC | science exam QA | 2-fact chain | automatic + crowd | 9,980 | ~10 | n/a |
eQASC + perturbed | science exam QA | 2-fact chain | automatic + crowd | n/a | n/a | n/a |
eOBQA | open-book science QA | 2-fact chain | automatic + crowd | n/a | n/a | n/a |
Ye et al. (2020) | SQuAD QA | semi-structured text | crowd + authors | 164 | 1 | n/a |
Ye et al. (2020) | NaturalQuestions QA | semi-structured text | crowd + authors | 109 | 1 | n/a |
R4C | reading comprehenesion QA | chains of facts | crowd | 4,588 | 3 | 45 |
StrategyQA | implicit reasoning QA | reasoning steps with highlights | crowd | 2,780 | 3 | 54 |
TriggerNER | Named Entity Recognition | group of tokens with highlights | crowd | ~7K | 2 | 3 |
COPA-SSE (Semi-Structured Explanations for COPA) | Balanced COPA (commonsense QA, causal reasoning) | ConceptNet-like triples with free-form head and tail concepts* | crowd | 1,500 | 4-9 (9747 total) | N/A |
* The author classed this as structured but says it’s not very rigid and can also be used as free text.