Revolutionizing question answering: BERT augmented with CNN achieves impressive results!
The scientists enhanced a language model called BERT with different neural net structures to improve its ability to answer questions. They tested these models on a question answering task using the SQUAD 2.0 dataset. By fine-tuning BERT's parameters, they showed it can adapt well to specific language tasks. The best-performing network was a contextualized CNN, achieving high scores on both answerable and unanswerable questions.