Faculty of Engineering, Computer Science & Engineering
Permanent URI for this communityhttp://192.248.9.226/handle/123/47
Browse
Browsing Faculty of Engineering, Computer Science & Engineering by Subject "ADAPTERS"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
- item: Thesis-AbstractExploiting adapters for question generation from Tamil text in A zero - resource setting(2022) Purusanth S; Ranathunga SAutomatic Question Generation focuses on generating questions from a span of text is a significant problem in Natural Language Processing (NLP). Question generation in lowresource languages is under-explored compared to high-resource languages. In the earlier work, all the parameters of a pre-trained multilingual language model were fine-tuned to perform a zero-shot question generation and other sequence-to-sequence (S2S) generation tasks. However, such full model fine-tuning is not computationally efficient. Recent research introduced a neural module called adapter to each Transformer layer of a pretrained language model and fine-tuned only the adapter parameters to mitigate this issue. In this study, we explored single task adapter and adapter fusion on the pre-trained multilingual model mBART to generate questions from Tamil text. Our best model produced a Rough-1 (F1) score of 16.9. Furthermore, we obtained a similar result with two variants of adapters called Houlsby adapter [1] and Pfeifer adapter [1], which resemble the result of adapters for other S2S tasks[2].