root.skip-to-content
Log In
Log in with Shibboleth
Communities & Collections
Browse TUstorage
Statistics
Log In
Log in with Shibboleth
Home
Public Test Area
Distributionen
MDPI AG
Periodika
Systems
2023
Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
Distribution
PDF
Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
Loading...
Format
pdf
Authors
Victor Kwaku Agbesi
Wenyu Chen
Sophyani Banaamwini Yussif
Md Altab Hossin
Chiagoziem C. Ukwuoma
Noble A. Kuadey
Colin Collinson Agbesi
Nagwan Abdel Samee
Mona M. Jamjoom
Mugahed A. Al-antari
Journal ISSN
Publisher
Dataset
No Thumbnail Available
Dataset
Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
(
MDPI AG
,
2023
)
Victor Kwaku Agbesi
;
Wenyu Chen
;
Sophyani Banaamwini Yussif
;
Md Altab Hossin
;
Chiagoziem C. Ukwuoma
;
Noble A. Kuadey
;
Colin Collinson Agbesi
;
Nagwan Abdel Samee
;
Mona M. Jamjoom
;
Mugahed A. Al-antari
Show more
Description
Keywords
URI
https://tustorage.ulb.tu-darmstadt.de/handle/tustorage/39665
Full item page
Files
Original bundle
Now showing
1 - 1 of 1
Loading...
Name:
systems-12-01-00001.pdf
Size:
3.4 MB
Format:
Adobe Portable Document Format
Download