HerBERT Large Pre-trained on KGR10 Data

PID

HerBERT-large model fine-tuned on KGR10 data. The model was trained using DeepSpeed technology.

Identifier
PID http://hdl.handle.net/11321/851
Metadata Access https://clarin-pl.eu/oai/request?verb=GetRecord&metadataPrefix=oai_dc&identifier=oai:clarin-pl.eu:11321/851
Provenance
Creator Janz, Arkadiusz; Wątorski, Piotr
Publisher Wrocław University of Science and Technology
Publication Year 2021
OpenAccess true
Contact clarin-pl(at)pwr.edu.pl
Representation
Language Polish
Resource Type toolService
Format downloadable_files_count: 0
Discipline Linguistics