Specific aspects of using large language models for text abstracting
Submitted by Гость (not verified) on Wed, 12/10/2025 - 11:08In the context of spike of science publications, the automatic abstracting based on AI technologies has become a relevant task. The existing abstracting models use the trained large language models which deployment requires significant hardware resources. Meanwhile, specialized models based on the same transformer architecture do not require such big resources and therefore, can be used both on local servers and in the cloud environment at a much lower cost.