- Print
- PDF
Article summary
Did you find this summary helpful?
Thank you for your feedback
Version 2.2
06 February, 2025
Important Note
Starting from v2.2, text-sentiment-v15 must be used, as this version enables sharing the sentiment module across all tenants.
Important Note
The streaming feature has been introduced with question-answering-v14. To prevent QueueOffloader from buffering and sending streamed responses in bulk for requests to the inference/stream endpoint, the QueueOffloader configuration must be applied to all environments.
Required configuration:
OffloaderConfig__QueuedRequestPathExcludeFilter: "inference/stream$"
AI Module Note
Please use the following stable AI modules for the v2.2 update!
- sentence-embedding (v17)
- text-normalization (v24)
- text-classification-mlp (v22)
- question-answering (v14)
- text-sentiment (v15)
- text-clustering (v24)
- text-language (v10)
- text-summarization (v8) -- Requires Vllm for operation, cannot function independently.
Improvements
New Version of AI Modules:
- question-answering-v14 has been tested and released.
- text-sentiment-v15 has been tested and released.
- text-summarization-v8 has been tested and released.
Was this article helpful?