You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The batch_size parameter in the paperQA codebase is primarily used for processing text documents in batches during the embedding process. It determines how many documents are processed together in a single batch when calling the embedding function [1]. However, there is no specific information available in the codebase that directly links changes in batch_size to variations in answer quality or speed for the openai/mixtral:8x7b model [1].
The impact of batch_size might not be noticeable in terms of answer quality and speed, especially if the bottleneck is elsewhere in the system or if the model and hardware can handle larger batches efficiently.
Hello,
I changed batch size from 1 (default) to 8, 32 and saw no changes on paperQA behavioural (answer quality end speed), as follows :
Is batch size have any impact on paperQA behavioural ?
The text was updated successfully, but these errors were encountered: