RAGchain.reranker.pygaggle.model package
Submodules
RAGchain.reranker.pygaggle.model.decode module
- RAGchain.reranker.pygaggle.model.decode.greedy_decode(model: PreTrainedModel, input_ids: Tensor, length: int, attention_mask: Tensor = None, return_last_logits: bool = True) Tensor | Tuple[Tensor, Tensor]
RAGchain.reranker.pygaggle.model.tokenize module
- class RAGchain.reranker.pygaggle.model.tokenize.DuoQueryDocumentBatch(query: RAGchain.reranker.pygaggle.base.Query, doc_pairs: List[Tuple[RAGchain.reranker.pygaggle.base.Text, RAGchain.reranker.pygaggle.base.Text]], output: Mapping[str, torch.Tensor | List[int] | List[List[int]] | List[List[str]]] | None = None)
Bases:
object
- output: Mapping[str, Tensor | List[int] | List[List[int]] | List[List[str]]] | None = None
- class RAGchain.reranker.pygaggle.model.tokenize.QueryDocumentBatch(query: RAGchain.reranker.pygaggle.base.Query, documents: List[RAGchain.reranker.pygaggle.base.Text], output: Mapping[str, torch.Tensor | List[int] | List[List[int]] | List[List[str]]] | None = None)
Bases:
object
- output: Mapping[str, Tensor | List[int] | List[List[int]] | List[List[str]]] | None = None
- class RAGchain.reranker.pygaggle.model.tokenize.QueryDocumentBatchTokenizer(tokenizer: PreTrainedTokenizer, batch_size: int, pattern: str = '{query} {document}', **tokenizer_kwargs)
Bases:
TokenizerEncodeMixin
- traverse_duo_query_document(batch_input: DuoQueryDocumentBatch) Iterable[DuoQueryDocumentBatch]
- traverse_query_document(batch_input: QueryDocumentBatch) Iterable[QueryDocumentBatch]
- class RAGchain.reranker.pygaggle.model.tokenize.T5BatchTokenizer(*args, **kwargs)
Bases:
QueryDocumentBatchTokenizer