autorag.data.qa.query package

Submodules

autorag.data.qa.query.llama_gen_query module

async autorag.data.qa.query.llama_gen_query.concept_completion_query_gen(row: Dict, llm: BaseLLM, lang: str = 'en') Dict[source]
async autorag.data.qa.query.llama_gen_query.custom_query_gen(row: Dict, llm: BaseLLM, messages: List[ChatMessage]) Dict[source]
async autorag.data.qa.query.llama_gen_query.factoid_query_gen(row: Dict, llm: BaseLLM, lang: str = 'en') Dict[source]
async autorag.data.qa.query.llama_gen_query.llama_index_generate_base(row: Dict, llm: BaseLLM, messages: List[ChatMessage]) Dict[source]
async autorag.data.qa.query.llama_gen_query.multiple_queries_gen(row: Dict, llm: BaseLLM, lang: str = 'en', n: int = 3) Dict[source]
async autorag.data.qa.query.llama_gen_query.two_hop_incremental(row: Dict, llm: BaseLLM, lang: str = 'en') Dict[source]

autorag.data.qa.query.openai_gen_query module

class autorag.data.qa.query.openai_gen_query.Response(*, query: str)[source]

Bases: BaseModel

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

query: str
class autorag.data.qa.query.openai_gen_query.TwoHopIncrementalResponse(*, answer: str, one_hop_question: str, two_hop_question: str)[source]

Bases: BaseModel

answer: str
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

one_hop_question: str
two_hop_question: str
async autorag.data.qa.query.openai_gen_query.concept_completion_query_gen(row: Dict, client: AsyncOpenAI, model_name: str = 'gpt-4o-2024-08-06', lang: str = 'en') Dict[source]
async autorag.data.qa.query.openai_gen_query.factoid_query_gen(row: Dict, client: AsyncOpenAI, model_name: str = 'gpt-4o-2024-08-06', lang: str = 'en') Dict[source]
async autorag.data.qa.query.openai_gen_query.query_gen_openai_base(row: Dict, client: AsyncOpenAI, messages: List[ChatMessage], model_name: str = 'gpt-4o-2024-08-06')[source]
async autorag.data.qa.query.openai_gen_query.two_hop_incremental(row: Dict, client: AsyncOpenAI, model_name: str = 'gpt-4o-2024-08-06', lang: str = 'en') Dict[source]

Create a two-hop question using incremental prompt. Incremental prompt is more effective to create multi-hop question. The input retrieval_gt has to include more than one passage.

Returns:

The two-hop question using openai incremental prompt

autorag.data.qa.query.prompt module

Module contents