Class: CondenseQuestionChatEngine
CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). It does two steps on taking a user's chat message: first, it condenses the chat message with the previous chat history into a question with more context. Then, it queries the underlying Index using the new question with context and returns the response. CondenseQuestionChatEngine performs well when the input is primarily questions about the underlying data. It performs less well when the chat messages are not questions about the data, or are very referential to previous context.
Extends
Implements
Constructors
new CondenseQuestionChatEngine()
new CondenseQuestionChatEngine(
init
):CondenseQuestionChatEngine
Parameters
• init
• init.chatHistory: ChatMessage
[]
• init.condenseMessagePrompt?: CondenseQuestionPrompt
• init.queryEngine: QueryEngine
• init.serviceContext?: ServiceContext
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:46
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:42
condenseMessagePrompt
condenseMessagePrompt:
CondenseQuestionPrompt
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:44
llm
llm:
LLM
<object
,object
>
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:43
queryEngine
queryEngine:
QueryEngine
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:41
Methods
_getPromptModules()
protected
_getPromptModules():ModuleRecord
Return a dictionary of sub-modules within the current module that also implement PromptMixin (so that their prompts can also be get/set).
Can be blank if no sub-modules.
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:61
_getPrompts()
protected
_getPrompts():object
Returns
object
condenseMessagePrompt
condenseMessagePrompt:
CondenseQuestionPrompt
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:65
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict
• promptsDict.condenseMessagePrompt: CondenseQuestionPrompt
Returns
void
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:71
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
,any
,any
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
, any
, any
>>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:92
chat(params)
chat(
params
):Promise
<EngineResponse
>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:95
getPrompts()
getPrompts():
PromptsRecord
Returns
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:58
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:136
updatePrompts()
updatePrompts(
prompts
):void
Parameters
• prompts: PromptsRecord
Returns
void
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:59
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Parameters
• promptsDict: PromptsRecord
• moduleDict: ModuleRecord
Returns
void
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:57