Skip to content

Commit 2159677

Browse files
authored
Merge pull request #794 from bezineb5/patch-1
Fix output_parser for generate_answer_node
2 parents b400cc5 + ebe0b0d commit 2159677

File tree

1 file changed

+2
-14
lines changed

1 file changed

+2
-14
lines changed

scrapegraphai/nodes/generate_answer_node.py

Lines changed: 2 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -122,22 +122,10 @@ def execute(self, state: dict) -> dict:
122122
partial_variables={"context": doc, "format_instructions": format_instructions}
123123
)
124124
chain = prompt | self.llm_model
125-
raw_response = chain.invoke({"question": user_prompt})
126-
127125
if output_parser:
128-
try:
129-
answer = output_parser.parse(raw_response.content)
130-
except JSONDecodeError:
131-
lines = raw_response.split('\n')
132-
if lines[0].strip().startswith('```'):
133-
lines = lines[1:]
134-
if lines[-1].strip().endswith('```'):
135-
lines = lines[:-1]
136-
cleaned_response = '\n'.join(lines)
137-
answer = output_parser.parse(cleaned_response)
138-
else:
139-
answer = raw_response.content
126+
chain = chain | output_parser
140127

128+
answer = chain.invoke({"question": user_prompt})
141129
state.update({self.output[0]: answer})
142130
return state
143131

0 commit comments

Comments
 (0)