|  Jacky Wu | ed5596a8f4
							
							fix: avoid llm node result var not init issue while do retry. (#14286) | 7 月之前 | 
				
					
						|  LeanDeR | 4f6a4f244c
							
							fix(llm/nodes.py): Ensure that the output returns without any exceptions (#14880) | 8 月之前 | 
				
					
						|  yuhaowin | 1e3197a1ea
							
							Fixes 14217: database retrieve api and chat-messages api response doc_metadata (#14219) | 8 月之前 | 
				
					
						|  -LAN- | 7a71498a3e
							
							chore(quota): Update deduct quota (#14337) | 8 月之前 | 
				
					
						|  Yeuoly | 403e2d58b9
							
							Introduce Plugins (#13836) | 8 月之前 | 
				
					
						|  Vasu Negi | 8a0aa91ed7
							
							Non-Streaming Models Do Not Return Results Properly in _handle_invoke_result (#13571) | 8 月之前 | 
				
					
						|  -LAN- | 04d13a8116
							
							feat(credits): Allow to configure model-credit mapping (#13274) | 8 月之前 | 
				
					
						|  -LAN- | b47669b80b
							
							fix: deduct LLM quota after processing invoke result (#13075) | 9 月之前 | 
				
					
						|  yihong | 56e15d09a9
							
							feat: mypy for all type check (#10921) | 10 月之前 | 
				
					
						|  JasonVV | 4b1e13e982
							
							Fix 11979 (#11984) | 10 月之前 | 
				
					
						|  -LAN- | 996a9135f6
							
							feat(llm_node): support order in text and files (#11837) | 10 月之前 | 
				
					
						|  Novice | 79a710ce98
							
							Feat: continue on error (#11458) | 10 月之前 | 
				
					
						|  yihong | 716576043d
							
							fix: issue 11247 that Completion mode content maybe list or str (#11504) | 10 月之前 | 
				
					
						|  -LAN- | 464e6354c5
							
							feat: correct the prompt grammar. (#11328) | 11 月之前 | 
				
					
						|  -LAN- | 223a30401c
							
							fix: LLM invoke error should not be raised (#11141) | 11 月之前 | 
				
					
						|  -LAN- | 044e7b63c2
							
							fix(llm_node): Ignore file if not supported. (#11114) | 11 月之前 | 
				
					
						|  -LAN- | 5b7b328193
							
							feat: Allow to contains files in the system prompt even model not support. (#11111) | 11 月之前 | 
				
					
						|  -LAN- | cbb4e95928
							
							fix(llm_node): Ignore user query when memory is disabled. (#11106) | 11 月之前 | 
				
					
						|  -LAN- | 20c091a5e7
							
							fix: user query be ignored if query_prompt_template is an empty string (#11103) | 11 月之前 | 
				
					
						|  -LAN- | 60b5dac3ab
							
							fix: query will be None if the query_prompt_template not exists (#11031) | 11 月之前 | 
				
					
						|  非法操作 | 08ac36812b
							
							feat: support LLM process document file (#10966) | 11 月之前 | 
				
					
						|  -LAN- | c5f7d650b5
							
							feat: Allow using file variables directly in the LLM node and support more file types. (#10679) | 11 月之前 | 
				
					
						|  非法操作 | 033ab5490b
							
							feat: support LLM understand video (#9828) | 11 月之前 | 
				
					
						|  -LAN- | 38bca6731c
							
							refactor(workflow): introduce specific error handling for LLM nodes (#10221) | 1 年之前 | 
				
					
						|  -LAN- | 8b5ea39916
							
							chore(llm_node): remove unnecessary type ignore for context assignment (#10216) | 1 年之前 | 
				
					
						|  -LAN- | 3b53e06e0d
							
							fix(workflow): refine variable type checks in LLMNode (#10051) | 1 年之前 | 
				
					
						|  -LAN- | eb87e690ed
							
							fix(llm-node): handle NoneSegment variables properly (#9978) | 1 年之前 | 
				
					
						|  -LAN- | d018b32d0b
							
							fix(workflow): enhance prompt handling with vision support (#9790) | 1 年之前 | 
				
					
						|  -LAN- | 8f670f31b8
							
							refactor(variables): replace deprecated 'get_any' with 'get' method (#9584) | 1 年之前 | 
				
					
						|  -LAN- | 2e657b7b12
							
							fix(workflow): handle NoneSegments in variable extraction (#9585) | 1 年之前 |