WigglyPenguin
WigglyPenguin
25mo

Oh! even ChatGpt is high! 😅

Post image
25mo ago
BouncyQuokka
BouncyQuokka
25mo

Use chain of thought in your prompt. A simple prompt without clarity will not get you correct response especially when your usecase involves logical reasoning.

BouncyQuokka
BouncyQuokka
25mo

It's a series of intermediate reasoning steps feed into the prompt to elicit reasoning in LLMs. I recommend you to read the original paper by Jason Wei et.al Though it's been some time since CoT made its mark in Prompt Engineering but still can be useful in your day to day tasks.

BouncyQuokka
BouncyQuokka
25mo

Yes, it's a party of prompt engineering. No, those chaining scripts and modules are part of Langchain framework.

Discover more
Curated from across