.

Technology

Chain-of-Thought

Chain-of-Thought (CoT) is a prompt engineering technique that compels Large Language Models (LLMs) to articulate a series of intermediate reasoning steps, drastically improving performance on complex, multi-step tasks.

CoT prompting unlocks advanced reasoning capabilities in LLMs, especially those over 100 billion parameters (e.g., PaLM 540B). The method is straightforward: include a few examples—or simply the phrase 'Let's think step by step' (Zero-Shot CoT)—that explicitly show the logical progression to the final answer. This forces the model to decompose complex problems, like intricate arithmetic or symbolic reasoning, into manageable, sequential steps. The result is a significant empirical gain in accuracy and a clear, auditable trace of the model's 'thought' process, moving performance beyond standard direct-answer prompting.

https://arxiv.org/abs/2201.11903
4 projects · 4 cities

Related technologies

Recent Talks & Demos

Showing 1-4 of 4

Members-Only

Sign in to see who built these projects