Blog Logo
TAGS

Tree of Thoughts: Deliberate Problem Solving with Large Language Models

Language models are being deployed for general problem solving, but can fall short in tasks requiring exploration or strategic lookahead. To address this, a new framework called Tree of Thoughts (ToT) is introduced, which enables exploration over coherent units of text to aid in problem solving. This allows language models to consider multiple reasoning paths and self-evaluate choices. ToT significantly enhances language models’ problem-solving abilities on three tasks: Game of 24, Creative Writing, and Mini Crosswords. The paper explores the dual process models of human cognition and connects them with mathematical models used in machine learning, suggesting that augmenting a language model’s simple associative token-level choices with a more deliberative planning process allows them to function as general problem solvers.