Tool Use
An AI model's ability to invoke external functions (APIs, code runners, browsers) instead of just generating text.
Tool use is what turns an LLM from a chat companion into a worker. The model is given a list of available tools (described in JSON schema), decides if and which to call, and the runtime executes the call and feeds the result back.
Function calling is the API-level term; tool use is the broader pattern. Modern frontier APIs all support it natively Anthropic, OpenAI, Gemini with subtle differences in how parallel calls are handled and how errors are surfaced.
MCP standardized how tools are described across providers, so an MCP server you build for Claude works in Cursor, Windsurf, and any other MCP-aware client. This is rapidly making tool use a portability story rather than a per-provider lock-in.