# Gaspard > --- # Source: https://claudette.answer.ai/toolloop.html.md # Tool loop ``` python import os # os.environ['ANTHROPIC_LOG'] = 'debug' ``` ``` python from IPython.display import display, Markdown, clear_output from pprint import pprint from cachy import enable_cachy ``` ``` python enable_cachy() ``` ``` python model = models[1] model ``` 'claude-sonnet-4-5' ## Problem setup Anthropic provides an [interesting example](https://github.com/anthropics/anthropic-cookbook/blob/main/tool_use/customer_service_agent.ipynb) of using tools to mock up a hypothetical ordering system. We’re going to take it a step further, and show how we can dramatically simplify the process, whilst completing more complex tasks. We’ll start by defining the same mock customer/order data as in Anthropic’s example, plus create a entity relationship between customers and orders: ``` python def _get_orders_customers(): orders = { "O1": dict(id="O1", product="Widget A", quantity=2, price=19.99, status="Shipped"), "O2": dict(id="O2", product="Gadget B", quantity=1, price=49.99, status="Processing"), "O3": dict(id="O3", product="Gadget B", quantity=2, price=49.99, status="Shipped")} customers = { "C1": dict(name="John Doe", email="john@example.com", phone="123-456-7890", orders=[orders['O1'], orders['O2']]), "C2": dict(name="Jane Smith", email="jane@example.com", phone="987-654-3210", orders=[orders['O3']]) } return orders, customers ``` ``` python orders, customers = _get_orders_customers() ``` We can now define the same functions from the original example – but note that we don’t need to manually create the large JSON schema, since Claudette handles all that for us automatically from the functions directly. We’ll add some extra functionality to update order details when cancelling too. ``` python def get_customer_info( customer_id:str # ID of the customer ): # Customer's name, email, phone number, and list of orders "Retrieves a customer's information and their orders based on the customer ID" print(f'- Retrieving customer {customer_id}') return customers.get(customer_id, "Customer not found") def get_order_details( order_id:str # ID of the order ): # Order's ID, product name, quantity, price, and order status "Retrieves the details of a specific order based on the order ID" print(f'- Retrieving order {order_id}') return orders.get(order_id, "Order not found") def cancel_order( order_id:str # ID of the order to cancel )->bool: # True if the cancellation is successful "Cancels an order based on the provided order ID" print(f'- Cancelling order {order_id}') if order_id not in orders: return False orders[order_id]['status'] = 'Cancelled' return True ``` We’re now ready to start our chat. ## Manual tool use ``` python tools = [get_customer_info, get_order_details, cancel_order] ``` ``` python chat = Chat(model, tools=tools) ``` We’ll start with the same request as Anthropic showed: ``` python r = chat('Can you tell me the email address for customer C1?') print(r.stop_reason) r.content ``` - Retrieving customer C1 tool_use [ToolUseBlock(id='toolu_01LJ2mkQDqRdToAFHbCosv26', input={'customer_id': 'C1'}, name='get_customer_info', type='tool_use')] Claude asks us to use a tool. Claudette handles that automatically by just calling it again: ``` python r = chat() contents(r) ``` 'The email address for customer C1 (John Doe) is **john@example.com**.' Let’s consider a more complex case than in the original example – what happens if a customer wants to cancel all of their orders? ``` python chat = Chat(model, tools=tools) r = chat('Please cancel all orders for customer C1 for me.') print(r.stop_reason) r.content ``` - Retrieving customer C1 tool_use [TextBlock(citations=None, text="I'll help you cancel all orders for customer C1. First, let me retrieve the customer's information to see what orders they have.", type='text'), ToolUseBlock(id='toolu_01G48VxPvsqRmUfRNWbz5JAf', input={'customer_id': 'C1'}, name='get_customer_info', type='tool_use')] ## Tool loop This is the start of a multi-stage tool use process. Doing it manually step by step is inconvenient, so let’s write a function to handle this for us: ------------------------------------------------------------------------ source ### Chat.toolloop ``` python def toolloop( pr, # Prompt to pass to Claude max_steps:int=10, # Maximum number of tool requests to loop through cont_func:callable=noop, # Function that stops loop if returns False final_prompt:str='You have no more tool uses. Please summarize your findings. If you did not complete your goal please tell the user what further work needs to be done so they can choose how best to proceed.', # Prompt to add if last message is a tool call temp:NoneType=None, # Temperature maxtok:int=4096, # Maximum tokens maxthinktok:int=0, # Maximum thinking tokens stream:bool=False, # Stream response? prefill:str='', # Optional prefill to pass to Claude as start of its response tool_choice:Optional=None, # Optionally force use of some tool ): ``` *Add prompt `pr` to dialog and get a response from Claude, automatically following up with `tool_use` messages*
Exported source ``` python _final_prompt = "You have no more tool uses. Please summarize your findings. If you did not complete your goal please tell the user what further work needs to be done so they can choose how best to proceed." ```
Exported source ``` python @patch @delegates(Chat.__call__) def toolloop(self:Chat, pr, # Prompt to pass to Claude max_steps=10, # Maximum number of tool requests to loop through cont_func:callable=noop, # Function that stops loop if returns False final_prompt=_final_prompt, # Prompt to add if last message is a tool call **kwargs): "Add prompt `pr` to dialog and get a response from Claude, automatically following up with `tool_use` messages" @save_iter def _f(o): init_n = len(self.h) r = self(pr, **kwargs) yield r if len(self.last)>1: yield self.last[1] for i in range(max_steps-1): if self.c.stop_reason!='tool_use': break r = self(final_prompt if i==max_steps-2 else None, **kwargs) yield r if len(self.last)>1: yield self.last[1] if not cont_func(*self.h[-3:]): break o.value = self.h[init_n+1:] return _f() ```
`toolloop` returns an iterable of assistant messages: ``` python chat = Chat(model, tools=tools) pr = 'Can you tell me the email address for customer C1?' r = chat.toolloop(pr) for o in r: display(o) ``` - Retrieving customer C1 \[ToolUseBlock(id=‘toolu_01LJ2mkQDqRdToAFHbCosv26’, input={‘customer_id’: ‘C1’}, name=‘get_customer_info’, type=‘tool_use’)\]
- id: `msg_01F1ruk8y7TsTrhpWTkBc67e` - content: `[{'id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'input': {'customer_id': 'C1'}, 'name': 'get_customer_info', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 757, 'output_tokens': 58, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': "{'name': 'John Doe', 'email': 'john@example.com', " "'phone': '123-456-7890', 'orders': [{'id': 'O1', " "'product': 'Widget A', 'quantity': 2, 'price': " "19.99, 'status': 'Shipped'}, {'id': 'O2', " "'product': 'Gadget B', 'quantity': 1, 'price': " "49.99, 'status': 'Processing'}]}", 'tool_use_id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'type': 'tool_result'}], 'role': 'user'} ``` The email address for customer C1 (John Doe) is **john@example.com**.
- id: `msg_019Z6rMioA6RExGgXKGnusiM` - content: `[{'citations': None, 'text': 'The email address for customer C1 (John Doe) is **john@example.com**.', 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 953, 'output_tokens': 24, 'server_tool_use': None, 'service_tier': 'standard'}`
The full set of tool loop messages is stored in the `value` attr: ``` python pprint(r.value, width=120) ``` [{'content': [{'id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'input': {'customer_id': 'C1'}, 'name': 'get_customer_info', 'type': 'tool_use'}], 'role': 'assistant'}, {'content': [{'content': "{'name': 'John Doe', 'email': 'john@example.com', 'phone': '123-456-7890', 'orders': " "[{'id': 'O1', 'product': 'Widget A', 'quantity': 2, 'price': 19.99, 'status': 'Shipped'}, " "{'id': 'O2', 'product': 'Gadget B', 'quantity': 1, 'price': 49.99, 'status': " "'Processing'}]}", 'tool_use_id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'type': 'tool_result'}], 'role': 'user'}, {'content': [{'text': 'The email address for customer C1 (John Doe) is **john@example.com**.', 'type': 'text'}], 'role': 'assistant'}] Let’s see if it can handle the multi-stage process now: ``` python orders, customers = _get_orders_customers() ``` ``` python chat = Chat(model, tools=tools) r = chat.toolloop('Please cancel all orders for customer C1 for me.') for o in r: display(o) ``` - Retrieving customer C1 I’ll help you cancel all orders for customer C1. First, let me retrieve the customer’s information to see what orders they have.
- id: `msg_01XdX15ZJuePtveCDsM41WMm` - content: `[{'citations': None, 'text': "I'll help you cancel all orders for customer C1. First, let me retrieve the customer's information to see what orders they have.", 'type': 'text'}, {'id': 'toolu_01G48VxPvsqRmUfRNWbz5JAf', 'input': {'customer_id': 'C1'}, 'name': 'get_customer_info', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 757, 'output_tokens': 87, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': "{'name': 'John Doe', 'email': 'john@example.com', " "'phone': '123-456-7890', 'orders': [{'id': 'O1', " "'product': 'Widget A', 'quantity': 2, 'price': " "19.99, 'status': 'Shipped'}, {'id': 'O2', " "'product': 'Gadget B', 'quantity': 1, 'price': " "49.99, 'status': 'Processing'}]}", 'tool_use_id': 'toolu_01G48VxPvsqRmUfRNWbz5JAf', 'type': 'tool_result'}], 'role': 'user'} ``` - Cancelling order O1 - Cancelling order O2 Now I can see that customer C1 (John Doe) has 2 orders: - Order O1: Widget A (Status: Shipped) - Order O2: Gadget B (Status: Processing) Let me cancel both orders for you.
- id: `msg_01CWangwZyHeqyk5m8MeSwP4` - content: `[{'citations': None, 'text': 'Now I can see that customer C1 (John Doe) has 2 orders:\n- Order O1: Widget A (Status: Shipped)\n- Order O2: Gadget B (Status: Processing)\n\nLet me cancel both orders for you.', 'type': 'text'}, {'id': 'toolu_01Lm2DE8sU5kBASiTXDa77zP', 'input': {'order_id': 'O1'}, 'name': 'cancel_order', 'type': 'tool_use'}, {'id': 'toolu_01CQ7U8kyyvWRE8kTGfxVto5', 'input': {'order_id': 'O2'}, 'name': 'cancel_order', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 981, 'output_tokens': 154, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': 'True', 'tool_use_id': 'toolu_01Lm2DE8sU5kBASiTXDa77zP', 'type': 'tool_result'}, { 'content': 'True', 'tool_use_id': 'toolu_01CQ7U8kyyvWRE8kTGfxVto5', 'type': 'tool_result'}], 'role': 'user'} ``` Perfect! I’ve successfully cancelled all orders for customer C1 (John Doe): - ✓ Order O1 (Widget A) - Cancelled - ✓ Order O2 (Gadget B) - Cancelled Both orders have been cancelled successfully.
- id: `msg_0133SZjYiBzUkej4NSzChNWh` - content: `[{'citations': None, 'text': "Perfect! I've successfully cancelled all orders for customer C1 (John Doe):\n- ✓ Order O1 (Widget A) - Cancelled\n- ✓ Order O2 (Gadget B) - Cancelled\n\nBoth orders have been cancelled successfully.", 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1200, 'output_tokens': 65, 'server_tool_use': None, 'service_tier': 'standard'}`
OK Claude thinks the orders were cancelled – let’s check one: ``` python for o in chat.toolloop('What is the status of order O2?'): display(o) ``` - Retrieving order O2 Let me check the current status of order O2 for you.
- id: `msg_016cS3SURc48upqKf4hBLPMX` - content: `[{'citations': None, 'text': 'Let me check the current status of order O2 for you.', 'type': 'text'}, {'id': 'toolu_01Kzj8EBHHxbTAnrohSNt7vk', 'input': {'order_id': 'O2'}, 'name': 'get_order_details', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1277, 'output_tokens': 73, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': "{'id': 'O2', 'product': 'Gadget B', 'quantity': " "1, 'price': 49.99, 'status': 'Cancelled'}", 'tool_use_id': 'toolu_01Kzj8EBHHxbTAnrohSNt7vk', 'type': 'tool_result'}], 'role': 'user'} ``` Order O2 is now showing a status of **Cancelled**. This confirms that the cancellation we performed earlier was successful. The order details are: - Order ID: O2 - Product: Gadget B - Quantity: 1 - Price: $49.99 - Status: Cancelled
- id: `msg_01NLEeZkoDPTVyps2dJoYC4x` - content: `[{'citations': None, 'text': 'Order O2 is now showing a status of **Cancelled**. This confirms that the cancellation we performed earlier was successful. The order details are:\n- Order ID: O2\n- Product: Gadget B\n- Quantity: 1\n- Price: $49.99\n- Status: Cancelled', 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1406, 'output_tokens': 71, 'server_tool_use': None, 'service_tier': 'standard'}`
If we run out of tool loops lets see what happens: ``` python def mydiv(a:float, b:float): "Divide two numbers" return a / b ``` ``` python chat = Chat(model, tools=[mydiv]) r = chat.toolloop('Please calculate this sequence using your tools: 43/23454; 652/previous result; 6843/previous result; 321/previous result', max_steps=2) for o in r: display(o) ``` I’ll calculate this sequence step by step, using the result from each division as the divisor for the next operation.
- id: `msg_012jpk6qp7u58WMkKv5VWwsv` - content: `[{'citations': None, 'text': "I'll calculate this sequence step by step, using the result from each division as the divisor for the next operation.", 'type': 'text'}, {'id': 'toolu_01TcqArsz5skzgeZSHACNpFL', 'input': {'a': 43, 'b': 23454}, 'name': 'mydiv', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 607, 'output_tokens': 95, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': '0.001833375969983798', 'tool_use_id': 'toolu_01TcqArsz5skzgeZSHACNpFL', 'type': 'tool_result'}], 'role': 'user'} ``` I was able to complete 2 out of 4 steps in the sequence before running out of tool uses. Here’s what was calculated: **Completed:** 1. 43 ÷ 23454 = 0.001833375969983798 2. 652 ÷ 0.001833375969983798 = 355628.0930232558 **Still needed:** 3. 6843 ÷ 355628.0930232558 = (not calculated) 4. 321 ÷ (result from step 3) = (not calculated) To complete this sequence, you would need to: - Divide 6843 by 355628.0930232558 - Then divide 321 by that result to get the final answer Would you like me to continue with these remaining calculations?
- id: `msg_01GNzpAuTG1LtRm8z8rjnXV4` - content: `[{'citations': None, 'text': "I was able to complete 2 out of 4 steps in the sequence before running out of tool uses. Here's what was calculated:\n\n**Completed:**\n1. 43 ÷ 23454 = 0.001833375969983798\n2. 652 ÷ 0.001833375969983798 = 355628.0930232558\n\n**Still needed:**\n3. 6843 ÷ 355628.0930232558 = (not calculated)\n4. 321 ÷ (result from step 3) = (not calculated)\n\nTo complete this sequence, you would need to:\n- Divide 6843 by 355628.0930232558\n- Then divide 321 by that result to get the final answer\n\nWould you like me to continue with these remaining calculations?", 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 875, 'output_tokens': 198, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python chat = Chat(model, tools=mydiv) r = chat.toolloop('Try dividing 1 by 0 and see what the error result is') for o in r: display(o) ``` I’ll try dividing 1 by 0 to see what happens:
- id: `msg_01Y8hq6xRjtB3jpnKUjcShHH` - content: `[{'citations': None, 'text': "I'll try dividing 1 by 0 to see what happens:", 'type': 'text'}, {'id': 'toolu_016iCAFGa523mQc3JKCiWQRH', 'input': {'a': 1, 'b': 0}, 'name': 'mydiv', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 592, 'output_tokens': 87, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': 'Traceback (most recent call last):\n' ' File ' '"/Users/jhoward/aai-ws/toolslm/toolslm/funccall.py", ' 'line 215, in call_func\n' ' try: return func(**inps)\n' ' ^^^^^^^^^^^^\n' ' File ' '"/Users/jhoward/aai-ws/claudette/claudette/core.py", ' 'line 439, in wrapper\n' ' return func(*new_args, **new_kwargs)\n' ' ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n' ' File ' '"/var/folders/51/b2_szf2945n072c0vj2cyty40000gn/T/ipykernel_73559/246724137.py", ' 'line 3, in mydiv\n' ' return a / b\n' ' ~~^~~\n' 'ZeroDivisionError: division by zero\n', 'tool_use_id': 'toolu_016iCAFGa523mQc3JKCiWQRH', 'type': 'tool_result'}], 'role': 'user'} ``` As expected, dividing by zero produces a **ZeroDivisionError** with the message “division by zero”. This is Python’s standard exception for attempting to divide a number by zero, which is mathematically undefined. The error shows the full traceback indicating where the division operation failed in the `mydiv` function.
- id: `msg_01RcdGpMqujYsp2zEh13aKvf` - content: `[{'citations': None, 'text': 'As expected, dividing by zero produces a **ZeroDivisionError** with the message "division by zero". This is Python\'s standard exception for attempting to divide a number by zero, which is mathematically undefined. The error shows the full traceback indicating where the division operation failed in the`mydiv`function.', 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 882, 'output_tokens': 70, 'server_tool_use': None, 'service_tier': 'standard'}`
## Streaming ``` python orders, customers = _get_orders_customers() ``` ``` python chat = Chat(model, tools=tools) r = chat.toolloop('Please cancel all orders for customer C1 for me.', stream=True) for o in r: if isinstance(o, (dict,Message,list)): print(o) else: for x in o: print(x, end='') display(o.value) ``` I'll help you cancel all orders for customer C1. First, let me retrieve the customer's information to see what orders they have.- Retrieving customer C1 I’ll help you cancel all orders for customer C1. First, let me retrieve the customer’s information to see what orders they have.
- id: `msg_01Qz5tYWK1jA8kete3Ub2dNt` - content: `[{'citations': None, 'text': "I'll help you cancel all orders for customer C1. First, let me retrieve the customer's information to see what orders they have.", 'type': 'text'}, {'id': 'toolu_019nveAUbHYAQ11nprLdWAQR', 'input': {'customer_id': 'C1'}, 'name': 'get_customer_info', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 757, 'output_tokens': 87, 'server_tool_use': None, 'service_tier': 'standard'}`
{'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_019nveAUbHYAQ11nprLdWAQR', 'content': "{'name': 'John Doe', 'email': 'john@example.com', 'phone': '123-456-7890', 'orders': [{'id': 'O1', 'product': 'Widget A', 'quantity': 2, 'price': 19.99, 'status': 'Shipped'}, {'id': 'O2', 'product': 'Gadget B', 'quantity': 1, 'price': 49.99, 'status': 'Processing'}]}"}]} Now I can see that customer C1 (John Doe) has 2 orders: - Order O1: Widget A (Status: Shipped) - Order O2: Gadget B (Status: Processing) Let me cancel both orders for you.- Cancelling order O1 - Cancelling order O2 Now I can see that customer C1 (John Doe) has 2 orders: - Order O1: Widget A (Status: Shipped) - Order O2: Gadget B (Status: Processing) Let me cancel both orders for you.
- id: `msg_01MhvyKhRtSdsCmTfbSuWwE7` - content: `[{'citations': None, 'text': 'Now I can see that customer C1 (John Doe) has 2 orders:\n- Order O1: Widget A (Status: Shipped)\n- Order O2: Gadget B (Status: Processing)\n\nLet me cancel both orders for you.', 'type': 'text'}, {'id': 'toolu_01DheUnWexZKYKSi211SgGLM', 'input': {'order_id': 'O1'}, 'name': 'cancel_order', 'type': 'tool_use'}, {'id': 'toolu_01HhVbewBrXzyGSfeWMeTEEt', 'input': {'order_id': 'O2'}, 'name': 'cancel_order', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 981, 'output_tokens': 154, 'server_tool_use': None, 'service_tier': 'standard'}`
{'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_01DheUnWexZKYKSi211SgGLM', 'content': 'True'}, {'type': 'tool_result', 'tool_use_id': 'toolu_01HhVbewBrXzyGSfeWMeTEEt', 'content': 'True'}]} Perfect! I've successfully cancelled all orders for customer C1 (John Doe): - ✓ Order O1 (Widget A) - Cancelled - ✓ Order O2 (Gadget B) - Cancelled Both orders have been cancelled successfully. Perfect! I’ve successfully cancelled all orders for customer C1 (John Doe): - ✓ Order O1 (Widget A) - Cancelled - ✓ Order O2 (Gadget B) - Cancelled Both orders have been cancelled successfully.
- id: `msg_01NgJQktChmU3RrU4xUpXTKq` - content: `[{'citations': None, 'text': "Perfect! I've successfully cancelled all orders for customer C1 (John Doe):\n- ✓ Order O1 (Widget A) - Cancelled\n- ✓ Order O2 (Gadget B) - Cancelled\n\nBoth orders have been cancelled successfully.", 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1200, 'output_tokens': 65, 'server_tool_use': None, 'service_tier': 'standard'}`
## Async tool loop ------------------------------------------------------------------------ source ### AsyncChat.toolloop ``` python def toolloop( pr, # Prompt to pass to Claude max_steps:int=10, # Maximum number of tool requests to loop through cont_func:callable=noop, # Function that stops loop if returns False final_prompt:str='You have no more tool uses. Please summarize your findings. If you did not complete your goal please tell the user what further work needs to be done so they can choose how best to proceed.', # Prompt to add if last message is a tool call temp:NoneType=None, # Temperature maxtok:int=4096, # Maximum tokens maxthinktok:int=0, # Maximum thinking tokens stream:bool=False, # Stream response? prefill:str='', # Optional prefill to pass to Claude as start of its response tool_choice:Union=None, # Optionally force use of some tool ): ``` *Add prompt `pr` to dialog and get a response from Claude, automatically following up with `tool_use` messages*
Exported source ``` python @patch @delegates(AsyncChat.__call__) def toolloop( self: AsyncChat, pr, # Prompt to pass to Claude max_steps=10, # Maximum number of tool requests to loop through cont_func: callable = noop, # Function that stops loop if returns False final_prompt = _final_prompt, # Prompt to add if last message is a tool call **kwargs ): "Add prompt `pr` to dialog and get a response from Claude, automatically following up with `tool_use` messages" @save_iter async def _f(o): init_n = len(self.h) r = await self(pr, **kwargs) yield r if len(self.last)>1: yield self.last[1] for i in range(max_steps-1): if self.c.stop_reason != 'tool_use': break r = await self(final_prompt if i==max_steps-2 else None, **kwargs) yield r if len(self.last)>1: yield self.last[1] if not cont_func(*self.h[-3:]): break o.value = self.h[init_n+1:] return _f() ```
``` python orders, customers = _get_orders_customers() ``` ``` python tools = [get_customer_info, get_order_details, cancel_order] chat = AsyncChat(model, tools=tools) r = chat.toolloop('Can you tell me the email address for customer C1?') async for o in r: print(o) ``` - Retrieving customer C1 Message(id='msg_01F1ruk8y7TsTrhpWTkBc67e', content=[ToolUseBlock(id='toolu_01LJ2mkQDqRdToAFHbCosv26', input={'customer_id': 'C1'}, name='get_customer_info', type='tool_use')], model='claude-sonnet-4-5-20250929', role='assistant', stop_reason='tool_use', stop_sequence=None, type='message', usage=In: 757; Out: 58; Cache create: 0; Cache read: 0; Total Tokens: 815; Search: 0) {'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'content': "{'name': 'John Doe', 'email': 'john@example.com', 'phone': '123-456-7890', 'orders': [{'id': 'O1', 'product': 'Widget A', 'quantity': 2, 'price': 19.99, 'status': 'Shipped'}, {'id': 'O2', 'product': 'Gadget B', 'quantity': 1, 'price': 49.99, 'status': 'Processing'}]}"}]} Message(id='msg_019Z6rMioA6RExGgXKGnusiM', content=[TextBlock(citations=None, text='The email address for customer C1 (John Doe) is **john@example.com**.', type='text')], model='claude-sonnet-4-5-20250929', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=In: 953; Out: 24; Cache create: 0; Cache read: 0; Total Tokens: 977; Search: 0) ``` python pprint(r.value) ``` [{'content': [{'id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'input': {'customer_id': 'C1'}, 'name': 'get_customer_info', 'type': 'tool_use'}], 'role': 'assistant'}, {'content': [{'content': "{'name': 'John Doe', 'email': 'john@example.com', " "'phone': '123-456-7890', 'orders': [{'id': 'O1', " "'product': 'Widget A', 'quantity': 2, 'price': " "19.99, 'status': 'Shipped'}, {'id': 'O2', " "'product': 'Gadget B', 'quantity': 1, 'price': " "49.99, 'status': 'Processing'}]}", 'tool_use_id': 'toolu_01LJ2mkQDqRdToAFHbCosv26', 'type': 'tool_result'}], 'role': 'user'}, {'content': [{'citations': {}, 'text': 'The email address for customer C1 (John Doe) is ' '**john@example.com**.', 'type': 'text'}], 'role': 'assistant'}] ## Code interpreter Here is an example of using `toolloop` to implement a simple code interpreter with additional tools. ``` python from toolslm.shell import get_shell from fastcore.meta import delegates import traceback ``` ``` python @delegates() class CodeChat(Chat): imps = 'os, warnings, time, json, re, math, collections, itertools, functools, dateutil, datetime, string, types, copy, pprint, enum, numbers, decimal, fractions, random, operator, typing, dataclasses' def __init__(self, model: Optional[str] = None, ask:bool=True, **kwargs): super().__init__(model=model, **kwargs) self.ask = ask self.tools.append(self.run_cell) self.shell = get_shell() self.shell.run_cell('import '+self.imps) ``` We have one additional parameter to creating a `CodeChat` beyond what we pass to [`Chat`](https://claudette.answer.ai/core.html#chat), which is `ask` – if that’s `True`, we’ll prompt the user before running code. ``` python @patch def run_cell( self:CodeChat, code:str, # Code to execute in persistent IPython session )->str: """Asks user for permission, and if provided, executes python `code` using persistent IPython session. Returns: Result of expression on last line (if exists); '#DECLINED#' if user declines request to execute""" confirm = f'Press Enter to execute, or enter "n" to skip?\n```\n{code}\n```\n' if self.ask and input(confirm): return '#DECLINED#' try: res = self.shell.run_cell(code) except Exception as e: return traceback.format_exc() return res.stdout if res.result is None else res.result ``` We just pass along requests to run code to the shell’s implementation. Claude often prints results instead of just using the last expression, so we capture stdout in those cases. ``` python sp = f'''You are a knowledgable assistant. Do not use tools unless needed. Don't do complex calculations yourself -- use code for them. The following modules are pre-imported for `run_cell` automatically: {CodeChat.imps} Never mention what tools you are using. Note that `run_cell` interpreter state is *persistent* across calls. If a tool returns `#DECLINED#` report to the user that the attempt was declined and no further progress can be made. In that case, do *not* attempt to run any further code -- stop execution *IMMEDIATELY* and tell the user it was declined. When using a tool, *ALWAYS* before every use of every tool, tell the user what you will be doing and why.''' ``` ``` python def get_user()->str: "Get the username of the user running this session" print("Looking up username") return 'Jeremy' ``` In order to test out multi-stage tool use, we create a mock function that Claude can call to get the current username. ``` python model = models[1] chat = CodeChat(model, tools=[get_user], sp=sp, ask=True, temp=0.3) ``` Providing a callable to toolloop’s `trace_func` lets us print out information during the loop: `toolloop`’s `cont_func` callable let’s us provide a function which, if it returns `False`, stops the loop: ``` python def _cont_decline(call, resp, asst): return resp['content'][0].get('content') != '#DECLINED#' ``` Now we can try our code interpreter. We start by asking for a function to be created, which we’ll use in the next prompt to test that the interpreter is persistent. ``` python pr = '''Create a 1-line function `checksum` for a string `s`, that multiplies together the ascii values of each character in `s` using `reduce`.''' for o in chat.toolloop(pr, cont_func=_cont_decline): display(o) ``` I’ll create a 1-line `checksum` function that uses `reduce` to multiply the ASCII values of each character in a string.
- id: `msg_01NfL3YNWTZ2Chst88nr3iPu` - content: `[{'citations': None, 'text': "I'll create a 1-line`checksum`function that uses`reduce`to multiply the ASCII values of each character in a string.", 'type': 'text'}, {'id': 'toolu_01GgTqM917BzHJ4MjpwZFwTF', 'input': {'code': '\nchecksum = lambda s: functools.reduce(operator.mul, (ord(c) for c in s), 1)\n\n# Test it\nprint(checksum("abc"))\nprint(checksum("hello"))\nprint(checksum("A"))\n'}, 'name': 'run_cell', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 927, 'output_tokens': 141, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': '941094\n13599570816\n65\n', 'tool_use_id': 'toolu_01GgTqM917BzHJ4MjpwZFwTF', 'type': 'tool_result'}], 'role': 'user'} ``` Perfect! I’ve created the `checksum` function as a one-liner that: - Uses `functools.reduce` with `operator.mul` to multiply values together - Converts each character to its ASCII value using `ord(c)` - Uses a generator expression `(ord(c) for c in s)` to iterate through the string - Starts with an initial value of `1` (the identity for multiplication) The test shows it working correctly: - `"abc"` → 97 × 98 × 99 = 941,094 - `"hello"` → 104 × 101 × 108 × 108 × 111 = 13,599,570,816 - `"A"` → 65
- id: `msg_01L1Et5ywM28AckqGct1Ce4A` - content: `[{'citations': None, 'text': 'Perfect! I\'ve created the`checksum`function as a one-liner that:\n- Uses`functools.reduce`with`operator.mul`to multiply values together\n- Converts each character to its ASCII value using`ord(c)`\n- Uses a generator expression`(ord(c) for c in s)`to iterate through the string\n- Starts with an initial value of`1`(the identity for multiplication)\n\nThe test shows it working correctly:\n-`“abc”`→ 97 × 98 × 99 = 941,094\n-`“hello”`→ 104 × 101 × 108 × 108 × 111 = 13,599,570,816\n-`“A”`→ 65', 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1090, 'output_tokens': 175, 'server_tool_use': None, 'service_tier': 'standard'}`
By asking for a calculation to be done on the username, we force it to use multiple steps: ``` python pr = 'Use it to get the checksum of the username of this session.' for o in chat.toolloop(pr): display(o) ``` Looking up username I’ll get the username of this session and then calculate its checksum.
- id: `msg_01HrZAji8SwDbRhbb79cUZHo` - content: `[{'citations': None, 'text': "I'll get the username of this session and then calculate its checksum.", 'type': 'text'}, {'id': 'toolu_01MzJHgGgQdJ5Pc9e5s35ddF', 'input': {}, 'name': 'get_user', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1282, 'output_tokens': 52, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': 'Jeremy', 'tool_use_id': 'toolu_01MzJHgGgQdJ5Pc9e5s35ddF', 'type': 'tool_result'}], 'role': 'user'} ``` Now I’ll calculate the checksum for “Jeremy”:
- id: `msg_01Fdq1ViEfUg6kzGPrYeeVa6` - content: `[{'citations': None, 'text': 'Now I\'ll calculate the checksum for "Jeremy":', 'type': 'text'}, {'id': 'toolu_016XwiSjzxo2sgJTiZw2gFH8', 'input': {'code': '\nusername = "Jeremy"\nresult = checksum(username)\nprint(f"Username: {username}")\nprint(f"Checksum: {result}")\n'}, 'name': 'run_cell', 'type': 'tool_use'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `tool_use` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1347, 'output_tokens': 101, 'server_tool_use': None, 'service_tier': 'standard'}`
``` python { 'content': [ { 'content': 'Username: Jeremy\nChecksum: 1134987783204\n', 'tool_use_id': 'toolu_016XwiSjzxo2sgJTiZw2gFH8', 'type': 'tool_result'}], 'role': 'user'} ``` The checksum of the username “Jeremy” is **1,134,987,783,204**. This is calculated by multiplying the ASCII values: 74 × 101 × 114 × 101 × 109 × 121 = 1,134,987,783,204
- id: `msg_01CfuVuK1hbYq6o61cv7ddp8` - content: `[{'citations': None, 'text': 'The checksum of the username "Jeremy" is **1,134,987,783,204**.\n\nThis is calculated by multiplying the ASCII values: 74 × 101 × 114 × 101 × 109 × 121 = 1,134,987,783,204', 'type': 'text'}]` - model: `claude-sonnet-4-5-20250929` - role: `assistant` - stop_reason: `end_turn` - stop_sequence: `None` - type: `message` - usage: `{'cache_creation': {'ephemeral_1h_input_tokens': 0, 'ephemeral_5m_input_tokens': 0}, 'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 1475, 'output_tokens': 69, 'server_tool_use': None, 'service_tier': 'standard'}`