May 272024
 

One of the catch phrases of the famous computer game, Bioshock, is “would you kindly”. It’s only near the end of the game that we learn that the protagonist is compelled to respond to this phrase and act accordingly. Presumably, omitting this phase would have had unpleasant consequences for the game’s antagonists.

I was reminded of this as I was playing with the “behind-the-scenes” setup instructions that I have for the language models GPT and Claude at my site wispl.com. The models are instructed on how to use tools, specifically Google (for searches) and Maxima (for computer algebra). I was perplexed as to why both models tended to overuse Google even when the conversation began with a question or request that should have required no searches at all.

The relevant part of the instructions sent to the chatbot at the beginning of a conversation used to read as follows:

If your answer requires the most recent information or current events, respond solely with CSEARCH(query) with no additional text. For general queries or fact-checking that is not time-sensitive, respond solely with GSEARCH(query) and no additional text.

In a moment of inspiration, however, I changed this to:

If your answer requires the most recent information or current events, respond solely with CSEARCH(query) with no additional text. If your answer requires general queries or fact-checking that is not time-sensitive, respond solely with GSEARCH(query) and no additional text.

Can you spot the tiny difference? All I did was to repeat the “If your answer requires” bit.

Problem (apparently) solved. The chatbot no longer appears to do Google queries when it doesn’t really need them. I just needed to make sure that the magic phrase explicitly accompanies each request. Much like “Would you kindly”, in the world of Bioshock.

 Posted by at 6:56 pm