Managing Prototyping Context Window - Pruning Conversation History

Does anyone know of a way to prune the conversation history? I am consistently receiving the same error that leads me to believe that I need to remove context from the context window in order to proceed. Otherwise, is there a way to create a new chat?

[GoogleGenerativeAI Error]: Error fetching from https://monospace-pa.googleapis.com/v1/models/gemini-2.5-pro-preview-03-25:streamGenerateContent?alt=sse: [400 Bad Request] Request contains text fields that are too large.

Hi, we’re actively working on a fix for this, it is coming soon. In the meantime there is a thread that may help Solved: [GoogleGenerativeAI Error]: The input token count (xxxxx) exceeds the maximum number of tokens allowed (1048575). Unofficial Solution

Thanks, that’s great to hear! Looking forward to it! Unfortunately, attempting another method now causes my workspace to no longer start up. So I will now search for the relevant thread.