⚡️ Ideas
Ken Blackman Ken Blackman Oct 22, 2023

Freely converse with large content (without hacking around AI context window limitation)

Explain the problem as you see it

One of the most important things I do with my content is interact with it using AI (ask questions of it, converse with it, summarize, evaluate, etc.).

The problem in Tana is AI's small context window.

I want to interact with any node and all its descendants, of any size.

I've seen the video that shows how to hack around the small AI context window for a specific use case in Tana. But it's pretty fragile. IMO this is a universal enough need that Tana should have some kind of built-in support. I'd love for it to feel seamless, the way it does in other AI tools.

Why is this a problem for you?

Currently I'm inconveniently exporting large content to other AI tools such as Elephas just to interact with it in this way. Then convert and transfer the relevant results back to Tana.

Whenever I update or make a change in my content in Tana... new export, start over.

Suggest a solution

One possible solution is to implement MemGPT, which simulates an infinite context window using techniques originally developed for memory swapping in operating systems.

This would also allow for, say, an ongoing dialog of any length, without ever forgetting what was said earlier. That would be lovely!

This would allow me to do in Tana what I'm currently using several other apps for, and save me a ton of time and unnecessary thrash (transferring into and out of, format conversion, etc. etc.)