Upgraded Gradio. More resilient to cases where LLM calls do not return valid markdown tables (will reattempt with different temperature). Minor fixes b9301bd seanpedrickcase commited on Mar 12
Allowed for manual modification of output topic table. Did some fixes to deduplication and Excel file input. Allowed for General topic specification in zero shot topics 75d1651 seanpedrickcase commited on Mar 11
Changed default requirements to CPU version of llama cpp. Added Gemini Flash 2.0 to model list. Output files should contain only final files. b0e08c8 seanpedrickcase commited on Mar 3
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards. b7f4700 seanpedrickcase commited on Dec 10, 2024
Added more guidance in Readme. Now wipes variables on click to create or summarise topics f8f34c2 seanpedrickcase commited on Dec 4, 2024