llm_topic_modelling / tools /helper_functions.py

Commit History

Upgraded Gradio. More resilient to cases where LLM calls do not return valid markdown tables (will reattempt with different temperature). Minor fixes
b9301bd

seanpedrickcase commited on

Allowed for manual modification of output topic table. Did some fixes to deduplication and Excel file input. Allowed for General topic specification in zero shot topics
75d1651

seanpedrickcase commited on

Changed default requirements to CPU version of llama cpp. Added Gemini Flash 2.0 to model list. Output files should contain only final files.
b0e08c8

seanpedrickcase commited on

Added presentation of summary table outputs
cc6683a

seanpedrickcase commited on

Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
b7f4700

seanpedrickcase commited on

Added more guidance in Readme. Now wipes variables on click to create or summarise topics
f8f34c2

seanpedrickcase commited on