Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
seanpedrickcase
/
llm_topic_modelling
like
0
Running
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
main
llm_topic_modelling
/
tools
Ctrl+K
Ctrl+K
3 contributors
History:
23 commits
seanpedrickcase
Upgraded Gradio. More resilient to cases where LLM calls do not return valid markdown tables (will reattempt with different temperature). Minor fixes
b9301bd
about 2 months ago
__init__.py
Safe
0 Bytes
First commit
5 months ago
auth.py
Safe
1.54 kB
Allowed for server port, queue size, and file size to be specified by environment variables
5 months ago
aws_functions.py
Safe
7.26 kB
Corrected line in upload_file_to_s3 function that was causing issues
5 months ago
chatfuncs.py
Safe
8.14 kB
Topic deduplication/merging now separated from summarisation. Gradio upgrade
3 months ago
helper_functions.py
Safe
15 kB
Upgraded Gradio. More resilient to cases where LLM calls do not return valid markdown tables (will reattempt with different temperature). Minor fixes
about 2 months ago
llm_api_call.py
Safe
130 kB
Upgraded Gradio. More resilient to cases where LLM calls do not return valid markdown tables (will reattempt with different temperature). Minor fixes
about 2 months ago
prompts.py
Safe
5.65 kB
Improved zero shot 'forced' categorisation and prompts
about 2 months ago