Qwen2.5-Omni-7B-Demo / requirements.txt
aimeri's picture
Update README.md to specify Python version as 3.12 and modify requirements.txt to include a direct link for flash-attn package installation, enhancing dependency management.
0c57f30
raw
history blame contribute delete
351 Bytes
transformers @ git+https://github.com/huggingface/transformers@3a1ead0aabed473eafe527915eea8c197d424356
qwen-omni-utils[decord]
soundfile
torch
gradio
torchvision
torchaudio
accelerate
# flash-attn
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiTRUE-cp312-cp312-linux_x86_64.whl