List Huggingface.js Packages
Browse Hugging Face JS Packages
Please open a proper issue in the corresponding Github repo (is it a huggingface_hub issue? a gradio one? etc.) with more details and especially a reproducible example of the issue you are having. If it is a HTTP request issue, try to isolate it first. (the more details you can provider, the better it is for the maintainers)
My spaces use models that, for some reason, state that they are unavailable to be used by any of the inference providers, despite them working fine two months ago
The Huggingface's serverless Inference API wasn't a production-ready service. It was only meant to easily experiment and prototype ML apps. We started rolling out inference providers to tackle this topic and make things more future-proof. Regarding the specific problem you have, it's hard to help without knowing the specific models you were using back then. The most likely is that these models have been removed from HF Inference API infra as we are now focusing on making fewer but high-impact models available.
pip install -U huggingface_hub[hf_xet]
from huggingface_hub import InferenceClient
client = InferenceClient(provider="fal-ai", bill_to="my-cool-company")
image = client.text_to_image(
"A majestic lion in a fantasy forest",
model="black-forest-labs/FLUX.1-schnell",
)
image.save("lion.png")