Introducing Gradio 5.0
Read MoreIntroducing Gradio 5.0
Read MoreNew to Gradio? Start here: Getting Started
See the Release History
gradio.load(ยทยทยท)
css
, js
, and head
attributes) will not be loaded.import gradio as gr
demo = gr.load("gradio/question-answering", src="spaces")
demo.launch()
name: str
the name of the model (e.g. "google/vit-base-patch16-224") or Space (e.g. "flax-community/spanish-gpt2"). This is the first parameter passed into the `src` function. Can also be formatted as {src}/{repo name} (e.g. "models/google/vit-base-patch16-224") if `src` is not provided.
src: Callable[[str, str | None], Blocks] | Literal['models', 'spaces'] | None
= None
function that accepts a string model `name` and a string or None `token` and returns a Gradio app. Alternatively, this parameter takes one of two strings for convenience: "models" (for loading a Hugging Face model through the Inference API) or "spaces" (for loading a Hugging Face Space). If None, uses the prefix of the `name` parameter to determine `src`.
token: str | None
= None
optional token that is passed as the second parameter to the `src` function. For Hugging Face repos, uses the local HF token when loading models but not Spaces (when loading Spaces, only provide a token if you are loading a trusted private Space as the token can be read by the Space you are loading). Find HF tokens here: https://huggingface.co/settings/tokens.
hf_token: str | None
= None
accept_token: bool
= False
if True, a Textbox component is first rendered to allow the user to provide a token, which will be used instead of the `token` parameter when calling the loaded model or Space.
kwargs: <class 'inspect._empty'>
additional keyword parameters to pass into the `src` function. If `src` is "models" or "Spaces", these parameters are passed into the `gr.Interface` or `gr.ChatInterface` constructor.