Gradio Agents & MCP Hackathon · Virtual, June 2-8 · $10k+ in prizes
Register NowGradio Agents & MCP Hackathon · Virtual, June 2-8 · $10k+ in prizes
Register NowNew to Gradio? Start here: Getting Started
See the Release History
To install Gradio from main, run the following command:
pip install https://gradio-builds.s3.amazonaws.com/bdbc210dbf09555b3bd8f647c6f19621771b5771/gradio-5.32.0-py3-none-any.whl
*Note: Setting share=True
in
launch()
will not work.
gradio.api(···)
import gradio as gr
with gr.Blocks() as demo:
with gr.Row():
input = gr.Textbox()
button = gr.Button("Submit")
output = gr.Textbox()
def fn(a: int, b: int, c: list[str]) -> tuple[int, str]:
return a + b, c[a:b]
gr.api(fn, api_name="add_and_slice")
_, url, _ = demo.launch()
from gradio_client import Client
client = Client(url)
result = client.predict(
a=3,
b=5,
c=[1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
api_name="/add_and_slice"
)
print(result)
fn: Callable | Literal['decorator']
= "decorator"
the function to call when this event is triggered. Often a machine learning model's prediction function. Each parameter of the function corresponds to one input component, and the function should return a single value or a tuple of values, with each element in the tuple corresponding to one output component.
api_name: str | None | Literal[False]
= None
Defines how the endpoint appears in the API docs. Can be a string, None, or False. If False, the endpoint will not be exposed in the api docs. If set to None, will use the functions name as the endpoint route. If set to a string, the endpoint will be exposed in the api docs with the given name.
queue: bool
= True
If True, will place the request on the queue, if the queue has been enabled. If False, will not put this event on the queue, even if the queue has been enabled. If None, will use the queue setting of the gradio app.
batch: bool
= False
If True, then the function should process a batch of inputs, meaning that it should accept a list of input values for each parameter. The lists should be of equal length (and be up to length `max_batch_size`). The function is then *required* to return a tuple of lists (even if there is only 1 output component), with each list in the tuple corresponding to one output component.
max_batch_size: int
= 4
Maximum number of inputs to batch together if this is called from the queue (only relevant if batch=True)
concurrency_limit: int | None | Literal['default']
= "default"
If set, this is the maximum number of this event that can be running simultaneously. Can be set to None to mean no concurrency_limit (any number of this event can be running simultaneously). Set to "default" to use the default concurrency limit (defined by the `default_concurrency_limit` parameter in `Blocks.queue()`, which itself is 1 by default).
concurrency_id: str | None
= None
If set, this is the id of the concurrency group. Events with the same concurrency_id will be limited by the lowest set concurrency_limit.
show_api: bool
= True
whether to show this event in the "view API" page of the Gradio app, or in the ".view_api()" method of the Gradio clients. Unlike setting api_name to False, setting show_api to False will still allow downstream apps as well as the Clients to use this event. If fn is None, show_api will automatically be set to False.