Uploading Large Files with FastAPI’s Server Side
The FastAPI server can handle large file uploads using the UploadFile class. Here's an example:
async def uploadfiles(upload_file: UploadFile = File(...)): ...
Problem with Client-Side Requests
When sending large files from the client, issues may arise due to the following:
Faster Option Using .stream()
By accessing the request body as a stream, you can avoid loading the entire file into memory, resulting in faster uploads. This can be achieved using the .stream() method. Here's an example using the streaming-form-data library:
from streaming_form_data import StreamingFormDataParser from streaming_form_data.targets import FileTarget request_body = await request.stream() parser = StreamingFormDataParser(headers=request.headers) parser.register('upload_file', FileTarget(filepath)) async for chunk in request_body: parser.data_received(chunk)
Alternate Option Using UploadFile and Form
If you prefer to use a regular def endpoint, you can handle file uploads as follows:
from fastapi import File, UploadFile, Form, HTTPException, status import aiofiles import os CHUNK_SIZE = 1024 * 1024 @app.post("/upload") async def upload(file: UploadFile = File(...), data: str = Form(...)): try: filepath = os.path.join('./', os.path.basename(file.filename)) async with aiofiles.open(filepath, 'wb') as f: while chunk := await file.read(CHUNK_SIZE): await f.write(chunk) except Exception: raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail='There was an error uploading the file') finally: await file.close() return {"message": f"Successfuly uploaded {file.filename}"}
Increasing HTTPX Client Timeout
When using the HTTPX library, you may need to increase the timeout to prevent read timeouts during large file uploads.
timeout = httpx.Timeout(None, read=180.0)
The above is the detailed content of How to Efficiently Upload Large Files with FastAPI?. For more information, please follow other related articles on the PHP Chinese website!