Home > Backend Development > Python Tutorial > How to Efficiently Upload Large Files with FastAPI?

How to Efficiently Upload Large Files with FastAPI?

Linda Hamilton
Release: 2024-12-02 01:13:11
Original
845 people have browsed it

How to Efficiently Upload Large Files with FastAPI?

Uploading Large Files with FastAPI’s Server Side

The FastAPI server can handle large file uploads using the UploadFile class. Here's an example:

async def uploadfiles(upload_file: UploadFile = File(...)):
    ...
Copy after login

Problem with Client-Side Requests

When sending large files from the client, issues may arise due to the following:

  1. multipart/form-data Header: The client's request should specify the Content-Type header as multipart/form-data, followed by the necessary boundary string. However, if you're not using a library to handle file uploads, you must manually set this header.
  2. MultipartEncoder Usage: Ensure the MultipartEncoder includes the filename when declaring the field for upload_file.
  3. Library Recommendations: Using outdated libraries (e.g., requests-toolbelt) for file uploads is not recommended. Consider using Python requests or HTTPX instead, as they provide better support for large file uploads.

Faster Option Using .stream()

By accessing the request body as a stream, you can avoid loading the entire file into memory, resulting in faster uploads. This can be achieved using the .stream() method. Here's an example using the streaming-form-data library:

from streaming_form_data import StreamingFormDataParser
from streaming_form_data.targets import FileTarget
request_body = await request.stream()
parser = StreamingFormDataParser(headers=request.headers)
parser.register('upload_file', FileTarget(filepath))
async for chunk in request_body:
    parser.data_received(chunk)
Copy after login

Alternate Option Using UploadFile and Form

If you prefer to use a regular def endpoint, you can handle file uploads as follows:

from fastapi import File, UploadFile, Form, HTTPException, status
import aiofiles
import os

CHUNK_SIZE = 1024 * 1024

@app.post("/upload")
async def upload(file: UploadFile = File(...), data: str = Form(...)):
    try:
        filepath = os.path.join('./', os.path.basename(file.filename))
        async with aiofiles.open(filepath, 'wb') as f:
            while chunk := await file.read(CHUNK_SIZE):
                await f.write(chunk)
    except Exception:
        raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
                            detail='There was an error uploading the file')
    finally:
        await file.close()

    return {"message": f"Successfuly uploaded {file.filename}"}
Copy after login

Increasing HTTPX Client Timeout

When using the HTTPX library, you may need to increase the timeout to prevent read timeouts during large file uploads.

timeout = httpx.Timeout(None, read=180.0)
Copy after login

The above is the detailed content of How to Efficiently Upload Large Files with FastAPI?. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template