Home > Backend Development > Golang > How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?

How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?

DDD
Release: 2024-11-07 17:06:02
Original
785 people have browsed it

How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?

Streaming File Upload to AWS S3 with Minimal Memory and File Disk Footprint

Problem: You need to upload a large multipart/form-data file directly to AWS S3 while minimizing memory and file disk usage.

Solution: Utilize the AWS S3 Uploader with the following steps:

  1. Create an uploader with a customized configuration including the part size, concurrency, and maximum upload parts as needed.
  2. Open the file to be uploaded and pass it as the input to the uploader.
  3. Initiate the file upload process using the uploader.
  4. Handle the upload result and monitor its progress.

Example Code:

import (
    "fmt"
    "os"

    "github.com/aws/aws-sdk-go/aws"
    "github.com/aws/aws-sdk-go/aws/credentials"
    "github.com/aws/aws-sdk-go/aws/session"
    "github.com/aws/aws-sdk-go/service/s3/s3manager"
)

// Configure Amazon S3 credentials and connection
var accessKey = ""
var accessSecret = ""

func main() {
    // Determine if your AWS credentials are configured globally
    var awsConfig *aws.Config
    if accessKey == "" || accessSecret == "" {
        // No credentials provided, load the default credentials
        awsConfig = &aws.Config{
            Region: aws.String("us-west-2"),
        }
    } else {
        // Static credentials provided
        awsConfig = &aws.Config{
            Region:      aws.String("us-west-2"),
            Credentials: credentials.NewStaticCredentials(accessKey, accessSecret, ""),
        }
    }

    // Create an AWS session with the configured credentials
    sess := session.Must(session.NewSession(awsConfig))

    // Create an uploader with customized configuration
    uploader := s3manager.NewUploader(sess, func(u *s3manager.Uploader) {
        u.PartSize = 5 * 1024 * 1024 // Set the part size to 5MB
        u.Concurrency = 2           // Set the concurrency to 2
    })

    // Open the file to be uploaded
    f, err := os.Open("file_name.zip")
    if err != nil {
        fmt.Printf("Failed to open file: %v", err)
        return
    }
    defer f.Close()

    // Upload the file to S3
    result, err := uploader.Upload(&s3manager.UploadInput{
        Bucket: aws.String("myBucket"),
        Key:    aws.String("file_name.zip"),
        Body:   f,
    })
    if err != nil {
        fmt.Printf("Failed to upload file: %v", err)
        return
    }
    fmt.Printf("File uploaded to: %s", result.Location)
}
Copy after login

By utilizing the S3 Uploader and streaming the file, you can minimize memory and file disk usage during the upload process, ensuring efficient handling of large files.

The above is the detailed content of How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template