Uploading large files directly to AWS S3 while minimizing memory and file disk footprint can be achieved using the upload manager in Go. Here's how:
Import the necessary libraries:
import ( "github.com/aws/aws-sdk-go/aws/credentials" "github.com/aws/aws-sdk-go/aws" "github.com/aws/aws-sdk-go/aws/session" "github.com/aws/aws-sdk-go/service/s3/s3manager" )
Create an AWS config:
awsConfig := &aws.Config{ Region: aws.String("us-west-2"), }
You can optionally provide your own access key and secret key in the config if needed.
Initialize a new session and an uploader:
sess := session.Must(session.NewSession(awsConfig)) uploader := s3manager.NewUploader(sess)
Customize the uploader's parameters (optional):
// Set the part size, concurrency, and max upload parts uploader := s3manager.NewUploader(sess, func(u *s3manager.Uploader) { u.PartSize = 5 * 1024 * 1024 // 5MB is the minimum allowed part size u.Concurrency = 2 // Default is 5 })
Open the file to upload:
f, err := os.Open(filename) if err != nil { fmt.Printf("failed to open file %q, %v\n", filename, err) return }
Upload the file using the uploader:
result, err := uploader.Upload(&s3manager.UploadInput{ Bucket: aws.String(myBucket), Key: aws.String(myKey), Body: f, })
Handle any potential errors:
if err != nil { fmt.Printf("failed to upload file, %v\n", err) return }
Print the upload location:
fmt.Printf("file uploaded to, %s\n", result.Location)
By utilizing the upload manager in this way, you can stream and upload large files directly to AWS S3 with minimal resource consumption.
The above is the detailed content of How to Stream File Upload to AWS S3 Using Go?. For more information, please follow other related articles on the PHP Chinese website!