Laravel's streaming response feature allows efficient processing of large data sets, sending generated data incrementally, reducing memory usage and improving response time.
The following is a simple example showing how to output 100 rows of data using streaming response:
Route::get('/stream', function () { return response()->stream(function () { foreach (range(1, 100) as $number) { echo "Line {$number}\n"; ob_flush(); flush(); } }, 200, ['Content-Type' => 'text/plain']); });
Next, let's look at a more practical example to demonstrate how to stream export large order data:
<?php namespace App\Http\Controllers; use App\Models\Order; use Illuminate\Support\Facades\DB; class ExportController extends Controller { public function exportOrders() { return response()->stream(function () { // 输出 CSV 头部 echo "Order ID,Customer,Total,Status,Date\n"; ob_flush(); flush(); // 分块处理订单以保持内存效率 Order::query() ->with('customer') ->orderBy('created_at', 'desc') ->chunk(500, function ($orders) { foreach ($orders as $order) { echo sprintf( "%s,%s,%.2f,%s,%s\n", $order->id, str_replace(',', ' ', $order->customer->name), $order->total, $order->status, $order->created_at->format('Y-m-d H:i:s') ); ob_flush(); flush(); } }); }, 200, [ 'Content-Type' => 'text/csv', 'Content-Disposition' => 'attachment; filename="orders.csv"', 'X-Accel-Buffering' => 'no' ]); } }
With streaming response, we can efficiently process large datasets while maintaining a low memory footprint and providing instant feedback to users.
The above is the detailed content of Optimizing Large Data Delivery with Laravel Streaming Responses. For more information, please follow other related articles on the PHP Chinese website!