PHP
Efficiently Processing Large Datasets with Eloquent Chunking
Learn to process millions of records efficiently in Laravel Eloquent by chunking results, preventing memory exhaustion for large datasets.
// app/Console/Commands/ProcessLargeOrders.php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use App\Models\Order;
class ProcessLargeOrders extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'orders:process-large';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Process a large number of orders in chunks.';
/**
* Execute the console command.
*
* @return int
*/
public function handle()
{
$this->info('Starting to process orders...');
Order::chunk(1000, function ($orders) {
foreach ($orders as $order) {
// Process each order. For example, update its status.
$order->status = 'processed';
$order->save();
// Or perform other operations
// $this->info("Processing Order ID: {$order->id}");
}
$this->info("Processed " . count($orders) . " orders.");
});
$this->info('Finished processing all orders.');
return Command::SUCCESS;
}
}
// To run this command:
// php artisan orders:process-large
How it works: When dealing with extremely large datasets, fetching all records into memory at once can lead to memory exhaustion. Eloquent's `chunk()` method allows you to retrieve results in smaller "chunks", passing each chunk to a callback for processing. This significantly reduces memory usage as only a subset of the results are loaded into memory at any given time, making it ideal for background jobs or command-line scripts where you need to iterate over thousands or millions of records.