This article is part of a series on building a high-performance multi-image gallery blog using Symfony Flex. (View the repository here.)
In the previous installment, we set up a basic Symfony project, created initial fixtures, and got the application running. This article focuses on populating the database with a realistic dataset for performance benchmarking. We'll also cover setting up a PHPUnit test suite.
Key Objectives:
Generating a Larger Dataset:
After initial development, creating a larger dataset is crucial for realistic performance testing. While small fixtures are suitable for development, performance testing requires a significantly larger volume of data. Simply increasing the COUNT
constant in our fixture classes (as shown below) is inefficient and can lead to memory exhaustion errors:
// src/DataFixtures/ORM/LoadUsersData.php const COUNT = 500; // src/DataFixtures/ORM/LoadGalleriesData.php const COUNT = 1000;
This approach is slow, prone to memory errors (PHP Fatal error: Allowed memory size of N bytes exhausted
), and inefficient due to repeated image downloads using Faker.
Optimizing Doctrine for Batch Processing:
To address these issues, we'll implement batch processing in Doctrine. We'll define a batch size (e.g., 100 galleries), flush and clear the EntityManager after each batch, and utilize garbage collection (gc_collect_cycles()
). This prevents memory bloat. We'll also monitor memory usage and print progress updates. Crucially, remember to re-merge entities back into the manager after $manager->clear()
to avoid "entity-not-persisted" errors.
Example of Optimized LoadGalleriesData
Fixture:
// Define batch size $batchSize = 100; // ... inside the for loop ... // Save the batch if (($i % $batchSize) == 0 || $i == self::COUNT) { $currentMemoryUsage = round(memory_get_usage(true) / 1024); $maxMemoryUsage = round(memory_get_peak_usage(true) / 1024); echo sprintf("%s Memory usage (currently) %dKB/ (max) %dKB \n", $i, $currentMemoryUsage, $maxMemoryUsage); $manager->flush(); $manager->clear(); gc_collect_cycles(); }
Optimizing Image Handling:
Instead of downloading images dynamically, we'll pre-select a set of images (e.g., 15 from Unsplash) and reuse them. This significantly speeds up the process. The generateRandomImage
method can be updated to select from this pre-defined set:
private function generateRandomImage($imageName) { // Array of pre-selected image filenames $images = ['image1.jpeg', 'image2.jpeg', ...]; // ... (rest of the method remains largely the same) ... }
Remember to add a command to clean the var/uploads
directory in your bin/refreshDb.sh
script before reloading fixtures.
Performance Testing with Siege and Docker:
We'll use Siege, a powerful HTTP benchmarking tool, within a Docker container for consistent and repeatable performance testing. This avoids the need for local installations and ensures consistent testing environments.
Testing Scenarios:
We'll define test scenarios to simulate various user interactions, including:
lazy-load-urls.txt
file containing URLs for lazy-loaded pages, weighted to simulate realistic user behavior.galleries.txt
.Setting up PHPUnit for Smoke Tests:
A basic PHPUnit test suite with smoke tests will ensure the core functionality remains intact during development and optimization. These tests will verify successful HTTP response codes for key URLs.
Conclusion:
This article detailed techniques for creating a realistic dataset for performance testing and establishing a robust testing framework. Future articles will delve into PHP and MySQL performance optimization and further performance improvements.
Frequently Asked Questions (FAQs): (These FAQs remain largely the same as in the original input, but could be tailored further based on the context of this specific blog post series if more specific questions arise.)
The above is the detailed content of Building an Image Gallery Blog with Symfony Flex: Data Testing. For more information, please follow other related articles on the PHP Chinese website!