What is the Difference Between "Cache Unfriendly" and "Cache Friendly" Code?
"Cache friendliness" refers to code that maximizes performance by effectively using the computer's memory hierarchy, particularly its caches. "Cache-unfriendly" code, on the other hand, hinders performance by causing cache misses.
How to Write Cache-Efficient Code:
-
Exploit Temporal Locality: Access data that has been recently used, increasing the likelihood of finding it in the cache.
-
Exploit Spatial Locality: Group related data together in memory to minimize cache lines crossing memory page boundaries, which can trigger multiple cache misses.
-
Use Cache-Aligned Data Structures: Choose data structures like std::vector over std::list, as they store elements contiguously, improving cache locality.
-
Exploit Data Structure and Algorithm Ordering: Design data structures and algorithms that optimize cache usage. Techniques such as cache blocking and exploiting data order can significantly improve performance.
-
Mind Branch Prediction: Avoid unpredictable branches as they hinder prefetching and increase cache misses.
-
Minimize Virtual Function Calls: Virtual functions introduce inherent overhead and can cause cache misses during lookups. Use alternative design patterns or consider manual function binding for performance-critical code.
Common Cache-Related Problems:
-
False Sharing: Multiprocessors can experience cache misses when multiple threads access data in the same cache line.
-
Thrashing: Continuous page faults due to excessive memory usage, leading to slowdowns due to disk access.
The above is the detailed content of What Makes Code Cache-Friendly or Cache-Unfriendly?. For more information, please follow other related articles on the PHP Chinese website!