Home > Web Front-end > JS Tutorial > body text

JavaScript startup performance bottleneck analysis and solutions

黄舟
Release: 2017-03-02 15:00:22
Original
1345 people have browsed it

In web development, as demand increases and the code base expands, the web pages we eventually publish also gradually expand. However, this expansion not only means occupying more transmission bandwidth, it also means that users may have a worse performance experience when browsing the web. After the browser downloads the scripts that a certain page depends on, it still needs to go through syntax analysis, interpretation and execution. This article will conduct an in-depth analysis of the browser's processing of JavaScript, discover the culprits that affect your application's startup time, and propose corresponding solutions based on my personal experience. Looking back, we hadn't specifically thought about how to optimize the JavaScript parsing/compilation steps; we expected that the parser would complete the parsing operation instantly after finding the <script> tag, but this was very difficult. It's obviously an idiot's dream. The following figure is an overview of the working principle of the V8 engine:

Let’s analyze the key steps in depth.

What is slowing down the startup time of our application?

In the startup phase, syntax analysis, compilation and script execution occupy most of the time when the JavaScript engine is running. In other words, the delay caused by these processes will truly reflect the user's interaction delay; for example, the user has seen a button, but it will take several seconds before he can actually click on it, which will greatly affect the user experience.

The picture above is the analysis result of a website using Chrome Canary’s built-in V8 RunTime Call Stats; it should be noted that the syntax parsing and compilation in desktop browsers takes up The time is still quite long, and the time taken on the mobile terminal is even longer. In fact, the time spent on syntax parsing and compilation in large websites such as Facebook, Wikipedia, and Reddit cannot be ignored:

The pink area in the picture above represents the time spent on V8 Compared with the time in Blink's C++, orange and yellow represent the time proportion of syntax parsing and compilation respectively. Sebastian Markbage of Facebook and Rob Wormald of Google also posted on Twitter that the long syntax parsing time of JavaScript has become a problem that cannot be ignored. The latter also said that this is also one of the main consumption when Angular is started.

With the incoming wave of mobile terminals, we have to face a cruel fact: the parsing and compilation process of the same package body on the mobile terminal costs as much as on the desktop The browser takes 2 to 5 times longer. Of course, high-end phones such as iPhone or Pixel will perform much better than mid-range phones like Moto G4. This reminds us that when testing, we should not just use the high-end phones around us, but should consider both mid-range and low-end phones:

The above picture is a comparison of the parsing time of a 1MB JavaScript package body between some desktop browsers and mobile browsers. It is obvious that the differences between mobile phones with different configurations can be found. huge difference between. When our application package body is already very large, using some modern packaging techniques, such as code splitting, TreeShaking, Service Workder caching, etc., will have a great impact on the startup time. From another perspective, even if it is a small module, if your code is poorly written or you use poor dependency libraries, your main thread will spend a lot of time in compilation or redundant function calls. We must clearly understand the importance of comprehensive evaluation to dig out the real performance bottlenecks.

Have JavaScript syntax parsing and compilation become the bottleneck of most websites?

I have heard someone say more than once, I am not Facebook, what kind of impact will the JavaScript syntax parsing and compilation you mentioned
have on other websites? I was also very curious about this issue, so I spent two months analyzing more than 6,000 websites; these websites included popular frameworks or libraries such as React, Angular, Ember, and Vue. Most of the tests are based on WebPageTest, so you can easily reproduce these test results. It takes about 8 seconds for a desktop browser with fiber access to allow user interaction, and it takes about 16 seconds for a Moto G4 in a 3G environment to allow user interaction.

Most applications will spend about 4 seconds in the JavaScript startup phase (grammar parsing, compilation, execution) in desktop browsers:

In mobile browsers, it takes about 36% more time to parse the syntax:

In addition, statistics show that not all websites throw a huge JS package to users. The average size of the Gzip-compressed package downloaded by users is 410KB, which is basically consistent with the 420KB data previously released by HTTPArchive. But the worst website dumps a 10MB script directly to the user, which is simply terrible.

Through the above statistics, we can find that the package volume is important, but it is not the only factor. The time spent on syntax parsing and compilation does not necessarily increase with the package volume. growth and linear growth. Generally speaking, a small JavaScript package will load faster (ignoring differences in browsers, devices, and network connections), but with the same size of 200KB, the syntax parsing and compilation time of different developers' packages will be huge. They are different from each other and cannot be compared.

Modern JavaScript syntax analysis & compilation performance evaluation

Chrome DevTools

Open Timeline(Performance panel) > Bottom-Up/Call Tree/Event Log will display the current The proportion of time the website spends on syntax parsing/compilation. If you want more complete information, you can turn on V8's Runtime Call Stats. In Canary, it's found in Timeline under Experims > V8 Runtime Call Stats.

Chrome Tracing

Open the about:tracing page. The underlying tracking tool provided by Chrome allows us to use disabled-by-default-v8. runtime_statsTo gain an in-depth understanding of V8’s time consumption. V8 also provides detailed guidance on how to use this feature.

WebPageTest

The Processing Breakdown page in WebPageTest will be automatically recorded when we enable Chrome > Capture Dev Tools Timeline V8 compilation, EvaluateScript, and FunctionCall times. We can also enable Runtime Call Stats by specifying disabled-by-default-v8.runtime_stats.

For more instructions, please refer to my gist.

User Timing

We can also use the User Timing API recommended by Nolan Lawson to evaluate the time of grammar parsing. However, this method may be affected by the V8 pre-parsing process. We can learn from Nolan's method in the optimize-js evaluation and add a random string at the end of the script to solve this problem. I use a similar method based on Google Analytics to evaluate the parsing time when real users and devices visit the website:

DeviceTiming

Etsy’s DeviceTiming tool can simulate a certain Evaluate the syntax parsing and execution time of the page in some restricted environments. It wraps local scripts in an instrumentation tool code so that our page can simulate access from different devices. You can read Daniel Espeset's Benchmarking JS Parsing and Execution on Mobile Devices article to learn more detailed usage.

What can we do to reduce JavaScript parsing time?

  • Reduce JavaScript package body size. We also mentioned above that smaller package bodies often mean less parsing workload, which can also reduce the browser's time consumption in the parsing and compilation phases.

  • Use code splitting tools to pass code on demand and lazy load remaining modules. This is probably the best approach, as models like PRPL encourage route-based grouping and are currently widely used by Flipkart, Housing.com and Twitter.

  • Script streaming: In the past, V8 encouraged developers to use async/defer to achieve a 10-20% performance improvement based on script streaming. This technique allows the HTML parser to allocate corresponding script loading tasks to a dedicated script streaming thread, thereby avoiding blocking document parsing. V8 recommends loading larger modules as early as possible, after all we only have one streamer thread.

  • Evaluate the cost of parsing our dependencies. We should try our best to choose dependencies that have the same functionality but load faster, such as using Preact or Inferno instead of React, which are smaller in size and require less syntax parsing and compilation time than React. Paul Lewis also discussed the cost of framework startup in a recent article, which coincides with Sebastian Markbage's statement: The best way to evaluate the startup cost of a framework is to first render an interface, then delete it, and finally re- render. The first rendering process will include analysis and compilation. Through comparison, the startup consumption of the framework can be found.

If your JavaScript framework supports AOT (ahead-of-time) compilation mode, it can effectively reduce the time of parsing and compilation. Angular applications benefit from this pattern:

How do modern browsers improve parsing and compilation speed?

Don’t be discouraged, you are not the only one struggling with how to improve startup time, our V8 team has been working hard too. We found that Octane, a previous evaluation tool, is a good simulation of real scenarios. It is very consistent with real user habits in terms of micro-framework and cold start. Based on these tools, the V8 team has also achieved about a 25% startup performance improvement in past work:

In this section we will describe the tools we have used in the past few years. Techniques for improving syntax parsing and compilation time are explained.

Code Cache

Chrome 42 began to introduce the concept of the so-called code cache, which provides us with a mechanism to store compiled code copies. Therefore, when the user visits the page a second time, the steps of script crawling, parsing and compilation can be avoided. In addition, we also found that this mechanism can also avoid about 40% of the compilation time during repeated access. Here I will introduce some content in depth:

  • Code caching will Scripts that are executed repeatedly within 72 hours work.

  • For scripts in Service Worker, code caching also works for scripts within 72 hours.

  • For scripts cached in Cache Storage using Service Worker, code caching can take effect the first time the script is executed.

In short, for actively cached JavaScript code, it can skip the syntax analysis and compilation steps at most on the third call. We can view the differences through chrome://flags/#v8-cache-strategies-for-cache-storage, or we can set js-flags=profile-deserialization to run Chrome to see if the code is loaded from the code cache. However, it should be noted that the code caching mechanism will only cache compiled code, which mainly refers to the top-level code that is often used to set global variables. Lazy compiled code such as function definitions will not be cached, but IIFE is also included in V8, so these functions can also be cached.

Script Streaming

Script Streaming allows parsing asynchronous scripts in a background thread, which can improve page loading time by about 10%. As mentioned above, this mechanism also works for synchronization scripts.


This feature is mentioned for the first time, so V8 will allow all scripts, even blocking scripts <script src=''> Can be parsed by a background thread. However, the drawback is that there is currently only one streaming background thread, so we recommend parsing large, critical scripts first. In practice, we recommend adding <script defer> inside the <head> block so that the browser engine can detect the script that needs to be parsed as early as possible and then Assigned to background thread for processing. We can also check the DevTools Timeline to determine whether the script is parsed in the background. Especially when you have a critical script that needs to be parsed, you need to make sure that the script is parsed by the streaming thread.

Grammar parsing & compilation optimization

We are also committed to building a more lightweight and faster parser, which is currently the biggest bottleneck in the V8 main thread. lies in the so-called nonlinear analytical consumption. For example, we have the following code piece:

(function (global, module) { … })(this, function module() { my functions })
Copy after login

V8 We don’t know whether we need the module module when compiling the main script, so we will temporarily give up compiling it. And when we plan to compile module, we need to reanalyze all internal functions. This is the reason for the so-called nonlinearity of V8 parsing time. Any function at N level depth may be re-analyzed N times. V8 is already able to collect information about all internal functions when compiling for the first time, so V8 will ignore all internal functions in future compilations. For the above function in the form of module, it will be a great performance improvement. It is recommended to read The V8 Parser(s) — Design, Challenges, and Parsing JavaScript Better for more information. V8 is also looking for a suitable offloading mechanism to ensure that the JavaScript compilation process can be executed in a background thread at startup.

Precompiled JavaScript?

Every few years, someone proposes that the engine should provide some mechanism for processing precompiled scripts. In other words, developers can use build tools or other server-side tools to convert scripts into bytecode, and then the browser can run them directly. Just these bytecodes. From my personal point of view, directly transmitting bytecode means a larger package body, which will inevitably increase the loading time; and we need to sign the code to ensure that it can run safely. Our current positioning for V8 is to avoid the internal re-analysis mentioned above as much as possible to improve startup time, while pre-compilation will bring additional risks. However, we welcome everyone to discuss this issue together, although V8 is currently focusing on improving compilation efficiency and promoting the use of Service Worker cache script code to improve startup efficiency. We also discussed precompilation at BlinkOn7 with Facebook and Akamai.

Optimize JS Optimization

JavaScript engines like V8 will pre-parse most functions in the script before performing complete parsing. This is mainly due to the fact that most pages contain The JavaScript function will not be executed immediately.

Precompilation can improve startup time by processing only the minimum set of functions required by the browser to run, but this mechanism actually reduces efficiency in the face of IIFE. Although the engine hopes to avoid preprocessing these functions, it is far less effective than libraries like optimize-js. optimize-js will process the script before the engine and insert parentheses for functions that are executed immediately to ensure faster execution. This kind of preprocessing has a very good optimization effect on Browserify and Webpack's generated package body, which contains a large number of small modules that can be executed immediately. Although this little trick is not what V8 wants to use, the corresponding optimization mechanism has to be introduced at the current stage.

Summary

Performance in the startup phase is crucial. Slow parsing, compilation and execution times may become the bottleneck of your web page performance. We should evaluate the time the page spends at this stage and choose the appropriate way to optimize it. We will also continue to work on improving the starting performance of the V8 to the best of our ability!

The above is the content of JavaScript startup performance bottleneck analysis and solutions. For more related content, please pay attention to the PHP Chinese website (www.php.cn)!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template