Web speed optimization

Time:2021-6-8

I believe that the Internet has increasingly become an indispensable part of people’s lives. Ajax, flex and other rich client applications make people more “happy” to experience many functions that can only be implemented in C / s. For example, Google opportunity has moved the most basic office applications to the Internet. Of course, convenient at the same time, there is no doubt that the speed of the page is getting slower and slower. I do front-end development. In terms of performance, according to the survey of Yahoo, the back-end accounts for only 5%, while the front-end accounts for 95%, of which 88% can be optimized.

The above is a life cycle diagram of Web2.0 pages. Engineers vividly say that it is divided into four stages: pregnancy, birth, graduation and marriage. If we are aware of this process when we click on a web link, rather than simply request response, we can dig out many details that can improve performance. Today, I listened to a lecture on the Web Performance Research of the Yahoo development team from Taobao Pongo. I feel I have gained a lot and want to share it on the blog.

I believe many people have heard the 14 rules for optimizing website performance. For more information, see developer.yahoo.com

1. Try to reduce the number of HTTP requests [content]
2. Use CDN (content delivery network) [server]
3. Add the expires header (or cache control) [server]
4. Gzip component [server]
5. Put CSS style at the top of the page [CSS]
6. Move the script to the bottom (including inline) [JavaScript]
7. Avoid using expressions [CSS] in CSS
8. Separate JavaScript and CSS into external files [JavaScript] [CSS]
9. Reduce DNS query [content]
10. Compress JavaScript and CSS (including inline) [JavaScript] [CSS]
11. Avoid redirecting [server]
12. Remove duplicate scripts [JavaScript]
13. Configure entity tags (etags) [CSS]
14. Cache Ajax  

There is a plug-in Yslow under firebox, which is integrated into firebug. You can use it to easily see the performance of your website in these aspects.

This is the evaluation result of my website xifengfang with Yslow. Unfortunately, only 51 points. ha-ha. The scores of China’s major websites are not high, just measured, Sina and Netease are 31 points. Then the score of Yahoo (USA) is 97! It can be seen that Yahoo has made great efforts in this respect. Judging from the 14 rules they summarized and the 20 new points added now, there are many details that we really don’t think about. Some practices are even “abnormal”.

First, reduce the number of HTTP requests as much as possible

HTTP request is to overhead, try to reduce the number of requests can naturally improve the speed of the web page. Common methods include merging CSS, JS (merging CSS and JS files in a page) and image maps and CSS sprites. Of course, perhaps the CSS, JS file split multiple because of CSS structure, sharing and other considerations. At that time, the practice of Alibaba Chinese station was to develop separately, and then merge JS and CSS in the background, which is still a request for the browser, but it can still be restored to multiple during development, which is convenient for management and repeated reference. Yahoo even suggests that the CSS and JS of the home page should be written directly in the page file instead of external reference. Because the number of visits to the home page is too large, this can also reduce the number of two requests. In fact, many domestic portals do this.

CSS sprites means that only the background image on the page is merged into one, and then the background image is obtained by the value that cannot be defined by the background position attribute of CSS. Taobao and Alibaba Chinese station are doing this at present. Interested can take a look at the background of Taobao and Alibaba.

http://www.csssprites.com/This is a tool website, which can automatically merge your uploaded images and give the corresponding background position coordinates. The results are output in PNG and GIF format.

Second, use CDN: use a content delivery network

To tell you the truth, I don’t know much about CDN. To put it simply, by adding a new layer of network architecture to the existing Internet, I publish the content of the website to the cache server closest to the user. Through DNS load balancing technology, I can judge the source of the user and access the cache server to obtain the required content, Users in Hangzhou access content on servers near Hangzhou, and users in Beijing access content on servers near Beijing. This can effectively reduce the time of data transmission on the network and improve the speed. For more details, you can refer to the explanation of CDN on Baidu Encyclopedia. Yahoo! Distributing static content to CDN reduces user impact time by 20% or more.

CDN technology schematic diagram:

CDN networking diagram:


Third, add an expiration / cache control header: add an expiration header

Now more and more pictures, scripts, CSS, flash are embedded into the page. When we visit them, we will make many HTTP requests. In fact, we can cache these files by setting expires header. In fact, expiration is to specify the cache time of a specific type of file in the browser through the header message. Most pictures and flash don’t need to be modified frequently after they are published. After caching, the browser doesn’t need to download these files from the server, but reads them directly from the cache, which will greatly speed up the speed of accessing the page again. A typical HTTP 1.1 protocol returns the following header information:
HTTP/1.1 200 OK
Date: Fri, 30 Oct 1998 13:19:41 GMT
Server: Apache/1.3.3 (Unix)
Cache-Control: max-age=3600, must-revalidate
Expires: Fri, 30 Oct 1998 14:19:41 GMT
Last-Modified: Mon, 29 Jun 1998 02:28:12 GMT
ETag: “3e86-410-3596fbbc”
Content-Length: 1040
Content-Type: text/html

Cache control and expires can be set by server script.

For example, set the expiration date in PHP after 30 days:

<!– pHeader(“Cache-Control: must-revalidate”);$ offset = 60 * 60 * 24 * 30;$ ExpStr = “Expires: ” . gmdate(“D, d M Y H:i:s”, time() + $offset) . ” GMT”; Header($ExpStr);–> Can also be completed by configuring the server itself, these even is not very clear, ha ha. Want to know with more friends can refer tohttp://www.web-caching.com/

As far as I know, the expiration time of Alibaba’s Chinese website is 30 days. However, there are also some problems during this period, especially the setting of script expiration time should be carefully considered, otherwise it may take a long time for the client to “perceive” such changes after the corresponding script function is updated. I met this problem when I was working on the [suggest project] before. Therefore, we should carefully consider which should be cached and which should not be.

Fourth, enable gzip compression: gzip components

The idea of gzip is to compress the file on the server first, and then transfer it. This can significantly reduce the size of the file transfer. After the transmission, the browser will decompress the compressed content again and execute. The current browsers can support gzip well. Not only the browser can identify, but also the major “Crawler” can also identify, you SEOER can be relieved. And gzip compression ratio is very large, the general compression rate is 85%, that is to say, the server 100k page can be compressed to about 25K, and then sent to the client. Specific gzip compression principle, you can refer to CSDN on the “gzip compression algorithm” this article. Yahoo specially stressed that all the text content should be gzip compressed: HTML (PHP), JS, CSS, XML, TXT… Our website has done a good job, which is an a. In the past, our home page is not a, because there are many ads on the home page. The JS of the owner’s website has not been gzip compressed, which will also drag down our website.

Most of the above three points belong to the content of the server side, I am only superficial understanding. What is wrong needs to be corrected.

Fifth, put CSS at the top of the page

Why put CSS at the top of the page? Because ie, firebox and other browsers will not render anything until the CSS is completely transferred. The reason is as simple as brother pony said. CSS, full name cascading style sheets. Cascading means that the following CSS can cover the previous CSS, and the high-level CSS can cover the low-level CSS. In [CSS! [important] this hierarchical relationship was briefly mentioned at the bottom of this article. Here we only need to know what CSS can be covered. Since the previous one can be overridden, it is no doubt reasonable for browsers to render after it is completely loaded. The problem with many browsers, such as ie, to put the style sheet at the bottom of the page is that it prohibits the sequential display of web content. If the browser blocks the display to avoid redrawing the page elements, the user can only see the blank page. Firefox doesn’t block the display, but it means that some page elements may need to be redrawn after the stylesheet is downloaded, which causes flicker problems. So we should finish loading CSS as soon as possible

Along with this meaning, if we go further into it, there are still areas that can be optimized. For example, the two CSS files on this website, < link rel = “stylesheet” Rev = “stylesheet” a=“ http://www.space007.com/themes/google/style/google.css ”Type = “text / CSS” media = “screen” / > and < link rel = “stylesheet” Rev = “stylesheet” a=“ http://www.space007.com/css/print.css ” type=“text/css” media=“print” />。 From media, we can see that the first CSS is for browsers, and the second CSS file is for print styles. From the user’s behavior habits, the action to print the page must occur after the page is displayed. So a better way is to dynamically add CSS for the printing device after loading the page, which can improve the speed( Ha ha)

Put scripts at the bottom of the page

There are two purposes to put the script at the bottom of the page: 1. To prevent the execution of script from blocking the download of the page. In the process of page loading, when the browser reads the JS execution statement, it will explain it all and then read the following content. Do not believe you can write a JS loop to see if the things below the page will come out( The execution of setTimeout and setinterval is a bit similar to multithreading, and the following content rendering will continue before the corresponding response time.) The reason why the browser does this is that JS may execute location. A or other functions that may completely interrupt the page process at any time. In other words, it will have to wait for JS to complete the execution before loading. So putting it at the end of the page can effectively reduce the loading time of visual elements on the page.         2. The second problem caused by scripting is that it blocks the number of parallel downloads. The HTTP / 1.1 specification recommends that the number of parallel downloads for each host of the browser should not exceed 2 (ie can only be 2, other browsers such as FF are set to 2 by default, but the new IE8 can be up to 6). Therefore, if you distribute the image files to multiple machines, you can achieve more than 2 parallel downloads. However, when the script file is downloaded, the browser will not start other parallel downloads.

Of course, for each website, the feasibility of loading scripts at the bottom of the page is questionable. For example, the page of Alibaba Chinese station. There are inline JS in many places, and the display of pages depends heavily on it. I admit that this is far from the concept of non-invasive script, but many “historical problems” are not so easy to solve.

Avoid using expressions in CSS

However, there are two more layers of meaningless nesting, which is definitely not good. A better way is needed.

Eighth, put JavaScript and CSS into external files (make JavaScript and CSS external)

I think this is easy to understand. Not only in terms of performance optimization, but also in terms of easy code maintenance. Writing CSS and JS in the content of the page can reduce two requests, but also increase the size of the page. If CSS and JS have been cached, there will be no more than two HTTP requests. Of course, as I said earlier, some special page developers still choose inline CSS and JS files.

Article 9. Reduce DNS lookups

On the Internet, there is a one-to-one correspondence between the domain name and the IP address. The domain name (kuqin. Com) is easy to remember, but computers don’t know it. The “mutual recognition” between computers has to be converted into IP address. In the network, each computer has an independent IP address. The conversion between domain name and IP address is called domain name resolution, also known as DNS query. A DNS resolution process will take 20-120 Ms. before DNS query, the browser will not download anything under the domain name. So reducing the time of DNS query can speed up the loading speed of the page. Yahoo suggests that the number of domain names contained in a page should be limited to 2-4 as far as possible. This requires a good planning for the whole page. At present, we don’t do well in this aspect, and a lot of advertising systems are dragging us down.

Article 10. Compressed JavaScript and CSS (Mini JavaScript)

Compression JS and CSS around, obviously, reduce the number of page bytes. Small capacity, page loading speed is naturally fast. In addition to reducing the volume, compression can also play a certain protection. We did a good job of that. Common compression tools include jsmin, YUI compressor, etc. In addition, likehttp://dean.edwards.name/packer/We also provide a very convenient online compression tool. You can see the capacity difference between the compressed JS file and the uncompressed JS file on the jQuery web page

Of course, one of the drawbacks of compression is that the readability of the code is lost. I believe many front-end friends have encountered this problem: Google’s effect is cool, but to see his source code is a lot of characters crowded together, even the function name is replaced, sweat! Isn’t it very inconvenient to maintain your own code. At present, the method adopted by all Alibaba Chinese websites is to compress on the server side when JS and CSS are released. In this way, we can easily maintain our own code.

Article 11. Avoid redirections

I saw the article “Internet Explorer and connection limits” on ieblog not long ago. For example, when you typehttp://www.kuqin.com/The server will automatically generate a 301 serverhttp://www.kuqin.com/You can see it in the address bar of the browser. This kind of redirection is also time-consuming. Of course, this is only an example. There are many reasons for redirection, but the constant is that every additional redirection will increase a web request, so we should try to reduce it.

Article 12. Remove duplicate scripts

I don’t want to tell you that, not only in terms of performance, but also in terms of code specifications. But I have to admit that many times we will add some code that may be repetitive because of the quick speed of the diagram. Maybe a unified CSS framework and JS framework can better solve our problems. Piggy’s point of view is very right, not only to do not repeat, but also to be reusable.

Article 13. Configure etags

I don’t understand that, ha ha. Find a more detailed explanation “using etags to reduce web application bandwidth and load” on inforq. Interested students can go to have a look.

14. Make Ajax cacheable

How to cache Ajax? When making Ajax requests, you often need to add a timestamp to avoid caching. It’s important to remember that “asynchronous” does not mean “instantaneous”. Remember that even if AJAX is generated dynamically and works for only one user, they can still be cached.