Sergei Chikuyonok
Influence of layout on browser performance December 23, 2008 |
|
Task: | to demonstrate the correlation between browser performance and website layout. |
||
Creating websites every day, I started to note one quite unfortunate peculiarity: the same JavaScript code performs differently on different websites. The reason for this strange behavior was soon uncovered: script execution speed correlated with the way the website was typeset. Searching for a solution was an interesting process, but unfortunately I cannot make my findings public due to a very simple reason: all the solutions were extremely local. Some code was slower on one site and replacing this construct with that made everything faster. The only positive outcome here is for readers to be happy for me, since there is not going to be much practical help from my findings. This became the first reason why I started to explore problems with modern browsers.
The second reason is the hype about new browser versions. It’s hardly a secret that JS code in browsers is quite slow (compared to a similar C++ program, for example). The developers have only recently come to realize it and started to rewrite JS engines of their products (Google Chrome’s V8, Firefox’s TraceMonkey, etc.). A bunch of tests appeared on the internet to measure performance of these engines, among the most famous are SunSpider and Dromaeo. The tests are pretty high quality, but many users seem not to understand that these are JavaScript Performance Tests, not Browser Performance Tests. I think their results are far from being objective for modern web applications. Can one positively claim that one browser is faster than the others in all possible cases?
Let’s look at an example. Dromaeo includes a DOM Style test. I wouldn’t go into details on how it works, I’ll just say that it extracts all div elements of a page and tries to change their color and display properties (I’m interested only in style modification). What if we place these elements into a different environment (that is, change the layout of the page)? The result is different. What if this element is a picture, not a div? That’s yet another result. What if give certain CSS styles to these elements? You get the idea :-)
As a result we see that web application performance depends not only on JS engine, but also on browser engine. To demonstrate it, I created a couple of simple tests.
Digression |
In no way should you treat these examples as "yet another browser perormance test." Their only task is to demonstrate the relationship between execution speed and page layout. |
I always wondered, what would be faster, an element with position: relative or with position: absolute?
As you can see, Opera is the only browser where performance depends on the ways the element is positioned.
There is many advice on the internet on cutting down the number of elements in DOM tree. The arguments are often lame, such as “why trash the DOM?”, “the less elements there are, the faster the website,” etc.
Let’s see how it really influences performance. For this test I enclosed each word in the content block in a span element. As a result I had 942 DOM elements on the page. To compare:
— Google search result page — 384 elements
— Yandex search result page — 555
— Mail.ru main page — 1196
— Lenta.ru main page — 2669
In this case test results were compared not to each other, but to the results of the first test as we wanted to see how the number of elements, not positioning, influences performance.
Well, we can see: less elements in the DOM tree mean not only better aesthetics but also increase speed.
Note |
Don't increase the number of DOM elements if you can avoid it. |
Let’s make the previous test a bit more difficult by adding to all the span elements surrounding the words a border: 1px solid #fff CSS property and see how it changes things. Keeping in mind that the user has always see the same result, let’s cancel out the border by adding margin: -1px.
Looking with a naked eye, it is evident that Firefox suffers from a serious decrease in performance, even though the final result doesn’t differ much from the previous test.
I hope the guys from Microsoft fixed such a serious performance drop in the final version of their browser.
Similarly to testing the large number of elements, let’s test whether the DOM tree depth makes any difference. To do that, I created a structure of 30 elements enclosed in each other. In the first case I enclosed the entire layout in this structure, in the second case I simply added the structure to the very end of the page (to normalize the number of elements on the page). To make these elements at least somewhat justified (they are usually added for a reason, after all), I gave them CSS style: margin: -1px; padding: 1px; width: 100%;
Note |
The non-deep tree looks better aesthetically and increases website performance. |
What is faster, an opaque picture with CSS opacity property or a semi-transparent picture?
IE6 was removed from this test on purpose: the only way to get it to display a semi-transparent picture is to use AlphaImageLoader filter which increases animation performance.
Definitely, animation with a semi-transparent picture is smoother in modern browsers than animation with an opaque picture with opacity property set. Performance increase of the new WebKit looks impressive.
Despite a fair result, animation in Google Chrome is quite slow.
The case: an image in the background with an animation on top of it. What is faster, the obvious background-image CSS property or an img element?
As we can see, using the obvious background-image CSS property led to a serious performance drop in Safari and Opera. Interestingly, the problems are evident on Mac versions of the browsers.
Despite the high result, animation in Opera for Mac looks really poor and choppy.
In reality, in some cases img works much faster than background-image in Firefox too (although this is not evident in this test).
Note |
Using img instead of background-image can seriously increase performance. |
What if we stretch the background picture from the last test by at lest 1 pixel in HTML?
It seems that modern browsers don’t know how to optimize such tasks and redraw the stretched image every time something is changed on the page. The problem comes up in those browsers that smooth stretched images.
In Firefox 3 using large pictures can even lead to page scroll being unresponsive.
Note |
In IE7 it's possible to turn on picture smoothing: -ms-interpolation-mode: nearest-neighbor | bicubic. But it will lead to almost a twofold drop in IE7 performance. |
As we can see, the situation is far from being obvious. To conclude this article I decided to put together a list of tips that can be helpful in maximizing performance of a project.