%3Fxml version="1.0" encoding="UTF-8"%3F>
Not uncommonly, I ran into a snag on the way to my final answer. Due to some JS sandbox issues (I think), onbeforeunload wasn’t actually the right object, so I had to awkwardly insert a script element to get this working.
After I got this working, I reloaded Hipchat, and suddenly felt a rush of excitement: something I wrote changed a product I use every day for the better! If you are a web developer and you haven’t played around with addons with browsers like Firefox, Chrome, or Safari, it’s a fun, empowering way to add a little more control to your everyday web experience. It’s streamlined almost to its perfect form with the addon builder.]]>
I’m a bit sad for Firefox. It used to be the fast, powerful, progressive browser that finally broke IE’s era of stagnant dominance and saved web developers’ sanity. Now, it’s a bloated, slow, unstable monster that’s often a pain in this web developer’s ass.
Siegler nods in assent. It’s funny to me because in my sphere Firefox is alive, full of new life due to efforts like memshrink and Jaegermonkey. The stability cry also rings hollow; this is a non-issue for me.
Then again, I have a few tricks up my sleeve that you may not. For web developers out there who love Firefox and its mission but worry about the bloat, I have a few suggestions:
Our designer pointed out today that Safari was re-rendering text when CSS animations were active. The affected text was not related at all to the animation. Putting a background color on the text block fixed the issue:
I don’t know why this worked, but I have a theory. CSS transitions and animations are hardware accelerated in every browser today in some form or another. Every browser now has to balance complex rendering algorithms with performance, and sometimes they don’t get them right. For Firefox, we broke apart web pages into pieces called layers. The content of the layers were rendered in software, and at a later point all the layers are composited using hardware acceleration to give us what we see on our screens.
Does Firefox create a new layer for every single block element? No. There is a cost to creating a layer, so Firefox keeps layer creation cheap and makes them on the fly, for instance, when CSS animations are running. When it creates this new layer, the software renderer has to repaint some elements on the web page. The final bitmap dimensions change because the larger paint has been broken up into two paints that will be composited together. Good text rendering is heavily dependent on subpixel positioning, so if you aren’t careful the aliasing calculations will subtly change when rendering text.
Safari does something similar through the Core Animation framework, so my guess is that setting the background color changed the regions that the software renderer painted and stopped tickling the bug. Background color permits Safari to paint that div to an opaque ‘layer’ instead of a transparent one, or so it goes in my head.]]>
Is it tacky to quote yourself? I’m in a devil-may-care frame of mind this morning, so I’m doing it anyway!
From this post:
I’m an imperfect being and I know I should write much more tests than I do.
I will continue fighting the urge to just ignore tests and keep telling
myself why they are so crucial. I’m trying to be part of the solution, and
I’ve committed a few mochitest patches along the way. What Mozilla could do
to help me significantly is to give me great tools…
It’s worth noting that the details matter. I know we devs tend to think of
ourselves as very practical and rationally minded, but the longer I make
stuff the more I am convinced that an emotional feeling of using these tools
is crucial to a happy developer. So let’s get some ideas, and I’ll be happy
to file some bugs!
Brainstorm your ideas. We’ll sort them out and file some bugs! I’m particularly interested in what the web developers in the crowd have to say, as in my experience the tools there are a pleasure to use.]]>
…[B]rowsers have historically been very friendly to learning web-making, in part because they keep protocol information in the address bar. My guess is that removing the http:// neither helps nor hinders someone from using the basics of the web—but it definitely makes it harder to learn what hypertext is.
Atul is rightly worried. The web is evolving rapidly, and one of its genes stands trial: the URL. It faces mounting pressure of extinction as a prominent piece of our user interfaces. The removal of the protocol is one minor example. Take the native application trend on mobile. Although the URL still remains the technical heart pumping data through our web, the URL no longer stands as the canonical identity of our web. On my phone, I don’t go to twitter.com. I tap on Twitter. I find myself spending more time outside the browser than ever before.
In truth, Mozilla is beginning to embrace this trend. We are now dreaming up open application markets and painting pictures of web applications that appear no different than their native cousins. The trend feels obvious. In this manner, native has already triumphantly secured its place in the gene pool.
Will the URL ultimately fade into oblivion? I can’t predict the future, but I can ask a better question. If the URL no longer stands as the bulwark defending the hackability and transparency of our web, what new things will we build up to replace it?]]>
Then, imagine tapping on a link and immediately seeing a familiar interface that is loading your web page in place of a loading screen. This is the goal for startup shrink.
It’s been a week since our first meeting. Some important bugs:
As usual, measuring startup time is generally very tricky. Using wall clock time for a warm startup doesn’t accurately represent real world scenarios. Here are some bugs that might help us with instrumentation:
If you have ideas for improving startup time or ideas for tools, get in touch! Better yet, maybe you’d like to contribute some code. We can be found in #startup on irc.mozilla.org.]]>
You may already be writing good browser chrome tests that properly reset any side effects your test may have caused, but do you always clean up? Instead of manually cleaning up your test after a successful run, please consider using registerCleanupFunction. You can register as many functions as you like, and they are guaranteed to run at the end of the test! I didn’t know this technique existed until recently, so I thought I’d pass it along.
Unfortunately, I’m not aware of anything similar for other mochitests.]]>
Download the Extension
Update: Bugzilla Tweaks already has this functionality.]]>
So, how did we do it? Through a major effort across the Firefox platform team and the front-end team, we split the rendering and processing done by web content into a separate process from the one that handles Firefox’s user interface. Firefox developers internally refer to this split as electrolysis, and you may be familiar with the concept from other browsers like Internet Explorer and Chrome. Furthermore, the content process can asynchronously render more content than is visible so that our main process can quickly pan and zoom by transforming the rendered content appropriately. In the meanwhile, the content process is informed of the new visible region and computes the new rendering quietly in the background.
For this post, I’d like to cover the platform foundation that made this possible for mobile Firefox.
One of the largest projects completed for Firefox 4 was the ability to quickly render common kinds of transitions and animations by splitting up rendering into separate parts and compositing them together quickly using.hardware acceleration. The layers system is responsible for compositing.
The layers are stored in a tree. A container layer holds child layers, and the leaf nodes are populated by layers that render content, such as thebes layers or color layers. Each node has properties like a transformation, a clip rectangle, and an opacity, which affect the rendering of it and its children. You can see what a node looks like in Layers.h. Applying transforms, clipping, and opacities are the kinds of operations that are straightforward operations for DirectX and OpenGL pipelines, so we can generate these accelerated layers on demand for fast web page animations.
For our desktop platforms, layers are already being composited using hardware acceleration, but we have disabled it for mobile platforms for now. We hope to significantly reduce mobile Firefox’s CPU load in a near-term release through hardware acceleration.
Layers are not only useful for fast animations. They also demonstrate a semantically clean break between expensive rendering and simple rendering for web pages. The expensive rendering is done in the content process, and the simple layers compositing is done in the main process. These layers generated in the content process and forwarded to the parent process are called shadow layers. In the main process, we manipulate these shadow layers using translation and scale transformations to achieve the effect of panning and zooming. When the layers are updated from the content process, we carefully compensate for any changes in the layers so that the procedure is seamless to the user.
Our first implementation of shadow layers used sockets to transport pixels back and forth, but we quickly found this was too slow. The main process would spend too much time shoving pixels to and fro, emitting obvious stalls during panning. We now use shared double buffered memory between the processes. The main process manipulates how the read-only front buffer is rendered, while the content process refreshes invalidated content in the back buffer.
The final component allows Firefox to render more than what is visible on the screen. Otherwise, there would be no content to render for hundred of milliseconds when the user pans. We render extraneously by setting a displayport, which overrides the visible region and paints the specified rectangle we give it. The web page is none the wiser.
Using these APIs provided by our platform, mobile Firefox is able to give an experience that is responsive and fast. The experience isn’t perfect yet, but we have a solid foundation to iterate on top of.
The majority of this great work was done by Chris Jones with the assistance of Robert O’Callahan and Timothy Nikkel. Without them, this project would not have happened. Thanks guys!]]>
Mozilla is a unique force in the browser market whose primary goal is to keep the web available for everybody. Recently, we acted on our mission statement by taking a stand on the Ogg Theora video decoder. For HTML5 video, Mozilla chose to support Theora (and only Theora) for two reasons:
Frankly, this isn’t about idealism or sophomoric zealotry, it’s about ensuring a healthy future for the web. We, after careful consideration, decided this was worth fighting for. I can’t tell you what the future holds for video formats, but Mozilla will ultimately do what’s best for consumers and publishers.
What’s more interesting to me is our ability to cause such a stir. Our stand loses its meaning without our market share, illuminating how important our user base is. In general, take a step back and recall where the web was just 4 years ago. Mozilla and our users played an (dare I say the?) important role in where the web is today. That’s actually pretty amazing for free software with grassroots origins.
I’ve come to see Mozilla in a different light from this video debate. As Firefox users, our choice is not only colored by features, or speed, or extensibility. It’s also about using a browser built by people whose vested interest is, well, everyone. You really are voting with your software, with every vote adding a discrete boost in volume to Mozilla’s voice. For us, it’s about the big picture. So let me gush a little here: thank you, our amazing users <3! For my part I will do my best to repay your choice with a better browser that gets out of the way and lets you get things done.]]>