Unrelated but just have a look at the comments in that post. It is hilarious because almost every comment says the site has become slow. Engineers measures load time their own way; users experience load time their own way.
Some of them complain that the site is slow in the sense that their friends' activities are pushed to their timeline a few minutes after they've happened instead of instantly. Not exactly the kind of performance problem the front-end engineers had in mind, I'm sure.
How are all these people “fans” of Engineering at Facebook?
Yea, I became a fan of FB engineering hoping to talk about their distributed systems and what not. Turns out well over 99% of the fans are simply there to complain about something on facebook thinking an engineer will fix it. None of them know anything about technology and are often complaining about apps that facebook has nothing to do with.
It'd be nice if a group like that could have a simple admission test. I'm not trying to exclude people that are actually interested, but something really simple just to keep out the whiners could be useful.
interesting article, but faster is only useful when everything works.
I'd much rather a real concentrated effort on getting rid of the constant errors that popup on facebook. "Oops! something went wrong", "chat not available at this time", etc.
I went months without having any significant problems... was really happy with their reliability. But for about the last month it's been really bad. It feels like I'm on their beta site or something.
It's good to make things faster, but their method of fixing it by "writing a library" instead of just rewriting the same functions over and over again is really, really basic. It really took them that long to do that?
And once again, Facebook creates a completely custom solution for no real reason. They don't see any advantage in basing this off of jQuery or similar if they're rewriting the JavaScript anyway?
You're assuming that jQuery isn't fast. In my opinion, especially with something like JavaScript, one of the main reasons one builds upon a library is that the library's functions are faster than one could make independently.
jQuery is concerned with speed as far as I'm aware; I see benchmarks for it somewhat often, anyway, and I know it's used at a lot of big sites who are also very concerned with speed and load time and therefore has had a lot of iteration and testing with what methods work best.
Of course, if Facebook tested jQuery and found it too slow for their needs, that's all fine and good, but somehow I doubt they have. If there is a specific function that's too slow, there's no reason you can't improve or rewrite that function specifically and keep the rest of the benefits of the library.
I just doubt that Facebook came up with something faster out of the blue. In some cases, it is true that highly-optimized and specific code is necessary to get maximal performance. Those cases are relatively few, I think, and usually involve code much lower level than JavaScript. In the case of JS, the code should be optimized to perform optimally across all of the major JS VMs. This sounds to me like something that jQuery's authors would have familiarity with, as they spend all day working on that.
jQuery is not an abstraction or extra layer of cruft to bog things down, jQuery is a general-purpose library. It's not like an ORM or a dynamic language that adds a whole new abstraction that didn't exist before. Yes, ORMs and/or dynamic languages are slower, but they're slower because they obscure a major functional piece that still has to be performed (mapping onto the appropriate SQL dialect and memory management, respectively) by the computer. In this case, yes, a human could usually make a better choice than a computer if that human was adequately informed.
jQuery is not like that. jQuery is not an abstraction, it doesn't abstract away any major task that the computer has assumed for you. It's just a fast library of commonly-used functions and building blocks. It's frequently tested against new browser releases for compatibility and speed. It's frequently optimized and improved upon. There is a huge community out there on tap for support and there are a lot of plugins for it. It's open-source and easy to modify if your case requires it. There are lots of other people from other places that are making important and great changes to that base at no cost whatsoever to you or your company, changes that you can apply just by installing the new version (in backward-compatible releases, naturally) and magically enjoying this speed boost that some other guy provided your website.
I don't see any reason not to use a good open-source JavaScript library. We've used jQuery in this example, but I'm sure it's applicable to many others. I don't understand any business reason why Facebook would want to write something completely from scratch when there are these many frameworks already available. The only reason that I can see why this would be implemented in this way is either developer indulgence or developer ignorance.
I agree that there's a lot of stuff that doesn't get used in a given site. However, I don't think it would be too difficult to pare it down -- perhaps a step in the compilation process would be parsing your code and removing any functions from the jQuery file that aren't necessary.
>> "The only reason that I can see why this would be implemented in this way is either developer indulgence or developer ignorance."
That's quite ridiculous. Any framework is there to solve most problems reasonably well, and to make things easier for developers who may not have the skillset to delve into inner workings etc. A custom solution will always be better, and if you're working at scale like facebook is, it could be orders of magnitude better.
>> "You're assuming that jQuery isn't fast."
'fast' probably isn't the best word to use, since it's all relative. I would say it is certainly non-optimal.
This statement is so absurd to me. I guess it depends on what you load into the word "better". Taking your idea to the extreme, why doesn't Facebook just develop a computer from the ground up, with custom hardware, a custom OS, and everything else? Wouldn't that be faster?
I don't agree with your statement even on the basic level. Why do you even use any pre-made tools if you could just crank out everything and have it work "better"? Why use an established programming language at all?
Custom-made is not always better, especially not at a place where it just now dawned on the developers "if we make a library, we won't have to keep rewriting the same things!"
Let's take the example of Google, a more stable and mature company than Facebook. They certainly have the resources to come up with a completely custom stack if necessary. Why, then, does Google run on Linux? Sure, they modify their systems to meet their needs, but it's still Linux. And, why does Google occasionally rebase their customized kernel off of mainline? Shouldn't Google just rewrite all of those patches so that they are applicable specifically to their situation if custom is always better?
I mean, seriously, be reasonable here. The communities behind Linux or BSD or LLVM or many other open-source projects are skilled and experienced. They may make some sacrifices for the general-purpose nature of their systems, but I sincerely doubt one could sit down and churn out from scratch an OS that's "better" than Linux just like that for their specific use case. Linux or BSD is almost always going to be better than a custom solution. There are a few cases where this wouldn't be true, but in most cases, it is true, and to assume otherwise is silly.
The only advantage of writing something custom comes if the authors of an open system haven't considered your specific problem set, but even then, unless the open system is trivial (relatively) or its implementation is totally off-kelter for your project, you're better off modifying something than starting from scratch.
I agree that there's different levels of complexity there, but the same principles apply.
jQuery is non-trivial. It's been worked on and improved for years. There are many big-time users of it. There are many who spend lots of time developing for it. It's fast and regularly tested for fastness. Its speed improves in each release. It's easily extensible and easily modifiable. It's not abstracting anything major and there is no major speed hit in using it.
It seems that this argument can't be solved without additional data, but if we assume that Facebook's custom library is the same speed or even somewhat faster than an implementation of the same based on jQuery, what advantage is there in the custom library? jQuery would be better here.
The only time Facebook's custom library would be better is if it was several orders of magnitude faster than it would have been if it had been implemented with jQuery. I would be very surprised if it were, but it's not outside of the realm of possibility, I guess.
That's our point of contention here. You think that jQuery is slower by default and I think it's faster by default. Does anyone know of any data to help corroborate one of these points?
The big_pipe used on the homepage is quite cool. The initial request just returns <script> blocks as each partial is rendered. I wonder how they work it on the server side - they use PHP so they're not using any kind of threading.
Perhaps the initial PHP request passes off most of the heavy lifting to their backend services over asynchronous Thrift calls. If that's the case then their PHP layer wouldn't be doing too much work, which doesn't seem to really tie in with them releasing HPHP.
"We noticed that a relatively small set of functionality could be used to build a large portion of our features yet we were implementing them in similar-but-different ways."