Hacker News new | past | comments | ask | show | jobs | submit login
MemShrink progress report, week 23 (blog.mozilla.com)
211 points by llambda on Dec 5, 2011 | hide | past | favorite | 88 comments



I know people like to complain about the new rapid release cycle, but there's been a definite upswing in the work going on on Mozilla since it started. I wonder if knowing that your code will be on people's desktops in three months rather than two years makes a big difference.


Does Firefox plan to use the same style of forced updates that Chrome does? As a web developer I am very grateful that Chrome is not only updated regularly but that these updates are forced on the end user. IE10 could be the best browser ever, but unfortunately there will still be people using IE7.


It does, but it's nowhere near as smooth, and I'm not talking about dialogs and user intervention.

Whereas Chrome has a stable extension API, Firefox really breaks at least one very popular plugin withe each "release." So then you get nasty warnings about incompatible plugin versions and whether or not you want to look up a new version of the plugin. Whereas Chrome a) doesn't break plugin APIs very often, even in alpha and beta channels and b) automatically updates plugins anyway.


Firefox also upgrades plugins automatically, provided they pass a bunch of automated tests. Unfortunately, it seems many don't.

The openness of their API is certainly a two edge sword; in one way, it lets developers dig deep in the browser's internals, which means addons can be immensely more powerful than on Chrome; on the other hand, it means they have a dependency on those same internal APIs.

They've launched the Jetpack SDK as a stable API for addons that doesn't require browser restarts to install them, but almost no one uses it:|


I wonder if there's anything other than momentum holding Jetpack back. I would be interested in trying to get some add-ons ported just for fun. I do agree that the add-ons in Firefox are way more powerful; I like that Adblock on FF can block ads before they're even retrieved (leaving my Hulu experience punctuated by silence rather than noise).


> Whereas Chrome has a stable extension API, Firefox really breaks at least one very popular plugin withe each "release."

It's not really the API which breaks, it's the versioning.


Doesn't matter, the end-user result is still the same: the browser breaks on every upgrade.

It's beyond inexcusable that this is still not fixed, and I'm sure its part of the reason the userbase declined further.


I use a good number of extensions, and the upgrade from 7 to 8 went cleanly. I think they've started to get a handle on it.


This is fixed in Firefox 10 IIRC; addons default to compatible (it's preffed off right now, but the plan is to enable it for release).


I keep firefox 3.6 around for this exact reason. I've been burned too many times by losing plugins.


> It's not really the API which breaks, it's the versioning.

It is definitely the API which breaks in many cases. These 6-week updates change APIs, both Web APIs (used by both websites and addons) and internal APIs.

For example, websites - not addons - can and do break with Firefox and Chrome 6-week updates, because even Web APIs are changed in these rapid updates.

It is true that versioning is a problem as well, however.


Thats correct and thats an issue.

Firefox has been forced to follow Chrome on this AFAIK because otherwise some website features (popular ones.. specially GDocs) wouldn't work optimally. Even right now, loading Google sites is faster (specially GMail) on Chrome because only Chrome has SPDY support (coming in a Firefox near you in some weeks thanks to the fast release mechanism!)

The Web APIs aren't stable at many levels, and HTML5 ain't standard. It's a bunch of drafts and some are even conflicting (hello audio APIs).

It seems to me that Google is the main company right now pushing in new drafts and protocols - using it to make other browsers incompatible. Generally the drafts are technically fine and good, the issue is the way they're used to kill diversity and obtain complete web (or "internet") control.

You can start to see a lot of "you need Chrome to see this website. Chrome, the fastest browser on earth by Google! <click to download>.

Specially true if you use Opera or IE which do not update as often as Firefox and Chrome.


Firefox does have a new versionless API, called the bootstrap API. It also allows extensions to stop, start and hence update, without restarting firefox.

https://developer.mozilla.org/en/Extensions/Bootstrapped_ext...


It already does automatic updates, and work is progressing to make them silent (like Chrome's).


I believe automatic, silent updates are coming in Firefox 10.


I don't see silent updates in FF10 yet. True silent ones are only possible if you move the install into the Roaming area, otherwise win7 with resonable security settings will ask for confirmation.

From what I know, IT-departments don't like that.


From what I know, IT-departments don't like that.

It's important to note that, yes, it is true that some IT department are reactionary and dislike the new release process, BUT that doesn't mean that are right.

There are plenty of analogous automatic update systems which are used every day in IT departments. Virus checkers are the canonical example, but increasingly SAAS applications are the same.

Many IT departments are stuck in a 1999 mode of trying to control the exact version of every software they run.

Fortunately businesses are wising up - firstly IT tried to tell them they couldn't use iPhone because they "weren't approved". That worked until the CEO demanded one.

Now some try and say rapid browser versioning and automatic releases are bad. They'll get over that, too, or otherwise they'll be ignored.

(And don't bring up the example of the IE6 websites that must use IE6 forever more. That's a problem to solve, not an excuse for keeping an entire enterprise locked to outdated technology.)


You're just wrong. There are several very good reasons why the 1999 mode is both useful and important, at least for the time being (and the next several years). Here is a brief list:

- Cross-testing. My team is responsible for roughly 150 enterprise apps of varying scale, platform, age, and quality, ranging from a few disgusting classic ASP apps to monstrosities we pay nearly $1m/yr in licensing fees for. It is neither feasible nor the best use of people's time to keep up with 12wk release cycles.

- Security. There are good reasons why not to allow local admin privileges for most users. Chrome is the first browser we've deployed that we've allowed auto-update to remain enabled, and if it didn't provide silent updates that didn't break extensions AND install in user space, we'd never have been able to do that.

- Antivirus. I'd have thought you'd know this based on the verbiage in your comment, but apparently not. Enterprise AV doesn't work like consumer AV. An orchestrating server software is installed, at which configurations are defined and from which rule sets are distributed to clients.

- Big ERP. As an example, Oracle ERP isn't compatible with every version of Java, and god help you if your users are running Linux and get confused between the official JRE and independent distributions like OpenJDK & Iced Tea. If your apps team has built extensions/interfaces to the ERP that also rely on Java, it's entirely possible or likely that they're dependent on a specific version & patch number, too.

I could go on for hours.

I don't think anyone -- at least none of my peers or colleagues -- would say rapid browser versioning & automatic releases are bad. Not being able to manage them can be risky. We have recently standardized on Chrome, but have deployed Firefox 8 as a fallback just in case, and our Windows users have IE8 (or 9, in the case of the few running Win7). For both Firefox & Chrome we included IETab in the MSI we distributed, customized with rules based on app compatibility. We also distributed a custom PAC file with proxy rules. Coincidentally, most of the web developers -- even if they prefer Chrome for browsing -- still prefer Firefox for dev & testing due to the superiority of Firebug & a few other add-ons.


I've always wondered----why can't you just install two browsers?

The old IE6 browser that can run your ancient intranet apps, and the new Chrome that can run the new SaaS apps. The new one auto-updates, and the old one doesn't.

Clearly IT departments are capable of supporting multiple programs since that's what they do when someone purchases native software instead of a ASP app.


Most companies do. IE & Safari come with their respective OS, and if you're a Windows shop you're probably going to also install either Chrome or FF. We had been standardized on FF until they expedited their release cycle, which made keeping current too much work (this will change once it updates silently without breaking add-ons), but since we were a Google Apps shop anyway, this doubled our resolve to switch to Chrome. We kept FF around as a fallback in case we ran into situations where Chrome didn't work (Juniper SSLVPN meetings in Windows, for example). We actively try to prevent users from accessing IE (removed shortcuts from desktop & Start menu, etc), btw, except for the couple hundred users who actually need it to run legacy apps we have no control over.

Note: about 80% of our employees with computers don't have internet access at all, with the exception of a proxy rule that allows them to access Google Apps (and a few other select SAAS apps). You can imagine this complicates things a wee bit.


You can't run IE6 on a modern version of Windows. You can run it in a VM, which is Microsoft's "approved" approach (they provide old version of browsers for this kind of thing)

Most people don't understand the "old IE6 app" issue - it actually applies to a very, very small number of apps. It isn't that these apps don't display in other browsers because of bad HTML - it is because they use weird IE6 only technologies. Microsoft came up with some very strange things over the years (look at Data Islands: http://msdn.microsoft.com/en-us/library/windows/desktop/ms76...), and fixing these apps isn't just HTML tweaking. Of course, if you still have a Data Island app around these days your IT department should be fired...


Most IE only tech, including this one, were still supported in IE7, and IE8 and later can be set to IE7 standard mode using Compatibility View.


IE6 is so 5 years ago. Intranet apps that run IE6 are rapidly disappearing, and odds are good that your intranet apps will run best in Firefox 3.6.


Do you know that security wise, since Chrome does not need administrative privs to install it means any bug, trojan, etc, can replace the Chrome binary and use it to capture/control/send/etc anything users are running?

That's the main reason why app INSTALLS should require admin privileges in general.

That's also why Chrome model for updates is a double-edged sword, and security wise its bad.

For Entreprises, you should be able to manually push out Chrome updates without getting any admin prompts or the need for admin privileges.

Regardless, for non-Entreprise use, seemless update of the browser is something one wants to have.

AFAIK Firefox plans to have an authenticated updater service running for that. It means you can't subvert the Firefox binaries to capture/control/whatever the user's pc as you need admin privileges to upgrade.

And the updater process is a tiny process that verifies signatures before upgrade, thus the likehood that it gets compromised or has bugs is very low (specially compared to browsers which are one of the most complex piece of software ever made)


Oh, I understand all of this well.

I just think it is totally, absolutely and completely wrong.

I think that the "risk-averse" nature of corporate IT doesn't serve business, it holds it back.

Things like "custom ERP extensions" are excuses. If a business builds it's own software, then it should expect to maintain it.

(WRT the anti-virus thing: yes, I understand this. I also know how this really works in 95% of businesses - the vendor sends a new virus definition file, and IT rolls it out to the business when they get in the next day. In another 4.9% of businesses someone will load the definitions on a SOE desktop, check it reboots and then roll it out. It is only in that 0.1% of businesses (if that!) where IT actually adds any value to the virus definition rollout process at all. In all other cases they remove value by holding back updates until that are run through the "IT process")


IT departments need to stop complaining and get their act together. That, and enterprise software firms should stop charging an arm and a leg for updates, and stop making software that will only run in IE (6). If you buy that software, shout at your boss, not at Firefox or Chrome.


From my understanding there will be a single one-off UAC popup which will allow a service to be installed with higher rights - that service will then carry out installations from that point on, and will be able to do so silently, as it will have the rights to do so.


Why is there still a need to deal with UAC? Why does it need admin rights in the first place? Sounds entirely avoidable (see: Chrome).


Chrome avoids it by installing into the User's own folder. Firefox is installed in Program Files, which requires user authorisation.

Arguably, this is the correct place to install, as it means that the application is usable by anyone on the PC, rather than having to be installed for each user individually.

You can find more information here: http://www.brianbondy.com/blog/id/125/


Chrome does a non-standard install and also can be self modified which isn't secure. Chrome could be replaced by a keylogger+chrome by anything running on the system.

That's why admin privileges are required for accessing Program Files to begin with.

Using a separate updater process is actually the clean way to go, on Windows.


It already does I believe, or at least, they are hard to not accept.


I am more pleased with Firefox8 than I ever was with Firefox3

But I have to use several more extensions to get back some features that were taken out.


While I'm sure it was an annoyance to lose some functionality and have to use plugins I think the move towards a leaner, more efficient version of Firefox is definitely a great move. It'll bring speedier web browsing to the majority of users.


Honestly I think this is a good thing.


I can't find any public headcount figures, but their weekly meeting notes often show multiple new hires for many weeks now (unsure about the amount of departures). I'm guessing part of the faster improvement is that they've increased headcount and these people are becoming productive.


Mozilla doubled in size over the last year, to nearly 700 people I think.


Mhh. LuaJit has quite shown that TraceJit can be awesome for dynamic languages. Sure JS has less potential to be optimized then lua (see in this http://lambda-the-ultimate.org/node/3851 LtU thread somewhere and read the rest anyway if your intressted in JITs) but the genarall aprouch could still be used.

LuaJit does it by using a very fast interpreter and a tracecompiler. The secret souce seams to be a SSA based bytecode. Im not sure how the Mozilla interpreter works but since its a relice I doute that its a bytecode optimized for interpreting and compilling. If Mozilla would adopt the same method LuaJit uses they would have to reimplment everything from ground up. Witch probebly wouldn't be a smart move buisnesswise.

If the language is to hart to interpret one could write a simple first stage method-jit and a trace-jit behind it. Exiting times in JIT-Land spezially for dynamic languages. Lets see who comes out with the best implementation in the long run.


The article already explains this, doesn't it? The code had both a Trace-JIT and a Method-JIT [1], but that switching between the two to use the fastest is problematic and that its apparently easier to get the Method-JIT to incorporate more advanced optimization and outperform the Trace-JIT in the end.

If Mozilla would adopt the same method LuaJit uses they would have to reimplment everything from ground up. Witch probebly wouldn't be a smart move buisnesswise.

I'm not an expert but the existence of "SpiderMonkey" "TraceMonkey" "JeagerMonkey" and "IonMonkey" (which can apparently be disabled independently) seems to suggest they have no problems with that.

[1] I presume there is always a fallback interpreter as well.


The method I talked about would work diffrent let me expand a little. I can imagen that jumping between two jits is problematic spezially if there is an interpreter throwen in. The method I talk about would use a simple method jit that generates assembly and the code to enable tracing. This compiler would not optimize anything only get rid of the interpreter overhead. Then the trace jit would jump in if it finds a worthy trace. There would be no switching between the two only from one to the other. In the LtU thread they talk about this too.

I don't read anywhere that the method-jit is better in optimizing, i only that the trace-jit dosn't find enought stuff to win. Once you have a good trace you can optimize it like crazy.

Mozilla seams to move in this direction, they have thrown out Tracemonky but in the longrun add one again and as far as I know the want to throw out spidermonky too. The diffrence would be that they would have more power in the method jit and less in the trace jit.

The reason I said that they would have reimplemented alot is that for the intepreter/simple-jit method and trace-jit to work well you need a nicly designed bytecode and I'm not sure they have one atm (I don't really know). There existing implmentation would have to be reworked to use a new bytecode.


Part of the reason is also that Mike Pall is actually a hyper-advanced cyborg from the future.

In case you're not sure what I'm talking about, read through the LuaJIT source code at some point. It's a thing of beauty - he's handcrafted assembly to make everything run as fast as possible, on all of x86, x64, PowerPC and now ARM.


Yes.

"LuaJIT 2.0 intellectual property disclosure and research opportunities" <http://article.gmane.org/gmane.comp.lang.lua.general/58908&#...; is further proof to that.

Rumor has it that Mike Pall is actually a collective pseudonym used by an army of cycle counting language expert cyborgs (from the future).


Mike Pall is awesome but Mozilla has a hord of very talented compiler guys. They have to fight against legacy code and a badly designed language.

I have learned alot about LuaJit already and I will digg into it more in the future.


The article says the tracing JIT can be faster for tight, non-branchy code. I imagine that Lua programs would be tighter and less branchy than JavaScript, which has many small callback functions invoked by native code.


I would rather say that article says that the trace jit they build is only faster for tight, non-branchy code. I would say that the generall JS programmen has this property but Mike Pall has showed that in the end trace-jit are better. He wrote LuaJit 1 witch is a method jit and LuaJit 2 witch is a trace-jit and his conclution with luajit one was that he hit a performance wall.


"Lets see who comes out with the best implementation in the long run."

How does this differ from what in fact happens?

When theory and reality conflict, reality is correct. If tracing JIT didn't work as well as you expected, it is not reality that is wrong.


Its not diffrent. I was discribing what was happening.

Well I think the evidence is not fully in on that jet. Method Jits are much older and there is a alot of reserach done in it spezially the old Self papers. Almost everything done in JS has allready been used in Self. Even some of the same people work on the JS implmentation.

Tracing jit are relativly reason and LuaJit shows how it can work out. It could well be in the end a compination of both is best.


"As to the size of plugin-container.exe: that is all flash’s fault. There is literally nothing we can do to fix this other than to kill off flash usage on the web."

Interesting comment w.r.t. efforts to shrink Firefox's memory footprint being at the mercy of Flash.


I think it's in a separate .exe from firefox.exe specifically for that reason, i.e. so the browser doesn't get blamed (or crashes) when dealing with crap plugins. I'm not sure if Firefox can kill off that process (and reclaim memory) when no plugins are running.

The other browsers should have the same problem, right? They have to run Flash somehow.


No criticism of Firefox intended, more just one more reason to despise flash.


Tracing is no silver-bullet. Mike Pall is.


Very true. Mike Pall is truly exceptional.

Also worth mentioning is that LuaJIT and JS engines have different constraints. For example, JS engines must be very careful about security, LuaJIT does not. So JS engines need to do more cleanup/initialization for example, to prevent things like heap spray attacks, which LuaJIT does not. Also, JS engines are written to be maintainable by a large team, LuaJIT is really focused entirely on speed.


LuaJIT is written with security in mind. LuaJIT 2.x had permanent write-protection enabled for the generated machine code in the first release, i.e. before anyone else did so.

I do not see how maintainability could somehow have an inverse relation to the engine speed. IMHO the LuaJIT code base is significantly more maintainable than the Mozilla code base.

Related: Fate of Unladen Swallow and TraceMonkey predicted about two years ago, in February 2010: http://www.reddit.com/r/programming/comments/azibq/a_rant_ab...


Ah, very interesting, thanks for replying!

Can you elaborate on what you mean by "permanent write-protection"? Is that to prevent code being overwritten? If so, that is just one possible attack.

Does LuaJIT implement techniques to mitigate heap spray attacks and heap inspection problems? JS engines work against that by initializing things like typed arrays to 0, and by overwriting objects when they are free'd. These have a runtime cost though, I'm curious if LuaJIT incurs it as well.

Regarding maintainability, your handwritten interpreter in assembly is not something I would expect other people to easily get up to speed on. The JS engine codebases are more approachable IMHO.


67,000 lines of C++ is actually not a lot of code. It really depends on _where_ these lines are (i.e. are they in a hot path? How _often_ they are the executed?) and how well written it is.

EDIT: I know it's been disabled for months - I'm referring to the original disabling of this code.


It says in the first paragraph that TraceMonkey was actually disabled months ago. So it's says more about keeping the codebase clean then anything about memory or speed of the browser.


Removing code from codebases is actually really great for keeping it maintainable. It really should happen more but I get the feeling in a lot of places code is basically never removed. So IMO removing 67k lines of code is good news for the long-term maintainability of any project! (As long as there's some source code left, heh)


As said in the second sentence of the linked post, they were disabled even before they were removed.


On an aside, relating to the performance of these two Firefox technologies, it is interesting that in the battle of the JavaScript engines it is artificial metrics that are driving the game: No one is measuring the number of clocks taken to run the JavaScript on typical pages like Twitter or gmail, or whether the variance is enough to even noticeably matter, but instead we run looping benchmark code entirely unlike anything actually used in practice (SunSpider being the most egregiously invalid test).

I would posit, based upon nothing, that in practice JIT systems can be a net negative to the overwhelming majority of run-once JavaScript encountered across the tubes.


TraceMonkey, JaegerMonkey, and IonMonkey. Well, the names are nothing if not consistent, but impossible to keep straight. I think IonMonkey is the new one?


TraceMonkey is/was a tracing JIT.

JaegerMonkey is a method JIT.

IonMonkey is an SSA-optimizing JIT.

Type Inference is additional code that analyzes JS both globally and locally for types, and feeds that info into the JITs to make them more effective.

Previously Firefox used TraceMonkey, then TraceMonkey+JaegerMonkey, and now JaegerMonkey+Type Inference. The plan is to move to IonMonkey+Type Inference (not sure where JaegerMonkey fits into that).


I also did a write up on this called "Mapping the monkeysphere" that you may find helpful: http://blog.cdleary.com/2011/06/mapping-the-monkeysphere/


[deleted]


Here's how to make Firefox fast: uninstall Firebug.

A clean modern Firefox install is about as fast as Chrome and probably uses less memory per tab.

As far as I'm concerned, there's no reason to use Chrome. Firefox is about as fast, with similar memory usage. But Firefox has two major advantages.

The first is that Firefox has a lot more useful extensions available. You have to be careful because a lot of Firefox extensions suck, but Firefox extensions can do things that they aren't allowed to do on Chrome.

The second advantage is that Firefox is open source and it's easier to feel that it won't be tracking you. Yes, Chromium is open, but Chrome isn't Chromium.


But you don't understand the thinking behind this. He's had a problem with the software and therefore it is bad software. The fact that you and millions more have used said software without similar troubles, quite possibly no troubles at all, is irrelevant. He has a problem, therefore the software sucks as a whole.

This much the same thinking behind the hate on most software these days, most commonly aimed at Flash and Windows.

How long before we see similar thinking towards his beloved Chrome?

Although the down votes on the guy's post is unwarranted.


Well, when a piece of software is sluggish and crash prone, I think it's fair to consider it "bad software". Is that really such a controversial position to take?

If Chrome goes down the same path as Firefox, I'll happily trash it as well.

I'd rather be honest about a the quality of a piece of software instead of sitting around and making excuses for it.


It's controversial because we suspect that it's your fault that you think Firefox is sluggish and crash prone.

Most likely either you've got some bad plugins installed, or you haven't tried it recently, or both. It's possible that a clean install of a modern Firefox would also be bad for you, but we doubt it.


And I can assure that it is not my fault. I am a software engineer, for one, and I run a tight ship - my Windows system is perpetually bare-bones. I keep as little running as possible. It's not my system (which is modern and up-to-date, including keeping up with Windows Updates).

Plugin-wise, almost nothing. Flash (groan), but I tend to keep that turned off unless I actually need it.

Under that scenario, Firefox is still sluggish and crash-prone during normal browsing. Innocuous browsing, to boot - I'm talking things like news sites, reddit, HN, StackExchange.

It has left such a bad taste in my mouth that I have very little interest in trying to figure out what is actually behind this - especially when I can install Chrome and have a squeaky-clean browsing experience in an identical environment. I think it is pretty fair to place the blame squarely on Firefox.

While in retrospect, I regret the wording of my original post, I stand my position that I do not feel Firefox is fit to be considered a top browser.


Well, I have had none of the problems you describe, with a machine that's not maintained on your level I might add, so I guess my good experience cancels out your bad experience.

Therefore, Firefox is neither good nor bad, it is neutral.


No.... Firefox may be good for you, but it is bad for him. Both experiences coexist, and yours does not negate his.


But that's my point. The attitude is that since he has had issues with the software he labels it bad software on that basis. I don't have those problems but for some reason I'm not allowed to dispute his label of the software.

I'm just trying to use the false logic presented to me in the same manner. I do not have problems with Mozilla Firefox, therefore, I demand that the world agree with me that it is "good software". How is that any different?

But in the end, it's just "software" and different people have different experiences.


Since I cannot reply to burgerbrain below for some reason...

Yes, he is. He labels the software as bad and apparently expects everyone to agree.

Anyone who states that Product A is bad because of his negative experience with said product while ignoring/discounting other people's positive experience with the same product is demanding agreement.

A better response, to me at least, from someone in that situation is, "I have had problems with Firefox and therefore I choose to use Chrome as I feel it is a better browser."

There's a difference between saying one product is better than another and just labeling one of them as bad. Especially when it's based roughly on personal experience.


Utter nonsense. He is merely expressing his frustration with the software, and you are attempting to silence and discredit dissident.


Well, then I guess we'll agree to disagree.


Is he demanding that the world agree with his assessment?


Here is the problem: "Try it without your plugins/extensions" and "Try a clean installation" have been the standard responses out of Mozilla when faced with any complaints for oh, say, half a decade? And every time I gave them any more of my time, I've been burned. Why should I believe it now? Particularly now that I've found someone^Wsomething that hasn't hurt me and treats me well?

Who knows, maybe firefox really did go to rehab, cleaned up his act, and worked out his problems, but it's too late.


I'm not making excuses for it, I'm questioning your decision-making process of determining what is "bad software".

I simply disagree with you, is that a controversial position?


Disagreement certainly isn't (or shouldn't be) controversial :)

But my questions are pretty straightforward: Does the software crash during "normal" workflows? Is the software slugish when compared to other similar software, on a similar platform? Is the memory consumption of the software unacceptably high when compared to other similar software on a similar platform?

If the answer to any of those questions is "yes", then I think you have a serious problem on your hands. Those are questions I ask myself when I am building software. I don't see why that decision making process should be controversial - those are fundamental indicators of quality in a software project.


Well, the people who down-voted my post disagree with you. Apparently it is controversial. But I'm glad you don't have an issue with disagreement as without it there is no debate.

Does the software crash during "normal" workflows? Without defining normal, I would say yes. But then again, I have various software packages crash during my workflow. Does that mean all of them are "bad"?

Is the software sluggish when compared to other similar software/platform? Define sluggish. What if on my machine Firefox is sluggish in loading new tabs but Chrome is sluggish in closing old tabs? Can I call them both "bad software"?

Is the memory consumption unacceptably high? What's unacceptably high? My work machine has 10 gigs of RAM while my home machine has two. Which one should I judge the high memory usage on? What was I doing at the time that might have added to the level? Does the high memory usage cause sluggishness on my machine? Is it because of a memory leak in the browser, a memory leak in a third-party plugin or maybe because I just felt like leaving thirty tabs open for more than 24 hours?

My point being that there are way too many factors involved for someone to just label software "bad" when so many people do not report the same problems. People act as if their experience and opinion is common knowledge when most likely it is not. What if the problems, whether real or perceived, are due to new features in the software? Should we all just roll back to Firefox 3.6?

Now, is it possible that Chrome is just a better browser than Firefox? Of course, I'm not disputing that at all and I would tend to agree. I just have an issue with labels that don't necessarily apply.

Now show me Firefox having these problems for, say, a substantial number of people then I would have to agree that there's an issue. But I'm talking something significant, such as 60 to 70% failure rate and then you have me. Even then it's possible the problem can be fixed so that it is no longer "bad software", but I'm sure no one will listen at that point. Because it seems that in these debates the negative ("bad experience") far outweighs the positive ("good experience") to the point that my positive opinion of Firefox means nothing.

If that's the criteria to label a product as "bad software", then I say that all of it is "bad software".

Or should I just say; "Chrome good, Firefox bad!" to avoid down votes?


"I have various software packages crash during my workflow. Does that mean all of them are "bad"?"

Yes?... And if you think otherwise, then I must say that I believe you have an unnaturally high tolerance for shit.


Fine then, you guys win. Every piece of software I have ever used that gave me a little trouble, including Firefox, is now bad software in my eyes.

Congrats.


The second advantage is that Firefox is open source and it's easier to feel that it won't be tracking you. Yes, Chromium is open, but Chrome isn't Chromium.

Even if Chromium is open source, development obviously goes where Google wants it. The (IMHO overdone) minimalism in browser chrome (no pun intended) and complete blurring of local history and online Google searches are nice examples. You can't really undo that without forking it.


I find that it is specifically the 'net' tab in Firebug that causes the problem. If you have it enabled it will cause your browser memory to balloon since it is keeping track of everything the browser downloaded.


Your Firefox profile is broken. If you do decide to give it another try make sure you delete it and start afresh. A stock Firefox is not sluggish or crashy.


What's the purpose of using Firefox if you lose all your customizations? Just like Emacs, Firefox rules because it's customizable. On my Linux box, Firefox crashes tens of times a day. I've not dumped it yet because it's the only browser which allows me to customize pages' colors in a reliable way.


I’m not saying that your customisations are breaking Firefox, I’m saying that something in your profile has gone awry and is causing it to crash. I’ve not used the page colour customisation but it’d be worth trying a stock Firefox then dropping your customisations back in to see if that fixes it.

You can create a new profile by following these instructions: http://kb.mozillazine.org/Profile_manager#Linux

You can also see a list of crash reports from your install by going to about:crashes . These link through to the Mozilla crash log server, which often then links onto the appropriate bug in Bugzilla. You might find that what’s causing the crashes has been fixed in an upcoming release.


Is this a mozilla.org build? If so, what does about:crashes say? Can you paste some of the links here?

If this is a distribution-provided build, it might still have a working crash reporter or not, depending on the distribution. Check about:crashes, and if the links there don't do anything useful try reproducing the crashes in a mozilla.org build?


firebug > inspector


Does the world really need Chrome fanboys selling the religion?

This story has nothing to do with Chrome. Yet like with every single story about Firefox, here's some guy slurring Firefox while praising Chrome.


[deleted]


If I remove all mention of chrome from your post, it's still wrong. Millions of people have no problems with Firefox. If you are having problems, it is either an isolated bug or you screwed something up. Either way, it's not an indication of the overall quality of Firefox.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: