Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Web Fonts Collateral Damage of Ad Blockers (miranj.in)
127 points by morisy on Oct 4, 2015 | hide | past | favorite | 147 comments



While we're discussing problems with web fonts, let's not forget the other common ailment: setting body text in a 300-weight font. Plenty of people don't seem to think to test web fonts on other OSes, as if font rendering were consistent. But what looks good on a retina Macbook Pro is completely unreadable on Linux or Windows. That light grey that produces a pleasing level contrast on a high-end IPS is nearly invisible on a three-year-old corporate-issue HP laptop.

Really, it's just the usual cycle of exuberance, overuse, and understanding that all web platform features seem to go through, but I for one am overjoyed to be able to fix a page's text via font blocking. For the light grey stuff there's always reader mode.


Some things never change. I surf the web since 1995 and I've seen the designers and "designers" not caring about the readability since then, now it's really 20 years. As soon as it was possible to put the background images on the pages, almost every random page became unreadable unless you've turned off the images altogether.

Then the designers recognized they can reduce the size of the text to the unreadable sizes. They did that too, because it's about the design, they don't care about the text anyway, and the ants droppings look better on the page than these letters.

More recently, some claim that they read better gray text on the lighter gray background. I always used IPS screens and it was never readable. I guess these guys turn their brightness and contrast controls to 110% then wonder that the black text is not readable. Repeat after me: the readers don't set the screen like you.

And all these pages with long of nothing because their Njdxjdfj font is the only one which can present their glorious design. Yes it's annoying.

I'm extremely happy there's a reader mode in iOS and in Firefox. But guess what? The most annoying pages are almost by rule also not available in the reader mode. I wonder how they manage to achieve that. Do they tweak the page until the readers can't show it? Or is that just grandiose ignorance?

And remember these glorious Flash-only pages?

One thing I'm sure: whenever the technology gives some chance for some to produce unreadable pages, they will produce unreadable pages. The last 20 years were consistent with that.

My favorite old page, fully readable:

https://www.bell-labs.com/usr/dmr/www/chist.html

Compare with the current home page of Bell Labs:

https://www.bell-labs.com

Is the goal of the modern design, when used as in this example, to convey information or to confuse?

The designer is certainly not the only responsible. The title of the page is: "Redefining Our Relationship with Information." Yes indeed.


It's about the user.

If your user is a developer or designer with a rMBP then you should design around their wants, needs, use cases, etc. If your user is a middle manager at a finance company using a Dell from 2009 then you have to design and develop to their use cases.

Very rarely will these two overlap; only on very very large products. As a designer I'm not going to forgo using a font or color pallete that my user finds attractive simply because a random person using antiquated hardware/software might stumble upon my product.


This is such a bizarre response that I'm wondering if it's sarcastic or something. To a paraphrase you (but really not much IMO), there are only two types of people in the world: those who have rMBPs (in the same lighting and calibration your page was designed in, with the same eyes and brain you have), and those who don't. (And it doesn't even appear you were using these as two extremes with a spectrum in between, because you said "Very rarely will these two overlap.") Somehow, you know exactly which part your audience belongs to and what they "find attractive", and you don't care if they disagree, or might have been interested in your content but it's not working because they are on the opposite of the "rMBP binary" side you chose.

Even if some of this is true, thinking "very rarely will they overlap" seems so short-sighted.. you're just going to turn away possible users because you decided your product is niche enough that you can predict exactly what environment (electronic, biological, and physical) your user is viewing it through?

I mean, I knew the stereotype that designers are opinionated, but this is a whole new level...


I most certainly didn't say there are only two types of people in the world. My comment clearly focused on designing around the user. Your user is different than my user, etc. I simply supplied two common use cases.

It's also a matter of designing around net positivity. If 70% of my users respond positively to a design then I've done pretty well. 30% in most cases is a small number. That doesn't mean alienating and ignoring that 30%, but it also doesn't mean throwing out the approval of the 70%. It's a compromise. (I didn't think I needed to be so explicit in my first comment).

Larger products, let's say Facebook, even 1% is a million users. So of course they can't ignore them.

I've obviously struck a sensitive chord here with the HN crowd so I won't go any further.


The more general version of your argument makes more sense, but I still think you are taking for granted that you can know your users that well, and how much "your user is different than my user" really applies. You don't seem to account for the magnitude of how positively/negatively the user is responding. If 70% of your users rate it a +2 on a scale of -5 to +5, and 30% rate it -4, that would seem to meet your scenario, but is that really a good goal, to be barely tolerable to a "mere" 30%? Maybe I'm being fallacious here since my own argument is that you won't have data that good, but I mean it as an analogy for why I disagree with the idea of knowingly pushing a design that won't work for some users.

Around here at least, it's weird to bake the assumption of a non-large product into your design philosophy. You don't want to have a large, diverse userbase? Is your design actually defining who your userbase is, rather than the other way around? If you want that, we can't stop you, but it seems strange to us.


it's not working because they are on the opposite of the "rMBP binary" side you chose

It's a losing cause. It's not just the person you're responding to, even Apple does the same.

I'm running OS X 10.10 on my desktop and it is OK, so I tried it on my non-retina Macbook Pro. But the overall appearance, including the fonts, was just "yeech". I'm not a graphic designer and so I couldn't put my finger on exactly what was wrong. But a bunch of fiddling around with fonts and Accessibility didn't help.

So I reverted to OS X 10.8 and everything is nice again. Time Machine made it so so so easy to revert.


As a designer, if your work is not usable by people with disabilities you have failed at the most basic tasks you are supposed to do.

Read this https://en.m.wikipedia.org/wiki/Web_accessibility and repent


If the target audience for your website is so very limited that they can be approximated as one single guy browsing from a single specific machine, then that's...nice for you, but it's not actually applicable to pretty much anyone else.


> Very rarely will these two overlap; only on very very large products.

This is the Kool-Aid drunk by designers who actually don't give a fuck about their users. 'rarely', 'very very large products'... this is the cry of the lazy, crappy designer.


From a marketing perspective, we kept getting really excited seeing screen resolutions rise and font rendering improve for our audience. Now designers can just design, not "for-web", which used to be a very restrictive medium compared to what could be achieved with a pixel editor.

We realize some strange folks somewhere are still running IE6, but they aren't the ones responding to our campaigns anyway.


Or as they said ten years ago:

"Did you know your web page is unreadable in Firefox?"

"Yeah, but our analysis shows very few of our visitors use Firefox."


But really, it's the only sane way.

Sorry, my QA guy is not going to test with Konqueror just because you happen to like it.


I imagine my electric company sees low usage of Chrome- because their site doesn't actually allow you to log in using Chrome.


Somewhere along the way designers forgot that design is not just about pretty things, but also being able to comfortably use the said things.


This is why people don't like marketers. It honestly doesn't even occur to you that there's a problem with e.g. screwing over vision-impaired users, because as a small minority not responsible for many ad click-throughs. "The right thing" begins and ends with money, and you don't even have enough moral compass to realize that normal people don't feel the same way.


You must not live in the same world I do. I get paid to show people messaging they want to see.

Do you get paid by vision-impaired users to represent them? Great! Do that.

And this nonsense about design ignoring the vision impaired is misinformed. Now that we're able to accurately represent content using the full spectrum of HTML, accessibility is better than ever. Most of the sites I build these days have very little ineffective syntax. Disable the stylesheet entirely, if you're into living like it's 1999, and you'll still be able to see the content.

You may not care for an ultramodern brand's style, but that only tells you one thing: it's not for you!


> You may not care for an ultramodern brand's style, but that only tells you one thing: it's not for you!

One of the reasons I minimize my use of web apps (e.g., Gmail) is because I got tired and frustrated with the re-styling every 6-12 months; where I was no longer comfortable just because it didn't look the same; and I was no longer efficient because it was all re-organized; and I was annoyed because I was having conscious thoughts about the interface, instead of just using it to get stuff done. (This happens to a lesser extent with desktop apps, but at least I can avoid updating, etc.) Did someone consciously decide their product was no longer for me, so they didn't care what I think? Will they care if their decisions make me stop using their product?

I'm hesitant to accept the opinionated, brusque ideas about design being thrown around here, but even if I did, they don't seem to account for the effects of evolving the product while the userbase stays the same. I get the idea of incremental progress, but can you seriously believe that every change is good and is worth the costs? Yesterday, we missed the mark with skeuomorphism, today we know we need to throw it all out, and then we will have achieved perfection!


See, this is exactly what I mean. You don't seem to be aware that it's possible for there to be a difference between "good design" and "maximize immediate profits." This is why shoddy clickbait is taking over the web.


Good design maximizes profits.

You seem to be unaware of the human element in all of this; designers are proud to create the best experience they can, developers are proud to implement it, and clients are proud to let it speak on their behalf.

You answer your own question, and emphasize my point with your perception of "shoddy clickbait" being popular. Obviously, for some reason, people like it. Who are you to say it should be eliminated?

So, again, if you don't like it, it's not intended for you.


I hope this entices developers to reconsider the use of private use area (PUA) [0] based icon fonts for key navigational elements. I see them used time and again, and a slow connection or spotty CDN can render these sites unusable. [1]

If you absolutely have to use icon fonts, at least use ligatures instead of a PUA. That lets you fallback to a whole word if the specific font isn't present, or if the user relies on assistive technology like a screen reader.

[0] https://en.wikipedia.org/wiki/Private_Use_Areas

[1] http://i.imgur.com/GR9Jk2b.png


There is even an actual Unicode symbol for the left-pointing magnifying glass ([1]) that they are using, yet they use the PUA anyways.

I use the [1] Unicode symbol semantically on my blog with a custom font at http://eligrey.com/blog/

1. Edit: Hacker News seems to have stripped the character from my message. I am referring to U+1F50D left-pointing magnifying glass.


NB I still get a broken character and not the magnifying glass on your site: https://imgur.com/a/3Rb7t


I've made a point of installing a variety of extra Unicode fonts like Symbola, and I still find sites with characters that don't render properly. I do see the magnifying glass on the parent comment's site though...


I think the main problem is that popular icon fonts like Font Awesome don't use ligatures.

Is there any reason why they don't? It seems it would indeed be much better.


Thanks for explaining the issue so clearly. I see this a lot after installing Privacy Badger and I wondered what it was called. In the rare instances that I needed one of these for page navigation, I've just hovered the mouse pointer over it and guessed at it's purpose by looking at the URL. It's one of those things that are an annoyance, but not quite a big enough issue to spend my time solving it or learning the cause.


Same here. I knew that something was broken, but meh ;)


Me too. I blocked web fonts in Firefox because a lot of them rendered text like trash when I had font anti-aliasing/sub-pixel rendering disabled (I prefer blocky-crispness to blurry-smoothness). Those stupid icon fonts have been the collateral damage.

And what's the point of sticking a handful of icons into a downloadable font rather than just downloading a few images?


> And what's the point of sticking a handful of icons into a downloadable font rather than just downloading a few images?

Some reasons:

* They are scalable vector images without the download bloat and processing load of rendering them with SVG.

* Raster icons need to closely match the screen DPI or they look awful - eg Retina displays made raster icons look like blocky pixellated afterthoughts.

* They inherit colors, sizes etc from CSS the same way as their surrounding elements.



As a NoScript user, I rarely see web fonts and I prefer it this way. The text shows up instantaneously and the font will look great and be perfectly readable.

Most web fonts don't look that nice on Windows. Windows heavily relies on hinting and doing this properly is a lot of hard mind-numbing work.

Most web fonts also aren't that readable. Sure, your wide/thin/square font looks very modern and stuff, but it's not as readable as a Verdana or Arial. When I visit your site, I'm there for the text, not the font.


I know you are specifically talking about web fonts but let's face it: a lot of the modern web sucks ass.

I don't want my browser downloading custom fonts, autoplay videos, animated GIFs, or CSS elements that move around on their own (or refuse to move when I scroll). I am also not interested having my browser download javascript or tracking cookies from dozens of third party domains when I visit a site.


Your complaint is about design and how the site serves the page, not web fonts. Knowledgeable developers know how to deliver web fonts quickly and web fonts that looks good on your system.

Unless you're paying by the byte, what gets downloaded shouldn't concern you from a well-developed site that doesn't hang on your phone.

Unfortunately, in this copy/paste era, too many sites shoot themselves in the foot.


You, or anyone, don't get to tell people what concerns them. If they care about it, it's a concern.

The cause of the concern should then be addressed, not the person having the concern.


People are "concerned" about a lot of crazy things that shouldn't bother them.


This is good to have someone that defends these modern practices. My problem concerns NPR (i.e. npr.org) and their direction for the use of Drupal with a specific config for the sites for their local affiliates that often requires Google js for the site to function properly. Practically speaking, this will not impact most visitors, but most of these people are being pigeonholed without knowing it for advertising purposes. I have a big problem with this. It is now affecting non-technical people b/c of the use of (ad)blockers on iOS.


> Knowledgeable developers know how to deliver web fonts quickly and web fonts that looks good on your system.

1. most developers aren't good, or at least not paid well enough to do what they consider a good job.

2. you cannot deliver web-fonts taster than you can deliver a page without web-fonts. Web-fonts will always be driving latency. On average in a noticably annoying way more often than not.

Besides... Downloading a 1MB font to render some typical 4kb of text? What sort of madness drives this behaviour?

I can definitely see the potential for a web-fonts blocker extension, like I now have ad-blocker and in the past had a flash-blocker (which now no longer seems needed).


Again, your complaint is with developers and designers who did not do their job properly, not with web fonts.


I have to wait for it to load, view it and charge my battery later. I don't get that time back. It concerns me.


You, too, are complaining about what the developer did and how he did it, not the web font. I use web fonts almost everywhere and our pages "blink" on.


You're overlooking the "view it" part. I don't want to see your web font in the first place.


I'm there for the text, not the font

Me too. In Firefox on OS X, I simply set Lucida Grande as my default font, and I uncheck the box that allows pages to choose their own fonts. I don't think I'm missing much.


I did the same in FF/Win - web fonts were a huge step back in my opinion - but immediately discovered a bunch of sites (including GitHub) abusing web fonts for button images, which made them pretty much unusable. Have you found a selective workaround for this?


With uMatrix you can selectively enable various features for sites. I tend to start with minimal sets and re-enable to the point I've achieved sufficient functionality for my needs.


I use Chrome and a plugin called Font Changer that lets one override fonts on pages. I don't need to have it turned on for all sites but it can be done. Mostly I just have everything set to Arial (I don't like reading any serif fonts on screen). Perhaps there is a plugin for Firefox that does a similar thing.


Web fonts done properly don't require JavaScript. You are correct about Windows having objectively terrible font rendering though, and not just in web browsers.


> Web fonts done properly don't require JavaScript.

NoScript blocks fonts, video, audio, and JavaScript.


I love typography and have no issues with the fonts themselves; font servers are the problem here, making several requests to download not only fonts but also js scripts. We could easily see Google Fonts as Google Analytics in desguise, same for Adobe. Encoding fonts directly into stylesheets works pretty well.


"font servers are the problem here, making several requests to download not only fonts but also js scripts."

Right. You no longer need Javascript to download fonts. You don't need multiple versions of the font any more, either. TTF, OTF, or WOFF work on everything current.[1] Early iPhone models (IOS < 4)only accepted fonts in SVG, which was strange, but now they're on board.

If you need to convert from one font format to another, see [2].

[1] http://caniuse.com/#feat=ttf [2] http://www.fontsquirrel.com/tools/webfont-generator


Thanks. You can also put base64 encoded fonts directly inside your stylesheets; works very well, just be careful choosing not too large fonts.


Font squirrel lets you do custom subsetting, so you can generate a tiny custom font with only a few characters. I do up exactly that with Font Awesome, so I'm only getting the characters I need.


Yes, I did that too in the past for icons but now I just use inline svg - it's tiny and you can style then so... multicolor and animated icons!


Not really, the browser caches these files. They're only loaded once. No requests for subsequent usages. I don't see how that helps for spying on visitors?

Google must be doing it to improve performance.


Sure, I give Google the benefit of the doubt in regard to user data collection on GF.

As for caching, mobile phones have very small caches and will reload fonts frequently, specially with the large variety of fonts available. And if the browser can cache font files it can also cache stylesheets with base64 encoded fonts, saving requests at the same time.


> They're only loaded once. No requests for subsequent usages.

Are you sure? Typically a <script> tag pointing to a cached script will still trigger a conditional GET.


If you include an expiry date with a cache-control header then the server won't be contacted. I typically use this in combination with a unique suffix (for changes) for my web applications.

It appears that google are not doing that, they are indeed using these scripts to track your users.

Take jquery for example:

Response:

  Remote Address:[2a00:1450:4009:80a::200a]:443
  Request URL:https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js
  Request Method:GET
  Status Code:304 OK
Response Headers:

  age:414824
  alt-svc:quic=":443"; p="1"; ma=604800
  alternate-protocol:443:quic,p=1
  date:Tue, 29 Sep 2015 15:17:51 GMT
  expires:Wed, 28 Sep 2016 15:17:51 GMT
  server:GFE/2.0
  status:304


If Google isn't doing that, wow, they ARE tracking GF users!


If you use ad blocker for privacy reasons you probably also clear your cache once in a while to get rid of tracking cookies. Then you also loose all your cached fonts/jquery/etc.


Oh, I block all 3rd party links from all sites with uMatrix. The only things that ever get through are CDNs.


Fuck your web fonts. Seriously, just fuck them to hell.

I've been online since long before the World Wide Web was a thing, and I've watched the evolution of Web design from TBL's first proposals, through background wallpapers, animated flames, multicoloured layouts, spacing pixels and table-based layouts through to the saviours that were supposed to be CSS and AJAX.

It's all been a terrible mistake.

I'm leaning to a model in which a standard set of templates exist: article page, index, gallary, catalog, search -- and the client has a set of standard (or custom) templates to view them with. Client overrides server and author.

Yes, this means putting all but three Web designers in the world out of work. Couldn't happen soon enough.

A recent discovery of mine was that the combination of uMatrix and Stylish is fantastic. As the first blocks by default all CSS, fonts, and JS, I'm given a blank canvas from which I can apply my own preferred stylings (look up Edward Morbius's motherfucking Web page for a general taste). https://ello.co/dredmorbius/post/GwGDOuSqWn91CRkQBUQeYQ

Before this it was a stable of over 1800 local stylesheets, most quite brief and/or standard, to fix common gripes.

And it's not just me. A visually disabled friend, totally blind in one eye, 20/60 vision at best with correction in the other, and generally not particularly computer literate, has endless frustrations with gimmicky websites with all the usual crap: low contrast, tiny font sizes, hard-to-read fonts, poor colour choices, content which re-renders multiple times,etc.

I just spent an hour at the local Apple store exploring various accessibility options. While there are some for Mac and iOS products, they're terribly insufficient. No way to globally set user font sizes. No way to make "reader mode" the default for either Safari or Firefox. No alternative to mouse and icon interactions in far too many cases.

And that's just the OS. The Web as a whole is many times worse.

Readability Mode for the entire Web, with a small header space for branding, would be a huge improvement over the status quo.


I mostly agree with you. The web itself is wholly incorrectly designed, or should I say evolved incorrectly. Instead of generic content that gets modified to individual users preference, we have the inverse, every site has some special functionality/code and therefore needs a web designer. This is not only increasing complexity, but also cost. With generic web, anyone could make a web page, no coding required, and anyone could read it. You can see that there is actual demand for this, as there are emerging solutions out there that try to solve it. Unfortunately they are solving it the only way possible, by using the current web standards of building stuff on top of more stuff.


Generic standards are the standards of the web. You don't need to use css. You can use semantic tags. It looks very bland, as it must, since no specific design thought has gone into it.

But this will never suffice for companies that have a marketing budget. They need to promote their brand more than they need to distribute information. It's not the web's fault. And any scheme you come up with is going to suffer the same fate.


Using generic tags and allowing clients to specify presentation is pretty much precisely where I'm headed.

For an excellent example of the first half of this, see Mark Pilgrim's excellent "Dive into HTML5". I'm referring here to the structure of the document, not the content (though that's also excellent). The document is virtually entirely bare-bones HTML5 tags, with a typical nesting depth of 2-3 elements, rarely more.

Mark has also applied an excellent (and being the exception, proving the rule that virtually all CSS sucks) stylesheet to the site. Again with a minimum of chrome and glitter.

My thought, again, is that, similar to how LaTeX offers a few basic documenbt templates, a set of standardised site semantic layouts, for which clients offer standard presentation formats, with variants for high and low contrast, "night mode", and simplified design, might be preferable.

http://diveinto.html5doctor.com/


Don't call them the standards, if almost nobody is using them.


In that case, the standards being proposed can't exist based on your definition.


I just tried out uMatrix and disabled all CSS.

Am I now expected to either write, or hunt down a stylesheet for every single host?

Without the CSS, the navigation sidebar just goes at the top, which makes most sites unusable.


I have a single stylesheet, I call "unstyled", which I apply to those sites I don't either explicitly style myself or allwo the site style via uMatrix.

I'm not claiming that the user experience of individually styling each website is preferable. I am stating that the design experience of utterly nuking virtually all sites' CSS is vastly preferable to suffering through what's shoved at me.

Using Stylinsh, I can:

1. Create a default CSS which applies to all sites.

2. Create individual stylesheets matching a specific site, or regex, or domain.

3. Create one or more stylesheets which apply to multiple sites, though those have to be individually specified.

Again: I'm not claiming that the overall result is perfect or even good, simply that, for me, with reasonable CSS chops and sufficient frustration with the status quo to care and take the effort, that what I end up with is better than what I started with, and worth the (varying levels of) effort (on a per-site basis).

The more often I use/visit a site the more likely I am to change its CSS to suit my tastes. Some site I visit only once, I'll simply leave unstyled or apply "unstyled" to it (manually).

I've posted a few iterations of "unstyled" to Pastebin, one from August: http://pastebin.com/7TRnwG78

(This may not be the most recent, system on which that lives is currently unavailable.)


Thanks for sharing your stylesheet. Skimming the code, am I not mistaken in concluding that the navigation bar will still be shoved to the top of the page?


Since that applies to sites for which CSS is entirely disabled, it tends not to be an issue (some sites use in-line styles, causing problems).

That said, there's a separate "annoyances" CSS (which I don't believe I've posted though you might find a copy via DDG/Google) which attacks a whole slew of other gripes: headers, footers, interstitials, modals, flyovers, etc. It's very broad (e.g., [class="modal"], [id="modal"], modal, ..., with a bunch of similar patterns for annoyances. I've got to explicitly override those on selected sites, particularly where they use confirmations and other dialogs.


Thanks. It sounds like it might be more trouble than it's worth for me at this point, but I definitely appreciate the effort.

Of course, the root of the problem is that HTML is not really a good basis for implementing hypermedia.


I forget just who it is that said this, but the problem isn't so much HTML, as that there's no single entity in the role of enforcer to require that what's posted as HTML is actually sane HTML.

It's possible to do _very_ good things with HTML, CSS, tables, and Javascript even. I've done a few things myself of which I'm somewhat proud (my CodePen, referenced earlier in this thread, carries a few examples). But, as with much else, it's possible to throw out complete crap.

And browsers will render it. They've got multiple sets of quircks and brokenness modes to deal with just such issues.

Search engines will index that crap. Hell, many SEO optimisation techniques themselves are based on abuse of HTML (keyword and other semantics stuffing).

Tools that simply say "fuck you" and stop rendering shit content, or just do what the user actually wants, would go a large step toward fixing this. Various ReaderMode tools (now on Safari and Firefox) are a key example. I'm actually using the latest Firefox for Android and it specifically addresses a number of my own key browser grips in at least part (though by no means all or completely).

Google's Chrome is actually about the worst of browsers from this perspective, it's leaning far too much in the direction of yielding full power to Web authors. Yes, it's possible to create some slick Websites in that process, but you and up with gobs and gobs of shite.


"The Web is an Error Condition" -- not an essay of its own but a comment within this one:

http://deirdre.net/programming-sucks-why-i-quit/

Honestly, I miss the days when Netscape Navigator would just halt rendering in the middle of your page, saying, “No, I will not parse any more of your shit until you fix it.”


Remember when the way that you styled body text was by putting it inside the <body> tag? It just baffles me that writers and creators took one look at the no-nonsense information delivery systems of the early Web and promptly spent 20 years working their hearts out making it slower, heavier, and harder to read.


Did you just refer to your own website in third person?


I referred to the title and a useful search-engine keyword set. Edward Morbius strives to avoid third-person references to himself.


Twitter bootstrap is in a way one step in this direction. All pages look kindof the same but since they all build on the same framework it's easy to customize it over and over again to get some kind of branding applied or to revert branding clientside for accessibility.


Not familiar with that, though microformats generally, somewhat encouraged by Readability, seem like progress as well.


Yes. Somehow having a consistent look and feel is considered to be good for native apps but bad for websites. On Windows you have to make sure your controls look and behave the same as every other app or people will get lost and think it's some clunky port of an open source Linux app. On the web, somehow every designer wants to reinvent the wheel.


Apple and to some extent Microsoft impose the look and feel of the apps on their platforms (desktop and mobile) but nobody has control of the web. It's just too large and decentralized. Best practices are all we have but not everybody knows them. Just check the number of pages with a light background and text of a color too light to be read comfortably, or that you can't pinch to zoom on a tablet and you need a lens to read.

Edit: check this http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3129953/ about reiventing the wheel, a horizontal scrollbar to flip pages instead of the plain vertical scrollbar of the browser.


That horizontal scrollbar is similar to the one on the Internet Archive's BookReader Web app. I find it appropriate there.

The NIH page suffers in that it doesn't scroll down, allowing for full-screen viewing on tablets such as the one I'm using. There's a permanently visible browser nav and tab bar on it, which I'd prefer to have go away. Many sites with persistent headers suffer similarly (e.g., G+).


Google and other search providers are probably best positioned to encourage improvements in design and presentation. They've already done quite a bit of this in regards to accessibility generally and mobile and HTTPS useage. I'm hoping that they'll tackle usability and design simplicity in future.


I worked in a project that had to implement an ebook like ui like that. Not trivial. Why did that have to it in that site is a mystery.


I'm with you buddy. I have buttons for turning off allnthat stuff. Let me choose the font I find best, at the size my old eyes like for the device I'm using.

The arguments I've had when higher ups want the Web page to render pixel perfect to the "design". "But all renderers are different"


In my view, fonts are just another case of potential conflicting interests between content creators and content consumers. I can easily imagine wanting to override the fonts chosen by content creators, for instance if a vision impaired person wants to substitute a more readable font (or even let a text-to-speech program take over).

Content blocking has been around forever. I remember when I got my first real Internet connection, and the first thing I did was to disable the automatic downloading of all images. It greatly improved my browsing experience on my 9600 baud dial-up connection.


I disable web fonts, because the most legible font is the font you're most used to reading. Fancy typography is the graphic design version of loudness war music mastering - something to gain attention and seem attractive in the short term, but unpleasant in the long term. In both cases there's a tragedy of the commons situation. People compete to stand out and make things worse for everybody. Blocking web fonts helps prevent this market failure.


Legibility is important, but it's not the only reason to choose a font. The form of words (font, colour, size, context) also helps tell you what they mean.

I agree the reader should be able to override appearance when it suits them, but there's nothing bad about content creators using form as well as content to communicate. Feel free to block web fonts, but you should recognise you are losing something more important than mere decoration by doing so.


I agree that the form of the words can be a good way to convey information. That's why we have bold and italic. There's no need to change the entire font when you could use those standard variations of a familiar font. You're also losing something important (familiarity) by allowing designers to change fonts at will, and I argue that's a greater loss.


Just to nitpick, your connection probably wasn't 9600 baud; I suppose you mean a V.32 modem which was 9600 bit/s but 2400 baud. (9600 is transmission speed, 2400 is symbol rate.)


Fonts are relatively small compared to media. Full screen top page images are only viewed once and mostly show nearly immediately on any decent connection, and are already several times larger than a web font or two used to style entire web sites. Full screen video is far worse in terms of bandwidth without user consent. And if you're on a shopping site or any site with photos, a web font is practically one photo. It is not that expensive at all.

We must not also forget that web fonts have replaced text graphics. They can be small, but when used on every header or with multiple states in interfaces, they added up quickly. Not only are web fonts lighter, they are real text. We don't want to revert to text graphics.

What we have is the designer on one end, and the user on the other. The designer wants to decide how the page is served. The user wants to choose how they consume the page. Some users may add salt and pepper, or remove all the tomatoes. Giving users power cannot be a bad thing. But do we want features that automatically change our food no matter where we go? And there may be some allergic to everything. Maybe they'll turn off JS and not load any images, and be okay with half the sites being broken. But they will always be a minority that other's shouldn't be too concerned about. And they will always do whatever they want.

To know if a web font is ugly requires tasting it first, so at that point, you've already paid for what you may have tried to save. But considering how little weight web fonts are and how important they can be as part of what is being expressed by the composer of a page, ignoring it would be like randomly removing an ingredient deemed important by the chef.

Why risk those great sites for the one's that suck? Why would we allow the worst food to dictate all of our meals? Shouldn't we just call out bad web sites for what they are, instead of crippling our hard earned technology?

Imagine iOS 9 with Arial. Fonts are important.


> Fonts are relatively small compared to media ... a web font is practically one photo. It is not that expensive at all.

That may be true in the United States and Western Europe, where a typical font only needs to encode a hundred glyphs or so. But even Google webfonts begin to get rather hefty if you include Central European and Cyrillic characters.

Things get even worse with CJK fonts that have (tens of) thousands of glyphs. I have yet to find a Korean webfont that comes in at less than a megabyte, and most common webfonts in my country take up several megabytes apiece. It's really annoying if you're on a metered connection, not to mention s-l-o-w to load.

> We don't want to revert to text graphics.

Of course we don't want to... but things are so bad in CJK-land that I once wrote a library [1] that automatically generates text graphics given a string and a font file. It creates a separate image for each word, reuses them where possible, and automatically adds alt-text for search engines and screen readers. So at least it's an improvement over old-fashioned text graphics.

[1] https://github.com/kijin/imgtext


There are two issues:

  The use of bad fonts
  The loading of fonts via third parties
iOS did have a very bad font in the past but the font was embedded. With an OS you don't have to worry about a third party that might track you or serve you some infected font.

Add blockers block those third party fonts.

Personally I block all fonts via a Firefox setting because my dinner is served much faster and still tastes good (or better)


This title doesn't accurately describe the content of the article. The author's argument is that web designers should be allowing default fonts while including @font-face fonts, and that Web Fonts had it coming. It also states that ad blockers are just an implementation of the idea of disabling crappy/slow web fonts.

And my opinion: Please make your website legible. Serif fonts, or sans-serif fonts with high weights and large sizes and appropriate widths. Black rather than light gray. It would be really nice if I didn't have to cURL your website or hack up its CSS in the Inspector to actually read it. The new design trends are cute, but there's nothing like actually being able to read a page.

(and if you're using Canvas or something, I've already closed the tab.)


Please make your website legible.

I suspect a huge part of the legibility problems a lot of sites see is the gamma, DPI, and font rendering of the designer's system not matching their audience. A very thin font with gray-on-gray might look great on a calibrated 5K iMac with Safari, but be completely unreadable on Firefox for Windows on a standard 1080p screen with bad colors.


I am guilty of only color calibrating my screens (mainly for contrast/gamma reasons, less for straight up color accuracy), and I do this so I get an accurate look at what contrast actually looks like.

I have 3x Dell U2414H (absolutely fantastic monitors, if you can find any for sale still (discontinued recently), buy them all up), and display in newest stable Chrome, newest stable Firefox, MSIE11 in Windows with unadjusted ClearType settings (only MSIE11 listens to Cleartype settings, and basically destroys ultra-thin fonts if you adjust Cleartype, so, uh, just leave it alone, trust me), and then Safari on a retina MBP because Safari both has different (possibly better, possibly worse, clearly different) font rendering but also the fact it is on HiDPI increases the difference in how things are rendered on top of it just being Safari.... and if I don't like what I see in all four, then I fiddle with it until text is clearly readable.

I'm okay with thin fonts, I actually adore them because they are easier to read on larger font sized text over thicker text, but please check it on a standard Windows non-HiDPI machine, especially if you're a Mac fanatic that loves Safari on a Retina Mac. Just do it.


> I have 3x Dell U2414H (absolutely fantastic monitors, if you can find any for sale still (discontinued recently), buy them all up)

The Dell IPS monitors are great, but I'd recommend the Dell U2415. It has a now-hard-to-find 16:10 aspect ratio, which gives more vertical pixels.


Problem with the U2415 is it doesn't have the amazingly low total latency that the U2414H has. Including the latency from the panel itself, it is about 4ms from pixels flowing into the monitor to them flowing out to your eyeballs: this is better than every single gamer-oriented monitor on the market (which virtually all of them buffer 1 frame, which at 60hz, is 16.6ms, plus another 2-4ms from the panel itself).


This.

I interviewed a designer whose personal "showcase" site didn't work on Chrome/Windows.

I'm a rMBP user but expect me to test on all major browsers/platforms.


And hardware? My Windows 10 VM I keep around has significantly different rendering properties than the Windows 10 box I have on my office desk, even more so when running the VM on my rMBPe and even surprisingly when running on the same monitor.


Yeah, at home at least. Physical Win 8 box with two different monitors (Dell IPS, Samsung TN).

I'm of the opinion that web designers should work on $500 Dell laptops with those lovely 1366x768 TN panels.

Don't even start me on colour-blind support (no, red is not a good choice as a 'highlight' colour when for 30% of men it's a flat 'dark' colour).


I'm of the opinion that web designers should work on $500 Dell laptops with those lovely 1366x768 TN panels.

Especially when deciding what background color will distinguish ads from normal content.


You probably shouldn't be setting your body copy in a downloaded font. That feature is more suitable for display and heading fonts, which are usually much larger than fonts used for body copy. People used to put headings in as images, which is a pain for search engines, screen readers, and everything else that understands HTML.


I don't block fonts to make my web experience faster, I block them to reduce my browser's attack surface on untrusted sites. Generally speaking, font-handling code is not exactly notorious for being well-hardened against attacker-controlled data.


I agree with the author on his main point, but his statement, "[w]ebsites should not come with minimum software requirements," only works in a perfect world. Unless it can be shown in ROI, there is no reason to develop for anything but a modern browser (I'm looking at you old versions of IE).


You might have the ROI question upside down. The overhead of developing for IE - even older ones - is small if you ignore bells and whistles. There are a few CSS gotchas to worry about and that's about it. A webpage shouldn't require megs of JS and assets and what have you to work.

The real question is whether the time you spend on these bells and whistles contribute a positive ROI.


It might, for a commercial site. Flashy can sell, depending on the business.


> "...if you’re a user, chances are, you’re quite relieved (or even ecstatic) at the ability to block web fonts and experience a faster web."

These days, with increased processing power and standards adoption generally available, we really have been able to push the weight of websites.

Conversely, battery powered devices and sketchy wideband internet access are growing rapidly.

Sure, we have the power, but it's becoming apparent that at this scale effeciency is very important. More users than ever want more content faster than ever.

Globalized (literally) asset sharing, with technologies like digest references in script and link tags could make a big difference. Why not even pre-package really popular stuff (jQuery, Helvetica, etc) with the browser or OS?


You may have forgotten that Helvetica licenses cost money. http://www.linotype.com/1308886/Helvetica-family.html?numLic...

Feel free to buy me one, I'm a starving designer :)

Also, it's not in the spirit of the web to package content natively with the browser, because then only certain clients are able to understand the content correctly. The thought is to make everyone able to see exactly the same thing, with nothing stored on a local machine.


Buy? For the web it's worse - they nickel and dime you based on the number of visitors to your site.

It's an subscription.


The popular stuff should be just as performant as if it were packaged with the browser - most sites are pulling jquery from one of the big CDNs and caching it for as long as possible. You aren't downloading jquery for every site that needs it, you're downloading it once for the thousands of sites that use the same cdn. Packaging it into the browser is just going to introduce a whole bunch of complexity around versioning and updates and standardization.


You aren't downloading jquery for every site that needs it, so long as they are using the same CDN and the same version of jquery. I wonder what the hit rate really is...


I think people seriously overestimate how useful CDNs are for static assets. Even if the page you visit happens to call for the same version of a resource you might have cached, there is no guarantee it is requested from the same CDN. And all these hostname lookups slow things down too. I frequently see a site request assets from multiple different CDNs when those assets are all available from a single CDN. The designer clearly copied a snippet of HTML without regard for the extra lookups when they could have found all the necessary assets on a single CDN.

I tried finding reliable data for cache hit/miss rates for popular CDNs but was unsuccessful. Is there a site out there tracking this that designers could refer to? It would be good to know if simply concatenating all your CSS (for example, or JS) into one file and serving it from Cloudfront would beat scattering it across several providers and hoping the majority of visitors had cached copies. I suspect the single file would perform better but I have no reliable data to support this.


Personally, I avoid it. The client can afford CDN delivery and then we can assure availability. Donated project hosting doesn't have that guarantee.


As someone fascinated with web design, typography, UI and the like, it may be worthwhile to point out that design can markedly enhance the rate at which salient information is digested, the real bitrate, if you like.

The website of an individual, a company, or even a framework like django rely on more than the digestion of facts, the factual-bitrate if you like. Perhaps not entirely scientific, but things like 'aura', 'emotion', 'atmosphere' and 'feel' can stimulate in a way in which a generic layout cannot. Design creates memorable impressions and literally colours our intuition and usage, allowing us to emote with others.

Although I acknowledge that the brain is not well understood, it seems plausible that some minds rely more on design (used in a very loose way here) than on a more scientific analytical mindset to interpret content and what is going on.

Note I did not use the word rational to refer to the latter mindset.

That's because humans are not particular rational. Trying to systematise the internet and remove all 'colour' would take away the beauty and the poetry of what the internet allows us to do.


As a user, I hope this pushes designers to consider the aesthetics of a page rendered in a font supported by the browser. Unlike JavaScript, a web page rarely needs a custom font to function, so designers must assume users would block custom fonts if preferable.


Yeah, but in reality, designers may just turn back to static images to maintain a consistent appearance. That would force a step backwards.


Hopefully we can find a more pleasing medium. For example, if designers used web fonts with metrics that match common built-ins the FOUT could be made much less jarring. (Yes, ligatures and such make that difficult in the general case, but it surely could be good enough for headlines.)


Seems like use of fonts isn't the issue, it's the fact they have to be served. Why can't browsers embed more fonts?


Fonts cost money to license. http://www.linotype.com/1308886/Helvetica-family.html?numLic...

Also, it's not in the spirit of the web to package content natively with the browser, because then only certain clients are able to understand the content correctly. The thought is to make everyone able to see exactly the same thing, with nothing stored on a local machine.


It may not be in the spirit of the web, but the current state of things is a failure. Less of a failure than 10 years ago, but still a failure.

Packaging the most popular / representative, say, 256 free & open fonts together and distributing them with Firefox and Chrome would be a useful thing for designers of everything on the web other than fonts.


A gzipped truetype font is in the order of 100k, though plenty are bigger. 256 of these is ~25MB. Chrome and Firefox are only 45MB to begin with.


It's supposed to be a decentralized network. Blocking all non-google or non-typekit webfonts is an example of breaking that vision for the web, which is a shame.

I think the better solution is more browser security work and transparent & common limits on computation time and download size that browsers will do the accounting for and enforce.


*Some fonts.


Yeah, honestly, Chrome, Firefox, MSIE, and Safari would all do well if they all included Open Sans and Droid Sans (Google's; Droid Sans is essentially a semi-condensed variant of Open Sans with less glyph coverage), Fira (Mozilla's), Source Sans Pro (Adobe's), and Helvetica Neue (Apple's).

Windows (duh), OSX, and (optionally, see your distro) Linux include several Microsoft fonts (such as Arial, Verdana, Georgia, and New Times Roman, among others; Verdana being the closest to Open Sans, Fira, and Source Sans Pro in modern usage), so Microsoft/MSIE, before web fonts even existed, side stepped the need to always download fonts.

Using Google Fonts for distribution of web fonts is great and all, but it doesn't always prevent the need for font download, and Google Fonts always includes the local font name just in case you have it.

The weird part is, Chrome doesn't seem to include a copy of Open Sans (even though its Google's official font used almost everywhere), and neither does Android (but it does include Droid Sans and/or Roboto, with, I believe, both Droid Sans and Open Sans aliased to Roboto on Android 5.x), and Firefox doesn't include Fira (even though its the official font for FirefoxOS and they're trying to make Firefox itself be an app host like how Chrome is for a subset of Chrome apps (I'm not sure how Chrome apps vs Chromebook apps work here, are they the same thing? Is it a subset that works on both as long as you don't use Chromebook-only APIs?))

Also, why doesn't Microsoft and Mozilla also spawn their own CDNs for font distribution? Google refuses to have Wingdings-type fonts, even though this is the canonical way of doing vector images meant to be paired with text (due to the ease of use and better ways of handling outlines; WOFF2 eats fonts like Font Awesome up; SVG + gzip would never perform like that, 63kb for several billion icons), and their API for producing subsets of fonts would make icon fonts a shitload smaller, and also have a much higher chance of pre-cached icon fonts (just like popular real fonts enjoy).


They can have thousand (no, really, they have thousand). But your little snowflake of a font will always be out.


As old as the web can feel at time, we are so blessed to have Georgia as one of the standard fonts. It is such a good serif font, which is also why I think a lot of the typography out there is more ornamental than necessary.


I have blacklisted google webfonts with hosts-file entry as many websites use fonts from there that are terrible.

For instance open sans seems to be used more and more. Here's what it's like without antialiasing (top), vs Arial (bottom) http://files.benjiweber.co.uk/b/fonts.png

Antialiasing hides some of the terribleness but it is still much harder to read than alternatives.


There are some interesting issues at play here. There is obviously some desire to have levels of standardized-form content that can be consumed in the way that the user desires.

However, I disagree with a closed internet. Websites, like brick and mortar businesses, should be able to do what they want. They are responsible for the user's experience and should be able to craft it as they see fit. A browser (or extensions) should not be forcing an agenda for how consumers may or may not want to experience the internet. (Ads in partciular, are a unique part of the discussion).

I appreciate websites like medium, reddit, facebook, pinterest, and others that standardized content formats for users to experience in a consitent way.

The "death of RSS" is related here too. It's a complex issue. With RSS the user has significantly more control, but the problem is that you lose context and it can often become hard to browse large amounts of content.


I also disagree with a closed internet. Users, like TV viewers skipping channels, recording to VHS, or leaving the TV on but mute for a house party, should be able to do what they want. A website should not be forcing an agenda for how consumers may or may not configure their browsers to request and present assets freely served by sites.

I appreciate that websites want to look good. But browser rendering has always varied between devices and user preferences, and fonts in particular have always been a prominent configuration item. User displays have had varying DPIs and widths, users can set their preferred minimum font size and preferred window size, their preferred serif and sans-serif fonts, and zoom text size with or without other page elements. Some users have to use screen-reading software! This has always been in the interest of the user's readability with their particular setup, and the openness of the web to be used with whatever setup the user has.

My only point is, your argument has nothing to do with "a closed internet". Or "it can often become hard to browse" (if you let users configure their browsers as they prefer). Right...


Sorry, but the idea that the client (browser) doesn't get to choose the experience is fundamentally incompatible with the architecture of the web. I don't see how you can call that a "closed" internet; quite the opposite.

From the very beginning, almost every standard related to the web has made it very clear that ultimately clients get to choose what to display, what to retrieve, and how to display it.

Yes, that freedom comes with a price, but it's a price worth paying. For users that are visually impaired, it's particularly important to be able to override the suggestions of the content being displayed.

I don't see how there's an "agenda" involved in what any web browsers are doing today. The end user is the one choosing to use the client, choosing to use the content blocker, etc.

If Apple or Google made the choice to start blocking content using the browser itself (on their own), you might have a point, but that hasn't happened (yet).


Absolutely, both the user and the website/webmaster should be able to freely make their own decisions.

My issue is with the idea of grouping web font blocking with ad blocking. A user should choose separately if they'd like to block:

• Ads

• Tracking scripts

• Creative assets that may slow web performance (like web fonts)

• Other non-essential, non-creative scripts that may slow web performance


Agreed; I'd also argue that web fonts shouldn't be blocked by default unless retrieval of them presents a privacy or tracking concern (unique source url, etc.).

In short, I don't disagree with what you've said at all. However, content providers must keep in mind that ultimately, they don't control how their content is presented or what part of it is displayed.


> Yes, that freedom comes with a price,

What price, and who pays it?


The provider of the content must knowingly give up control over their content in exchange for the client's attention. They can make suggestions as to how the content is presented, and in some cases, requirements, but overall, the client is in control.

Put another way, content may be king, but the client is the emperor.


Browser should let user choose which kind of experience they desire.

Unfortunately, both websites AND web browsers are set on taking choices from users and forcing agenda on them.


Web browsers are becoming better at accommodating user choice, though. More and more browsers have extensions or content filtering tools, which are an extension of the user's power into their view of the web. And if there's a site that really forces its agenda on you, don't use it. If enough people agree that it's not useful enough to be worth the trouble, it'll stop existing eventually.


Brick and mortar is a closed model. Quite literally.

Flash, ActiveX and Java applets are a "brick and mortar" of the web and thank god they're long dead as means of web design.

Luckily, there is a "brick and mortar" of the modern times where publisher or retailer has full control over experience. It's native apps and users seem to love 'em.


Site was down when I tried to load it today. Here's a mirror via Google cache: https://webcache.googleusercontent.com/search?q=cache:miranj...


What blockers/browsers should do is just tell scripts that they finished loading. Then sites would continue to work just fine. This kind of technique is gonna be needed to continue ad blocking once a lot of pages start trying to detect blockers and not display content.


That would start a needless arms race, there are a million ways to detect if you're using an ad blocker.


Ultimately winnable by the end user. It's DRM all over again. Worst case: An image recog system identifies and blanks out areas that are showing ads. (Or hell, a robot arm puts pieces of paper over the screen.)

May as well start the arms race when we know who will finish it.


re: a robot arm:

websites will simply require to connect to a webcam showing you browsing the site to allow viewing.

We'd better devise a way to painlessly charge users ¢0.1 per view.


True. I suppose they could also serve up an ad-captcha before serving the content.


¢0.1 per clickbait.


and this blog entry is a prime example, choosing a font that was small and faint against a bright background, making me click on my DFT firefox extension to override it.

Worse is that there was absolutely no requirement for them to choose a small serif font for any particular web design reason.


Small and faint? Looks fine to me? http://imgur.com/g6Ue5Fj


Your image is 404.

The text on the site is not black but it's 90% OK on my tablet (8" and about 2500x1600 px). Black text would probably be better. There is also a lot of useless () margin on the left but I can punch to zoom and make the text fill all the screen. No issues with font color at that size.

() Useless because I want to read the text comfortably on the tablet. The screen of my computer is larger and I probably won't mind that margin there.


this is one part of why i love my rss reader.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: