Hacker Newsnew | past | comments | ask | show | jobs | submit | celebril's commentslogin

That's... really anachronistic.

The Death of the Author came about in 1967, even New Criticism, which foreshadowed Death, came about only at the earliest the 1930s-40s, after IA Richards published his Practical Criticism.

Shelley wrote in the Romantic period, when the artist's personal "genius" was paramount as a conduit to the "sublime". Saying that he somehow whipped up a criticism of "author as the arbiter of meaning" in what is practically a product of a friendly poetry contest is plainly absurd.

Strangely you also attributed this anachronistic reading to Shelley, making it somehow the author's intent to be counter-author-intent. Just... why?


Ah I see, thanks for setting me straight. I must have conflated someone's later reading of the poem with the author's original intent.


This is not a new idea. Playing with the boundaries of presentation in literature is just a subset of post-modernist lit.

If you want a prime example, OP, you should look into 4chan /lit/'s massively collaboratively written post-ironic tome "Hypersphere"[1]. It's the epitome of this sort of hyperliterature.

[1] https://docs.google.com/document/d/1L10Cbgj7CUEAe1-LwwNfYY1B...


And it's still faster than PCs. :)


Apple makes some nice hardware but they are not alone in making good stuff. They have some stiff competition.

For example my Sony Vaio Pro is significantly faster than any of the numbers mentioned in the article. It gets nearly 1GB/sec in read speeds: http://i.imgur.com/24GdPiz.png

The drive is partitioned so ignore the odd size.


They won't get much stiff competition from Sony:

http://www.cnet.com/news/sony-agrees-to-sell-pc-business-rev...


Going forward, no, but they did in the past and others will step in to fill those shoes. I hear ASUS makes some good stuff currently.


The thing is, you don't need 1GB/sec for an 11-13" laptop.


This comment is so bad it was featured on @shit_hn_says: https://twitter.com/shit_hn_says/status/462524610844442626


Why would someone in this day and age say something like ”X is enough” when talking about tech? Is that a wish to be featured alongside Bill Gates?

(yes, I know Gates' quote is apocryphal)


You can do much worse than being featured alongside Bill gates!


Why? I boot VMs and 1 GB/sec makes it happen in 5 seconds


It has nothing to do with them being Macs vs. PCs, it just depends on who they buy the SSDs from.


Even though your statement is not necessarily true, I bought a Vaio T11 almost two years ago and I was able to upgrade the SSD (the laptop has the footprint of the 11" MacBook air) and I have a HDD (in addition to the SSD), that I can upgrade very easily, I also had a free SoDIMM slot and was able to double the ram from 4GB to 8GB. That's what I like about PCs. Here with the mac you only have one drive that you cannot change or upgrade and the ram is soldered.


my brain just exploded


No mention of Steve Jobs?

Disappointed.

Everybody knows that Apple's OS X is the most advanced and secure OS.

If Steve Jobs isn't one of the people who are responsible for protecting the Internet I don't know who else is.


>bigger != better

It's hard to ignore the hint of irony here when it's the Westerners' turn to tout this line.


Actually not a westerner :-)


A second-hand Thinkpad.


Well which one? I like them and they are nice, but doesn't really compare to the air or the pro retina in handling. But maybe you have something in mind.


>MIUI, which is like an iOS themed android

I hear this line getting trotted around a lot, mostly from people who have never used MIUI themselves.

MIUI is an "iOS-themed Android" as much as Ubuntu's Unity is an "OS X-themed Linux DE".


>I hear this line getting trotted around a lot, mostly from people who have never used MIUI themselves.

Well, I use it every day.


I never thought I'd live to see the day the word "mansplaining" is used on Hacker News unironically.


There's a certain beauty in the ideograph system: you may not know how to pronounce the characters, but you could figure out a meaning out of it.


It's bad because it's not as hip and cool as LLVM's offering.


No gdb is bad for c++ debugging because it doesn't include parsing support for c++.

At least I'm not aware of being able to type in c/c++ in gdb on the fly like you can with lldb contexts.

So what exactly is "hip and cool" about having your debugger actually use the same parsing engine as the compiler infrastructure? If you have a technical point make it, otherwise stop being the discussion equivalent of acid.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: