Hacker News new | past | comments | ask | show | jobs | submit login

I think the mistake may be in discounting the long tail: Even if the top n% are still in print, the bottom (100-n)% aren't. The volume from a million out-of-print old books that would each sell on average fifty copies a year can easily dwarf that of a few thousand in-print old books that each sell a thousand copies a year.

As for the e-book explanation, even if that turns out to be the case, isn't that an even stronger argument for shorter copyright terms? Now that public domain books are "free" you get a lot more public benefit out of something being in the public domain, because instead of the price going from $17 to $15 it goes from $17 to $0, which results in far more people getting access to the book.




If the million books are out of print, Amazon certainly can't sell them as physical copies. (OK, there's PoD but I can't see that skewing these numbers in a big way.) And even Gutenberg has "only" 36,000 e-books, which I assume corresponds pretty closely to all the available public domain works that have been converted to text. That's a big number but it's still a fairly small slice of all published books even allowing for the fact that the number of published books has gone up over time.

> As for the e-book explanation, even if that turns out to be the case, isn't that an even stronger argument for shorter copyright terms?

Sure, at least in principle. Insofar as copyright law is supposedly about balancing the public good with encouraging the creation of new works, making the good relatively more valuable (through wider distribution) once out of copyright would presumably change the balance point. As I said, I'm certainly not going argue for the current copyright regime.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: