Fox News was notoriously famous for having extremely negative coverage on African Americans during the 1990s through 2014. They still are famous for that. National Review was extremely famous for it, too. The New York Times was also a conservative outlet, as I mention elsewhere, having hawked for every war since Vietnam.
Even the person Damore was inspired to write his manifesto because of (SlateStarCodex) admitted it was too far.
Biased reporting isn't new, it's always been universal, and it'll never go away. You just sound like you want more conservative outlets. Here's a tip: Turn on your local news. Local news in America is almost always owned by one of three conservative companies (Sinclair and Nextstar come to mind most immediately), and all of them have an extremely conservative bent.
That's not even getting into the bias of media on scientific issues. HIV/AIDS never got proper coverage, and the coverage it did get was usually outright wrong and hostile during the 1990s. Climate change? Completely debatable! A matter of emotion! They were still pushing that up until a few years ago. Encryption? Can be government-crackable and still secure! That's been a constant topic since the 1990s, too!
According to a report by the progressive research center Media Matters, New York City television stations give disproportionate coverage to crimes involving black suspects.
The Media Matters study found that between August 18 and December 13, 2014, the stations (WCBS, WNBC, WABC, and WNYW) used their late-night broadcasts to report on murder, theft, and assault cases in which African Americans were suspects at rates that far exceeded African-American arrest rates for those crimes.
There's always bias. Always will be bias. The above issue was even worse in the past.
Yes, Fox is part of the problem and there will always be bias, but again, there’s a difference between being biased but aspiring for the truth and committing oneself to one’s biases (partisanship). The media landscape of yore was mostly the former even though Fox and other outlets were exceptions to the rule. Today’s media landscape is almost entirely partisan with few outlets that seem to aspire toward neutrality and objectivity.
Journalism was rarely feigning objectiveness: during the 1920s through the 1970s, some of the strongest voices in favor of labor were journalists. "Neutrality" is largely a construct, or meme, that has been pushed by Rupert Murdoch since he founded his "news" network. Naturally, his own properties are nowhere near objective, despite some people claiming WSJ magically escapes bias.
Everything is naturally biased, the only distinction is whether an entity is open about their bias. This goes for anything: whether it was the New York Times and FOX hawking for nearly every war during the 2000s, or it was Newsweek favoring MLK Jr. and Life describing his speeches with phrases like "demagogic slander" during the 1960s, everyone's naturally got an opinion. This doesn't stop applying when writing about a subject.
Granted. The extent to which journalism is valuable is the extent to which it is neutral and objective. When it abandons even the pursuit of truth, it becomes a social ill.
> Everything is naturally biased, the only distinction is whether an entity is open about their bias.
I don't think this is true on any level of analysis (though it is one of the most popular and obvious of mistruths). At the individual level, one can choose to counterbalance his biases or commit himself to them. He can choose to lean on rhetoric or reason. He can debate against his most competent opponents or he can choose stooges. He can choose between straw men and steel men. He can choose to be honest (if fallible) or dishonest.
At an organizational level, we can choose between orthodoxy and heterodoxy. We can have an ideological monoculture or a diverse culture. We can build a culture that calls out rhetoric and favors reason. The net result of a heterodox organization isn't the absolute lack of bias, but the bias is severely attenuated compared to the modern newsroom.
That link is transparently pushing something, and what it's pushing definitely isn't "the truth."
The only thing, and I repeat: the only thing that absolutely ridiculous, fearmongering, slanderous article even says outright that they do, rather than just blatant speculation, is PDF downloading.
Then, over a weekend (when spikes in usage are less likely to come to the attention of publishers or library technical departments) they accessed 350 publisher websites and made 45,092 PDF requests.
What's the harm in this? There's none! They're literally just requesting PDFs. The article insinuates murder but doesn't even try to substantiate their claims of "Oh maybe they're doing something, just maybe, maybe maybe maybe they're doing something evil, yes indeed, maybe they are!"
No, they say that hackers "not only broke into their database; they changed the names and passwords of profiles" but they admittedly do not attribute that to the group.
>What's the harm in this? There's none! They're literally just requesting PDFs
Via stolen, cracked, or phished credentials, though. I'm not arguing against this, I wholeheartedly believe in the Guerrilla open access manifesto and its beliefs, and it is admittedly not proven to be Sci-hub, just a random attack.
No, they say that hackers "not only broke into their database; they changed the names and passwords of profiles" but they admittedly do not attribute that to the group.
You can't negate "They don't accuse Sci-Hub of actually doing anything!" with "They accused hackers of Doing Evil, but admittedly they don't attribute this to Sci-Hub."
Via stolen, cracked, or phished credentials, though. I'm not arguing against this, I wholeheartedly believe in the Guerrilla open access manifesto and its beliefs, and it is admittedly not proven to be Sci-hub, just a random attack.
So if there's no proof, and you'd agree with it even if there was, then why bother posting this awful article?
I suppose to see what others thought about it. I specifically mentioned in the parent comment that I was on the fence and that "This might just be a hit piece by the same companies who are losing money". I did mention the proof in the article, which is real. I'll admit my initial judgement of the article was off, but not entirely wrong given that I never said I wholly agreed with it. Or maybe I'm moving goalposts or whatever. Anyway, I thank you for pointing out what I did not realize.
>You can't negate "They don't accuse Sci-Hub of actually doing anything!" with "They accused hackers of Doing Evil, but admittedly they don't attribute this to Sci-Hub."
I am not negating it, I am admitting that I am wrong.
My guess would be that Sci-hub probably isn't doing this because my guess is that they don't need to. Given how widespread support and usage of Sci-hub is within academia, I suspect they have access voluntarily donated credentials on the order of hundreds if not thousands (remember that it's not only faculty staff that have access to journal articles: students do too).
Now that I agree with; the article specifically avoids attributing it to them, and if they could, you can bet they would. So I'm assuming they're taking a mostly unrelated incident and pushing an agenda with it.
Drew is an Alpine maintainer. Alpine, which you might know as probably the most-used server-side Linux distribution in the world right now, bootstraps everything in its repositories, just like every major Linux distribution on the planet.
Morten Kromberg (big guy at Dyalog; coincidentally, author of the post we're discussing) in "justifying" why Dyalog isn't libre, just a few days ago: There is little risk of the demise of Dyalog APL any time soon. Our customers run businesses that are based on Dyalog APL with a combined annual turnover in excess of a billion euros/dollars.
There are a few other companies with proprietary APL implementations that are also getting by really well.
k, which is pretty similar, is even more of a money-maker than Dyalog APL, having been responsible for a company worth a billion or more (Kx) and an entirely different company that also seems to be doing pretty well.
J also is used somewhat commonly, though less, and its users seem to be doing more than well, too (my current employer and most of my previous employers have been J shops, and none of them have gone under yet. I think a couple of them had record-breaking years in 2020). J is also free, libre software, so instead of paying for a Dyalog or K license, you should really just use it (or ngn/k, if k is more your thing).
I'd also recommend checking out J, which isn't a recent development, but has the best syntax out of all array languages, has the best development environments, is the easiest to learn (it has a way to learn it built into the language itself!), and is the only one that treats making GUIs as a first-class feature (and, also, critically, is not proprietary, unlike Dyalog):
https://jsoftware.com (Has so many previous discussions I just recommend using HN search to find them.)
The chat is biased in favor of Dyalog APL, but a lot of the modern additions Dyalog has made to the language make it (in my opinion) worse as a notation, so ideally don't let it turn you off of the concept of array languages entirely if Dyalog doesn't "click" with you.
If you haven't already, you should also check out Notation as a Tool of Thought, a paper so good it won Iverson the Turing Award:
Dyalog clicks a lot more than J; "the best syntax"? "Easiest to learn"? Can you expand more on these positions?
And "a lot of the modern additions Dyalog has made to the language make it (in my opinion) worse as a notation" this one. I don't know when you mean modern but as a casual user, {} functions, trains, nest ⊆, rank adjustment ⍤ (like J), seem to make things more convenient?
Yes, absolutely. By a long shot. For starters, J can actually be parsed. (k can also be parsed, for what it's worth.)
"Easiest to learn"?
Spend ten minutes using J's built-in Labs feature. Or read J for C Programmers (also ships with the language), if you come from a non-array background. Iverson was able to teach this stuff to public school children in no time at all; modern array languages seem to deliberately make themselves obtuse to outside observers. APL was doomed to obscurity because the people making it decided to please existing customers rather than try and make it approachable.
I don't know when you mean modern
Pretty much every APL2 feature and everything that came after it that they didn't borrow from J.
While J has English control statements, they generally aren't used, but nearly every time I come across something written in Dyalog APL it's full of :If :EndIf and all sorts of atrocious English words which mock the ideal of a better notation than ALGOL.
Don't know where you got that one. J source can't even be tokenized except during execution!
NB. Compute n from reading a file of something
n : 0
Is this the source of a function, or a string?
)
Besides that, J has the same contextful issues as APL with regards to the value of variables determining how the syntax fits together. It's missing a few dark corners like niladic functions, but these are small differences; it's still fundamentally unparseable.
Regarding that, I am surprised that a 'more limited APL' lacking this ambiguity and opening the way to easy compilation never appeared (to my knowledge). I essentially use J for data wrangling and pretty much never use the metaprogramming features of the language (". and the like). An 'APL NumPy' would be quite nice, IMO.
K can be parsed. Even Q (the English version of K) can be parsed if you consider the prelude a part of the language (rather than part of the runtime environment).
An area that's still pretty different between J and APL (IMO) is that ⋄ makes APL much easier for procedural programming while J is easier for functional code. As a result, I (a journeyman in J and beginner in APL) tend to procedural much more in APL. I also get this impression from APLcart where ⋄ and ← are not at all uncommon. Of course, you can do that with J also using [x=. A second area of difference is that J is arguably more math-oriented, which is ironic considering the origins of APL, but I mean look at the primitives! J's 'higher-math' primitives are really useful for preparing datasets.
Also, I'm mainly doing preparatory data wrangling with J (i.e. quick&dirty work) and not having to activate the APL keys everytime is a nice feature. The ability to write a simple #! script and dispensing with the 'APL machine' of workspaces also counts (a feature that just appeared in APL). As a data wrangler though, the true killer feature of J is Jd, the integrated columnar store. Of course you can mmap through it all in APL, but having a completely integrated solution does the trick for me! Actually, I find the concept of 'fetch your data SQLish and model it with J' such a good idea that I'm playing with the idea of extending the concept to the Racket data science world (which is mostly non-existant) :-)
So if you're doing serious programming as done in Fortune 500 companies I'm pretty sure APL wins due to consistency and tooling integration, but for one-off work J is a killer!
He has a lot of really good blog posts and software toys on his site and on itch. I recommend looking through them. He also has a web comic that's pretty good, but he hasn't updated it in a while.
The SE chat is really worth lurking if you're at all curious about APL; nearly everyone who has written a publicly-available array language implementation who isn't dead or over sixty is active on it.
He wrote solutions for Shootout (predecessor to that Debian page) problems a few years ago; a lack of k in benchmarks is not due to a lack of k being volunteered: http://www.kparc.com/z/comp.k
k is really fast. Half of the things on the Shakti mailing list are just Arthur getting really excited about how significantly he's beating x or y or z in performance and giving numbers for it. `grep`ping it now I see 40 in half a year that explicitly contain the word "benchmark," though not all of these are comparing to other things (some are just comparing to different k releases), and there are more comparisons without that word.
Arthur doesn't work at Kx anymore, by the way. He's at Shakti now. Shakti has a different (but still draconian/non-(A)GPL) license. It probably doesn't have the benchmark clause, but I don't care enough to check (I prefer J to k and don't have a proprietary k on my system).
> He wrote solutions for Shootout (predecessor to that Debian page) problems a few years ago; a lack of k in benchmarks is not due to a lack of k being volunteered
That's 6 of the programs, there were at-least 4 others ;-)
I lacked and still lack the knowledge to figure out if those snippets are doing what they should.
For example, do those scripts set arg n from the command line and read from stdin? Do those scripts write correctly formatted output to stdout?
What a pity that page does not show measurements for those scripts, and a comparison with some of the C programs written for the benchmarks game.
Thanks for that link. Do you know if the results are posted anywhere?
https://shakti.com/license.php says "Customer shall not... distribute ... any report regarding the performance of the Software benchmarks..." which I would need to agree to if I want to download the binaries at https://shakti.sh/benchmark/
You could check and see if it was ever posted on the Shootout site with archive.org, but when I checked coverage was spotty. The point of these benchmarks is everyone being able to run them themselves, though.
If you look at my post history, it's actually primarily free software APL implementations! ngn/k is amazing. John Earnest's work with his own variants is really fantastic. I dislike klong and Kona.
Not developer, just a lone data wrangler but FWIW Jd has made me much faster at exploring data. I now use Jd to create my datasets and use R just for the models. IMO, it's far better than using R to tidy things.
If you have concerns over someone's behavior, please just message the mods at hn@ycombinator.com. Pushing discussions into meta-arguments about reposting poisons threads.
Even the person Damore was inspired to write his manifesto because of (SlateStarCodex) admitted it was too far.
Biased reporting isn't new, it's always been universal, and it'll never go away. You just sound like you want more conservative outlets. Here's a tip: Turn on your local news. Local news in America is almost always owned by one of three conservative companies (Sinclair and Nextstar come to mind most immediately), and all of them have an extremely conservative bent.
That's not even getting into the bias of media on scientific issues. HIV/AIDS never got proper coverage, and the coverage it did get was usually outright wrong and hostile during the 1990s. Climate change? Completely debatable! A matter of emotion! They were still pushing that up until a few years ago. Encryption? Can be government-crackable and still secure! That's been a constant topic since the 1990s, too!
https://www.vox.com/2015/3/26/8296091/media-bias-race-crime
According to a report by the progressive research center Media Matters, New York City television stations give disproportionate coverage to crimes involving black suspects.
The Media Matters study found that between August 18 and December 13, 2014, the stations (WCBS, WNBC, WABC, and WNYW) used their late-night broadcasts to report on murder, theft, and assault cases in which African Americans were suspects at rates that far exceeded African-American arrest rates for those crimes.
There's always bias. Always will be bias. The above issue was even worse in the past.