ABP is not a one-man show, it's now developed by a company (Eyeo GmbH) that makes a lot of money getting companies like Google to buy into their "acceptable ads" scheme.
ABP also considers ads that track people (such as Google's) to be acceptable and whitelists them by default.
If you make a modern-looking long-scrolling article that has an ad somewhere in the middle, it's not "acceptable". If you get a crappy CMS that splits every article into 9 pages with an ad at top and bottom, then it is.
The main weird thing is that 3rd-party tracking is "acceptable" (!)
I run multiple exits and consider much of what is written about the risks of running exit nodes to be overblown scaremongering, so it is nice to see an article about the actual reality of running exits: it's boring.
There are legitimate fears for people living in certain countries but many of the people handwringing about it aren't in those countries.
This policy was created in 2009 in response to the controversy when NoScript started messing around with ABP and broke functionality so that ads would display on the NoScript developer's website.
Wow, thank you for posting that. Very informative. The post and the NoScript reply [1] both raise some interesting points about how to compensate plugin authors for their hard and continued work that seem like they're still not addressed six years later. I have never really stopped to consider either compensating plugin for their work and I rely on them daily.
The problem is that even IF it did do this, it wouldn't help publishers. In fact it would probably hurt publishers because ad performance would take a hit. This would lower ad pay rates and possibly even sell-through.
Google Analytics had a self-hosted version, or more specifically Google Analytics was a self-hosted solution when it was Urchin before Google acquired it.
You could still get the self-hosted version for a price until Google killed it in 2012.
The reason Google doesn't want you to do this is simple: they want the data.
Why should one group of people acquiesce to another group of people's demands just because they claim offensiveness.
Should everyone live to the strictest thinkable standards because somewhere out there is a semi-large group of people who find your behavior offensive?
Such semi-large groups might be religions for example that take various kinds of behaviors you take for granted as offensive.
It's not that you have to. It's a social grace to not offend people. If you really want to, you are allowed to. But when society says to fuck off -- well, you got what you paid for.
People getting offended is not the issue. If everyone chose to never offend anybody else, my mother would never know that she makes bad pasta and we'd all have to eat bad pasta 3 times a week.
You want to draw the line at marginalizing a group of people or trivializing something serious like racism. But you want to make sure you are doing it in a way that isn't inhibiting legitimate thoughts, studies, projects, etc. from making it into the public sphere.
Every time you censor speech you stifle legitimate dialog because people will be uncertain whether their thoughts are legitimate or just offensive.
Millions of people find all kinds of things offensive. One half of the political spectrum finds the other offensive. That doesn't mean we just ban everything.
Entropy that changes when your local IP does? That's worse than useless for ad targeting. Even if it were useful, I don't think there's much of a chance that adtech companies will build out STUN servers to handle the kind of traffic they do just track down the 0.001% of users who do not accept third-party cookies. Can you even do WebRTC from an iframe?
Browser fingerprinting is absolute FUD. It makes no sense for advertisers, and it's pretty useless for anyone else, too. Every time I visit the EFF site that checks my fingerprint, it tells me I'm still unique. That's perfect anonymity!
Revealing a user's personal IP when they're using a VPN is a real problem, though, where the computer isn't doing what an even an experienced user would expect.
> They talk about FreeBSD in the original article and the guy tests that on other OS and say it's not a serious vuln?
Mr Hansteen is saying it is not a serious OpenSSH vuln, like the tech media is claiming it is.
> This is a serious vuln for FreeBSD. Period.
That's why the original disclosure and subsequent news articles clearly stated it was a FreeBSD and/or PAM vulnerability, and didn't run with headlines such as "OpenSSH keyboard-interactive authentication brute force vulnerability"[0] or "Bug in widely used OpenSSH opens servers to password cracking"[1].
The messenger reported the problem privately 9 months ago[0] and it was only fixed as they were preparing to go to the press. The day before the "stunt hacking" Jeep were claiming it was not a problem[1].
Not something that I would do, at least not on that stretch of road. But, nobody was actually harmed, nor was anyone even inconvenienced beyond perhaps having to make a lane-change. It could have caused a pile-up accident, but to give some perspective, I commuted into Houston on that same day and counted half a dozen disabled vehicles on my route, and two police officers trying to catch speeders; both of which add at least the same magnitude of risk to other motorists.
>They deserve to be charged/locked up for that portion of it.
"They" the two that did the risky road demo, or "they" the ones who waited 9 months to mitigate the safety defect in thousands of cars? Locking one of those groups up makes us more safe, and the other less.
That's ends-justifies the means. I'm not arguing someone was hurt, I'm arguing they put people in a situation with a higher than necessary risk.
> I commuted into Houston on that same day and counted half a dozen disabled vehicles on my route, and two police officers trying to catch speeders; both of which add at least the same magnitude of risk to other motorists.
You can't control what other drivers around you do to cause road hazards, but in this case that's exactly what the hackers did. By cutting the transmission they caused a possibly dangerous situation.
They could have done this test safely in numbers ways. On a rig designed for testing horsepower, an abandoned parking lot, a private track, or asked some police to close a section of road.
Instead they did it in about the most dangerous way possible: live on uncontrolled roads with other traffic.
> "They" the two that did the risky road demo, or "they" the ones who waited 9 months to mitigate the safety defect in thousands of cars?
The 'researchers'/hackers. They directly put people's lives at risk to have a stunt to prove their point.
FCA screwed up big, and deserve some sort of penalty from the government. This shouldn't have been possible in the first place. But they didn't modify a running vehicle at 70mph surrounded by unsuspecting motorists.
If the hackers had done this reasonably safely, I'd have no issue at all. But they don't deserve the title 'researcher' after behaving like that.
The "ends justifying the means" is not a fallacy, unless you think Consequentialism is completely wrong[1]. In this case, it seems a lot of people do think the ends justify the means. In other words, the risk was morally justified given the results.
I don't like it either, but it's a common philosophy that we experience in our daily lives, whether we want it to be that way or not.
>Instead they did it in about the most dangerous way possible: live on uncontrolled roads with other traffic.
There are a number of ways they could have made it worse.
>The 'researchers'/hackers. They directly put people's lives at risk to have a stunt to prove their point.
We tolerate the same kinds of risks daily by allowing police to conduct traffic stops on busy roadways. We do so ostensibly because safety, but the reality is that it is mostly for some pretty dubious financial reasons. That's ends justify the means, and in this case ends are traffic fines collected, and the means are lives of police officers and motorists and man-decades of time lost in traffic every day.
>FCA screwed up big, and deserve some sort of penalty from the government.
But not jail, like for the evil hackers-not-researchers?
>This shouldn't have been possible in the first place.
Another uncomfortable reality for you. Software verification is a huge challenge which nobody has gotten right yet. There will be more of these vulnerabilities. We have to get this disclosure/update process right. The best thing that automakers can do to prevent disclosure stunts like this is to fix vulnerabilities ASAP when they're discovered.
>But they didn't modify a running vehicle at 70mph surrounded by unsuspecting motorists.
What they did/do (months worth of nothing) is far worse, and endangers far more lives. Imagine if someone/somegroup/somegov't had managed to bundle this vuln with a popular cell-phone app, or mobile website, and decided to activate it one day at rush hour.
>If the hackers had done this reasonably safely, I'd have no issue at all. But they don't deserve the title 'researcher' after behaving like that.
That's just petty, and nobody really cares what you call them. It does nothing to move the discussion in a fruitful direction; but it does make you appear a bit shallow in your reasoning.
> We tolerate the same kinds of risks daily by allowing police to conduct traffic stops on busy roadways.
We've chosen to accept that risk, and have control over it though government. People could choose to make it illegal.
Also note that when the police do it they take precautions such as having bright multi-colored lights on the car to draw attention.
It also bothers me that the Wired guy didn't know what was going to happen, so he couldn't prepare as well. Even that would have helped (though I still think the stunt was dangerous enough for someone to go to jail).
> But not jail, like for the evil hackers-not-researchers?
We can't put a company in jail and we don't usually do it with the CEO for much bigger crimes. I doubt the CEO had any idea this could happen.
I imagine this is one of those things where dozens of people in different departments (and even companies due to auto parts suppliers) all made small but not terrible bad decisions that added up to a huge problem. I doubt there is a smoking gun email someone where a manager says 'I know someone could disable the car on the freeway but it will save us millions!'
And while this is happening to FCA there are other cars with cell connected systems (VW, BMW, Audi, others). I imagine if we had enoug time we'd find at least 2 or 3 other companies with vulnerable systems in other car brands's 2015 models.
I just don't see how we could jail anyone in FCA. That's why I didn't call for it. On the other hand the hackers seems like a pretty cut and try case.
> That's just petty, and nobody really cares what you call them.
The words you use to describe something matter. Having taken such a stupid risk I don't see why they should ask to be acknowledged on the same level as security professionals who don't put people in danger for headlines.
Frankly the number of people in these threads defending the overly dangerous demo scares me, as does the number of people who seem to tacitly encourage such behavior.
>We've chosen to accept that risk, and have control over it though government. People could choose to make it illegal.
I'll expect to see you down at the legislature lobbying for reform and threatening jail for the opposition.
>Also note that when the police do it they take precautions such as having bright multi-colored lights on the car to draw attention.
It's hard not to notice something so distracting! That's another argument against the practice, isn't it?
>We can't put a company in jail and we don't usually do it with the CEO for much bigger crimes. I doubt the CEO had any idea this could happen.
It's a funny thing. We can never seem to find anyone in a company who knows anything or has any responsibility. We just have to satisfy ourselves that if popular opinion moves against a company strongly enough, or the gov't gets shamed into prosecuting them, that maybe then they might address some problem, usually after it actually kills people, so long as nobody has to admit fault. It's almost like a huge stunt is needed to get peoples' attention sometimes!
>I imagine this is one of those things where dozens of people in different departments...
Yeah, we all know about how corporate structures insulate decision-makers from the consequences of their decisions.
>cut and try
"Cut and dry" So, because it's easy to prosecute these two, and hard to prosecute the others, justice should take the easy route, even though one may have endangered tens of people on one occasion, and the other endangers tens of thousands of people for months? I see a different value proposition here than you.
>Frankly the number of people in these threads defending the overly dangerous demo scares me
I guess it would surprise you to learn that I feel the same way about you after this conversation?
I'm gonna take this argument in a slightly different direction. While I agree with you that the way this experiment was conducted was needlessly unsafe and irresponsible, I don't believe these researchers deserve jail time given that only positive outcomes resulted from this event. Pragmatically, society has nothing to gain from punishing these individuals and in the best case scenario doing so costs us money and pointlessly devestates likely several lives. For that matter, I am also against sending people to jail even when their actions have negative effects if it is not clear that isolating them from the public will reduce future harm.
My reason for suggestion jail is because I worried if they're not punished (just a week would be perfect) other people will think such needlessly unsafe public demonstrations are acceptable.
I dont know another way to say 'this is not acceptable'. To literally just say it but not punish... I do t think that would be heard and someone else would take a stupid risk.
They certainly don't need 5 years or something like that. Just a very noticeable slap on the hand.
> What's wrong with you? [...] You're a bad person and you should feel bad.
That's unnecessary (and I get the reference).
I have no problem with them fining and disclosing this hole. I'm glad someone is proving that cars are woefully insecure.
I have a problem with the unnecessarily dangerous way it was demoed.
Why do I have to choose between 'white hat hackers saving the world' and 'evil computer terrorists'? Why are can't I judge one part of the story (the discovery) from another (the demo)?
In this case, your judgement is poor. St. Louis area prosecutors would like nothing better than a showy media trial of suburban white dudes to distract from the well-deserved asskicking they've gotten over the last year. Fiat-Chrysler would like nothing better than pictures of these guys in handcuffs to distract from their fuckup. Those of us who hack shouldn't be so quick to turn on our own, and those of us who have ever driven on a public road know that sometimes vehicles slow down. Driving is dangerous, and as a society we've accepted that, because trade-offs. If you don't make your kids wear crash helmets while riding the school bus, you have no business calling the fury of the state down on these guys.
A lot of reasonable people would say that it is because driving is dangerous that we shouldn't be tolerant of people needlessly adding more danger to it.
ABP also considers ads that track people (such as Google's) to be acceptable and whitelists them by default.