Hacker Newsnew | past | comments | ask | show | jobs | submit | ghjm's commentslogin

The question here is what motivates individual developers to write big projects and then release them as open source. I think vague dreams of million-dollar deals are part of this for a lot of people. As the developer community becomes more aware of what a grind open source maintainership is, people are already less interested in taking on that responsibility. If we also prevent big money buyouts from happening, I wonder what's left to motivate a future developer to create the next redis.


Redis was created for the same reason most of us create open source tools: to scratch an itch, to solve a problem (or improve a solution).

I find it hard to believe many if any would see "create an open source tool" as a method to become a millionaire.


Maybe there's already a way to do this, but it seems to me that it would be useful to be able to refer to the answer on a previous line, like the Ans key on a TI calculator. So:

2 + 3 (=4)

Ans * 2 (=8)


x = 2 + 3

x * 2


Out of curiosity, would you be concerned with giving a managed service vendor all the passwords to everything in your infrastructure?


Yes. I would. Which is why I love the idea of having AWX to run within my own infra, in the way that I'd want to run it.

Arrogant as it may sound, I'd much prefer to take ownership of that, personally. I'd be happy to self-host it and take the operational pains that come with it, and sleep (at least slightly more happily at night) knowing that I'm managing my secrets myself.


Even if your playbooks run everything as sudo, that doesn't mean you have to grant AWX/Tower users the ability to create arbitrary playbooks or run anything else as sudo. You certainly can do that, but the point of the RBAC feature of Tower is that you don't have to.


You are absolutely correct that these factors should be carefully considered prior to any deployment of AWX or Tower. You are granting Tower a lot of authority to your networks and systems, and should not do so without good reason. If you don't need the features of AWX/Tower, then the best practice is not to use it. There's a tremendous amount you can do with Ansible itself, without AWX/Tower, and lots of people use it happily that way.

That being said, I think you are overstating some of the risks. You don't need to grant every Tower user root access to everything on your network. If you're at a scale where Tower makes sense, you probably already have some sort of separation of privileges. I agree that a malicious Chrome extension could do a lot of damage - just like it could with all your other management tools like DRAC/ILO, network equipment GUIs and so forth. Yes, every web application carries the risk of CSRF/XSS or other vulnerabilities, and Tower is not immune to this, but we do spend a lot of time worrying about it, conducting audits, etc.

If your operation can succeed with nothing but a sudoers rule and command-line Ansible, then by all means use that. Nobody wants to force AWX/Tower on people who don't need it. But if you do need the feature set of Tower, I think it's one of the safer options available.


Yes - you can either use your own tooling to distribute your playbooks to the hosts and then run ansible-playbook using the 'local' connection method, or you can use ansible-pull, which retrieves the playbooks from an scm (git) repo and runs them locally. You don't need AWX/Tower for any of this.


Don't you have to give Jenkins the same credentials and attack surface that you would have had to give to AWX/Tower?

With the Jenkins model, can't someone just add a job that dumps the execution environment, and get all your credentials?


I would like to object to the following statement from the article:

> While lifeguards are taught all the possible signs of a person who is drowning, pilots don’t receive elaborate training on all the things that can go wrong, precisely because the many things that can go wrong so rarely do.

In fact, pilots are given extensive training in the failure modes of their airplanes. When Casner got his private pilot license, he should have been told to memorize the emergency procedures for his airplane. De novo private pilots are not given the same level of training as airplane pilots, but failure modes and "interesting-looking instrument panels" are crucial components of instrument training.

More generally, I don't disagree with his conclusions regarding the problems with cockpit automation, but they are hardly novel - the industry has been aware of this for years. However, the service life of an airliner is measured in decades, and the regulatory environment, for excellent reasons, doesn't allow us to change the avionics on existing airliners without re-certifying the new systems and re-training all the pilots.

Last but not least, I'd like to point out that even if the airplane is on autopilot, there is still plenty for the pilots to do. For example, they are supposed to brief the instrument approach procedure and the missed approach procedure every single time they fly it. A lot of pilots relax this discipline (if you've flown into the same airport hundreds of times, it's hard to argue that re-reading the procedure yet again has much value). But relaxing it to the point of just idly chatting all the way down the glideslope? That's not a problem with the cockpit systems, and it's hard to see how better human factors would have magically turned these incredibly irresponsible people into good pilots.


I'd also disagree with that same assertion with regard to lifeguards:

Drownings (and near-drownings prevented by a rescue) are pretty uncommon, too. It is precisely because of the rarity that one needs to be trained to spot the problem. If they happened all the time, then it would merely be a matter of the journeyman pointing one out to the apprentice when it happens. Instead, one has to rely on a combination of book study and roleplay.

This is even more the case in a pool situation (as described in the article). I was a lifeguard in a very busy pool for five years, in that time, I performed very few rescues, some of which would better be described as "assistances", only one would have been likely to turn into a drowning without a rescue.

I might add, good lifeguarding is not about watching for signs that someone might be drowning, it's about watching for signs that might precede trouble.

I'm no pilot, so I have no idea how many different things might go wrong on a plane, but I assume by the comparative duration and cost of training that it's a bit more complex than watching people swim. I'd be shocked if pilot training was strictly happy-path. I'd expect (as you state) that they would be assessed on their ability to interpret and react to things going wrong. Otherwise, what's the point of all the instrumentation?


Isn't the argument from the article that the pilots are just idly chatting all the way down because of excessive automation?

If, like the article claims, human error is the leading cause of accidents (and I assume it is implied that distraction is a major factor in human error, though the article doesn't explicitly claim this), isn't mindless automation a major problem that should be addressed? To quote NASA's researcher: "Companies were introducing increasingly specialized automated functions to address particular errors without looking at their over-all effects [...] As it stood, increased automation hadn't reduced human errors on the whole; it had simply changed their form."

I don't know if it's accurate to call these pilots "irresponsible people". Probably it is. But how does this help in reducing accidents?


> Isn't the argument from the article that the pilots are just idly chatting all the way down because of excessive automation?

Yeah, but there seems to be an implicit argument appended: "and this causes more accidents than the automation prevents". I'm not sure about this one. Aren't jet planes ever safer to fly in? I was under the impression that the death rates kept going down. So the backfire effect from this automation can't be too bad or else net safety wouldn't increase.

I think you could turn it into a weaker more valid argument: 'and the mind-wandering sets a bound on how safe air travel can ever get with human pilots because automation itself introduce human error'. But when you make it explicit like that, it starts to look like an argument for taking humans out of the loop entirely...


It's definitely worth remembering how rare commercial airline crashes are: http://en.wikipedia.org/wiki/List_of_accidents_and_incidents...

There was also drama around a whistleblower being ignored and demoted after bringing up safety concerns about Colgan Air: http://www.nytimes.com/2009/06/04/nyregion/04colgan.html?_r=...

Check ride stall recovery is now graded differently too, thanks to this incident: http://www.flyingmag.com/pilots-places/pilots-adventures-mor..., under the belief they could've been making pilots instinctively afraid to lose altitude during stall recovery.

It's also believed the pilots on this flight were fatigued, and performance was impaired. Given how bad humans are at even driving cars while exhausted, it's not surprising it's hard to fly a plane in that condition.

Full disclosure: All but the first link are cited by the same Wikipedia article: http://en.wikipedia.org/wiki/Colgan_Air_Flight_3407. I immediately recognized the story of this crash because I have a bit of a fascination with how things can go horribly wrong.

This layman also cringed while watching a TV documentary about this flight the moment the reenactment pilot pulled up during a stall warning, and again when the co-pilot retracted the flaps. "Are you trying to crash‽" is what I wanted to yell at my monitor.


In the near/mid-term it would probably be a good course of action to have entirely automated airplanes, and "pilots" (or technicians or whatever you would call them) specialized in handling automation failures. That way, there would be no question about what their role is in the plane.


Pilots are all about handling failures - that is pretty much what their whole career is focused on. The normal operation of a modern aircraft is pretty boring - what makes a pilot worth his salary is all the continuous training and drilling that embedded emergency procedures into his mind at the reflex level... It is a lifelong process, regularly updated through new hardware and modified methods in response to incidents.


> Isn't the argument from the article that the pilots are just idly chatting all the way down because of excessive automation?

We can make them play computer games that test their attention.


I actually thought something similar while typing my first reply. What if they gave pilots some sort of busywork to keep them engaged? But I guess it wouldn't work; as soon as you identify something as busywork, you stop giving it your full attention.


In trains they have busywork to test your attention. (I rode a simulator once, it's like a flight simulator but for locomotives.)

There's a deadman pedal that you need to keep pressed to keep the train moving. Once every half a minute, you need to lift your feet of the pedal. If you fail, the train will soon break and come to a halt.


"Each agency may promulgate regulations, pursuant to notice and receipt of public comment, providing for the aggregation of certain requests by the same requestor, or by a group of requestors acting in concert, if the agency reasonably believes that such requests actually constitute a single request, which would otherwise satisfy the unusual circumstances specified in this subparagraph, and the requests involve clearly related matters."

They would just publish one response and send us all a link to it.


They want to be able to take large donations from wealthy contributors, without (a) having to pay tax on them as income, or (b) having to follow the normal reporting requirements for political groups.


> without (a) having to pay tax on them as income,

But that is my point: The expenses write off against the revenue in the biz, so there's no point in creating a separate org. There can't be sizeable income left in the separate org, or they get nailed for having assets.

> (b) having to follow the normal reporting requirements for political groups.

That's different, but I got the sense from the article that it was more about advertising than politics. FTA: In short, the IRS is concerned that some of these organizations exist simply to market companies' software, and perhaps the associated services sold alongside them.


The tax issue actually goes the other way--individuals can take a personal tax deduction on most donations to 501c orgs, but not to for-profit companies.


Aha, it's because they think their income will go up from donations of individuals? That makes more sense.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: