Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Simple. They...

1. Take steps that make it easier to identify leakers (like embedding secret numbers into promotional photos so they can tell who leaked them if they get published)

and 2. They pursue anyone who dares leak information with the ferocity of a rabid dog.

No real mystery here.



While I don't disagree with your observations, I think it would be remiss to exclude the fact that many Apple employees are incredibly loyal and believe in the company.


When I started working for Apple (about 5 years ago now) it was my dream job (I had serious fanboitis). The new hires got a security orientation which was your basic "We spend lots of money protecting things; don't do anything lame."

But the reality of working makes the magic unicorn dust wear off. I saw a bunch of prototypes and I knew a bunch of information that nobody else 'knew' but lots of outsiders suspected. I became more interested in my work than selling other people this idea that I had these secrets that I could pass onto others. (We were also told to 'eat our own dog-food' which makes me highly suspicious of every Software Update to this day. I always wait a couple weeks before installing anything new.)

That's an important part of the equation. If your employees are more interested in leaking information than working, maybe you're doing something wrong. (Just to clarify, I still consider Apple a great place to work; I left because I wanted to do something different, not because I felt that it was a poor place to work.)

[Edited for grammar]


Surely if you were encouraged to 'eat your own dog food', that would mean people inside apple would be testing stuff themselves, so would mean eventual public releases would be better?


You might think so. It would be hard to test this empirically since I would need an alternate universe where people inside apple did not test unreleased software on themselves before releasing. But seeing the process that goes into deciding what fixes go in and what fixes don't has educated me to NOT update unless I have a real need to. I saw lots of examples where a fix for something broke something else that was seemingly unrelated. Or that a feature that I liked depended on a bug that was going to be fixed. That's why I wait on Software Updates. And if I don't need the Update, I don't install it.

The problem with 'eating your own dog food' is that it is dog food, so it's really hard to eat. To comply with this edict, I would set up a separate machine and install the latest build and play around with it for about 15 minutes and spend a half hour (or more) filing bugs. Thus, I wasn't really eating my own dog food...

Installing new builds turned out to be a big time-sink. It took time for the build to install and then you had to go through the paces of setting the machine up. You also had to do a clean install because of incompatibilities with previous releases.

After all of this, you started testing. Then you started finding bugs, and since I was in engineering, I tried to make sure the bug was reproducible and even tried to track down which component was responsible (which is a mess if you're dealing with overlapping stacks of software). THEN, you had to crawl through the maze of reported bugs to see if there was something already reported that was close... Hopefully they've made the bug reporting system better since I was there (but I doubt it… the system I used had been in place for a long time).

On top of that, I needed to get actual work done. There was no way that I was truly going to use a ever changing system of dubious quality to do work. I tried, and it was too difficult (although I knew of some folks that were faithful and did so).

'Eating your own dog-food' is an ok idea. It's not clear to me that it was the best idea for this particular situation. As a general practice, it seems that if your particular product scope is small enough, this would be a good idea. If it's larger, hiring a set of dedicated test engineers would be awesome. (The ironic thing is that Apple had those too!)


True but every company has mostly loyal employees (imho). The trick is keeping the dishonest ones in-line and the best way to do that is through fear. Apple makes it clear they'll go nuclear on anyone who leaks information and the dishonest ones realize it probably isn't worth the risk.


I disagree. In a company with 10s of thousands of employees it still only takes 1 to leak something. Loyalty is (IMHO) far more important than those other factors (although they too contribute).

Consider a more open example: Google. Internally Google is more open than I believe Apple to be (I have experience with the former, none with the latter other than what I've read).

You may think that a lot of stuff leaks from Google based just on TC stories and so forth and while there are things I'm sure we could manage better, when I joined I was (and still am) amazed at what doesn't leak.

Google doesn't have the same militant, even draconian, approaches to secrecy that Apple does. What they have in common is that most people in both companies (IMHO) are very loyal.


Right - because the ones that dind't - see previous comment :)


3. They do not employ very large teams, and those small teams are very dedicated.

Apparently they didn't just read Brooks and Peopleware, they also seem to act upon them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: