I like pretty much everything someone takes the time to write about development, but this rubbed me the wrong way. I'm not sure what it is exactly the author is trying to express here: that engineers need to stay current on new, hot technologies? Everyone knows that. But then he seems to deride people who, based on his interviewing, haven't used a <new technology X> in production / in a new project.
You can't have it both ways. You need to find a way to evaluate people and, through that evaluation, grok whether or not they can pick up a new technology that they aren't already familiar with. If you can't figure that out, your interview process has failed. Think about it: the complexity and difficulty of your own problems vastly exceed the difficulty a senior engineer faces learning a few new frameworks or APIs. If not, you're doing something pretty trivial.
The best people I've ever worked with weren't people who a priori understood the technology stack we were working with -- instead, they have been seasoned engineers who know how go out and learn, and then apply what they already know in a new environment. This seems so obvious I feel silly even writing it down here, but this is on the front page, so it must be resonating with folks.
What tptacek has said about .NET developers working at IT departments at huge organizations really rings true here. If you immediately write someone off because they work with technologies you don't work with[1] then you're doing it wrong.
1. Assuming you're not looking for a quick, contracted hired gun for something fairly straightforward.
If you immediately write someone off because they work with technologies you don't work with[1] then you're doing it wrong.
I've had a hell of a time interviewing for just this reason. For the first set of interviews, I was emphasizing my experience with patterns as opposed to technologies. That didn't work. I was routinely dismissed, despite decades of industry experience.
When I decided to grind through the latest buzzwords, and build a few sample apps, that's when interviewers got much more positive.
It's not how I would have interviewed people, and I do agree with you and tptacek, but it was an eye opening experience on how to play the game.
HR folks and interviewing engineers don't get this. Idioms, best practices and methodologies should be sought in senior people. Buzzword hiring shops are likely full of bad people to work with. Its a signal of lack of intelligence and perspective organizationally.
It may be frustrating and require a lot of patience on your end, but if you look at it in another light it may be a sign you don't want to work at that company.
Exactly what you said. The other part that got me was his "as an entry-level programmer, be prepared for grinding, endless tedium."
Maybe that's how his workplace treats junior devs, but the reality is that a developer with very little experience but a lot of skill is someone who can add a lot of value, and can find a workplace where they'll be able to do that. If you try to make a great junior dev "pay their dues," well, they have better options, with companies that will use them more sensibly.
I see where you're coming from, but in context I don't think the author was advocating for treating junior devs badly on purpose. More like everyone ends up banging their head against the wall sometimes as a developer, and you should be prepared for that if you're a new programmer and considering doing it as a career. In some sense it is paying dues but it's not imposed by anyone else, it's just that the less experienced you are the more often you get stuck.
His related advice actually really resonated with me, basically just to be humble.
"However with one or maybe two of the students I sensed overbearing confidence that now that could write some code and create prototypes they were valuable commodities in the industry."
I've also come across this kind of person who's sure they can do no wrong, and there's no one worse to bring into an organization. Anybody that's gonna come in and do things their own way, not bothering to spend time learning how the team and codebase works and why, will cause a lot of friction. But when it's a junior dev it can be a lot more destructive because "their own way" will often be flat out bad as opposed to just incongruent with the way the rest of the team works.
> The other part that got me was his "as an entry-level programmer, be prepared for grinding, endless tedium."
Tediousness is in the eye of the beholder. I enjoy coding, and as a junior developer (~3 years exp.) I get to spend the majority of my time doing that.
When I look at the more senior developers all I see is that the better you get at writing software the less they have you write. Their days are filled with meetings, emails, triaging bugs, and reviewing other people's code.
If this is the case, it sounds like OP does not adhere to "KISS" or "DRY" principles. On the contrary, anyone who is going to take up space in my office better not be engaging in "endless tedium". It's a sure sign they're not doing their job right.
"So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture."
"Like I said, it’s a pop culture. A commercial hit record for teenagers doesn’t have to have any particular musical merits. I think a lot of the success of various programming languages is expeditious gap-filling."
It's been said that the population of programmers doubles every ~5 to 10 years. And, that loosely correlates to the hype cycle of "hot technologies".
I don't think it's possible to avoid a pop culture with that many young and inexperienced programmers, especially with a cultural emphasis on self-training over mentoring.
> They have been working in the same job, with the same technologies, on the same product but when asked what makes you a senior developer, they answer “Because I have been coding for 5 years”.
> The sad fact is that these people have not been aware of the realities of the industry they have been working in. Their job and their job security is no longer secure. Never let this become you!
It sounds like the author values language/framework knowledge above expertise. Speaking as a backend dev I find as you move up in seniority it becomes impossible to keep up with the latest technologies. Learning management skills, being move involved in architecting and business development all take time. You may end up doing code reviews, project management and cleaning up the 'legacy' (read broken) parts of the system. Does that make you less valuable?
To a company that insists on every part of their project being built in the latest cool language/framework (and yes, I still class NodeJS as this) then yes. To a pragmatic company, which may choose NodeJS if it's the best fit, or Rails, or PHP, or Java - no.
We're back at the skills vs language debate, one which I thought has been done to death.
"when asked what makes you a senior developer, they answer 'Because I have been coding for 5 years'."
Maybe 5 years is insufficient, to be considered senior, but isn't the amount of time you've been coding what defines how senior you are? I get that technologies come and go and you have to stay up to date, but I would think that if someone were looking for a senior developer they would be looking for someone with a lot of experience, not necessarily someone who knows the ins and outs of technology X.
Can anyone shed some light on this? It just seems bizarre to me that if you want to be a senior developer you'd need to either stick with the same technologies you used before..
i have 12 years experience and my interviews have gone to complete train wrecks in the past - i cringe to think the interviewer might have had the same point of view as this guy, that I had just watched some videos and knew things at a high level. What a flat out ignorant view point.
This is often the case, the interviewers are completely clueless on selecting the right person for the job. Wasn't it on the news recently how a random selection did just as well?
To me, the 'Senior' adjective in a job title corresponds to the individual's position within the organization that has bestowed that title, not necessarily to the individual's abilities compared to the larger community.
Sometimes you need someone who knows every detail of whatever system. It can really pay off and you are willing to throw money and titles to get someone good. It doesn't mean they are a great senior dev, but if you need that person, then you need that person.
I've done embedded work, application work and now I'm doing web development. I often seen the "5 year senior wannabes". Hell, after 5 years of work, I was one of them! I refer to my old self as a "hotshot ass hole programmer". They are common.
These days, I think a junior developer is someone who needs assistance to get their job done. An intermediate developer is one who can get everything done without assistance (although they may still benefit from assistance!). A senior developer is someone who can bring up everybody else's level -- getting assistance in this seems to be a critical prerequisite for success).
I tell this to people on my team, but again it is hard to judge your own effectiveness. There is a lot of hubris. I know -- been there, done that, slept in the t-shirt for over 20 years. Thinking about it, I think the key piece of experience that can help you be a good senior developer is to watch a system that you built collapse under its own weight. Coming to grips with the idea that all of those things you really believed would definitely lead to wonderful software were just... wrong (and it doesn't matter what it was that you believed here).
Seniority means, to us at least, experience (i.e. beyond "don't know what you don't know" and at the "know I know some stuff but there's a lot I don't know" stage) and the ability to rapidly pick up and grok new tech, languages, ways of working, and what have you.
It's not about a certain level of experience in Node.js, or whether you have used the shiny new tech that came out last Thursday - it's about whether you can.
5 years! A career these days is 40-45 years. If you are "senior" in only 5 then where do you expect to go from there?
20 years experience, on a wide range of technologies, in a variety of roles, then we'll talk senior. Until then calling yourself "senior" just invites people not to take you seriously.
Exactly what I've seen have and experienced first-hand (as verified in recent salary negotiations). It's a systemic software industry issue, a peacock game we're expected to play in order to get ahead.
20 years in this industry invites people to treat you as a fossil and not even hire. Technology moves too fast, nobody cares what you did in 1995 anywhere near as much as what you did in 2013.
I'm a somewhat older programmer, and fwiw, I've met as many genuinely unhireable fossils as I've heard this complaint.
I've heard it from people who've been programming for 30+ years, whose idea of code versioning is the 'Save As' button. I've heard it from people whose entry to technology development was coding websites in HTML 1.0, who've learned very little else since.
Many people, at least from the previous generations of programmers, either don't appreciate or aren't able to manage the lifelong learning required by this profession.
As I said on another thread, 2010's MongoDB is just 1970's MUMPS. Once you have been through a few cycles, you see that actually technology comes round and round again. Centralize, distribute, centralize again, just one repeating pattern. Thick clients, thin clients, thick again. Compile, interpret, compile again. Just the buzzwords and the syntax changes.
All that is probably true, but what we are discussing here is perceived value to an employer. Its rather obvious in software that perceived value does not accrue with experience as it does in fields like law or medicine. Even though data shows doctors get worse over time, a 25 year veteran is held in much higher regard than a 5 year practicioner. In software circles, you are far more likely to be looked at as old and out of date, regardless of the skills you have amassed.
Oh, sure, trends in the libraries come and go, but if you don't think they're only going in circles, you should consider whether you'd rather be programming: a Ruby on Rails web app, or a COBOL-backed green screen application on an IBM AS/400.
Myself, I'll take those 45 years of progress, thank you kindly :)
> If you are "senior" in only 5 then where do you expect to go from there?
Depends. After 5 years of being a senior developer, you might begin to think about being a junior architect or tech lead of some sort.
Now, I think I was a pretty senior developer after 5 years, but then I also had preprofessional experience going back another 5 years, and a few years of hobby experience before that... :P
For me, John Allspaw's article On Being A Senior Engineer[1] lays it all out pretty well. It addresses the issues we all seem to have with this article better than I could.
Generation X (and even more so generation Y) are cultures of immediate gratification. I’ve worked with a staggering number of engineers that expect the “career path” to take them to the highest ranks of the engineering group inside 5 years just because they are smart. This is simply impossible in the staggering numbers I’ve witnessed. Not everyone can be senior. If, after five years, you are senior, are you at the peak of your game? After five more years will you not have accrued more invaluable experience? What then? “Super engineer”? Five more years? “Super-duper engineer.” I blame the youth of our discipline for this affliction.
Web development is like an entry point for programming these days, the lowest hanging fruit if you will. I would not exactly call anyone who does exclusively web development "senior". It's like being a nurse - I suppose you can become a senior nurse, but you're not going to be a doctor. There's only so much you can progress professionally in web development, and each new stack gets easier to pick up, almost no challenge, patterns repeat and boredom sets in. I started from website development, then went to enterprise web application development and now moving on to data engineering (data mining, machine learning).
"Web development is like an entry point for programming these days"
Web development is just a _type_ of development. You're creating a hierarchy where none aught to exist.
I've known many system programmers who said the same thing about data engineering: "what kind of a lame made up job is that? They write Python and SQL queries, wow, that's barely programming."
It's all non-sense. You can employ techniques which vary over a wide range of sophistication to create web pages, embedded systems, or ETL pipelines. There is no clear cut hierarchy.
source: Have worked as a web dev, data warehouse engineer, and release engineer.
"The sad fact is that these people have not been aware of the realities of the industry they have been working in".
IMO the observations provided by the OP are not sound reasoning for the above conclusion.
In a recent gig where I was responsible for helping the company hire new developers I made a conscientious effort to avoid making such sweeping generalizations. Web development is one of the most diverse fields that I know of. So much so that in my mind you can't assume a qualified employee has ever touched the specific technologies you use in your stack.
I think the hiring practices described by the OP may be the dinosaur here.
This is why development is ageist. Keeping programming skills up to date after work, and spending hours after work trying different languages and libraries is fine when your young.
Its not so fine, when you have kids and a wife to look after.
It's not only ageist. As long as developers keep accepting that they have to spend 15 hours a month on their own dime just to keep up, it's outright exploitative. Developers are generally expected to keep current in the breakneck technology rat race with no support from their employing organization. This could (maybe) be considered reasonable if technology moved more slowly, but not the way the world of computer software works.
It's one thing if you're a consultant in a web development position where your team is constantly jumping from the previous best practice to the next; keeping up in such an environment is easy and you'll get paid to do it. The problem occurs if you're in (e.g.) a back-end position where your job duties require you to focus on slower-moving technologies that might have lesser demand in the market. What happens if you lose your job after 5 years? Unless you spent all your weekends working with the latest hot frameworks, you're (mostly) screwed.
There was a good essay a while ago which stated that software developers need a professional association. If we were organized, we could put forward the very reasonable demand that a small portion of our on-the-clock time should be spent for education. I think this is a requirement if we are not aiming for a world of 40-year old burnouts and early retirees.
That's a very good point. Other fields have Continuing Education, sabbaticals, in-service training, etc. (e.g. medicine, education).
But then, computers and software change so much more rapidly--and arbitrarily--than other fields, that it's very different. Software is as much art as science, and a lot of "keeping up-to-date" with software is simply learning whatever is popular at the moment, not what is necessarily more advanced, better, or newly discovered.
As long as developers keep accepting that they have to spend 15 hours a month on their own dime just to keep up, it's outright exploitative.
Why? How is this any different from working an extra... 4 hours a week? It sounds like long hours are fairly common, and we tend to be on salary and not get paid extra for that either.
I think the US system of salaried employees and no overtime restrictions (or even pay) encourages exploitation in the first place, so that is my background for making this statement. I live and work in Europe, and I will never work for any US-based company that does not place firm restrictions on how overtime is managed. Most US companies simply have the wrong incentives in place.
But yes, having the legal right to take 4 hours a week out of your schedule to stay up-to-date would go a long way towards alleviating the problem.
If you've done C++ or DB2 or something for 5 years or 10 years or 20 years you will have no trouble finding the next gig. Because these skills genuinely do take time to develop, and people with the tenacity to do it are rare.
It's pointless too. Every new language that comes out, everyone reinvents the wheel in it. You see it here on HN, on the front page[1] every day <some piece of software that has existed for years> now rewritten in <trendy language of the day> and everyone cheers, but is this program doing anything that it didn't do before? Is it faster, better, more reliable? Generally no, and this happens in the workplace too, just reinventing the wheel over and over and over again.
We could have stuck with C, and concentrated on developing new things, but the programming community is like Hollywood remaking the same film over and over with different stars.
For one, (speaking as an American) unions are sneered at as a waste of resources by your average citizen.
This leads to the second problem: there will always be a pretty decent sized group who do not believe in unions, meaning companies will have a choice in who to hire. It's usually not in a corporation's best interest to hire a unionized worker, at least not a corporation which views "profit" as their primary goal.
well, speaking as an american from the midwest, unions are usually sneered at as a waste of resources by people who have to pay for union workers.
We need to make sure that we aren't getting taken advantage of by our employers (examples: shitty pay, terrible work environments, bad equipment, conspiracies to lower wages), aren't implementing poor practices (things like backdoors for the government), and aren't subject to things like sexism, racism, ageism (as you age and learn more and get better at what you do, its going to be harder to get a job - does that seem fair?).
I'd prefer to be part of a tech union that helped ensure my safety in these areas.
admittedly i don't know enough about unions to know how the current structure would benefit our industry, but we need something similar at least.
> A comparison to law/medicine is more appropriate.
Which are both regulated and treated completely differently than the tech industry. I speak of unions because the way we are treated (not the level of skill) is more akin to a construction worker.
One minor point I disagree with is mobile development. For certain apps it's extremely important... But there's a lot of complicated applications that don't work on mobile at all. So if you wanted to support mobile you'd have to design and build two separate applications. That requires a lot of resources and manpower. Another alternative is to hide most of the functionality that won't work well in mobile, but depending on what problems you solve, this could make your application useless.
As for the overall topic, I think I agree. It seems to me that server-side technologies are better researched and understood. But then you look at client-side technology, it's chaotic. Building and maintaining user interfaces is hard... Recently I've been playing around with flux and react, and I feel like it's definitely a step in the right direction towards building simple components that can be easily maintained and extended.
> Some people love to cook in their spare time. Would these people be as enthusiastic about cooking if they were doing it for 40+ hours a week every week and there was a superior telling them what and how to cook?
> The honest answer for most people is no!
I find this weird. I advertised for calculator program requests in high school because I wanted someone to tell me what to code. (There were no takers. One guy wanted a program to convert decimal numbers into fractions, but that's already built into the calculators.) To me writing a program is solving a puzzle. You can't do that unless there's a puzzle to solve.
I was baffled and enraged by the prevalence of the question "what do you want to work on?" and its related forms in job interviews. My philosophy of work is that the whole point of having a boss is that they tell you what to do and you do it. Sure, I don't want to fix cars, but that's why I'm applying as a software developer and not a mechanic.
I perceived the emphasis as being on the how to work. I once worked for a company that "believed" in "the one" PHP framework (developed in-house; terrible), "the one" cache (memcached - had to fight to get redis for the richer structures that we really needed; lost), "the one" DB (MySQL 5.0?), no staging environments for devs (we won a battle and allowed ourselves to run vagrant! woo), etc.
In your car mechanic analogy: you don't get to pick what cars to work on, and this is completely reasonable. But, it should be up to you & your team to pick the tools. If some idiot in the company once managed to undo a bolt with chopsticks and liekd it so much that he made it a company mandate, you'll soon be looking for another garage.
Long story short: I've seen the extreme of non-techies micro-managing technical decisions, and it is extremely frustrating. <lablab.png>
You can literally work on software for any industry, company or government in the world. I would think you would have some sort of preference. At the least you should be interested enough in the place your are interviewing at to find out what they do and and then tell them you are interested in working on that. Otherwise you just come off as a mercenary or desperate.
The thing is some people really don't care what they work on. As long as the pay is good and they are getting to program(or whatever other task they enjoy doing), the actual application doesn't matter.
Okay, so he doesn't like that someone is a senior developer because they have been coding for 5 years. Fair enough, I guess.
Then they go onto say that you need to keep your technology current, and always be moving on. Okay....
So how does one become a senior in his position when you're always suppose to be moving on? By being a master with PHP? Not much has survived/been popular for >5 years that will make him happy.
This is a debate I am having right now. Should I go the sysadmin route (hadoop clusters, etc), become a data scientist, or choose to get into cybersecurity. I have been leaning against choosing the web development career path
Trite, but accurate. This role is currently in high demand, with correspondingly good salaries. It's worth noting, though, that companies expect DevOps to do more more operational tasks than programming or procedure generation.
There aren't a lot of folks who understand how to administer a modern Linux system anymore; even companies using AWS or Rackspace or GAE will need someone versed in the arcane ways of System Administration as they grow.
This applies not just to Web Development...Mobile Dev is most challenging as well...As a iOS developer, every iOS update, there will be new features..and lot of guidelines..now even new Programming Language...
For "Web development as a career", the OP
seems mostly to mean as an employee. Then
I'd say, go down that road if you have to,
but ASAP learn the business side, get
your own software development environment
(likely you already have it) in your own
room at home, and then just be a sole
proprietor in business much as a guy
who owns and runs, say, a pizza shop,
is a CPA, mows grass and plows snow,
is a plumber or electrician, a
car repair guy, an auto body guy, etc.
Then you get some big advantages:
(1) You are the one who gets the
revenue and, thus, keeps the
earnings.
(2) Likely for each dollar of revenue,
you have lower overhead than
any employer and, thus, get more money.
In particular, you get to cut out
the fraction of the revenue going
to the owners, managers, marketing guys,
lawyers, HR,
landlords, business insurance
agencies, etc.
(3) You have some tax advantages,
often well known to sole proprietors.
(4) For your technical qualifications, you
only have to learn and use what
you actually need and find useful.
So, you don't have to play buzz word
ice hockey with some HR clown who
doesn't know fixed from float
but has a checklist of C++, Python,
Ruby, MongoDB, Jason, HTML5, ....
Point (4) is crucial: Likely
your customers just want their
Web site developed, running, and
occasionally revised. Okay,
that's what you need to be able to
do. But, you get to select the
framework, languages, libraries,
development environment, etc.
If you need a new tool, then, sure
get, learn, and use it. But you
don't have to spend a lot of time
learning, say, HTML5 or
SVG if you don't need to use them
yet.
"The business of America is business",
and in practice from crossroads up
to the largest cities a lot
of sole proprietor small businesses.
To me, it's interesting that an article on Web development is hosted on Google Docs. This is not to point out an irony, but that like any technology, web development can be democratized pretty rapidly. The website for my side business is hosted on a similar platform.
What was considered best in class 5 years ago is literally frowned upon now as the technologies and the standard of our applications have evolved rapidly.
Moreover, what was considered best in class 5 years ago can now be set up by a non-expert.
That's perfectly fair. They might also not want to use their employer's website for their personal writings. And using the right technology for the job is a good attribute.
Just wanted to note that this was posted to the CSCareerQuestions subreddit[1] 6 days ago, and there's a bit of discussion there that may interest people as well.
I fear there is a new generation of developers who have jumped into programming using nodejs as their core language and fit this description you mention "On more occasions than I could count, the candidate for this type of role would have some cool technologies listed on their resume and completed an adequate coding test which is part of our hiring procedure.". If you don't know algorithmic complexity, how to run all sorts of tests, or have any clue about security measures but can set up a nodejs express webserver in 5 mins you are a script kiddie at best. Unfortunately these type of developers are now getting jobs because its the new web development buzzword
what new technological advances in back end web development have occurred in the last five years? why should senior developers spend their spare time learning new ways of doing exactly the same thing?
You can't have it both ways. You need to find a way to evaluate people and, through that evaluation, grok whether or not they can pick up a new technology that they aren't already familiar with. If you can't figure that out, your interview process has failed. Think about it: the complexity and difficulty of your own problems vastly exceed the difficulty a senior engineer faces learning a few new frameworks or APIs. If not, you're doing something pretty trivial.
The best people I've ever worked with weren't people who a priori understood the technology stack we were working with -- instead, they have been seasoned engineers who know how go out and learn, and then apply what they already know in a new environment. This seems so obvious I feel silly even writing it down here, but this is on the front page, so it must be resonating with folks.
What tptacek has said about .NET developers working at IT departments at huge organizations really rings true here. If you immediately write someone off because they work with technologies you don't work with[1] then you're doing it wrong.
1. Assuming you're not looking for a quick, contracted hired gun for something fairly straightforward.