I suspect its not as easy as many people make out.
I was a great C++/OO dev for 10 years before I got bored, starting doing management, business focussed roles and more high level devops style work. After 10 years of that I wanted to move back into pure coding. With great experience should be easy right? Wrong.
I got a mid-level java/python dev job and it was difficult, I was out of touch and everything was different or new. Languages, styles, CI/CD, DI, git, containers, unit testing its a huge amount to learn. After a year I got laid off because I was getting paid like a senior but not keeping up with the grads.
A few years later I'm productive and useful in this new world but I dont really like it. I'm enjoying Scala and functional programming but with so many libraries and tools I feel like everything is so difficult and complicated. It takes a lot of study effort to keep up. Also I'm never sure if its because the applications I work on are badly designed, or I just dont really understand modern design. I have a business specialty which keeps me employable but I miss the old days when things were simpler.
Being "old" at 40+ really isn't so easy - I'm not sure you can ever redeem yourself in the eyes of the industry. Best you can hope for is get a non-tech domain specialty and find a big stable company that values experience and try to keep working on interesting projects. Once you're laid off or fired once its really hard to be the super confident hacker you were at 25.
EDIT - thinking about if you want advice. Get a business specialty or technical niche. Dont get too lazy, if you aren't learning on the job for a few years in a row, change the tech in the project or leave. It should be easy to keep employed but you have to keep working at it. Best career money-wise is to move to management, but its difficult to move back. Dont take the high paying job on a dead end project without a plan to get out.
I'm only 32 but that's kind of where I'm at right now too.
Mastered Objective-C and Mac & iOS development 10 years ago, now it's all Swift and nobody's really hiring for iOS devs in my area (Chicago suburbs), and nobody's hiring for native Mac developers anywhere, period. And all the best practices in iOS have changed drastically since then, too, in terms of both coding and UX.
Learned Ruby on Rails 8 years ago, but it changed so fast that most of what I knew about it has become irrelevant, and I never was very good at Rails in the first place.
All the best practices I've learned in HTML/CSS/JS/jQuery/Less/Sass are becoming outdated pretty quickly.
Spent 5 years mastering Clojure but it's obviously very niche and I don't have any experience with big data or anything else Clojure is usually used for, only traditional web apps.
It feels like there's no way to keep up with the industry while staying relevant and employable.
When you get into technology, you should go under the assumption that you will be a student for the rest of your life. Otherwise you will get left behind. My Dad is in tech (I followed his footsteps). For as long as I can remember growing up he always had a book with him in his free time. He started with punchcards.
I don't mind learning, definitely. I got into this field because I'm passionate about software and I enjoy programming a lot. But while I have a full time job and a large-ish family to support, it's hard to fit "3 years of professional React.js experience" into my spare time to put on my resume, so that when nobody's hiring iOS developers anymore, I can still get a job. That's what I'm talking about. A lot of the skills are very transferrable, but I've already been turned down for a few jobs simply because I just don't have the in-production experience with the exact technology they're hiring for, even though I could pick it up pretty quickly.
Want to get some production experience with react? Tons of webdev projects are using it at Mozilla. Come on over and find yourself a good first bug :-)
"But while I have a full time job and a large-ish family to support, it's hard to fit "3 years of professional React.js experience" into my spare time ..."
I'm sure your got the bugs, but the gp ain't got the time
I was in your position, been working with iOS since the beginning until I decided to just get the fuck out of iOS and start doing some useful things.
Went all in with building single page applications in JavaScript and Node, front end, back end, and database administration, and haven’t looked back.
Fuck iOS now, native mobile development benefited from a craze where everyone thought mobile apps were the new web apps, but the truth is, most mobile apps only make sense in the context of a larger application ecosystem, usually supporting a web app.
Node and javascript? Those technologies change and move faster than anything else in the industry. What you're doing now will likely be obsolete within a couple years.
In most jobs you will not get a foot in the door if all you claim to know is fundamentals. If you do not know the languages or frameworks they work in the interview is over as they are not going to spend the time training you. They want you to get in and be effective as soon as possible, not pay for your on the job training.
Personally I have found that this is relevant only for front end work. For the backend you just need rudimentary knowledge of a framework/language to get your foot in the door, then the actual testing is language agnostic.
I'm willing to bet that Node/JS devs actually have an easier time staying current, since their workplace actually has a need to change technologies, so they're getting paid to keep rewriting their web app in $the_latest_framework, which they can then put on their resume.
I know a few iOS devs in Chicago. I've definitely heard about the trials and tribulations of switching from Objective C to Swift, but I'm surprised there aren't more jobs out there. Is the commute to Chicago too long for you?
The iOS devs I know are either in marketing or work freelance though. So I'm not sure how much like "a normal job" it is for them.
HN doesn't have any private message functionality but many people (including the poster you replied to) have a means of contact in their profile - just click on the user's name.
> Also I'm never sure if its because the applications I work on are badly designed, or I just dont really understand modern design.
FWIW, I think there's a lot of people out there who would claim that Scala is a bit of a step backwards for the professional community. It's a fine language, but not a (socially) scalable one due to it's immense expressiveness.
Scala is rough IMO less because of the expressiveness, but more because you have at least 4 paradigms in common use, none of which play well with each other. Let's see, I count (1) Futures/Async (2) Threaded (3) Akka/Play/Actors (4) Finagle, (5) Reactive Streams.
There are so many impedance mismatches between them and so many diamond dependency conflicts that it's impossible to get any moderate size project done without going to microservices so that e.g. your Redis client and your HTTP server aren't complaining about different subtly incompatible versions of Netty.
> Languages, styles, CI/CD, DI, git, containers, unit testing its a huge amount to learn
I feel that a key trend in modern development is often needlessly focused on the mastering the complexity behind the coding infrastructure/deployment pipeline and ignores the actual business needs. Not suggesting these practices are useless by any means, but that for most startups it's putting the cart before the horse.
>After 10 years of that I wanted to move back into pure coding. With great experience should be easy right? Wrong.
The trouble is that the "just coding" experience gets very quickly outdated. Your knowledge of Win32 API or WPF isn't really relevant on a Python job. Interestingly, the skills that don't get outdated mostly lay outside the technical part: your ability to manage people and resolve conflicts, your ability to present ideas and convince others, your ability to spot business niches and shape products under limited resources.
In the old times one could argue that being a good engineer (i.e. finding simple solutions to tough problems) makes a difference, but I'm not sure how much this holds anymore for software engineers, since the complexity isn't in about fitting your business processes into the SQL/backend/frontend bounds, but rather in defining those processes and making proper assumptions about the market.
You might want to look into Python, specifically within data science / machine learning area. So far it seems to me to be much more pragmatic philosophy, for example if you can accomplish something with 10 lines of code, it seems acceptable in the community to use just 10 lines of code, rather than writing some elaborate framework of 30 classes, interfaces, factories, DI, unit testing as if you're sending people to the moon, all spread over numerous application tiers for maximum obfuscation.
Also, due to the uncertainty inherent in ML, I find there's a much lower prevalence of dogmatic know-it-all zealots who insist you must(!!!!) do something a certain way, because <a whole bunch of reasons that don't make any sense>, who inevitably will have moved on by the time the entire thing starts to collapse on top of itself.
> Once you're laid off or fired once its really hard to be the super confident hacker you were at 25.
This is the point of all those self-help books about grit and things like that. Their point isn't really to accelerate your progress when you're feeling good and confident. Their purpose is to help you regain the confidence once you lose it, to make you robust to setbacks.
It sounds like you were away from coding for a really long time (10 years!), then you got back into it, picking an area you didn't know much about, expected it to be easy, maybe didn't put in the time outside of work...this approach wouldn't work for anyone, no matter their age.
Here's a manager-y question for you: how could you have done things differently when you went back to being an IC, to have had a better transition? Maybe doing more prep work before you left the manager track? Maybe going back to C++ instead, or an area that was closer to what you used to work on? Maybe doing open source work in that new java/python target area, to get experience, build a portfolio, and be sure you liked it, before you bet your livelihood and reputation on it?
Dependency Injection. It's an over complicated term for a really trivial pattern in development.
func addOp(a, b, op){
return op(a) + op(b);
}
Here op is a function being injected into another function. Nobody uses the term in functional/procedural programming because this pattern is obvious and ubiquitous in Higher order functions like map, reduce or filter.
When people use DI though it's used in the context of OOP, where a dependency is an object getting injected into another object through a constructor. People using this pattern tend to create programs that are littered with very abstract objects that can only work when injected with dependencies. Also the injected objects themselves may be DI dependent as well leading to crazy dependency chains (reminds me of inheritance ugh). All of this is done in the name of "modularity" and "unit testing." As you can imagine this pattern produces a sort of False composability where your code is littered with very abstract objects that are rarely ever reusable. If you want to compose object A and object B, you have to specifically design/code object A so that it can handle object B or vice versa. Because all object composition involves custom modifications to the code of an object it is a sort of broken implementation of modularity and reusability.
In a function, the only requirement for composability is matching types. All functions that return ints can be composed with one another without additional modifications. This is true composability and true modularity. The above example doesn't need dependency injection the same result can be achieved with this:
func add(a, b){
return a + b;
}
add(op(a),op(b));
So you can see why the OP listed DI as one of his complaints. He must be dealing with a highly Object Oriented code base that employs this pattern extensively.
> It's an over complicated term for a really trivial pattern in development.
Yes, but...
The objective of most business-oriented programs is to create the One True Function(tm), usually known to developers as a the "application." For everyone's sanity, this should be composed of smaller functions, which are sometimes called "objects." An object is just a function that was created using a specific creation template, usually called a "constructor."
For a pretty moderately sized application, the One True Function might be a tree of sub-functions which has hundreds or thousands of branches. Sometimes you might want to write the straightforward boilerplate code to attach items to this tree, but that could get tedious to maintain.
Other times, you might find a library that will automatically assemble your tree for you, as long as you specify up front in configuration that all of your fruits are apples and all of your leaves are green. This program is usually called a "DI framework."
Hopefully, this saves you a lot of typing, which you can then use to type up HN comments.
Objects and functions are different. An object retains state, a function should be stateless. Objects cannot be evaluated and do not have return values.
I think conflating data structures and objects is bad, and muddle the discussion. An app is mostly a pipeline of functions over an initial data structure.
I think you forgot the word mutable somewhere. A true Business Application(tm) has no mutable state that is not stored in the database, so all objects are immutable and therefore functions (except those dirty dirty data objects).
some_object.apply(method_name, param, ...) // sure looks like evaluation to me
You never mentioned immutable. If objects were immutable then yes you are correct.
I'm not sure what you mean by "true business application" but there is a transient form of mutable state that an object possesses in between processing a request and returning a response that is seperate from the state that the database holds. It is up to you whether you want that transient state to be mutable or not.
An object itself cannot be evaluated. Methods can be evaluated but not the object itself. Is your example referring to a specific language?... because in general objects are not functions.
Python disagrees with you. Given a python script with a single class. If you instantiate an object and try to "call" or "evaluate" it:
class A(object):
def __init__(self):
pass
x = A()
x()
you get this output to stderr:
Traceback (most recent call last):
File "python_temp.py", line 6, in <module>
x()
TypeError: 'A' object is not callable
Of course you can mess with the callable magic method, but that's more of a trick then a standard. You may be referring to how functions in python are implemented as callable objects rather than primitives. This is an implementation detail that is language specific, similar to how in javascript, functions are also classes.
In standard programming vernacular and practice, objects and functions are two different primitives. They are not the same thing. An object is a noun, a function is a verb, and just like their english grammar counter parts functions and objects are primitives because a Noun is not really a special kind of verb nor is a verb a special case of a noun.
DI is Dependency Injection, which is a $1,000 term for a $5 concept.
A Python class without DI:
class SomeClass(object):
def __init__(self):
self.someDependency = GetDependency()
A Python class WITH DI:
class SomeClass(object):
def __init__(self, dependency):
self.someDependency = dependency
...
That's it. DI is just explicitly passing ("injecting") a dependent object into an object, rather than requiring the object to call a function to get a reference/pointer to the dependency or make one.
I'm guessing Dependency Injection? In context meaning writing code that's easily testable, and writing lots of tests, which goes hand in hand with the other TDD/BDD buzzwords that job descriptions often throw out there.
I was a great C++/OO dev for 10 years before I got bored, starting doing management, business focussed roles and more high level devops style work. After 10 years of that I wanted to move back into pure coding. With great experience should be easy right? Wrong.
I got a mid-level java/python dev job and it was difficult, I was out of touch and everything was different or new. Languages, styles, CI/CD, DI, git, containers, unit testing its a huge amount to learn. After a year I got laid off because I was getting paid like a senior but not keeping up with the grads.
A few years later I'm productive and useful in this new world but I dont really like it. I'm enjoying Scala and functional programming but with so many libraries and tools I feel like everything is so difficult and complicated. It takes a lot of study effort to keep up. Also I'm never sure if its because the applications I work on are badly designed, or I just dont really understand modern design. I have a business specialty which keeps me employable but I miss the old days when things were simpler.
Being "old" at 40+ really isn't so easy - I'm not sure you can ever redeem yourself in the eyes of the industry. Best you can hope for is get a non-tech domain specialty and find a big stable company that values experience and try to keep working on interesting projects. Once you're laid off or fired once its really hard to be the super confident hacker you were at 25.
EDIT - thinking about if you want advice. Get a business specialty or technical niche. Dont get too lazy, if you aren't learning on the job for a few years in a row, change the tech in the project or leave. It should be easy to keep employed but you have to keep working at it. Best career money-wise is to move to management, but its difficult to move back. Dont take the high paying job on a dead end project without a plan to get out.