Hacker News new | past | comments | ask | show | jobs | submit | Transfinity's comments login

I think Target isn't the right comparison here - the skills required for this project are worth much more than minimum wage bagging groceries. If you assume something like $50 an hour (on the low end for a skilled electrician), you get to the $6800 number in the parent post pretty quickly.


Getting certified and hired as a skilled electrician is a lot more complicated and much harder than acquiring the knowledge to be a skilled electrician. There are many people working Target-level jobs with that level of skill in some area.


That number was from 2016 is useful in determining if it was worth it but not useful if it will be worth it staring today as the number has changed in the intervening 9 years. The number will keep changing with an estimate of $80 kWh by 2030.


There's no incentive for an engineer to do that. Saying yes and delivering crap gets you a bonus, hard truths get you shuffled around or made redundant. There's no real consequence for delivering crap, so that's what happens.

Contrast this with other engineering fields, where the engineer is truly responsible for the decisions they make. My civil engineer friends face losing their licenses, fines or jail time if they are found professionally negligent. The same is true of other high stakes professions - think doctors, lawyers, even accountants. It's probably not appropriate for most software engineering roles, but for safety critical systems it doesn't seem far-fetched to me.


The product team for the touchscreen control system scoffs at the engineering team’s concerns because “customers don’t care, they’re wowed by the touchscreen at the dealer lot.” It’s only after purchase that the regret sets in. Product teams know this and exploit it. The business side knows they’re selling a steaming pile to customers and don’t really care for engineering’s concerns. In most situations they’ll override these concerns forcefully. It’s a hard pill to swallow as an engineer in these companies.


"My civil engineer friends face losing their licenses, fines or jail time if they are found professionally negligent."

These standards ought to be applied more widely and done in conjunction with tightened consumer law. In many cases the quality of electronics equipment has gone to the dogs. I could give instances of appliances I use that can only be used in a hobbled mode—numbers of published functions simply don't work—because their firmware bugs are so bad.

These devices are so bad they wouldn't pass as early developmental mockups let alone early prototypes in a professional engineering establishment. I'm damned if I know why the hell consumers put up with the situation and haven't revolted, it remains a mystery.

Things won't improve until they do.


I don't think jail time for making software too bloated and slow for your liking is a serious proposal.


I'm not a strong advocate of this, or of jail time in general for non-violent offenders, but as a thought experiment, suppose that Acme Auto release updates to their car's software which make the UI more laggy and less intuitive to navigate. After they do, there are a cluster of similar accidents - distracted driver hits a pedestrian when they should have stopped. These can be shown statistically to affect Acme models with the software updates significantly more than any other make of car, and more than Acmes which don't have the update. A class action lawsuit is started against Acme by both crash victims and drivers. In discovery, correspondence between software engineers is found. Engineer A writes to Product Manager B and says that they don't think the new build is safe, because they were forced to compromise latency performance, and button placement is now more surprising, having changed again. QA Engineer C chimes in and says that since the changes apply to features critical to driving such as de-misting, they won't be prepared to sign off on the change. PM B says that they have to go with the new version in order to meet internal targets on engagement with entertainment apps. They overrule A and C, as company rules allow them to do.

Do you think B should face any personal consequences within a public justice system? Or Acme is just liable for a big payout and then upper management decide who takes the blame?


"Do you think B should face any personal consequences within a public justice system?

Yes he should. Reason, because he now knows the consequences of proceeding if the problem not fixed first (he was told them by engineer A.).

Once aware, everyone has the responsibility to act. The Occupational Health & Safety laws of many jurisdictions are written exactly on this principle. Such laws don't just apply to managers and decision-makers, a floor sweeper who overheard the conversion would also be culpable if it were proven that he did not inform authority of the fact and or if he had good reason to suspect Management would do nothing.

Same for Engineer A, he would still be culpable if after telling Product Manager B the facts and he knew or had good reason to suspect Product Manager B or others responsible did not or would not act to fix the problem. Moreover, unlike the floor sweeper, Engineer A, due to his extensive knowledge of the facts and his senior decision-making position (as an engineer—even if not in charge of marketing or production), the Law would still require him to follow though with either senior management and or external authority until he was satisfied (to the level of his professional ability) that the problem was sufficiently in the hands of responsible others.

Whilst these laws vary between different jurisdictions the common themes are if one—and that's anyone, inside or outside the company—knows there's danger and or potential for someone to be harmed or killed then that person has to act, irrespective—full stop. Second, the more responsible or more knowledgeable someone is as to the consequences of something or some process going wrong then the more incumbent it is for that person to act (the floor sweeper in Boeing's factory would not be expected to know the wrong alloy had been used in engine turbine blades but the engineer would).

These laws were introduced to avoid problems like the Challenger and the Boeing 737Max disasters, and the Purdue Pharmaceuticals opioid crisis. Unfortunately, the US lags behind in either implementing them or making existing laws sufficiently strong.


> I don't think jail time for making software too bloated and slow for your liking is a serious proposal.

Especially since the boundaries of "making software" is pretty blurry.

Would creating a suboptimal Excel spreadsheet count as "making software too bloated"? A pretty strong case could be made for that.

Would creating a clunky personal homepage count as "making software too bloated"? A pretty strong case could be made for that.


You don't find the idea both kinda hilarious and somehow vaguely appealing even though it's a bit nonserious at the same time? I love it as a thought experiment.


Ideas that are silly thought experiments have a way of actually getting implemented once too many people start paying attention to them.


"too bloated and slow for your liking"

I neither said nor implied this. To be clear, the products in question were sold under false presences, as they were sold with features that—as far as the lay consumer is concerned—don't exist (just because I'm a technical person and I know they are almost certainly software bugs and or are not designed as per specifications is immaterial). In essence, by deliberately selling a substandard product they've committed fraud.

Here's one of the many examples I could list but it's a clearcut easy one to understand. I have three identical PVRs/STBs (Personal Video Recorders/Set Top Boxes) of one brand and type—so the problem is not just a single faulty unit. These are the type that you add external storage via USB, 2.5" or thumb drives.

Advertised on the outside of their boxes is the statement that they will take external storage to 2TB in size, the scanty manual—if you can call it that—that's sealed in the box which you can't read until one unboxes the device makes a very clearcut statement that the maximum limit of external storage is ONLY 700MB drive (a rather strange limit methinks), and 1/3rd that published on the box. In practice, these units simply wiil not work with ANY external USB drive 2.5" rotary or SSD drives—even the lowest current SSDs of 120GB or smaller—which is in direct contradiction to what's stated on the box and in the so-called manual.

They will however work with thumb drives up to 128GB (I haven't tried bigger). Incidentally, have you ever seen a 2TB thumb drive? Right, I haven't either.

Thats not all, there are software bugs and an UNSTATED limitation that only six programs can be programmed at one time (this is an unheard of restrictive limit, I've never reached the limit on my other units although one type, which has other bugs and problems, says its limit is 32).

I also have three other PVRs but of a different brand (a well-known international mob). All three have the SAME identical model number but two have completely different electronics and their firmware operates in a totally different fashion to the first (clearly built by a different subcontractor), Even the boxes they came in are all identical.

I discovered this when the first unit failed and I bought two more of the same. Moreover, the first unit wasn't even out of warranty so the second purchase was only about six months on from the first.

To make matters worse, before the first unit failed and after getting nowhere with the local distributer I'd hunted around the internet for a firmware upgrade to fix the annoying bugs but couldn't find an ungrade (little wonder if different hardware exists for a given model). The so-called identical replacements are not only operationally very different but they have so many bugs that they are actually unusable. I'm still working on exchange/warranties and such.

Those two brands are not alone in having masses of bugs, I've three other brands—five all up with even more model numbers (yes, I've boxes of these damned things). The bugs in a third band are so bad that it allows one to program the same timeslot on different channels simultaneously—which channel takes precedent and is recorded is pot luck, at other times, about one in three, it fails to record the scheduled program, only a black screen (it switches to blank instead of a channel—but give credit where credit's due, it does switch to blank at the correct time)!

And believe it or not that brand/model has been on the market for several years and it still is without any firmware upgrages being available.

Here, I've presented only the tip of the iceberg—and that's only the PVR/STB story. Where else would you like me to start?

People should not have to put up with this shit, it wastes time and human effort not to mention wastes resources and the environment is clogged up with dead and discarded e-waste and other junk. A simple way around the problem would be to license both companies and their design engineers and threaten them with loss of license for producing junk. With importers, bring in junk and they'd lose their import license.

Implement these rules and most of the problems would soon disappear. In extreme cases where irresponsible designs threaten safety and life then loss of license and jail time would be a just measure.


You'd just get a lot less software as people instituted enough checks to make progress glacial. Not everything needs to be developed like it is a medical device or aerospace software.


Singapore has criminal liability for software malfunctions. I don't think they've sent anyone to jail for a software bug yet, but the law allows for it.


Rightly so, if justified by the consequences—to the extent of causing injury or death.

As with other professions, civil, chemical engineering etc., when the outcomes are the same (people killed or injured etc.) then the punishment should also be the same.

Software design should be no exception to any other profession just because it's common for programs to have bugs.

Moreover, the profession of programming now calls itself Software Engeering, if it wants to play with the Big Boys then it must face the same consequences when things go wrong.


The incentive is having a rewarding job where you develop products you are proud of. Once I have food to eat, this is by far the most important incentive for me and it greatly outweighs e.g my desire for promotions, raises and bonuses. If I can have both, great. If I need to choose one, it’s the fulfilling job and product pride every day.


No thanks. I’ve got a family to support for at least 18 more years, and I could be laid off at any second.

I need to make as much money as possible in the little time as possible, and the best way to do this is to stop worrying and learn to love the bomb.


kinda weird that expressing pride in one's job and prioritizing excellence over financial remuneration would be downvoted so vigorously


Yep agreed. If you raise a flag, you'll be looking for work. Head down and build crap, and you have a job for life. I see it all the time. I've lived it.

Sometimes you have to decide, do I build a better system, or do I feed my family. The craft and world suffer, but...


It should be, at least on countries where Software Engineering actually means something, and not a title that one can easy peasy call themselves after a six weeks bootcamp.


Oh this again. Yeah a certification is going to solve all the problems.


It will, when it comes with a liability just like any other Engineering position.


I think it's the liability that matters, not the certification -- which usually translates to "X years in a government facility, pretending to learn something which may or may not be misguided and out of date."


I don't usually write in fiction, but with nonfiction and especially technical writing (like O'Reilly books) I find taking notes helpful, and the book itself is the most convenient place up do so. I'll underline important words or phrases, ask questions, raise concerns, recall definitions from earlier.

I find doing this helps keep me honest about whether I'm understanding what I'm reading or just glossing through it, and it helps pace my engagement. If I can't come up with one question or comment per page, I've probably lost focus.


I've heard that the Kanji make Japanese and Chinese much easier to scan quickly once you're fluent.


I've only learned Japanese as far the N4 test (second of five levels of the standardized tests), but my experience backs this up. Those tests preferred syllabic symbols instead of kanji, and that just made them harder to read for me.


Chinese doesn’t have a writing system that mixes different sets of pictographs. It has traditional and simplified versions of the writing system but they’re not intermixed the way Japanese does it.


Such mixing is simply not required in Chinese since words in all Chinese varieties are not inflected. There are some particles that are quite common (like 了 and 子), but they are very easy to write. The Simplified Characters are further optimized for writing speed. Most importantly 儿 for 兒 and 个 for 個, since these characters are very common in Modern Standard Chinese.


Hi! I'm that person! Senior engineer, decade of experience. I've used debuggers in the past, both for running code and looking at core dumps, but I really don't find them to be cost effective for the vast majority of problems. Just write a print statement! So when I switched from C to python and go a couple jobs ago, I never bothered learning how to use the debuggers for those languages. I don't miss them.


I am also this person. I'm a systems programmer (kernel and systems software, often in C, C++, golang, bit of rust, etc)

What I find is that if my code isn't working, I stop what I'm doing. I look at it. I think really hard, I add some print statements and asserts to verify some assumptions, and I iterate a small handful of times to find my faulty assumption and fix it. Many, many times during the 'think hard' and look at the code part, I can fix the bug without any iterations.

This almost always works if I really understand what I'm doing and I'm being thoughtful.

Sometimes though, I don't know what the hell is going on and I'm in deep waters. In those cases I might use a debugger, but I often feel like I've failed. I almost never use them. When I helped undergrads with debuggers it often felt like their time would be more productively spent reasoning about their code instead of watching it.


Your list of programming languages excluded Java. Please ignore this reply if Java is included.

Are you aware of the amazing Java debugger feature of "Drop to Frame"? Combined with "hot-injection" (compile new code, then inject into current debug'd JVM), it is crazy and amazing. (I love C#, but the hot-injection feature is much worse than Java -- more than 50% of the time, C# compiler rejects my hot-injection, but about 80% of the time, JVM accepts my hot-injection.) When working on source code where it is very difficult to acquire data for the algorithm, having the ability to inspect in a debugger, make minor changes the the algorithm, then re-compile, inject new class/method defs, the drop to frame, the re-exec the same code in the same debug session is incredibly powerful.


Yes, that sounds pretty cool and it doesn't take a lot of imagination to see the utility in this. I've done a lot work on lower level software, often enough on platforms where the debuggers are tough to get working well anyway.

The plus side of less capable tooling is it tends to limit how complex software can be--the pain is just too noticeable. I haven't liked java in the past because it seems very difficult without the tooling and I never had to do enough java to learn that stuff. Java's tooling does seem quite excellent once it is mastered.


This part: "I haven't liked java in the past because it seems very difficult without the tooling"

If you are a low level programmer, I understand your sentiment. A piece of advice, when you need to use Java (or other JVM languages), just submit to all the bloat -- use an IDE, like IntelliJ, that needs 4GB+ of RAM. The increase in programmer productivity is a wild ride coming from embedded and kernel programming. (The same can be said for C#.)


I think there's a sort of horseshoe effect where both beginners and some experienced programmers tend to use print statements a lot, only differently.

When you're extremely "fluent" in programming code and good at mentally modelling code state, understanding exactly what the code does by looking at it, stepping through it doesn't typically add all that much.

While I do use a debugger sometimes, I'll more often form a hypothesis by just looking at the code, and test it with a print statement. Using a debugger is much too slow.


> Using a debugger is much too slow.

This varies, but in a lot of environments, using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement). Especially when you're trying to look at a complex object/structure/etc where you can't easily print everything.


I think there is a bit of a paradox, debugging can seem heavy, but then when you’ve added enough print statements you’ve spent more time and added more things to clean up than if you had just taken the time to debug, well, you should have debugged. But you don’t know until you know. This seems to also appear with the “it would’ve been faster to not try to automate/code a fuller solution” than address whatever you were doing.


> using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement)

Don't remove the print statement. Leave it in, printing conditionally on the log level you set.


In what way is using a debugger "slow"? I find that it speeds up iteration time because if my print statements aren't illustrative I have to add new ones and restart, whereas if I'm already sitting in the debugger when my hypothesis is wrong, I can just keep looking elsewhere.


I find I use the stepping debugger less and less as I get more experienced.

Early on it was a godsend. Start program, hit breakpoint, look at values, step a few lines, see values, make conclusions.

Now I rely on print statements. Most of all though, I just don't write code that requires stepping. If it panics it tells me where and looking at it will remind me I forgot some obvious thing. If it gives the wrong answer I place some print statements or asserts to verify assumptions.

Over time I've also created less and less state in my programs. I don't have a zillion variables anymore, intricately dependent on each other. Less spaghetti, more just a bunch of straight tubes or an assembly line.

I think it's possible that over the years I hit problems that couldn't easily be stepped. They got so complicated that even stepping the code didn't help much, it would take ages to really understand. So later programs got simpler, somehow.


I find I use the stepping debugger more and more as I get more experienced. Watching the live control flow and state changes allows me to notice latent defects and fix them before they ever cause actual problems. Developers ought to step through every line of course that they write.


It’s because we moved to request-response, and allow for deep linking / saved states. Soo the state of the session is more easily reproduced


> I never bothered learning how to use the debuggers for those languages. I don't miss them.

This could be causal.


You'd have to assume that the python and go debuggers do something that C debuggers don't do.


Or assume that python debuggers aren't as nice to use, or that python does not lend itself to inspecting weird memory and pointer dereferences, or a bunch of other possibilities.


Most of what I worked with code wise growing up was either very niche or setup in such a way that debuggers weren't an option so I never really used them much either. I don't understand their appeal when print statements can give you more context to debug with anyway. I'm definitely no senior but I'm used to solving things the "hard way" as one developer told me. He wondered how I could even work because of how "bad" my tools were but I didn't know any better being self taught and with certain software it's just not compatible with the tools he mentioned.


It depends what you're doing. Sometimes inserting a print and capturing state works. Sometimes you're not sure what you need to capture, or it's going to take a few iterations. That's where pdb / breakpoint() / more interactive debuggers can be very helpful.


What makes you talented?


I agree. IMO, Java-style OOP conflates 2 different concepts, polymorphism and inheritance.

Polymorphism (including dynamic dispatch and duck typing) is a game changer, in that it encourages simple, stable interfaces, enables testing, encourages encapsulation, etc. It's a key technique for building big projects.

Inheritance is a tool for reducing the amount of code written by a human, among many others (things like code generation and composition) I haven't seen it unlock other important conceptual domains the way polymorphism does.

Unfortunately many undergraduate curriculums get overly excited about inheritance when teaching OOP. I guess animal-cat-dog is an easy example (though totally unrealistic), but the problems polymorphism solves don't often show up in classroom-sized projects.


I saw a similar effect teaching at a coding school for adults: one of our strongest predictors for success was whether or not the student had someone close to them who was in tech. Having someone outside the system to practice lingo, talk about culture, or give an opinion on what topics matter makes a huge difference. I see no reason why grade school math would be different - I know my parents made a huge difference for me, not by "pushing", but by just being there to talk.


Kids are incredibly perceptive as to "what matters in this family". If parents show an interest in how the kid is doing in school, kids see that school is important. If parents show an interest in whether they won or lost a sports game, kids learn that's important. If parents show no interest but someone else does, whatever that person values takes on an outsized importance in the kids' life.

"Just being there to talk" is more than half the battle (but also more than half the time investment) pretty much no matter the topic.


I will occasionally file bug reports or even submit fixes to open source projects as part of doing my job. As an engineer, one of my responsibilities is to maintain the commons - without it I would be much less productive. On the other hand I have a lot of autonomy in my work, a chill boss and a team that doesn't need to crunch. If someone I report to got grumpy I would stop, and I definitely don't do this on my own time.


> The “silent majority” was used by President Richard Nixon during his presidency and his campaign against the Vietnam war.

Excuse me? The "silent majority" was the set of Americans who were for the war, not (loudly) protesting against it. Nixon was all for continuing the war.


That's literally what it says in the very next sentence.

"In this usage, it referred to those Americans who did not join in the large demonstrations against the Vietnam War at the time"


It was stealthily changed. Someone else quoted the original further down in the comments

> The “silent majority” was used by President Richard Nixon during his presidency and his campaign against the Vietnam war. He spoke to the people who were not actively voicing their opinions and who were overshadowed by the vocal few who were supporting the war.


That depends heavily on the shape of your data, what your workload looks like, what sort of consistency guarantees you need, etc. I recommend Designing Data Intensive Applications for getting a handle on this - it's the book I suggest to SDE IIs who are hungry for the jump to senior. Not a quick read, but well written, and there's not really a shortcut to deep understanding other than deep study.


> I suggest to SDE IIs who are hungry for the jump to senior

Any suggestions for seniors who are hungry for a staff title. Is there any value to specializing in tech anymore after senior or is it all 'soft skills' at this point ?


It is not all any one thing. If you’re completely lacking in soft skills, you’ll probably have more trouble the higher up you go, and you may get stuck at some point. Hard technical skills become less important the higher you go, but even the CEO of a tech company probably still needs to have some basic understanding of technology. Being good at both things is obviously the best way forward.


I enjoyed this: https://staffeng.com/book

Not a staff though so can't attest to how true/accurate/effective it is.


thanks, will look that up


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: