Hacker Newsnew | past | comments | ask | show | jobs | submit | svara's commentslogin

This is too reductionist. When you go to work, do you go maximize shareholder value? Were you ever part of a team and felt good about the work you were doing together? Was it because you were maximizing shareholder value?

> When you go to work, do you go maximize shareholder value?

Yes. The further up the ladder you go, the more this is pounded into your head. I was in a few Big Tech and this is how you write your self-assessment. "Increased $$$ revenue due to higher user engagement, shipped xxx product that generated xxx sales etc".

If you're level 1/2 engineer, sure. You get sold on the company mission. But once you're in senior level, you are exposed to how the product/features will maximize the company's financial and market position. How each engineer's hours are directly benefiting the company.

> Were you ever part of a team and felt good about the work you were doing together? Maybe some startups or non-profits can have this (like Wikipedia or Craigslist), but definitely not OpenAI, Google and Meta.


Of course the business needs to work as a business too. I'm not saying that's not real, I'm saying it's reductionist to say it's only that.

Put another way, you need to have an answer to the question: Why should I work towards optimizing the success of this business rather than another one's.

If there isn't a great answer to this, you'll have employees with no shared sense of direction and no motivation.


Most of the work I as an engineer do is jumping through hoops that engineers from other departments have drawn up. If someone up high really cared, wouldn't they have us work on something that matters?

I usually try not to be so cynical but just couldn't resist here: What if the future of human connection is to replace it with para social relationships that can be monetized?

That said I am not cynical about mission statements like that per se, I do think that making large organizations work towards a common goal is a very difficult problem. Unless you're going to have a hierarchical command and control system in place, you need to do it through shared culture and mission.


The counterargument used to be, the heavy lifting will be offloaded to python modules written in C, like numpy.

Which was true, but maybe not the strongest argument. Why not use a faster language in the first place?

But it's different now. There's huge classes of problems where pytorch, jax &co. are the only options that don't suck.

Good luck competing with python code that uses them on performance.


> Why not use a faster language in the first place?

Well for the obvious reason that there isn't really anything like a Jupyter notebook for C. I can interactively manipulate and display huge datasets in Python, and without having to buy a Matlab license. That's why Python took off in this area, really


I believe I heard that argument since before jupyter became popular.

Usually it was accompanied by saying that the time needed to write code is often more important than the time it takes to run, which is also often true.

All that said, jupyter is probably part of python's success, although I'm not the only one who actively avoids it and views it as a bit of a code smell.


I love Jupyter! What I don’t love is people writing large projects in a workbook, then asking how to run it as-is in production so they can continue to iterate on it in that form.

It’s not impossible, but neither is it the sort of thing you want to encourage.


I agree - Jupyter notebook is really the key feature Python has that makes it attractive for research/scientific computing. I would say the REPL too but until very recently it was extremely shoddy so I doubt many people did any serious work in it.


> I would say the REPL too but until very recently it was extremely shoddy

Can you elaborate? I've been using the Python REPL for more than two decades now, and I've never found it to be "shoddy". Indeed, in pretty much every Python project I work on, one of the first features I add for development is a standard way to load a REPL with all of the objects that the code works with set up properly, so I can inspect them.


Very obvious example - you can't paste code containing blank lines.

Another example: navigating this history is done line by line instead of using whole inputs.

It's just bare minimum effort - probably gnu readline piped directly into the interpreter or something.

I think they did improve it a lot very recently by importing the REPL from some other Python interpreter but I haven't upgraded to use that version yet so I don't know how good it is now.


> probably gnu readline piped directly into the interpreter or something

That is more or less how the REPL originally was implemented. I think there's more under the hood there now.

I still don't think what you describe qualifies as "shoddy". There are certainly limitations to the REPL, but "shoddy" to me implies that it's not really usable. I definitely would not agree with that.


The predecessors were really popular before it too - MATLAB in engineering in particular, Mathematica also popular in Physics and Maths departments particularly where the symbolic functionality was more useful. I used both in academia and IPython (later renamed Jupyter) was clearly a natural extension of those but open source, and without the baggage of MATLAB (only one function definition per file, etc.)


>Which was true, but maybe not the strongest argument. Why not use a faster language in the first place?

Because most faster languages sucks donkeys balls when it comes to using them quickly and without ceremony. Never mind trying to teach non-programmers (e.g. physics, statistics, etc people) them...


It's the internal surface area. Like saying 10 grams of Swiss cheese has X surface area in its holes on average.


> You should also remove any students from classrooms whom routinely distract from others' learning.

Might very well be the bored gifted ones...


"...here, let me teach you how to sweep a floor, wise Bartholomew III" [hands intelligent student broom] "...now push. It's a simple task for a simple person."

src: I was a bored gifted one; only swept the floors long enough to want to change my behavior(s). I was also once a teacher for children with behavior issues.


I don't think it's luck. They invested in CUDA long before the AI hype.

They quietly (at first) developed general purpose accelerators for a specific type of parallel compute. It turns out there are more and more applications being discovered for those.

It looks a lot like visionary long term planning to me.

I find myself reaching for Jax more and more where you would have done numpy in the past. The performance difference is insane once you learn how to leverage this style of parallelization.


Are you able to share a bit, enough to explain to others doing similar work that this "Jax > numpy" aspect applies to what their work (and thus that they'd be well-off to learn enough Jax to make use of it themselves)?


This should be a good starting point:

https://docs.jax.dev/en/latest/jax.numpy.html

A lot of this really is a drop in replacement for numpy that runs insanely fast on the GPU.

That said you do need to adapt to its constraints somewhat. Some things you can't do in the jitted functions, and some things need to be done differently.

For example, finding the most common value along some dimension in a matrix on the GPU is often best done by sorting along that dimension and taking a cumulative sum, which sort of blew my mind when I first learnt it.


When you look at total real returns over longer time scales, the current period in time is not unusual.

https://totalrealreturns.com/s/SPY

We're not far from the long term trend of 6.15%/yr on the S&P500, just slightly above.

The deviation above the long-term trend was much larger in the run-up to 2001. It's like we're in 1997 now.

Not saying a correction isn't likely, just that it's easy to scare yourself by looking at linear, non-inflation adjusted plots.


> Deutsche Bank recently said that the AI hype is the only thing holding the US stock market together.

Doesn't sound right. Russel 2000 is up ~10% YTD.


This 10% is largely due to weaker dollar though.


I'm perfectly happy reading different, well-argued cases in a magazine even if they contradict each other.


> I've never had any issue with Cursor

Glad it's working for you but I think you might be the only one!


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: