Hacker Newsnew | past | comments | ask | show | jobs | submit | jm547ster's commentslogin

Yeah the Linux language gets most people's goats......


Sounds like you picked some obscure tasks to test it that would obviously have low representation in the data set? That is not to say it can't be helpful augmenting some lower represented frameworks/tools - just you'll need to equip it with better context (MCPs/Docs/Instruction files)

A key skill in using an LLM agentic tool is being discerning in which tasks to delegate to it and which to take on yourself. Try develop that skill and maybe you will have better luck.


The opposite may be true, the more effective the model the lazier the prompting as it can seemingly handle not being micromanaged as with earlier versions.


I can guess your background (and probably age) from this comment


Finishing sentences with a full stop would put me above 30, yes.

EDIT: incidentally, Suchir Balaji was 26 when he held those views.


Aider is not agentic - it is interactive by design. Copilot agent mode and Cline would better comparisons.


OpenAI launched codex 2 days ago, there's open forks already that support other providers too

there's also claude code proxy's to run it on local llm's

you can just do things


Phase is relative, you are trying to sound intelligent


Of a single sinusoidal component, sure, this is true. But phase differences between sonic features are absolutely detectable.

The effect is most noticeable on raw synthesized tones: sawtooth, square wave, etc. These tones contain sonic energy concentrated at discontinuities in the waveforms. The ear can hear this, as a "buzzing sound".

Run these tones through Paulstretch (even with 0 stretch), and the sonic energy is distributed throughout the wavecycle. These tones retain their spectral character, but noticeably lose the buzzing character.

I've uploaded a demo here: https://chris.pacejo.net/temp/phase.wav It is a 55 Hz sawtooth tone, alternating every 2.5 s between the raw tone, and the tone fed through Paulstretch with no stretching.

There was even a paper written on this. Laitinen, Disch & Pulkki, "Sensitivity of Human Hearing to Changes in Phase Spectrum". [1]

Paulstretch muddies up percussive transients (like hi hat strikes) as well.

Anyway it's the reason things like gammatone filters exist for analyzing audio. They reveal phase correlations in the same way the ear is able to. Windowed Fourier transforms (used by e.g. Paulstretch and Audacity for various purposes) obfuscate these relationships.

Aside: please try to avoid snarky armchair dismissals on HN: https://news.ycombinator.com/newsguidelines.html "you are trying to sound intelligent" does not advance discourse.

[1] https://www.researchgate.net/profile/Ville-Pulkki/publicatio...


Seems to have been heavily downvoted also, it's flown off the front page. Times have changes for HN. Also double standard when it comes to the like of Deepseek r1 earlier this week :shrug:


CIA/NSA? KGB hasn't existed for more than 30 years, no doubt the FSB will be engaged in similar pursuits


If you're ever in Limerick you can pick some OG KFC recipe chicken https://www.ilovelimerick.ie/chicken-hut/


Apple Music if you're in that ecosystem


What ecosystem? It’s on Android and Windows too, and has a web version for Linux use.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: