Hacker News new | past | comments | ask | show | jobs | submit | mitt_romney_12's comments login

The link to at the bottom update post should probably point here https://ochagavia.nl/blog/solving-the-jit-calculator-challen... instead of back to the article


Oops, good catch! Thanks!


What happens if you do:

x = read_guess(); y = io.read_int("what's your name: "); z = read_guess();

How would your system ensure x and z are the same?


You don't have that API. You offer

    x = read_guess(0)
    y = read_name("what's your name: ", len(x))
    z = read_guess(len(x)+len(y))
If you read(0) twice, you get the same value both times.


But now every function that reads from stdin needs to pass around an offset of where to read from which is very unergonomical. It also isn't really pure since offset is now state that is changing whenever a function is called.

fn do_stuff(offset: int) -> (string, int) { x = read_guess(offset) y = read_name("what's your name: ", offset + len(x)) z = read_guess(offset+len(x)+len(y)) new_offset = offset+len(x)+len(y)+len(z) return ("name is: {y}, guess is: {x}", new_offset) }


Ergonomics isn't really the point of a hypothetical extreme system. And no, you can compute offset as a result of all inputs to the system, like I did in my example - just keep passing a "previous state" value around, and compute the next state and return it. Like any other functional system already does 99% of the time.


This is an interesting approach but it has some issues with more more advanced effects. For example if you have a function that uses mutable state you would need to have the state both as an input and an output (for the original and the updated state) `update_mut: State -> String -> State` this isn't very economical as is probably worse than just `update_mut: String -> State ()`. If you commit to just having the parameters act as "markers" that don't actually relate to the implementation then you can get away with `update_mut: State -> String -> ()` but then you use one of the big benefits of effect systems which is that you can change the way effects work and have multiple different implementations.


You're approaching this from a PL design standpoint where the distinction is important, but from a user perspective it doesn't matter if it's just "syntax sugar" or if it's a super complicated to implement all that matters is whether the feature is available or not.


Typing features affect the way we design APIs. Libraries written in languages with type classes and without them can have completely different designs. If nested pattern matching is not available, this will not affect the APIs, only the function bodies -- because desugaring is local by definition.


That doesn't matter in practice. If two programming languages have the same underlying feature but one has syntactic sugar to make it very easy to use and the other does not (so is quite cumbersome to use) then you'll find that the library ecosystem for the former language will see the feature in widespread use whereas the ecosystem of the latter will tend to shun the feature.

This is one of the social factors of programming language design and it's one of the main reasons successful programming languages work so hard to establish a coherent philosophy and a set of best practices or idioms within the language. For similar reasons, I believe this is why "anything goes" languages such as LISP have struggled to gain widespread adoption: with no philosophy every programmer becomes an island unto themselves.


> "anything goes" languages such as LISP have struggled to gain widespread adoption: with no philosophy every programmer becomes an island unto themselves.

There are already two misconceptions.

First: "Lisp has no programming philosophies" and styles.

Not every program starts by zero. Since Lisp exists since the end 1950s, it has seen quite a lot in programming styles over the years and it may contain traces of several. Generally it may support more than one programming paradigm. For example during the Common Lisp standardization there was a wish to have a standardized object system. So instead of the multiple possible approaches (actors, message passing, prototype-based, ...), Common Lisp has just one: CLOS, the Common Lisp Object System. So, much of the object-oriented code written in CL is implemented in one particular object system: CLOS. Object Lisp, Flavors, LOOPs, Common Objects, and a bunch of other once had thus been replaced by one standard.

CLOS also defines a bunch of user-level macros: DEFCLASS, DEFMETHOD, DEFGENERIC, ... Everyone using CL & CLOS will use those macros.

Second: "every programmer becomes an island unto themselves". If we look at the way CLOS was designed: there was a core group of six people from three companies. Around that there was a mailing-list based communication with a large group of interested people. Early on a prototype was implemented as a portable implementation of CLOS. This was widely distributed among interested parties: implementors, companies, research groups, ... Then reports about the language extension and its layers were published, books were published, application & library code was published.

One of famous books coming out of this effort: "The Art of the Meta-Object Protocol". It contained also a toy implementation of CLOS in Common Lisp. Book and the implementation of CLOS (both the larger prototype and the toy implementation) showed in excellent quality how to write object-oriented Lisp code.

https://mitpress.mit.edu/9780262610742/the-art-of-the-metaob...

So, there are communities, which share code and coding styles. Not every programmer is alone and starts from zero.


First: "Lisp has no programming philosophies" and styles

You misquoted me. I said no philosophy, singular. In the programming language context, a philosophy is a convention or a standard. Just as many standards implies that there is no standard, many philosophies implies no philosophy.

Everything else you said is evidence for my premise. Hire 3 different programmers, one from each of the communities, and you might as well have 3 different programming languages. That’s not a standard. That’s not a philosophy. That’s anything goes!


There is rather no evidence for your premise.

Most of the Common Lisp code that is accessible via public repositories conforms to conventions and is understandable.

Lisp programmers are highly motivated toward encouraging collaboration, since there aren't that many people in Lisp where you can afford to be turning people away toward projects that are easier to get into.

Also, you can easily hire 3 developers and get 3 different languages in, oh, Java or JavaScript. One person is doing Kotlin, another one Scala, ...

Three C++ programmers in the same room could also not understand each other. The greybeard speaking only C++98 with a bit of 2003 doesn't grok the words coming out of the C++20 girl's mouth and so it goes.


> You misquoted me. I said no philosophy, singular. In the programming language context, a philosophy is a convention or a standard. Just as many standards implies that there is no standard, many philosophies implies no philosophy.

That makes no sense.

> Hire 3 different programmers, one from each of the communities, and you might as well have 3 different programming languages.

Maybe not. They build on the same foundation, a language with a large standard, which is largely unchanged since three decades. A language which can be incrementally extended, without invalidating the rest. A language where extensions can be embedded, without invalidating the rest of the language or its tools. Many projects use SBCL (now itself 25 years old and only incrementally grown) and a bunch of core libraries for it.

> That’s not a standard. That’s not a philosophy. That’s anything goes!

Most languages support widely different software development practices. Take JavaScript: it includes imperative, object-oriented and functional elements (similar to Lisp). It has huge amounts of competing frameworks (many more than any Lisp), where many of them have been superseded and many of them are built on a multitude of other libraries. The developer can pick and choose. Each projects will be different from other projects, depending on which libraries and programming frameworks it uses - and which of those the developer re-invents.

Any half-way powerful language (C++ -> templates, Ruby -> meta objects, Java -> objects & byte code & reflection & class loader, Smalltalk -> meta objects, Rust -> macros, C++ -> language interpreters, Java -> external configuration languages, C -> macro processor, ...) has ways to adapt the language to a certain style & domain.

Any large Java framework defines new standards, new configuration mechanisms, new ways to use new Java features (lambdas, streams, ...). See the large list of features added to the Java language over time: https://en.wikipedia.org/wiki/Java_version_history For many of them there were competing proposals. Each Java code base will use some subset/superset of these features, depending on what is needed and what the personal preferences are. And then the Java architect in the project will not be satisfied and will invent yet another configuration system, this time not using XML or JSON for the syntax, but develop a new embedded scripting language for the JVM and integrate that with his software. I have seen Java architects which eventually didn't understand their own configuration system any more.

If you think a language like Common Lisp is special here, then in reality it is not. It's just slightly different in that it has extensibility as one of its philosophies and provides defined interfaces for that. There is one macro mechanism, which is powerful enough, that for decades it has not been replaced. Every syntactic extension mechanism will use this macro system, which is documented, stable and widely used.


> I believe this is why "anything goes" languages such as LISP

Why do you think that Lisp is an "anything goes" language? What's your baseline? I think that C is no less an "anything goes" language, but with a much less pleasant UI.

> with no philosophy every programmer becomes an island unto themselves

Some people actually think that Lispers tend to be too philosophical


Lisp is anything goes because of macros. Hire ten different Lisp programmers and you’ll get ten different domain specific languages and no one will understand what everyone else has done. If there is a unifying philosophy of Lisp, it’s probably “don’t use macros” and yet so many ignore it!

For all its faults, C is quite easy to read and understand, even by beginner C programmers. Yes, C also has macros but their clumsiness helps to discourage their use.


I'm a Common Lisp programmer. I don't really know how to respond to that except to say, what you describe is not the case at all. Reading others' Common Lisp code is a joy compared to reading others' code in any other language.


Yes, features that are easy to use will be more often used, while inconvenient features will be less used. I don't quite see any controversy with my comment.


The point is that in both cases the underlying feature is present, so APIs will be compatible. However, the lack of syntactic sugar in the one case will make any API that uses the feature cumbersome to use in that language, so in practice it will be avoided.


Abstractly this is true, but software development is a human practice, so it matters not what's technically possible but what people actually do.

That's why the most important difference between C++ and Rust isn't some technicality even though the technical differences are huge, it's cultural. Rust has a Safety Culture and everything else is subservient to that difference.

Sugar matters, Rust's familiar looking loops are just sugar, it only "really" has a single way to do loops, the loop construct, an infinite loop you can break out of. But despite that, people deliberately write the other loops - and the linter strongly recommends that they write them, because the programs aren't just for machines to compile, they're for other humans to read, and a while let loop is an intuitive thing to read for example, so is the traditional for-each style iterator loop.


> the situation is, in the worst case, as good as when types are mandatory

The worst case is actually worse than when types are mandatory, since you can get an error in the wrong place. For example, if a function has the wrong type inferred then you get an error when you use it even though the actual location of the error is at the declaration site. Type inference is good but there should be some places (ex. function declarations) where annotations are required.


I'm assuming `timeout_return_value` would be a user provided value that serves as the default. But most effect systems also support a `return` effect that lets change the return type of a function [1]. So you could make it return `Just<result>` when it succeeds or `Nothing` when it hits the timeout.

[1] https://koka-lang.github.io/koka/doc/book.html#sec-return


That'd almost be partial functions with extra steps. Take the Klesili category with the Maybe Monad,and you get partial functions.

Unless you are manually matching on the the Maybe, and thus observing the timeout, then that isn't the case. You'd probably also want a nondetermism effect which cannot handle unless you specifically build your timeouts to be deterministic, which I think Lean 4 does, but you can't go from partial to total with it afaik.


If you replace the ternary with an if then it's easy to understand for anyone who knows what ... means (which imo every JS developer should know)

  const count = (amount: number) => {
    if (amount > 0) {
      return [...count(amount - 1), amount]
     } else {
      return []
     };
  }


You missed the point entirely I’m afraid. It’s not about replacing the if with …, it’s about complexity.


With JSX you can do

  const todo = <DoMoreStuff args={...}><DoEvenMoreStuff args={...}/></DoMoreStuff>;
  <DoStuff args={...}>{todo}</DoStuff>


> With any financial innovation, there's often a cycle: initial excitement, over-extension, contraction, and then matured understanding

I'm a little more cynical, I think the cycle is: a new financial invention comes out, greed causes people to people to pump money into it, eventually the bubble pops or regulators step in. We've seen it time and time again: the dot com bubble, the subprime mortgage bubble, SPACs, and now/soon (IMO) with PE. My question is how many times this has to happen before we stop viewing it as an isolated with specific product and start to see it as a system issue with our financial system.


I'm curious what your reasoning on PE is. I know the space is struggling, (I'm not exactly rooting for them), but could you give a little context on what bubble and potential regulation is called for?


Under this interpretation nothing can ever be revolutionary. There is absolutely and ideology that is dominant in society at large and trying to say that the ideology that runs 95% of world powers and one that opposes it are equally "revolutionary" is counterfactual.


You can be revolutionary by seeking big change at personal risk because you're not socially dominant (or militarily in other contexts). There have been times this happened. Who is doing that today?


> Who is doing that today?

So many people. Look no further than the activists who are continuing to protest against Cop City in Atlanta despite the fact that the cops have already murdered one person [1] and arrested many more on trumped up charges [2].

[1] https://theintercept.com/2023/04/20/atlanta-cop-city-protest... [2] https://theintercept.com/2023/06/21/cop-city-georgia-attorne...


Indigenous people resisting copper mines, gas pipelines, lithium mines, etc that are being proposed on territories they've been pushed back onto.


I doubt critical theorists do them any good.


Oddly enough the OG Critical Race Theory that originated in law schools about the globe decades ago addresses systemic inequality issues fairly directly.

This seems to have been forgotten since Tucker Calson et al started redefining woke | CRT to mean whatever they currently hated to avoid addressing policy.


"Countering disinformation about Critical Race Theory" (2022)

[0] https://edwardfeser.blogspot.com/2022/08/countering-disinfor...


Your material:

    As readers of my book *All One in Christ* will find, the content of CRT is even more disturbing than this brief summary indicates – and it is also riddled with blatant logical fallacies, crude social scientific errors, and assumptions and policy recommendations that are utterly contrary to the natural moral law and the Catholic faith. 
appears to be a lengthy pro catholic | anti CRT self promoting strawman.


Based on what? If it's your dislike of critical theory, that's merely circular.


Is running a casino more moral than a copper mine?


There's a clear distinction between

* people choosing to build a casino on a specific portion of their land that they choose to build upon, and

* people having no say over third parties excavating a giant hole beneath a large area of their land at a particular location they believe has deep religous meaning | spititual connection.


Stealing people's land to build a casino is no more or less moral than doing the same to build a copper mine.

However, only one of these is actually going on.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: