Hacker Newsnew | past | comments | ask | show | jobs | submit | Maxatar's commentslogin

What's awful is thinking that making mistakes is a form of shame, and that being corrected is a form of dominance. That view is something that is taught and acquired and I am very thankful that the people who taught me never made me feel that way. I make mistakes all the time and I never have to feel ashamed about it, nor do I feel that the people in my life who I've learned from hold some kind of position of power over me.

It's good that no one did it to you, but are you sure that you never try to do it to anyone else?

Brought to you by Steven Pinker, another one of Epstein's associates.

The main purpose of sarcasm is not humor, it's to use irony as a form of contempt. To the extent that humor is involved it's usually done so as a form of mockery.

I am perfectly OK with having contempt for powerful pedophiles. The opportunity for laughter is a bonus.

I just hope that the fallout doesn't begin and end with Prince Andrew and Larry Summers.


And VSCode is based on Chromium (a fact you won't find on VS Code's website other than related to updates).

VSCode isn't a Chromium fork, it's an Electron app. Utilizing something is different than making a derivative of it. For example, an empty "Hello World" Electron app wouldn't have any value for an app developer, but creating a web browser derived from Chromium means you've already finished 99.9% of the work.

It uses Electron which itself uses the Chromium rendering engine.

That's like saying news.ycombinator.com is based on Chromium.

VSCode runs on chromium, like any website you visit when using a Chromium browser.


To the downvoter:

VSCode -> Electron (essentially purpose specific web browser) -> Chromium

news.ycombinator.com -> General Purpose Web Browser -> Chromium


Ever heard of Firefox or Safari? I can open HN in any browser I want, but I can't change the browser used by the VSCode desktop app.

That's due to Electron, not Chromium.

Which is exactly why VSCode has a near parity version that runs on the web, under any browser engine.

Saying VSCode is based on Chromium doesn't make any sense.


That's quite the unsmart comparison.

There isn't much of a final version shipped. It's pretty well understood that modules are underspecified and their implementation across MSVC, clang, and GCC is mostly just ad-hoc based on an informal understanding among the people involved in their implementation. Even ignoring the usual complexity and ambiguity of the C++ standard, modules are on a whole different level in terms of lacking a suitable formal specification that could be used to come close to independently implementing the feature.

And this is ignoring the fact that none of GCC, clang, or MSVC have a remotely good implementation of modules that would be worth using for anything outside of a hobby project.

I agree with the other commenter who said modules are a failure of a feature, the only question left is whether the standards committee will learn from this mistake and refrain from ever standardizing a feature without a solid proof of concept and tangible use cases.


You should get in there and put all your expertise to work.

I did prior to 2017. I realized the committee was 75% politics and people with a lot of time and devotion pushing their pet projects, and about 25% about addressing actual issues faced by professional engineers and decided it was no longer worth the time and effort and money.

The committee is full of very smart and talented people, no dispute about that, but it's also very silo'd where people just work on one particular niche or another based on their personal interests, and then they trade support with each other. In discussions it's almost never the case that features are added with any consideration towards the broader C++ audience.


You implemented modules in 2017 and they didn't use it?

Today I learnt that Office is an hobby project.

You learned nothing because the extent of your knowledge tends to be rather superficial when it comes to C++.

Office does not use C++ modules, what Office did was make use of a non-standard MSVC feature [1] which reinterprets #include preprocessor directives as header units. Absolutely no changes to source code is needed to make use of this compiler feature.

This is not the same as using C++20 modules which would require an absolutely astronomical amount of effort to do.

In the future, read more than just the headline of a blog post if you wish to actually understand a topic well enough to converse in it.

[1] https://learn.microsoft.com/en-us/cpp/build/reference/transl...


My dear, I have written more C++20 modules code than you ever will.

Feel free to roam around on my Github account.

Also go read the C++ mailings regarding what is standard or not in modules.


MSVC now supports C17.

Does it support C99 with VLAs yet?

It supports C99 minus VLAs. Worth noting that only C99 mandates VLAs. They are optional in C11, C17, and C23.

MSVC is also made out of a dozen javascript processes which makes typing text need a beefy computer.

MSVC is a C++ compiler toolchain and it does not contain any JavaScript. You're thinking of VSCode, probably, but your comment was an off-topic rant either way.

Microsoft visual studio the IDE (not vs code the electron program) has lots of javascript processes running in the background doing all sorts of things.

Also my comment was a single sentence with a single fact so it can't be a rant.


My understanding is that ISS is not self-sustaining even in principle. It consistently needs to be resupplied with water and breathable air as the station continuously leaks it. These resupplies happen about once every month or two. This article goes into quite a few details about what would be needed for actual self-sustainable human space exploration and it looks like there's quite a few engineering challenges to work out.

https://www.frontiersin.org/journals/space-technologies/arti...


Not only that but they have to routinely boost its orbital velocity as there is still a little atmospheric drag at the height.

No they're not.

Being isomorphic is not the same as being identical, or substitutable for one another. Type theory generally distinguishes between isomorphism and definitional equality and only the latter allows literal substitution. So a Nine type with a single constructor is indeed isomorphic to Unit but it's not the same type, it carries different syntactic and semantic meaning and the type system preserves that.

Some other false claims are that type theory does not distinguish types of the same cardinality. Type theory is usually intensional not extensional so two types with the same number of inhabitants can have wildly different structures and this structure can be used in pattern matching and type inference. Cardinality is a set-theoretic notion but most type theories are constructive and syntactic, not purely set-theoretic.

Also parametricity is a property of polymorphic functions, not of all functions in general. It's true that polymorphic code can't depend on the specific structure of its type argument but most type theories don't enforce full parametricity at runtime. Many practical type systems (like Haskell with type classes) break it with ad-hoc polymorphism or runtime types.


This comment contains a lot of false information. I’m first going to point out that there is a model of Lean’s type theory called the cardinality model, in which all types of equal cardinality are modelled as the same set. This is why I say the types have no useful distinguishing characteristics: it is consistent to add the axiom `Nine = Unit` to the type theory. For the same reason, it is consistent to add `ℕ = ℤ` as an axiom.

> So a Nine type with a single constructor is indeed isomorphic to Unit but it's not the same type, it carries different syntactic and semantic meaning and the type system preserves that.

It carries different syntax but the semantics are the exact same.

> Type theory is usually intensional not extensional so two types with the same number of inhabitants can have wildly different structures

It is true that type theory is usually intensional. It is also true that two types equal in cardinality can be constructed in multiple different ways, but this has nothing to do with intensionality verses extensionality – I wouldn’t even know how to explain why because it is just a category error – and furthermore just because they are constructed differently does not mean the types are actually different (because of the cardinality model).

> Cardinality is a set-theoretic notion but most type theories are constructive and syntactic, not purely set-theoretic.

I don’t know what you mean by this. Set theory can be constructive just as well as type theory can, and cardinality is a fully constructive notion. Set theory doesn’t have syntax per se but that’s just because the syntax is part of logic, which set theory is built on.

> most type theories don't enforce full parametricity at runtime

What is “most”? As far as I know Lean does, Coq does, and Agda does. So what else is there? Haskell isn’t a dependent type theory, so it’s irrelevant here.

---

Geniune question: Where are you sourcing your information from about type theory? Is it coming from an LLM or similar? Because I have not seen this much confusion and word salad condensed into a single comment before.


I will let Maxatar responds if he wants to but I will note that his response makes much more sense to me than yours as someone who uses traditional programming language and used to do a little math a couple decades ago.

With yours, it seems that we could even equate string to int.

How can a subtype of the integer type, defined extensionally, be equal to Unit? That really doesn't make any sense to me.

> it is consistent to add `ℕ = ℤ` as an axiom Do you have a link to this? I am unaware of this. Not saying you're wrong but I would like to explore this. Seems very strange to me.

As he explained, an isomorphism does not mean equality for all I know. cf. Cantor. How would anything typecheck otherwise? In denotational semantics, this is not how it works. You could look into semantic subtyping with set theoretic types for instance.

type Nine int = {9} defines a subtype of int called Nine that refers to the value 9. All variables declared of that type are initialized as containing int(9). They are equal to Nine. If you erased everything and replaced by Unit {} it would not work. This is a whole other type/value.

How would one be able to implement runtime reflection then?

I do understand that his response to you was a bit abrupt. Not sure he was factually wrong about the technical side though. Your knee-jerk response makes sense even if it is too bad it risks making the conversation less interesting.

Usually types are defined intensionally, e.g. by name. Not by listing a set of members (extensional encoding) in their set-theoretic semantic. So maybe you have not encountered such treatment in the languages you are used to?

edit: the only way I can understand your answer is if you are only considering the Universe of types as a standalone from the Universe of values. In that universe, we only deal with types and types structured as composites in what you are familiar of perhaps? Maybe then it is just that this Universe as formalized is insufficiently constrained/ underspecified/ over-abstracted?

It shouldn't be too difficult to define specific extensional types on top, of which singleton types would not have their extensional definitions erased.


> Many practical type systems (like Haskell with type classes) break it with ad-hoc polymorphism or runtime types.

Haskell does not break parametricity. Any presence of ad-hoc polymorphism (via type classes) or runtime types (via something like Typeable, itself a type class) is reflected in the type signature and thus completely preserves parametricity.


I think what they meant is that it is not purely parametric polymorphism. Not that parametricity is broken.

Hmm, maybe

> most type theories don't enforce full parametricity at runtime

means "sometimes types can appear at run time"? If so it's true, but it's not what I understand by "parametricity".


Not sure, either : Parametric polymorphism is compile time. Adding runtime behaviors is a sort of escape hatch.

Which is fair although necessary. It's not so much a breakage as it is a necessity, since non parametric code may rely on this too.

Or maybe it is about constrained parametricity. In any case this doesn't seem a big issue.


Yes, the compiler contains an interpreter for the entire language, not just a subset. It is circular as you rightly point out, but there's no contradiction here.

Typically the interpreter limits itself to only evaluating pure functions or functions that do not perform any kind of IO, but in principle this restriction is not necessary. For example Jai's compiler allows completely arbitrary execution at compile time including even downloading arbitrary data from the Internet and using that data to make decisions about how to compile code.


Thank you for the explanation. This reminds me somewhat of an F# type provider, which can execute arbitrary code at compile time.

https://learn.microsoft.com/en-us/dotnet/fsharp/tutorials/ty...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: