Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Haha, thanks for the plug, Don!

I just fleshed out the README for my Croquet resurrection yesterday so others may have an easier time trying it. It maybe even contribute :)

https://github.com/codefrau/jasmine



Vanessa, it has always amazed me how you managed to square the circle and pull a rabbit out of a hat by the way you got garbage collection to work efficiently in SqueakJS, making Smalltalk and JavaScript cooperate without ending up with two competing garbage collectors battling it out. (Since you can't enumerate "pointers" with JavaScript references by just incrementing them.)

https://freudenbergs.de/bert/publications/Freudenberg-2014-S...

>The fact that SqueakJS represents Squeak objects as plain JavaScript objects and integrates with the JavaScript garbage collection (GC) allows existing JavaScript code to interact with Squeak objects. [...]

>• a hybrid garbage collection scheme to allow Squeak object enumeration without a dedicated object table, while delegating as much work as possible to the JavaScript GC,

Have you ever thought about implementing a Smalltalk VM in WebAssembly, and how you could use the new reference types for that?

https://bytecodealliance.org/articles/reference-types-in-was...


I would like to speed up some parts of SqueakJS using web assembly. For example BitBlt would be a prime target.

For the overall VM, however, I’ll leave that to others (I know Craig Latta has been making progress).

I just love coding and debugging in a dynamic high-level language. The only thing we could potentially gain from WASM is speed, but we would lose a lot in readability, flexibility, and to be honest, fun.

I’d much rather make the SqueakJS JIT produce code that the JavaScript JIT can optimize well. That would potentially give us more speed than even WASM.

Peep my brain dumps and experiments at https://squeak.js.org/docs/jit.md.html


>Where this scheme gets interesting is when the execution progressed somewhat deep into a nested call chain and we then need to deal with contexts. It could be that execution is interrupted by a process switch, or that the code reads some fields of thisContext, or worse, writes into a field of thisContext. Other “interesting” occasions are garbage collections, or when we want to snapshot the image. Let's look at these in turn.

This sounds similar to Self's "dynamic deoptimization" that it uses to forge virtual stack frames representing calls into inlined code, for the purposes of the debugger showing you the return stack that you would have were the functions not inlined.

I always thought that should be called "dynamic pessimization".

Debugging Optimized Code with Dynamic Deoptimization. Urs Hölzle, Craig Chambers, and David Ungar, SIGPLAN Notices 27(7), July, 1992.

https://bibliography.selflanguage.org/dynamic-deoptimization...

That paper really blew my mind and cemented my respect for Self, in how they were able to deliver on such idealistic promises of simplicity and performance, and then oh by the way, you can also debug it too.


Absolutely. And you know Lars Bak went from Self to Strongtalk to Sun’s Java Hotspot VM to Google’s V8 JavaScript engine.

My plan is to do as little as necessary to leverage the enormous engineering achievements in modern JS runtimes.


Glad I asked! Fun holiday reading to curl up with a cat to read. Thanks!

I love Caffeine, and I use Craig's table every day! Not a look-up table, more like a big desk, which I bought from him when he left Amsterdam. ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: