Hacker News new | past | comments | ask | show | jobs | submit | more galdor's comments login

For anyone finding CLOS interesting, "The Art of the Metaobject Protocol" is the next logical step. A good way to dig deep into how CLOS is built.

It should be a mandatory read for anyone looking to design a programming language with object concepts.


I think I'd recommend reading Sonya Keene's Object-Oriented Programming in Common Lisp: A Programmer's Guide to CLOS first though. I really don't know any other ways of learning all the things like method combinations and the like that make the MOP so powerful. All the CLOS tutorials I am aware of lack a real discussion of these features of CLOS.


unrelated to your point, Sonya keene's book is the only one I know that was authored and typeset using Symbolics's in-house publishing system Concordia. (besides symbolics documentation sets of course)

Concordia (and its display component document examiner) had a pretty novel hypertext approach to authoring. all the content was made up of individual notes, that you could link together either with explicit links, or implicit one-follows-the-other links. I don't know if that's how Sonya used it, but when I authored some documents using Concordia, I discovered that it was very easy to focus on addressing specific points in almost a throwaway fashion. write a note on some subject you want to address. if you don't like it, write another take on the same subject. now you have the option to put either note into the book flow, refine them in parallel, and defer editorial decisions to the very end. the process is very reminiscent of explorative programming in lisp, so in other words symbolics people figured out how to write books in the same way as they write their code. ultimate vertical integration.


"Lisp Lore: A Guide to Programming the Lisp Machine" by Bromley&Lamson was also written using Concordia AFAIK. The Symbolics documentation also has Jensen&Wirth "Pascal User Manual and Report" and Harbison & Steele, C: A Reference Manual.


> Symbolics's in-house publishing system Concordia

What you describe regarding Concordia sounds very intriguing. I can search for information on Concordia, but do you have any goto references (books, papers, videos) on it that you'd recommend?


I once recorded a demo video, running Concordia on an actual Symbolics Lisp Machine.

https://vimeo.com/83886950


Cool video, interesting. Looks though like a very complicated way of creating and entering the content, compared to just typing in a buffer with some markup language. When I saw the video first I thought it was just textual hyperlinks, sort of what we with clickable text in Emacs. Info and help-mode are Emacs apps that uses it quite heavy. But then I looked up Concordia on Wikipedia, and I see it has same ancestor as texinfo :).

But very interesting to see, I'll watch your other videos when I have more time, it is a bit of history. Thanks for recording them and uploading videos. Is that machine still alive and running, or did you record it on your Linux port?


One doesn't need to use the menus, one can use the keyboard commands. The UI provides several ways to interact: key commands, interactive command line and menus.

> I see it has same ancestor as texinfo

Genera comes with a Scribe-based markup language and formatter.

> Is that machine still alive and running, or did you record it on your Linux port?

I made this video years ago on Lisp Machine. The new emulator for the Mac & Linux is many times faster and runs silent on something like a MacBook... Thus it's ,uch more convenient to use that for a demo, unless the software does not run there. The emulator has its own native code format and, for example, lacks emulation of the console hardware (graphics hardware).


> One doesn't need to use the menus, one can use the keyboard commands. The UI provides several ways to interact: key commands, interactive command line and menus. > Genera comes with a Scribe-based markup language and formatter.

You mean, the humanity has not gone too far away when it comes to computer-human interaction back from those days? :-). Just kidding; that sounds like they were quite modern back in 80's. I saw the other video on YT about their graphics software and hardware. While it looks relatively simple compared to modern image editors, modellers, fx and animation packages, it still feels like they had all the right ideas. What do you think put them out of the business? Just the economy or some other more technical reason?

> The new emulator for the Mac & Linux is many times faster and runs silent on something like a MacBook.

Yeah I saw another video, and saw "machdd" or something similar on the modeline somewhere, so I assumed you made it on a mac.

> lacks emulation of the console hardware (graphics hardware)

That explains why all the demos are black and white.

I don't have so much time to install and configure virtual machines and programs, but one beautiful day I'll try it, just for the curiosity; I have seen the repo on GH.


> What do you think put them out of the business? Just the economy or some other more technical reason?

The main reason was the end of the cold war and the end of the high-tech war. Means there were too few commercial customers. Where they had commercial customers (like the Graphics & Animation business), there was a disruption by other technology, like SGIs (RISC CPUs with powerful graphics accelerators) and also Windows NT. The graphics software was sold to Nichimen and ported to SGIs and Windows NT.

> That explains why all the demos are black and white.

All the Symbolics early consoles were black & white, so all the software was using b&w. Typically the machines had an additional color screen, then with an additional color graphics board. All driven by Lisp. But the megapixel color screens and graphics boards were very expensive. They also might have been too slow to use as an interactive console screen.

The emulator support graphics. It's X11 and one can use color graphics, but the graphics & animation software hasn't been ported to X11 AFAIK. It's just that the normal tools don't use color in their UI, though there were applications which used color.

> I have seen the repo on GH.

Don't expect too much. That's an old, unsupported emulator, which has a bunch of technical problems.


> Means there were too few commercial customers. Where they had commercial customers (like the Graphics & Animation business), there was a disruption by other technology, like SGIs (RISC CPUs with powerful graphics accelerators) and also Windows NT. The graphics software was sold to Nichimen and ported to SGIs and Windows NT.

And now even SGI is out of business. It is a little bit unfortunate, but I think there is a history lesson to learn. Symbolics, SGI, Sun, Xerox, IBM, AT&T, they are all gone from the software business, more or less. I mean IBM, AT&T and Xerox are alive, but they are just a shadow of former self they once were, at least on the software front. Seems like all companies that target high-level industry with big profits, and ignores the consumer market are fading away.

Compare that to Microft which exploded in market share after their Dos/Windows and Intel which exploded after their 8086/8088. It just shows how important it is to put the technology out to consumers. Not because mass consumers will create so much value, they will that too of course, but foremost they will learn how to use the technology and once they come to businesses and have to solve problems, they will use it. I think that is a problem Symbolics faced. They run on dedicated hardware that probably was a multum in price and was used for specialized problems, while worse technology was cheaper and more accessible. People used what was accessible and when a generation grew up and went to work of course it is cheaper to let them use what they have learned then to buy specialized hardware and train them in specialized language. I think same thing happened to SGI when big graphics software names released their software for Windows. I think it is a circle, or a rolling stone. It is important to put the technology and knowledge out in the hands of people.

It is a bit sad that LW and Franz are keeping their software behind the locked gates instead of letting them out in the free. I bet some middle-tier manager is sitting at the Boeing as of this writing and trying to figure out how to save $$$ by cutting out that crazy expensive expert-knowledge Lisp thing out of their software stack, just to save some $$$ and get promotion or a bigger bonus.

If LW and Franz are going to survive and not go same way as Symbolics, Sun & Co, they should probably rethink their strategy of licensing their stuff free for GPL/non-commercial use, similar as Qt and some other companies do. Perhaps SBCL is good enough, but Lisp community needs more and better tools. In expert hands Emacs is s superb tool, but it is not the average mass tool.

It is a bit shame. I think Lisp is such a great tool for software engineering and applications development, but it is so underused because the knowledge pool is so small and the best tools are locked away behind the pricey tag seems like. If/when those two guys are gone, LW and Franz, Lisp will be seen even more as an academic exercise rather than a useful practical tool.

I don't know, perhaps I am wrong, just thinking loud.


> If LW and Franz are going to survive and not go same way as Symbolics, Sun & Co, they should probably rethink their strategy of licensing their stuff free for GPL/non-commercial use, similar as Qt and some other companies do.

I'd think there were like 30 commercial implementations of Common Lisp. LispWorks and Franz are still alive. None of the others are. Personally I'm happy that they exist and fear what you propose would kill them very quick. There are also a few inhouse implementations "alive".

> and the best tools are locked away behind the pricey tag

If there would be a business opportunity someone else can pick it up.

LispWorks reused bits and pieces of CMUCL. But the stuff they've added provided real value: robust ports to different platforms, an extensive implementation and a portable GUI layer.

If someone sees a business, they could easily layer something on top of SBCL or provided other improvements.

Scieneer tried that with CMUCL by adding concurrent threading for multi-core machines.

Clozure CL was alive while there was the expertise of the old hackers. Once they were gone it was difficult to keep it ported to new platforms and to fix hairy bugs. They tried to have a business model with an open sourced variant of Macintosh Common Lisp.

We'll see dev tools financed by big companies like Microsoft, Oracle, Apple, Google, ...

Then there are a bunch of companies trying to provide tools (Intellij, ...) or alternative languages (Scala, ...).

The specialized niche languages tend to have capable implementations with large price tags. See for example the commercial Smalltalk implementations of Cincom or SICStus Prolog ( https://sicstus.sics.se/order4.html ).

Other financing models tend to be fragile... or depend on other sources... like research/academic funding.


> LispWorks and Franz are still alive.

Yes, but what makes us believe they won't go the same faith as all the others?

> fear what you propose would kill them very quick

Why? Is their technology that bad? I don't think so. I think they need to let people use their stuff, show to the masses the good things about Lisp and their tools and let the masses learn how to use those tools.

> If someone sees a business, they could easily layer something on top of SBCL or provided other improvements.

It is not just. Layering something on top is a lot of work :). It is software, everything is possible, but who has the manpower and time? However people do layer stuff on top. We do see people making cool and interesting stuff, but it is relatively few enlightened persons. We don't see a mass movement; perhaps masses need to see a "killer app" or something, like when JavaScript got Chrome and went from verbotten on every computer to No. 1 language (more or less)?

I think the business is made by giving the technology to people, and letting them use it. Once it is in use by individuals and people realize the power of their tools, I believe it will see more use in businesses too. As I said; I think it is a "vicious circle" as they say here in Sweden.

> Once they were gone it was difficult to keep it ported to new platforms and to fix hairy bugs.

Yes, that is a normal thing. Once the experts are gone, the platform is dead. To survive, a platform needs to attract new people and bring them up to expertise level. But every platform also has to adapt and as well to be well documented. I wish to make an extension for sbcl, I don't see any writing anywhere on how to proceed, so I have to look through the source code. It is not impossible, but is more tedious than it perhaps should be. The best thing with Emacs is actually the documentation, and the openness. I think. I don't know for sure, but it seems so.

> The specialized niche languages tend to have capable implementations with large price tags. See for example the commercial Smalltalk implementations of Cincom or SICStus Prolog

Anyone still using Smalltalk? :) Yes, of course, I agree. What you describe is how the business was done, but I don't think the history speaks in that favor. I don't know, all this are just my speculations of course, but I think that is an obsolete view on the business. It seems that all those specialized companies that target big biz are sooner or later out of that biz. What would it mean for Franz to loose Boeing? Probably quite a lot. I don't know. I am not sure Oracle is doing that great as they did in the past either. There is a lot of inertia too; big customers can't easily switch. I mean, Cobol is still there, but it does not mean Cobol as a platform is doing well.

> Other financing models tend to be fragile... or depend on other sources

I didn't mean they should switch to some other form of making business and financing. I just mean they should let their tools go free for the masses, for people making open source code for non-commercial use. For businesses they should still sell and charge of course. I don't know, perhaps I am wrong; but which hobbyist like me will pay LW 400€ for the basic license? Or whatever is the price now. They wont make money out of hobbyists, students, indie devs and alike, so they can as well let those people use the thing for free for non-commercial and open source use like other companies do. Perhaps if they have good tooling, gui builders and so on, people will use it to create some interesting stuff, more people would learn how to use their software, and that I mean would play in their favor in the long run.

By the way, to mention is that Microsoft, Oracle, Apple, Google gives away their tooling basically for free, because they have realized that they want people to learn and use their tools. I don't know, perhaps there is not money in traditional tooling like GUI builders at all in the world of web technologies.

Don't get me wrong, I don't wish them anything bad, but I think that the Lisp community is so small and fragile, and it would perhaps help with the better tools. I think is a shame because it is a nice language for rapid prototyping and development with lots of potential that the humanity perhaps is slowly loosing; but now I am perhaps bit too dramatic and cheesy :).


> Yes, but what makes us believe they won't go the same faith as all the others?

Switching to an open source model would mean that only 1% of the users would pay anything. To make that viable you need a good business model for a much larger or a different market. The business model is vastly different with a vastly different product.

Just open sourcing the thing is an easy way to kill an existing company, which now has a small, but paying customer base.


No, no; it wasn't was I meant; I clarified in the last one; release things for free for already non-paying customers for non-commercial use only, Qt style. I am hard to believe Boeing would count in that group :).


How do you enforce that? Experience shows that most companies ignore that code is only for non-commercial use and it's hard to enforce for small companies.

I also doubt that all customers are like Boeing.


The issue with console hardware is that some applications use old code paths that directly call into the most basic framebuffer Symbolics Lisp Machines had, and that is not supported on X11 - thus said software doesn't work neither on OpenGenera, nor on UX400, UX1200, and NXP1000 physical lisp machines (all of which lack console hardware and use X11.

The only exception is AFAIK MacIvory, which has special subsystem that emulates console hardware on top of Macintosh toolbox calls done over Sun RPC.


The MacIvory has the option to use a NuBUS color graphics card directly from Lisp: a NuVista board.


Yes, but that's used by COLOR package, whose interfaces are IIRC mostly available on X11 client as well (minus acceleration).

The TV package in the full was redirected only on MacIvory, not on Unix-based setups, notes from OpenGenera suggest that the plan was to fix the application packages to use new interfaces instead.


Thank you!


I'm glad Rainer recommended his video, because I learned Concordia purely by word of mouth and exploration. There's a paper by Janet H. Walker, the principal on Document Examiner/Concordia for a hypertext convention https://dl.acm.org/doi/10.1145/317426.317448, but it's more about DE.


Agreed. It's easier to learn how to do practical programming with CLOS from Keene's book. AMOP is more useful for those who want to know more about how things work under the hood, or who maybe want to build their own implementation of CLOS or something similar, and easier to understand after you've absorbed Keene.


The book "Object-Oriented Programming: The CLOS Perspective" is a better intro than just diving into AMOP. It's essentially a collection of papers, but it really helps contextualize a lot of the early design thinking, gives a gentle introduction to the MOP, compares CLOS with other languages including C++, and showcases some application uses back then.


One of the core points about metaobject protocols is that while it's important to language design, a good language to me also exposes & makes flexible the object system to users of the language too.

From a review (https://www.adamtornhill.com/reviews/amop.htm),

> The metaobject protocol behaves like an interface towards the language itself, available for the programmer to incrementally customize. Just as we can specialize and extend the behavior of the classes we define in our own applications, we can now extend the language itself using the very same mechanisms.

One of the points that has rather surprised me is that we devs have not more broadly explored what we could do with our metaobject protocols. We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

The one sizable exception I have on my radar is React's Higher Order Components, where components/classes were wrapped/composed in other classes. Slices of object behavior were spliced into place.

Now that's replaced with hooks, which invert the relationship, making the function up front composed all behavior it might want as hooks that it calls.

I don't know enough about how Rust macros are made & used. My vague impression is they are very scarcely considered. Maybe I just missed the discussions but I'd expect there to be lots of blogging about this topic if metaprogramming here was really as fertile a field here as to be expected.


> One of the points that has rather surprised me is that we devs have not more broadly explored what we could do with our metaobject protocols. We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

The other day I showed some newer devs how they could replace a good 10+ lines of for/index/loop/conditional code whose purpose was to find either the next or previous elements following an element that matched a condition with wrap around semantics. I did it using a zip and a filter. Good old “functional” approach. That’s cool they said, and then got rid of me as quick as they could. They had been close to pushing their changes. And indeed did.

It used to be the case that a sizable number of my peers were interested in furthering their craft. Those that “just ship it, it works, ok?” We’re tolerated. Now it seems that enthusiast peers that are interested in “exploration” and “discovery” are few and far between. I’m surrounded by people that do their hours and clock out tickets.

So no, I am no longer surprised by this lack of meta programming interest. Or any other “higher level” linguistic pursuits.


You might find this interesting:

Metaobject Protocols for Julia https://drops.dagstuhl.de/opus/volltexte/2022/16759/pdf/OASI...


> We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

As user of Java, .NET and C++ ecosystems, I beg to differ, those subjects are quite present.


Agreed - another good resource is the original Flavors paper. Many of the parts are there like multiple inheritance and method combination but without the MOPs and multiple dispatch.

https://www.softwarepreservation.org/projects/LISP/MIT/nnnfl...


The trick with online platforms is to edit messages, replacing the content with a random string (this script supports both ".", a random string or a fixed string of your choosing).

Most web apps will keep a copy of messages you delete, but they usually do not save an history of every modification.


Unfortunately most "web-scale" apps I've worked on are basically immutable (aside from retention policies). You just keep appending forever because updates are too expensive. So more than likely your comment history will exist on Reddit's servers but they use a clever "GROUP BY" semantic on read to only return the most recent version.


We are at least raising the cost of them storing and sorting through versions then.


1. Shredding does this edit. 2. I think this is a myth and they still keep the original message.


This is the only reason I haven't tried to use Guix. I love the concept, I want to use it, but it does not run on FreeBSD. And I would not invest time in a packaging system that works on my workstation but not on my servers.

It is sad to see so many Linux only software, especially when there is no technical reason not to support *BSD.


> when there is no technical reason not to support BSD.

Guix' packages include glibc in their dependency chain, so unless you can use glibc on BSD, there is your technical reason. For example GNU Hurd is supported in addition to Linux.


I feel this. My love for BSD almost matches my love for Nix, it's unfortunate I have to choose one or the other.

I'm aware Nix is closely integrated with SystemD, so availability on BSD will probably never happen (though perhaps you could use the Darwin code as a starting point?), but I'd be curious to know what in particular limits Guix from being ported.


Wait it's integrated with SystemD? What is a package management system doing with the init process? I now remember that Guix is a daemon, something that made me really wary at the time; but a SystemdD dependency is something else entirely.


NixOS can be validly viewed as a very fancy Systemd configuration toolkit. If you think about it, ultimately an instance of a modern general-purpose OS is structured as a set of services which are described for and orchestrated by, well, something. In the case of NixOS (not Nix in general!) that something is Systemd (which also happens to be used by all other major Linux distros); in the case of GuixSD that something is GNU Shepherd (which also happens to be used by GNU Hurd)


Nix is also generally run as a daemon. It doesn't have a dependency on systemd and it does run on MacOS. I think also it runs on FreeBSD in principle.

The parent poster was probably conflating NixOS and Nix. NixOS is a way of using nixpkgs to build/configure and install/upgrade an operating system based on Linux and systemd. But you are not obligated to use systemd just because you're using nixpkgs.

NixOS uses sytemd for essentially the same reason other Linux distributions do: most user environments are now built with the assumption that systemd underlies them if Linux does as well. In principle you could fork nixpkgs and make all the necessary changes to remove the systemd assumption - managing a NixOS system and hacking on nixpkgs are basically the same skillset so it's not like managing an Ubuntu system where you would have to become familiar with how Ubuntu gets built. With nixos, the command is the same whether you're building an entirely new distribution based on a custom fork or whether you're just downloading builds of the latest security upgrades. Aside from the actual energy it takes to maintain a second system manager, you would only have to wait for the builds to complete rather than learning how to build everything. But so far no-one who has tried that has got enough traction - the cost-benefit just isn't there. Honestly, I think this is part of the reason people are offended by NixOS in particular using systemd: it feels like it should be possible for you to remove the systemd dependency, but you haven't done it.


Nix isn't, but NixOS is, if I'm not mistaken.


Sorry, yes, I was mixing up Nix and NixOS :)


The most surprising aspect of REDUCE is the way the callback can be called depending on both the length of the list and the presence of absence of an initial value.


Author here. I double checked the Hyperspec [1] and you're quite right. I'll fix the article tomorrow, thank you for noticing!

[1] http://www.lispworks.com/documentation/lw60/CLHS/Body/f_redu...


I don't know if it will fix your font size problem, but this is what I have in my userContent.css for Firefox:

    @-moz-document url-prefix(https://news.ycombinator.com) {
      body {
        background-color: #d4d4cb;
        margin-top: 0;
      }
    
      #hnmain {
        max-width: 960px;
      }
    
      .title {
        font-size: 1.5rem !important;
      }
    
      body, td, input, textarea,
      .default, .admin, .subtext, .yclinks, .pagetop, .comment, .hnname {
        font-size: 1rem !important;
      }
    
      .comhead {
        font-size: 0.8rem !important;
      }
    }

Using rem units means it respects the font size defined in my Firefox configuration. The rest is mostly about avoiding ultra wide content.


I don't know about Outlook and others, but you can still use any client you want, both for IMAP and SMTP access, using application passwords[1]. I have been doing that for years, both for Gmail and Google Workspace, without any issue.

[1] https://support.google.com/accounts/answer/185833?hl=en


Is there an option for the folks on the security that disables app passwords?


Thanks, that looks like a good solution.


My understanding is that the budget required to build a game perceived as "modern" is higher and higher each year, meaning that studios need to target a very large market to be profitable. This is for example why we don't see complex RTS and arena shooters anymore: their market is too small to pay for the investment. The 50+ years old video game market is tiny, and you'd need very different marketing (compared to what works with the usual 15-25yo segment) to target it.

It is true that lots of indie games find success with smaller budgets. But I suspect that lots of indie developers are not 50+ years old. Game development is mostly a craft, and unsurprisingly indie game developers prefer to work on projects similar to what they themselves enjoy.


I'm not sure the market is as tiny as you think. 4 out of 10 adults over 50 play on average 12 hours per week. By reading the article it seems that many are casual gamers who use games to relax, and women are the most frequent players. I suspect you will also find a large portion of 30+ gamers playing the same type of games.

To me it seems like a large enough market to target. Candy crush has nailed the target group (Women 35+) and is making lots of money.


    The 50+ years old video game market is tiny
I know this wasn't your main point, but this is very much not the case (source: have worked in games for 13 years).


there still are plenty of RTS, and probably also arena shooters, depending on what your requirements for inclusion are, coming out. maybe you don't see them because they are drowned out by other stuff, but there still are. (for instance, AOE4 is recent, unreal tournament is soon releasing I think)


A programming language is made of a long list of trade-offs, meaning that you could invent hundreds of new ones without really repeating yourself.

Please go and invent new languages! Yes, most of them will be ill-designed and will not go anywhere, but this is how you make progress.

As always the Internet will scream at you that you are wasting your time, or that you are somehow fragmenting the ecosystem, but this is the usual case of strangers feeling that they should have a say in what you work on. Ignore it.

Personally I'd love to see two languages:

- A modern Lisp. Builtin concurrency (green threads of course), gradual typing, native compiler. And I more and more feel like it should be a Lisp-1 (please don't cancel me). It should be the perfect language for exploratory programming and of course symbolic programming.

- An improved Erlang. Proper records and strings, modern tooling, modern standard library. And please make it decently fast. It should be the one obvious choice to write server software.

Hopefully I someday will get the time to work on it.


I’m curious, what works better than Twitter for this kind of use case? Mastodon is better than Twitter for me, but it still is very slow.


I'm not on Mastodon. So that works?

Actually, this site is usually the best, and Facebook (the latter might reflect my audience).

Reddit subs tend to censor any attempt at self-promotion, which you can't blame them for, I guess. And StackExchange is the absolute worst. The level of asshole-ness there has to be seen to be believed.


Is it really asshole-ness?

YC/Reddit/SE being good/bad/worse for self-promotion seems to be 100% in line with the intended use of these sites.


I'm hardly the first person to single out SE / SO for trashing.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: