Hacker News new | past | comments | ask | show | jobs | submit | hnlmorg's comments login

I owned one of these lights before it went viral and it was a nightmare to install. The thing doesn’t screw into the ceiling like every other light figure does. Instead you install a hook and dangle the damn thing off the hook. Which means the plastic surround never goes flush with the ceiling.

Even when you do finally get it flush after several painful iterations of hanging it, gravity stretches the cord causing the base to come slightly away from the ceiling again.

If you’re OCD like me, it made the light a horrible reminder of that OCD. So in the end I gave up on the light.

Pity because it’s a really cool looking light.


Never seen a light fixture that screws into the ceiling.

Some fixtures screw into an electrical box attached to the studs in the ceiling. The IKEA SIMRISHAMN pendant does that, for example.

Looking at the Ikea US website, the Simrishamn and PS 2014 seem to have similar solutions: a plate that screws into an electrical box and provides a hook (the lowest common international denominator).

What do you do if you want to move a ceiling light a bit to the side? Do you install an entire new electrical box?


If those electrical boxes are anything like the ones in the video I shared, then they’re trivial to move.

You don’t actually need the box though. In fact they weren’t even available in my previous two homes. It’s really more a convenience thing than anything.

But again, this is assuming we are talking about the same thing (region differences and all).


In the US, NEC and most local codes (which are often based on some version of the NEC) require that connections be made inside a box. This is largely because connections are the most likely place for an electrical fire to start and the box helps contain it.

I can’t speak for where you are, but it’s the norm in the uk. Eg around 4:25 in this video https://youtu.be/WZizlnLfLks?si=LjRI1EWIHhn6Ktgx

When I bought the Ikea light, it was just hook and no way to fix the plastic surround to the ceiling.

Ikea might have updated the light since then though. As I said before, I got the light when it was new, long before it went viral, and ikea might have tweaked the design since.


That doesn’t mean what everyone is familiar with it. For example I’ve been around since internet slang first developed a life of its own. And yet I wasn’t immediately familiar with SJW either.

By the time you commented you could have at least searched for the acronym or asked AI.

I wasn’t the one who asked.

But even if I were, you’re not accounting for the cumulative benefit saving others from having to research the same acronym.


Let’s get real, they can search. HN doesn’t have a repo of acronyms and this isn’t a technical document where you need to spell out the acronym on first use

It doesn’t matter if it’s a technical acronym or not.

You’re making a lot of effort here to claim that people should already know this when the evidence here (of people asking what it means) demonstrates that it’s not universal.

For the record, I don’t actually mind people not spelling it out on first use if the acronym is guessable from the context of the comment (which, ironically, a lot of technical acronyms are). But in the case of the OP, you wouldn’t know what SJW was u less you already knew what a SJW was. So it’s not an unreasonable request from the GP. And frankly, the comments criticising them for asking is really unfair. They have just as much right to ask as you do to say “it’s common” and the OP had to use that acronym in the first place. Let’s all just be nice rather than moaning that someone didn’t memorise some specific piece of tribalism before coming to HN.


In 2025, most online users have learned how to look things up using.. the internet.

Of course, but if everyone does it, it is very inconvenient to read and in some case leaves unnecessary space for misunderstanding. Usually, acronyms are followed by the full wording the first time they are mentioned.

> Usually, acronyms are followed by the full wording the first time they are mentioned.

I'm sure they did at the time this (SJW) acronym got popular. That was maybe 10-15 years ago.


SWJ wasn’t a popular acronym 10 years (let alone 15) ago in the online communities I hung around in. ;)


You’re questions are already answered in the article:

1. The items were on display in this bedroom

2. The quantities were so small that they were deemed safe to eat.

This sounds like more of a case of the border force wanting to raise awareness rather than any actual danger being presented


The article only said that his solicitor (lawyer?) described the quantities as being so small they were safe enough to eat.

I read some more about it (Guardian) https://www.theguardian.com/australia-news/2025/apr/11/scien... and entirely agree with you that the border force over-reacted, and could have spent the money and resources more effectively than by pursuing this.

Also, via Guardian, this attitude is demeaning and depressing:

>"At a sentence hearing in March, the lawyer described Lidden as a “science nerd” who committed the offences out of pure naivety. “It was a manifestation of self-soothing retreating into collection; it could have been anything but in this case he latched on to the collection of the periodic table,”


I’m going through exactly this joy with a client right now.

“We need SQL injection rules in the WAF”

“But we don’t have an SQL database”

“But we need to protect against the possibility of partnering with another company that needs to use the same datasets and wants to import them into a SQL database”

In fairness, these people are just trying to do their job too. They get told by NIST (et al) and Cloud service providers that WAF is best practice. So it’s no wonder they’d trust these snake oil salesman over the developers who asking not to do something “security” related.


If they want to do their job well, how about adding some thinking into the mix, for good measure? Good would also be,if they actually knew what they are talking about, before trying to tell the engineers what to do.

> If they want to do their job well, how about adding some thinking into the mix, for good measure?

That’s what the conversation I shared is demonstrating ;)

> Good would also be,if they actually knew what they are talking about, before trying to tell the engineers what to do.

Often the people enduring the rules aren’t supposed to be security specialists. Because you’ll have your SMEs (subject matter experts) and your stockholders. The stakeholders will typically be project managers or senior management (for example) who have different skill sets and priorities to the SMEs.

The problem is that when it comes to security, it’s a complicated field where caution is better than lack of caution. So if a particular project does call on following enhanced secret practices, it becomes a ripe field for snake oil salesman.

Or to put it another way: no company would get sued for following security theatre but they are held accountable if there is a breach due to not following security best practices.

So often it doesn’t matter how logical and sensible the counter argument is, it’s automatically a losing argument


They don't want to do their job well. They want to look like they're doing their job well, to people who don't know how to do the job and whose metrics are completely divorced from actual merit.

I’ve just installed Zed based on your recommendation and I’m already impressed.

It’s fast, the interface is distraction free and it already has support for all the languages I use regularly. Even Terraform support, which is notoriously hard to get right, is better than the current “best” in VSCode.

Thanks for the recommendation


Glad you like it! It’s a proper native app (no Electron) and super responsive. I truly enjoy using it. And yes, the language support (via Treesitter and LSPs) is fantastic.

“The year of the Linux desktop” has always been a stupid statement because it never quantifies what the success criteria is.

For example, we now have first class games support via Proton. First class application support via Electron and other web technologies. Linux used in schools via Chromebooks. Etc

Linux was never going to be Windows-killer but I’m constantly amazed at just how easy it is to use vanilla GNU Linux in a variety of previously closed domains and how Linux has taken over as the de facto base for many commercial systems too (phones, tablets, Chromebooks, smart TVs, set top boxes, etc.

There’s also plenty of OEMs that support and even ship Linux systems. And that would have been unthinkable to anyone who lived through the 90s and saw how MS penalised OEMs and retailers for shipping non-MS OSs.

So at what stage do people say “Linux desktop has picked up”?


The Linux kernel is a beast of an engine at the heart of all sorts of things, from small to large.

But the "desktop" itself refers to the GNU Linux userspace, which has plenty to criticize it for (with that said, I personally find windows to be worse on many counts). Desktop OSs are a generation behind mobile OSs, and they have a really hard time making that jump, with possibly OSX being the closest to it. They have a terribly insecure "security" model (compare the number of vulnerabilities per user for a desktop OS vs mobile - especially considering that they something like Linux desktop is barely targeted compared to the billions of android users) where your user usually runs your applications - this worked in the age of huge servers with lots of terminal users connected, where the number of processes running for=as the user were readily inspectable (due to their low number and being directly started by the user). But with applications we have tens of thousands of threads/processes running simultaneously. The processes are running by me (and thus can do everything I can), but not directly for me. The sane thing to do would be to run them in a sandbox, basically what android does (runs them as generated "system" users, and has a well-defined IPC architecture to cut holes only where necessary).


The year of Linux desktop already happened in 2006.

That's when I switched to it full-time on my desktop and never looked back. It's the only success criteria I care about :)


Do people still use desktops?

To answer your question.


"Desktop Linux" includes laptops, so yes.

I agree with your overall point but the following comment is unnecessary:

> People working on "Linux" phones as anything more than a diversion (why not play Factorio instead?) are wasting their time.

People are free to spent their free time however they want. Some people view building things, whether it’s furniture or software, more enjoyable than playing computer games or watching TV.


You’re not looking very hard then because there are loads of new games released both with VR support and also exclusively for devices like the Oculus Quest.

What we’ve seen less of is AAA games bolt on VR support as an afterthought - and the reason for that is because it’s almost always a terrible way to play a game that was originally designed to be played with a keyboard and mouse, or traditional game controller.


Literally the first line of the README explains why:

> This folder contains the programs found in the March 1975 3rd printing of David Ahl's 101 BASIC Computer Games, published by Digital Equipment Corp.


Pipelining is just syntactic sugar for nested function calls.

If you need to handle an unhappy path in a way that isn’t optimal for nested function calls then you shouldn’t be nesting your function calls. Pipelining doesn’t magically make things easier nor harder in that regard.

But if a particular sequence of function calls do suit nesting, then pipelining makes the code much more readable because you’re not mixing right-to-left syntax (function nests) with left-to-right syntax (ie you’re typical language syntax).


I think they are talking about nested loops, not nested function calls.

Nested loops isn’t pipelining. Some of the examples make heavy use of lambda so they do have nested loops happening as well but in those examples the pipelining logic is still the nesting of the lambda functions.

Crudely put, in C-like languages, pipelining is just as way of turning

  fn(fn(fn()))
Where the first function call is in the inner, right-most, parentheses,

into this:

  fn | fn | fn
…which can be easily read sequentially from left-to-right.

Pipelining replaces several consecutive loops with a single loop, doing more complex processing.

Pipelining doesn’t do anything with iterating. It’s entirely about linking nested functions.

What you’re looking at is loops defined inside lambda functions. Pipelining makes it much easier to use anonymous functions and lambdas. But it doesn’t magically solve the problem of complex loops.


It kind of is something related to loops if the language supports object iterator interfaces (they bridge OOP to classical constructs like for/foreach). Or maybe even generators.

It does not solve it magically, but it does give the programmer options to coalesce different paradigms into one single working implementation.


Sure, but again, you’re talking about using either lambda functions or object methods.

The crux of the “magic” with pipelining is chaining functions. How those functions are composed and what they do is a separate topic entirely.


It's only magic if you don't understand how it works.

To me it is awkward to describe but simple to understand. Lucky me I have no intention of describing it.


> It's only magic if you don't understand how it works.

That’s why I used scare quotes around the term ;)

> To me it is awkward to describe but simple to understand.

It’s not awkward to describe though. It’s literally just syntactic sugar for chaining functions.

It’s probably one of the easiest programming concepts to describe.

From our conversation, I’m not sure you do understand it because you keep bringing other tangential topics into the fold. Granted I don’t think the article does a great job at explaining what pipelining is, but then I think it’s point was more to demonstrate cool syntactic tricks you can pull off when writing functions as a pipeline.

edit: just realised you aren't the same person who wrote the original comment claiming pipelining was about iteration. Apologies for getting you mixed together.


Pipelining, as discussed in the linked article, is about iteration. Just look at the code example at the beginning of the article:

data.iter()

.filter(|w| w.alive)

.map(|w| w.id)

.collect()

is one loop, as opposed to

collect(map(filter(iter(data), |w| w.alive), |w| w.id),

which is three loops.

Did you notice four letters 'i', 't', 'e' and 'r' in the code, followed by two round brackets? They mean "iterator".


If you're going to be snarky then at least get your facts right.

`Iter` is a method of `data`. And do you know what a method is? It's a function attached to an object. A FUNCTION. Pipelining is just syntactic sugar around chaining functions.

You even proved my point when you quoted the article:

    collect(map(filter(iter(data), |w| w.alive), |w| w.id))
Literally the only thing changing is the syntax of the code. You've got all of the same functions being called, with the same parameters and in the same order.

The article itself makes no mention of this affecting how the code is executed either. Instead, it talks about code readability.

In fact the article further proves my point when it says:

> You can, of course, just assign the result of every filter and map call to a helper variable, and I will (begrudgingly) acknowledge that that works, and is significantly better than trying to do absurd levels of nesting.

What it means by this is something like the following:

  list = iter(data)
  list = map(channel, |w| w.toWingding())
  list = filter(list, |w| w.alive)
  list = map(list, |w| w.id)
  result = collect(list)
While I do have some experience in this field (having written a pipeline-orientated programming language from scratch), I'll cite some other sources too, so it's not just my word against yours:

+ Wikipedia: https://en.wikipedia.org/wiki/Pipeline_(computing) (no mention of iteration, just chaining functions and processes)

+ JavaScript proposal: https://www.geeksforgeeks.org/javascript-pipeline-operator/ (it's very clear how pipelining works in this guide)

+ Pipeline macros in LISP: https://blog.fugue88.ws/archives/2022-03/Pipelines-in-Lisp (again, literally just talking about cleaner syntax for nested functions)

The reason the article focuses on map/reduce type functions is because it's a common idiom for nesting commands. In fact you'll be familiar with this in Bash:

    cat largefile.txt | sort | uniq --count
(before you argue about "useless use of `cat`" and other optimisations that could be made, this is just an example to demonstrate my point).

In here, each command is a process but analogous to a function in general-purpose programming languages like Rust, LISP, Javascript, etc. Those UNIX processes might internally loop through the contents of STDIN as a LF-delimited list but that happens transparently to the pipeline. Bash, when piping each command to the next, doesn't know how each process will internally operate. And likewise, in general-purpose programming language world, pipelines in LISP, Rust, JavaScript (et al) don't know nor care how each function behaves internally with it's passed parameters just so long as the output data type is compatible with the data type of the next function -- and if it isn't, then that's an error in the code (ie compile-time error in Rust or runtime error in Javascript).

So to summerise, pipelining has nothing to do with iteration. It's just syntactic sugar to make nested functions easier to read. And if the examples seem to focus on map/reduce, it's just because that's a common set of functions you'd want to chain and which are particularly ugly to read in nested form. ie they're an example of functions called in a pipeline, not the reason pipelines exist nor proof that pipelines themselves have any internal logic around iteration.


Yeah, you are right, it's about syntactic sugar, I didn't read the article except first two examples.

Pipelines are about iteration, of course. And they do have internal logic around iteration.

cat largefile.txt | sort | uniq --count

is an excellent example. While cat and count iterate on each character sequentially, sort and uniq require buffering and allocating additional structures.


The pipeline isn’t doing any of that though. The commands are. The pipeline is just connecting the output of one command to the input of the other.

Iteration is about looping and if the pipeline in the above example was some kind of secret source for iteration then the commands above would fork multiple times, but they don’t.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: