Hacker Newsnew | past | comments | ask | show | jobs | submit | deerpig's commentslogin

The ideal line length for text layout is based on the physiology of the human eye… At normal reading distance the arc of the visual field is only a few inches – about the width of a well-designed column of text, or about 12 words per line. Research shows that reading slows and retention rates fall as line length begins to exceed the ideal width, because the reader then needs to use the muscles of the eye and neck to track from the end of one line to the beginning of the next line. If the eye must traverse great distances on the page, the reader is easily lost and must hunt for the beginning of the next line. Quantitative studies show that moderate line lengths significantly increase the legibility of text.

Web Style Guide – Basic Design Principles for Creating Website Patrick J. Lynch and Sarah Horton 2nd edition, page 97.


I disagree, perseverance is not the same as goal-seeking for goal seeking's sake, or at least, it doesn't have to be.

Twenty years ago I set a thirty year goal, literally bet my life on it. It wasn't until two years ago that I was reasonably sure it had a chance of success. I had to, and continue to basically give up everything, no one funds very long term research, so I had to learn to live on less than $200 a month. No healthcare, not being able to buy a new pair of glasses, only being able to buy clothes once a year or so. But I found a university who believed in my work, gave me an office and a flat on campus outside of Phnom Penh and the freedom to follow the project wherever it went. No salary, but it's been enough.

Perseverance is what keeps you sane, gets you up in the morning. It gets you through whatever is thrown at you. I have not had a happy or easy life, but it has been a life full of purpose. I have seen and done things that most people couldn't imagine, that has been, many times, too weird to be believable even as fiction.

Perseverance is a skill you practice every day for the rest of your life. But there are many small compensations. You find kindness and beauty and from time to time pleasure in unexpected places along the way.

With the university's help we will go to press with the first volume later this year, with a volume every year or so until it's finished on or about 2030. No one would willingly choose such a life, you fall into such things, and I was one of the stupid one's who took on such a ridiculously ambitious challenge. With a little luck I will live to see it through to the end.

So for some, purpose and perseverance, trump happiness and comfort and security. Though I admit I am likely and edge case.

I must say though, my wife and I sure as shit would like to live on a little more than $200 a month. :)


I think this is a definitional disagreement. If what you mean by perseverance is the skill you practice to keep yourself going (while overall aligned in the general direction), then yes, I am all for it. But often goal-seeking for goal-seeking’s sake is somewhat coerced on others in the name of virtue, perseverance, etc. The old advice of “do something, don’t just stand there” is the example I am talking about.

But if you have found what you want to do and have set a reasonably long goal (30 years is dedication), then yes humans are imperfect and there will be bits and pieces of motivation still missing. But not many are fortunate to have this goal arising from within.

I like the “shower thoughts” test. Are your goals what you think about in the shower? If yes, then that is probably what you want.

Other story in the same line is “if you are trying to start a startup and you get discouraged by someone and consider not, you probably weren’t meant to do it in the first place.” (Peter Graham I think)

You really can’t not have desires. You can have desires that you don’t want or invisible/preconscious desires, but strictly speaking a person can’t be more desirous (or, loosely, more passionate) than another. It’s like saying a piece of AI is half-invested into its utility function, it just doesn’t work like that.


> I have seen and done things that most people couldn't imagine, that has been, many times, too weird to be believable even as fiction.

What? I'm really curious now. What is the project? What was the goal? Don't leave us hanging!


Curious too. 30 years project with one printed volume a years looks like an encyclopedia.

@deerpig you could set up a Patreon or some sort of crowd funding. If a few people interested in your project give you few bucks here and there, it could have a big impact on your income.


I did some digging. He's the founder of the Chenla Institute, Center for Distributed Civilization. Also launching/launched Chenla FabLab.

His comment history suggests he's old-school, been around the block (China, Japan, Cambodia), has done work for SGI, and has an "Infinite Reality" machine at work (which is described as a "graphics supercomputer").

https://en.wikipedia.org/wiki/InfiniteReality


Can you share details on what you're working on? Or where we can follow the upcoming announcements about your work?


Also, better persevering in your own mania.. at least the wound is self inflicted. When you have to persevere for what society considers important, if that leads to nothing the grief is heavy.


Why did you set this goal? Could you have picked any random 30 year goal and been as happy or was this something that naturally spoke to you? Are you still excited by it?


you can live on more than $200 a month. give back and receive bro, it's really easy. come to my nidhogg tournament and i'll give you the game.


Physical Review D is not a light weight journal. You don't get published there unless you are being taken seriously in the scientific community.


And "stochastic superoptimization" sounds so much cooler than their original term; "super-duper random improvement."


Yahoo, when it was a single page with a couple of dozen links. I think that was before there was a browser for Windows -- I was using the Omniweb browser on a NeXT Color workstation. I still miss the original Wired and later Wired News.


So it's sort of the opposite of Field of Dreams, which often results in vaporware. Just because people come, doesn't mean they know what they want or know how to express a problem or need that could be solved and result in a viable product. It's a big risk to gather people using a human interest story in order to sell them that you haven't made, or really formulated yet. Is your story compelling enough to keep interest long enough for you to build and launch a product? How will you find the time to keep them engaged while building the product? You may well luck out, and if you do that's great for you. But it doesn't make it a good idea.


It's in elisp? Of course it is. It's emacs!


Great music. I want one.


There is a deep rabbit hole to go down once you start watching shiny videos of industrial machines. Once you watched one, your recommendations will be full of them, including gems such as automatic boiled sausage unwrappers and slicers: https://www.youtube.com/watch?v=ZNRfVfm8eFw



Site is down for me....


Try again. It just worked for me; I'm writing this two minutes after your post.


I have a slightly newer Infinite Reality R10000 sitting beside my desk here in Phnom Penh. A professor in Berkeley gave it to us and somehow we managed to get it to this side of the world. It uses enough power to run an apartment block and yes it is noisy. I will eventually gut it and install a six node linux cluster inside.

I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.

SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...


>To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...

In the recent 3-5 years, there is a clear revival of the cyberpunk subculture online. Many related hobbyist websites appeared, many new cyberpunk-inspired independent art, music and games are composed, new communities are formed, etc.

Themes include a general nostalgia of the 80s, especially vintage computers, also the 90s early pre-Web 1.0.

The reason? We can clearly see. The lost future that never comes...


It’s coming. Birthing something like that requires patience. And it always takes longer than you think.

It will come slowly at first, and then all at once.


>The future is already here — it's just not very evenly distributed.

Nobody is denying it, indeed we see many related development.

However, it's not only about "the lost future that never comes", I think it's also due to the people being increasingly alienated about the current state of computing. Primarily, it's about the lost mentality of the future.

Cyberpunk promised a future where computing is the disruptive technology. Since 2006, the ever-increasing clock speed came to a halt. Since 2013, the general performance of Intel processors remained constant. Selling of PCs keeps declining. No major breakthrough in practical operating systems are made beyond Unix (http://herpolhode.com/rob/utah2000.pdf).

Cyberpunk promised a future where megacorps and governments being increasingly oppressive to the population with technology, it has occurred as predicted. Cyberpunk also promised the "cyberspace" is going to be the electronic frontier where technologists reclaimed liberty... Today, the damn web browser that runs all the craps written in Electron and JavaScript is hardly the "frontier", neither Facebook, Twitter and the endless timeline of buzz, which constitute 90% of the Internet activities count. Also, from 2000-2013, decentralization was killed, not built. Move importantly, today, we don’t even know how to waste time on the Internet anymore (https://news.ycombinator.com/item?id=17068138). Though, It, still, has arguably occurred to some extents too, but progressed slowly, even the basic tech like HTTPS was only deployed after Snowden.

A future where anything seemed possible and all of it was magical is definitely gone. But the new generation of developers, armed with decentralization, P2P, cryptography, and trustless system, no matter if it's going to be successful or not, would bring the Internet back to its cyberpunk ideals, revive the dream and set the history forward.


Links? I loved Neuromancer, Snow Crash and others of the time.


Check out Daniel Suarez daemon and freedom. Not saying more as I don't want to spoil anything.


More aesthetic oriented: http://reddit.com/r/outrun/


The recent game EXAPUNKS is definitely like this, and worth playing, although it's assembly language programming puzzles so might be a sort of busman's holiday for a lot of HN readers.


Another example is Neocities, the old 90s Geocities personal webpage culture brought to life again.


Are you talking about vaporwave?


That's just one face of the movement.


please don't gut rare vintage hardware to install commodity x86 hardware.


10000% agree. Either sell it or even better, donate it to a computer history museum somewhere.


Yes, please don't do it, for no primitive intel-based system has ever come close to any SGI hardware in terms of elegance. It would be a sacrilege.


I wonder if it's still possible to put together hardware that's a few years ahead of current high-end products.

Like, can you arrange, say, ten flagship graphic cards for realtime rendering? Do we have game engines that can scale to that number?


I don't most software can scale to 10 GPUs out of the box. AFAIK, it would be hard to even find a motherboard that would fit them. However, a company could conceivably buy a workstation with 256GB RAM, two 32-core Threadripper CPUs and four Nvidia 2080Ti. That would definitely put you a few years ahead of the average "consumer PC" or next-generation console.

Sidenote: I've read that John Carmack and id Software liked to develop on workstations that were "ahead of the curve" that way. It gave them an edge, in that they were able to develop future games for hardware that didn't yet exist, but knowing that consumer PCs would eventually catch up.

I think what made these SGI computers really amazing at the time is that there was no such thing as accelerated 3D graphics in the consumer market at the time (or much real-time 3D for that matter). They also had a cool Unix operating system with a UI that was way ahead of anything you could get on a consumer PC. I can also imagine that it was a much much more comfortable development environment than developing on say, MS-DOS, which didn't even have multitasking.


That's a good point but underselling it. My favorite thing about them was doing a single, system image that removed bottlenecks and complexity for developer at the same time. The PC's were using slow or redundant buses to connect high-speed components. SGI removed redundancies, used fast interconnects (GB/s), made them low-latency (microseconds vs milliseconds), and NUMA-enabled them. The last part meant that sending data to nodes didn't take middleware like MPI: you might just do a load and store instruction like on single node. Hardware took care of communications with cache-coherency. You did have to design for good locality to minimize moving data across nodes, though.

Add the other features like reliability (esp hot-swapping), servicability, and security (Trusted IRIX) to have some incredible machines. I always wanted inexpensive hardware with hot-swap, RAID, and something like NUMAlink connecting it. Never quite got that. One company did make a NUMA for AMD and Intel:

https://www.numascale.com/Scale-up-products/


I guess that's the difference between a workstation that's designed for performance and versatility before cost, and a PC, which is made to be affordable first. When the PC industry started, it was very much about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000. Things have changed quite a bit since, but much of the do-it-cheap rushed-to-market type of compromises are still with us.


Sure, the PCs that started the PC industry — things like the Apple I and the MITS Altair — were indeed "about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000." But, long before 1993, most CPUs and components used in PCs were being produced specifically for PCs, with uses in things like lab equipment, industrial control, and workstations a rather smaller secondary market.


The company I work for builds machines like that for film post production. I’ve been writing software for one with 8 Titan Xp GPUs, and with its disks arrays and GPU expansion chassis its about the size of an Onyx, and draws considerably more power :-)

Unfortunately I wouldn’t say it feels like the future, more like a normal CentOS Linux desktop.

You’ll struggle to get a PC whose BIOS can handle much more than that too.

We used to build clusters for the same thing in the past, but that was largely standard supercomputing stuff but very similar to how the InfiniteReality machines were used. I believe our software once ran on Onyx machines in the dim & distant past.

So in short I wouldn’t say having loads of GPUs is enough to make it feel futuristic.


I would approach it like this:

1) find some graphics problems which people say are not possible on any near-term hardware

2) study the algorithms and identify low level calculations which, if you could do orders of magnitude more of them, would allow you to solve the problem.

3) get a bunch of FPGAs and try to design a machine which can (very slowly) run that architecture

4) once you’ve got it working, slowly replace the FPGAs with ASICs

5) build a box with 16-64 of everything.

I would avoid polygons, since the current architectures are all extremely good at filling polygons. SDFs and raytracing are where you may find the “not on current gen” problems.


Take a look at the game Claybook. Fully raytraced visuals at 60fps against a dynamically updated SDF scene representation, complete with fluid sim for the main game mechanics. All on current hardware, including stuff most people consider “slow” these days (Xbox One).


> SDFs and raytracing

An easy one would be: have each GPU raytrace a (say) 320x240 scene, each offset by fractions-of-a-pixel[0] or multiples-of-a-screen from each other, then have a final GPU stitch them together into a full-res video.

0: If you this with 60x1080 resolution, you might be able to replace the final GPU with a dumb hardware multiplexer, though that would make compositing painful at best.


That’s literally what we used to do for our cluster colour grading systems for films.

We had hardware that the would merge DVI from up to 8 GPUs, in separate nodes, and produce a single image.


Step 0) is be an Intel or AMD. Nobody else has the money to do what you're talking about.


As others have said, the whole point of these machines is that they were NOT simply creative combinations of current high-end commodity products. The whole system architecture, from the CPU, memory, I/O, to the graphics hardware, was custom designed. That's something you really don't see very much of these days.


Not in PCs, but I wonder if the iPhone would count as a largely custom designed computer. They do custom silicon for a lot of the system, and get much higher performance than commodity phones as a result. Perhaps the spiritual successor to these high end workstations in an odd way.


I think Linus mentioned something along those lines: on how the work on using Linux for supercomputers was the foundation of Linux for phones. It's a point Tannenbaum makes in Modern Operating Systems: what is old, is new, and what is new is soon to be old, as the bottlenecks move, ideas that were abandoned get resurrected.


And adding to the magic, SGI machines were a something most people would never get their hands on. A maxed out graphics workstation today is still just an "ordinary PC" but more expensive. A (big) upgrade, but not something with magic dust.


You could but it's unlikely to impress as much - some fundamental transitions only happen once. I think in this case the big advance is realtime 3d graphics with lighting and texture mapping. Perhaps a really fancy VR setup might have a similar effect although to me, it's less obvious that VR will be something every computer has, the way it seemed obvious that 3d graphics is something every computer will get much better at, in the early 90s.


Not quite 10 but here’s a blogpost on making a 7 gpu machine for 3D pathtracing rendering http://tomglimps.com/7_gpu_workstation-1000_octanebench/


Supposedly you could play video games on one of Nvidia's more recent data center products:

https://www.nvidia.com/en-us/data-center/hgx/

It shows up to the host computer as one really big GPU. Of course, you're going to get worse performance than just a single Titan V because it can handle any game already and there's inevitably going to be latency added by doing work over NVLink/NVSwitch. Those massive GPU products are targeted toward offline rendering or machine learning applications, not so much realtime simulation.


Do you have contact information?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: