For me, Brian Westley is the greatest code poet of all. His poem was a winner in the 1990 round of the International Obfuscated C Code Competition under "Best layout - poetic exchange between lovers":
Halflife2 ran perfectly under WINE. At the time I assumed that it was a win for WINE but with hindsight — and typing this out makes me feel so naive! — was HL2 optimized for WINE in order to make WINE more successful? Of course it must have been!
It’s a shame the connotations are negative because this ironic comment otherwise works quite well: This large wooden horse is such an extravagant gift, it has to have some subversive purpose, right?!
It ran fine as in not crashing, but you were limited to dx8 or maybe dx9a feature set which limited many visual effects and there were significant performance issues originating from wine's reliance on translating dx to opengl, lack of offloading cpu grpahics "command lists" (or whatever it's called) to a deditacted thread and the disjointed state of linux graphics at the time... It took until about 2013 for wine staging to run hl2 properly with multi core rendering and with all bells and wistle, but performance was still inferior.
I think linux graphics were only good when paired with the right version of red hat and nvidia drivers on a supported workstation dedicated for running proprietary 3d/vfx software packages as an alternative to the aging SGI workstations. Every other use case was pretty rough... until about 2017 when things began to change massively, and finally now, where you can actually get better experiences than freaking windows on most use cases.
Honestly I feel like anything beyond 5 megapixels per frame is pushing beyond reasonable expectations with 35mm. This is certainly the case with any kind of available light or high speed work in silver-halide process, the area where I figure most people are going to be using this device. Lab-work in C41 and E6 is definitely possible at home but must account for single digit percentage of the home analogue market.
A 4000 DPI scan of 135 gives you 21 megapixels. So 36MP with a good lens will easily resolve just as much detail. There is not 60-70MP of information in a 4000 DPI scan, period.
For most films, anything beyond 4000 DPI is just going to help resolve the grain particles or dye cloud shapes. You have to be shooting slow fine grained BW with the best lenses to need more.
"…beyond 5 megapixels per frame is pushing beyond reasonable expectations with 35mm."
Well, as I mentioned elsewhere old fashioned Kodachrome resolves ~100 lines/mm and some newer color emulsion are considerably higher, and of course B&W ones have even higher resolutions.
Given that a 35mm frame is 36x24mm even Kodachrome achieves 8.64 megapixels. OK, let's allow for an overgenerous Kell factor of say 0.8, this figure will drop to ~6.9 megapixels. Given the ready availability of emulsions with higher resolutions, especially the best B&W ones then a figure well in excess of 5 megapixels is relizable in practice.
Of course, that doesn't take into account the image chain as a whole, lenses, displays, compression, etc. which would reduce the effective resolution. That said, these days the typical image chain can easily achieve much higher pixel throughput than 5 megapixels before bandwidth limiting so the effective Kell derating factor could easily be kept quite small.
I think I see what you mean. It’s the difference between having an image showing the shape and texture of each film grain, and an image which looks like what I saw in the camera and which isn’t going to be any sharper. The former has value but the latter was always good enough for me and, surprisingly, rather low in resolution compared to subsequent DSLRs and mirrorless cameras I bought in the 2010s.
Ilford Delta 400 pushed two stops to 1600 ASA in a 1970s Asahi Pentax SP1000 was always going to produce… artistic results, requiring as much imagination as acuity to appreciate the subject. (Read: see past the blur.)
For me, strictly immutable and side-effect free programming was a forcing function* to make me really decompose problems into isolated parts. Only when I could do that could I truly say I understood the problem (and its distinct sub-problems) in full.
In that sense, Erlang isn’t really different from any other functional programming except that it also ships with a lot of actor / message passing concurrency dogma which solves a different and equally delightful set of problems (low-bug concurrency) at the same time.
So it’s a double hit of two great ideas in one paradigm.
Joey et al.’s work ports the SPICE astronavigation library to the on board ARM m0 giving you a complete orrery in a classic F91W “Terrorist” watch. It is fantastic!
I really love being able to get an estimate of when and where The Moon will rise, or where Saturn is right now. Timekeeping and astronomy are two of the oldest forms of science we have and I love being in constant touch with them via the newest science we have: computers! (The source is all open and available for you to hack on, including a nifty emulator.)
It’s interesting how those watches are both objectively expensive for something nobody really needs and at $1,000 or less, dirt cheap by the standard of expensive watches.
The astronomy face is superior as it calculates the altitude and azimuth of the selected object based on your programmed location and, of course, the current time:
What would be even better would be to acknowledge that altitude is somewhat moot when all these objects are in the ecliptic plane — unsurprisingly Jupiter at Jovian noon is roughly where The Sun was at lunchtime! — and instead cycle through the azimuths of each object in the sky, in the order in which they are visible.
The CGW-50 Cosmo Phase is impressive on that end for displaying realtime planet positions in 1989, now of course it's just another watch face to choose from on Apple.
Still I bet the Casio works offline longer :p
I'll have to revisit the sensor watch, I'd love to hit a button for sunrise and moonrise
Iosevka is a beautiful font indeed. the condensed look of Myna was inspired by Iosevka. i saw it once in a coding demo and decided to make it condensed. the predecessor of Myna (called Hera, available on my profile) was just a customised version of Source Code Pro (and is non-condensed, just like Source Code Pro).
Beside being a neat font in its own right, Iosevka allows for custom builds with different settings, selection from a bunch glyph variants, and custom ligature choices. It's pretty incredible.
Thank you for the detail that Iosevka is a form of the name "Joseph." I've used this font for years and it never clicked for me--nor did the correct pronunciation, which it turns out was always listed on the readme.
I’ve always had a bit of a chip on my shoulder about HIBP’s switch to charging for domain searches. It felt a bit like those travel visa scalpers who charge 50 CURRENCY_UNIT to file an otherwise gratis form on your behalf.
Law enforcement should provide this kind of service as a public good. They don’t, but if you do instead, I don’t think it’s cool to unilaterally privatize the service and turn it into a commercial one.
I voted with my feet but this post feels like a good enough place to soapbox a bit!
Nano is quite a venerable piece of software with the initial implementation shipping as pico, the text editor for the pine mail client, back in 1992. Tens of thousands of students at a few universities will have been introduced to it as their very first email client.
The pine authors fell foul of the Debian free software guidelines and, as well as nano, a clone for the mail client itself lives on to this day as alpine. I use it every so often for a spot of nostalgia.
This is exactly my path to it, and even though I know how to “eat flaming death” vi and used emacs for awhile, nano is still my default “edit that config file quickly”.
Same. My first internet access was a BSD shell account back in 1993; I had Pine and Pico on there. Coming from a world of MS-DOS BBSes (I ran one myself!) it wasn't that hard to take QEdit skills and move that over to using Pine and Pico-- it was quite comfortable.
I still tend to muscle memory my way through using Nano when I need to do quick file edits on Linux.
https://iamkate.com/data/12-bit-rainbow/
I had completely overlooked that it was for this power-usage visualization.
reply