Hacker Newsnew | past | comments | ask | show | jobs | submit | mschaef's commentslogin

> I'm curious what the CX-83D87 and Weiteks look like.

The Weitek's were memory mapped. (At least those built for x86 machines.).

This essentially increased bandwidth by using the address bus as a source for floating point instructions. Was really a very cool idea, although I don't know what the performance realities were when using one.

http://www.bitsavers.org/components/weitek/dataSheets/WTL-31...


This is nuts, in the best way.

The operand fields of a WTL 3167 address have been specifically designed so that a WTL 3167 address can be given as either the source or the destination to a REP MOVSD instruction. [

Single-precision vector arithmetic is accomplished by applying the 80386 block move instruction REP MOVSD to a WTL 3167 address involving arithmetic instead of loading or storing.


haha - took me a while to figure out that's Mauro Bonomi's signature

iirc the 3167 was a single clocked, full barrel shift mac pipeline with a bunch (64?) of registers, so the FPU could be driven with a RISC-style opcode on every address bus clock (given the right driver on the CPU) ... the core registers were enough to run inner loops (think LINPACK) very fast with some housekeeping on context switch of course

this window sat between full PCB minicomputer FPUs made from TTL and the decoupling of microcomputer internal clocks & cache from address bus rates ...

Weitek tried to convert their FPU base into an integrated FPU/CPU play during the RISC wars, but lost


> Then for many years it was standard for software to have help files, and it seemed anachronistic for Emacs to loudly proclaim it is self-documenting.

Emacs' notion of self documentation refers to something slightly different than the fact it has online help files. The help facilities can query the Lisp runtime for things like functions and keybindings. These update dynamically as the system is reconfigured. The result is something that isn't quite as cleanly presented as an online help document, but has the benefit of being deeply integrated into how the system is actually configured to behave at the moment. Very cool, and very much dependent on the open source nature of emacs.


> My first idea was to get the source cleaned up a bit and compile it again to 32 bit with Visual Basic 4, but I couldn't figure it out, it required some 3rd party libraries that I just couldn't get a hold of.

This was super common for VB apps. The original architecture of VB was, loosely speaking, a GUI container for various pluggable controls connected to a BASIC runtime. The number of controls that came in the box would vary depending on how fancy a version of VB you bought, and you could plug in additional third party controls in the form of "VBX's" - Visual Basic eXtensions. Even though VBX's were designed mainly for GUI controls, they were the easiest extension point for VB and got extensively used for all sorts of integrations until OLE Automation became more prevalent. (Roughly contemporaneous with the switch to 32-bit.)


He's been blogging continuously for close to twenty years - he was one of the original wave of Microsoft bloggers (along with Larry Osterman, Michael Kaplan, and several others I can't remember).

It is very much an engineer's engineering blog, and written by someone deeply in the trenches.


That derives logically from the way Commodore implemented disks. If you bought a 1540 or 1541 (or any other Commodore drive) for a C-64 or VIC-20, it had an onboard 6502 to run the disk drive. The interaction between the computer and the disk drive was somewhat similar in concept to fetching a file from a network server.

This could be useful to save on costs in computer labs... my grade school used multiplexer boxes to share a single 1541 across four C-64's.


It was always awkward to do low level disk stuff by basically "remoting" into the drive to execute code.

  OPEN 1,8,15,"N:NEWDISK,01":CLOSE 1
was always a weird way to format a floppy...


Knowing what I know now, I'd have appreciated it much more than I did at the time. (Also, fixing the link rate on the C64 would've been nice too.)


I had it done about eight years ago... there's nothing quite like waking up after a fifteen minute procedure and seeing better than you have without correction since you were five.

> It's nothing short of a man-made miracle but I have to say it's also very umconfortable and stressful for the patient.

I think I must be strange, either in my reaction to the stress or the way I chose to manage it during my surgery.

I explicitly wanted to be somewhat awake during the procedure to see what was going on (how many chances do you get to see your lens emulsified from the inside)... and I remember having short conversations with the surgeon during the procedure. (She'd been concerned about zonular laxity, and we discussed during the procedure that she didn't see evidence.)

This is not something I'd want to reproduce if I didn't have to, nor would I suggest it as a general approach, but given that it was necessary for me, it was amazing to see it first hand.

My second procedure (second eye) was a little more stressful than the first, but for me all the material stress (which was significant) was in the run up and anticipation.


> The Mac Desktop is vastly inferior to the Linux world

Asking out of curiosity, why is this? What's the functionality you miss on Mac?


Most of it is there but you need a crap-load of third party extension and some even cost money.

Like proper alt-tab, better keyboard configuration, Finder is the worst file manager I have ever used, a classical task bar and so on.

You can manage but the defaults are really bad for power users.

Honestly Apple just needs to let me install a proper Desktop Environment like KDE on it. The unix base is decent, just give me more freedom.


To be fair KDE is also pretty wonky out of the box (basic stuff like turning numlock on boot is unnecessarily buggy or confusing).

you usually also need a bunch of extensions. And 50% of them are broken due to various if you try to use KDE builtin extension thing.


The one I have always missed is proper focus-follows-mouse support. The mac desktop always feels really clunky without that when working with multiple windows.


FWIW, this is now possible albeit with a third party app: https://github.com/sbmpost/AutoRaise


Proper focus-follows-mouse does not autoraise the window when it gives it focus. I see that that app does offer "don't raise the window" (which I think is an improvement from last time I was researching this some years back), but only under an "experimental feature" flag that relies on undocumented private macos APIs that might go away in any future macos version...


Personally, most of my problems with MacOS (and Apple's operating systems) would be fixed if it were faster. The OS is full of very lengthy animations that aren't necessary, such as when switching between desktops.


Looks like on Windows it possible to disable all animations, including switch desktop, but you have to press two buttons to switch.


Some of it's size, some of it the fact that the camera is a second device, and some of it's workflow.

I tried a Sony RX100 (1" sensor) when they first came out, optimistic about the possibility of using it for 'general purpose' photography. After all, it's small enough.

The problem was, it's a second device to carry around and keep charged. Then once you capture the image, it's largely stuck on the device until you find a way to offload your images. I briefly experimented with cables that would let me do things like transfer images from the RX100 to my (Android at the time) mobile phone, for archiving and sending to family and friends. That turned the whole thing into the sort of science fair project that I didn't have time for as the parent of a very young child. (Although in fairness, I can't think of a single time in my life when I'd have had the patience, kids or not.)

This is why, for all the arguments you can make against them as cameras, I've come to be very thankful for the amount of effort that Apple and others have made to get appealing images out of devices I always carry around anyway. I can take a set of pictures, edit them, have them automatically archived to cloud storage, and send them to whoever I want.. all with a single device I was carrying around anyway.

This leaves open the fact that the 'real' camera workflow is still an option when there's the need for higher image quality and the time (or money to hire a photographer) to take advantage of what a DSLR or the like can do.

(When I compare what I can do with my iPhone to what my parents had available to them (a 110 format camera and 35mm Nikons), I like the tradeoffs a lot better. the image quality available now is definitely better than the 110. Some of those 35mm exposures are probably better quality than what I can get out of an iPhone, but they're all stuck in albums and slides, and nobody ever looks at them. )


> Then once you capture the image, it's largely stuck on the device until you find a way to offload your images. I briefly experimented with cables that would let me do things like transfer images from the RX100 to my (Android at the time) mobile phone, for archiving and sending to family and friends. That turned the whole thing into the sort of science fair project that I didn't have time for as the parent of a very young child. (Although in fairness, I can't think of a single time in my life when I'd have had the patience, kids or not.)

Most modern cameras now have a WiFi-based photo transfer system that works pretty well. It's not instantaneous, but it is quick enough to copy the photo you want to share with a friend or partner while you finish a meal or drink your coffee.


This is true, but switching to that mode is frustrating and you often have to use AWFUL mobileOS software to get the images. And my DLSL shoots like 25FPS and each raw file is 80MB. This is NOT fast to send over the wifi.

Waiting until I can plug in the 2TB memory card to my Mac and use a huge screen to review all the photos is far more efficient even if it has much higher startup latency.

Honestly this is a good reason to choose the iPhone Pro over the Air or Standard: 10gbps USB port. Plug the Nikon in to the phone for cloud upload. This would be the fastest path of all. Most people are only focused on the USB bandwidth in the iPhones for download from the phone.


I haven't had too much trouble with the Canon App, but YMMV.


The RX100 has had wifi transfer since the 3rd gen.

I understand the "second device to carry around" but it isn't a real point for baby pics you might take at home. A ridiculous number of times I have no idea where I last put my phone anyway and sometimes have to make it ring from kde connect on my laptop so it is not like a smartphone is necessarily readily available at all time anyway.

I also know a number of people who don't leave home with their smartphone amyway for short errands since they have an apple watch, that leave one pocket available for those that would prefer having a camera.


> The RX100 has had wifi transfer since the 3rd gen.

On an iPhone, I can take the picture and I'm immediately a button press away from a photo editor and then whoever I want to send it to.

(A camera that automatically tethered to a phone and dumped pictures into the phone's camera roll would mostly solve the workflow issues I'm mentioning here. Would not surprise me if this already exists.)

> I understand the "second device to carry around" but it isn't a real point for baby pics you might take at home.

Maybe. The camera still has to be charged and in mind and hand. (Then as soon as the kids leave the house you're back to where you were and having to carry something around that you might not otherwise.)

> I also know a number of people who don't leave home with their smartphone anyway

I see that... different people have different sorts of relationships with personal electronics. For me, it wound up being that I'd carry a cell phone and that was about it. Even in the pre-smartphone days, when I might have carried a PDA, I either wouldn't or couldn't.


I have a couple DSLR's and a large frame compact, and I wholly get your point. The image quality on even an older DSLR is better, mainly due to the physics of the optics - there's nothing like a high quality lens dumping a bunch of light on a large sensor.

However.... it's really hard to overstate the workflow and convenience aspects of shooting with a phone. (Particularly as a parent, and even moreso when I was a new parent of a small child.) The phone has the twin benefits of 1) being present almost always and 2) being immediately able to process and transmit an image to the people you might want to see it. For the 99% case, that's far more useful than even a very significant improvement in image quality. For the 1% where it matters, I can and do either hire a professional (with better equipment than my own) or make the production of dragging out my DSLR and all that it entails. This is like so many other cases where inarguable technical excellence of a sort gives way to convenience and cost issues. IOW, "Better" is not just about Image Quality.


> It feels like the industry quickly moved beyond the reach of the "hobbyist". There were no more "clever tricks" to be employed

It happened in a matter of a few years. The Apple II was built as a machine capable of running Breakout in software. Woz picked the 6502 (originally for the Apple One) because he could afford it.

It wasn't that long after that Commodore released the C64. They chose the 6502 because they'd bought the 6502 fab to protect their calculator business (and then they used it to assemble custom video and audio chips). From there, we were off to the races with respect to larger and larger engineering requirements.

Oddly, I wrote a bit about it a few days ago (in the context of John Gruber's recent discussion on the Apple and Commodore microcomputers): https://mschaef.com/c64


The Commodore machine contemporaneous with the Apple II was the PET.

    Apple I - July 1976
    Commodore PET - January 1977
    Apple II - June 1977
    C64 - January 1982
(Dates from Wikipedia)

All four used the 6502.


Apple made the II series for a long time. It was contemporaneous with the PET, but stuck around long enough to be relevant through the C64 and 128 (to the extent the 128 was relevant at all.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: