Great post. There is definitely something hard to articulate but absolutely life-changing about my mental state moving from the US to Taiwan. The complete lack of worry about all the things mentioned in this post (physical safety, personal property, transportation, etc) alleviates a burden I never knew was there until it was gone. Here it's completely normal for people to leave their phones or wallets unattended on a cafe table to claim a seat. And I don't worry about whether my front door was locked or about fumbling with cash in public.
For me, I never _consciously_ worried about these things much in the US but in the back of my mind it was always there. Every time I went out. I didn't realize the tension that it built up until I left.
The first two responses to you were a solid yes and a solid no. I think that just demonstrates how the concept does not apply to application versioning.
It most certainly applies, the previous poster gave good examples of config files syntax and semantics, db compatibility, etc, etc.
On the UI front it really depends, you need to make decisions and let users know to avoid surprises.
At Sun the process was that every possible interface that something or someone can depend on is acknowledged to be an interface. Is GUI pixel positioning an interface? Well yes, clearly it is as someone could write tools that rely on that position.
But do you as the author want to give users an assurance that you won't break this except in major releases? Maybe not. So interfaces were classified on the axis of public/private and stable/unstable (it was a bit more complex but that's the gist). Most applications used to classify the GUI elements as an unstable interface. Then you document that so consumers of the application know they shouldn't rely on GUI elements not moving.
I was specifically trying to contrast programming interfaces against the large universe of applications that don’t expose any. You bring up an interesting point about external dependencies such as db, however I would say the vast majority of these applications I refer to don’t rely on users to set these up.
I noticed pre-covid that businesses tend to make internal decisions such as spending and hiring, that even when not directly related to my department, ultimately affected the compensation they were available to offer me. Now with the pandemic closing a number of businesses it seems the issue of bad business decisions is even higher.
It's simply a matter of trust. If an employer is not providing the absolute maximum hypothetical compensation then we can all agree that is a truly disgusting and untrustworthy employer. I don't have exact numbers for my employer yet.
First of all, I'd like to figure out how I can identify an employer making business decisions I am unaware of. Generally it would be fine for me what businesses do internally but when it affects my salary, then it's not acceptable.
Secondly, if one has figured out that a business is taking private actions that one doesn't like, how should one handle that? I would need to quit that job the next day, but from a personal finance perspective I cannot easily afford leaving and acquiring new jobs from one day to another.
I'm looking forward to your experiences and ideas.
Everyone derides Apple for not listening to the demands of users and then, when Apple finally does give in they... continue to deride them. I'm just happy with +magsafe -touchbar. Even if the price is Apple saving face and pretending this was their idea the whole time, so be it.
Apple wanted to be "bold" and forced a lot of their users into a pipeline that didn't last. And now they are listening to the demands of their users and backtracking.
Users that stuck with them are now being punished for their brand loyalty.
Apple: "If you want to connect to HDMI, you have to buy a dongle now. Also TOUCHBAR! Woot."
User: "Ugh...fine"
Apple: "So... now that you own that dongle that you didn't need, but we decided to make you need. Well, we decided not to make you need it anymore! It's built in. Also, no more touchbar! See! We listen!"
User: "Ugh......."
I'm not here to harp on Apple, but I understand why some people are a little put off by their decisions.
I like the magsafe and the card reader could be useful. But I think Apple should have stuck with the usb-c. Someone should push the standard. USB-C could handle anything (mouse, keyboard, printer, screen, camera, network, etc...). There was a time when the printer, mouse, keyboard and network all had different cables: It was a mess. I also can't interchange these cables. USB-C changes that and it is also smaller and fancier than the USB-A type.
Just the other day, I had to buy an HDMI cable for my screen. That could be a USB-C. But screen makers and graphic card makers are not going to change overnight. No one is strong enough to push the standard too. If anyone can do it, it's Apple. This is, in my opinion, a step backward.
I agree and would love to have everything usb-c. But I’m skeptical that will happen, for one practical thing. Apparently not every usb-c cable is created equal and not every usb-c port is created equal either. (Remember that Nintendo device with an usb-c port that turned out not to comply to the spec?)
If every cable looks the same but one is suited for 240W of juice but doesn’t do data and the other vice versa, maybe we would be worse off…
> (Remember that Nintendo device with an usb-c port that turned out not to comply to the spec?)
Someone needs to be the pedant and say this every time it comes up: It wasn't the Nintendo device that didn't respect the spec, it was knock-off docks and chargers that didn't respect the spec and damaged the Nintendo device [0]. I happen to think that Nintendo's engineers should have designed against some shenanigans by third parties (I do with my USB-C devices), but they weren't strictly wrong.
I spent more time than I'd like to admit trying to find the perfect off-brand usb-c dongle that would work with my switch (so I can plug it in to TVs when I travel, without needing to bring the whole dock). I am about 80% confident that the dongle that I eventually got wont brick my Switch, as long as I remember to plug in the power first and then the Switch (or was it the other way around)?
Oh yeah, and also, unrelated to the above, I now inspect the tiny grey-on-black voltage information whenever I'm about to plug in a USB-C cable for providing power.
Oh yeah, and also, each USB-C cable has its own special purpose, since they aren't actually interchangeable, as it turns out.
USB-C somehow managed to find the worst intersection of "Universally pluggable" without being a universal standard.
I really don’t understand the anger over usb c, even now. I got a single cable dock that handles all my peripherals. It’s not like they made that useless; it would still work and it’s still nice to have a single cable that drives everything. Even if I got a new MBP I’d continue to use USB-C, because it’s a better UX.
Hell, you can now buy monitors that work only with USB-C. It’s clearly the way everything is going, Apple just moved a bit too fast.
That dongle is $69, but you can try buying the off-brand, where you’re not sure whether it works, not sure it’s doesn’t upload your data to China, and not sure it doesn’t have malware. If you’re hit, good luck explaining your choice of brand to your insurance. So basically, it’s $69.
So? Next to a few thousand dollars of laptop, monitor, and peripherals what's another $70? Especially since that hub will be compatible with every laptop I will own for the next few decades, regardless of the brand.
> I got a single cable dock that handles all my peripherals.
That you have to drag with you if you want those ports away from your desk. Plug into a TV or projector and you need a dongle. No thanks. Ports are fine by me and I don't care if that adds 0.5mm or some stupid aesthetics nonsense.
I don't understand this dongle stuff. You just buy a USB-C to HDMI cable and plug that in. I don't see how it's any different than buying an HDMI cable.
I'm a photographer, so the SD card slot is nice for me personally, but it really doesn't make sense to build it into the laptop. It must be used by a tiny percentage of people.
Most new TVs/displays can do Thunderbolt so there’s your USBC connector replacing HDMI. Hdmi can be useful for longer runs because of integrity and there you’d want a HDMI->C cable/adapter that’s an active part.
For SD cards, the majority of people don’t use SD cards (even customers for the pro line). It’s a concession to an important niche, but forcing that niche to have a USBC adapter that is built into the SD card (maybe not possible?), having a separate SD card reader dongle, or having the camera itself be the SD card reader all seem like better choices than forcing all laptops to have an SD reader.
What’s the compelling story for why SD card or HDMI strictly need to be built in for all customers? The story for HDMI is kind of the strongest because the HDMI market is much more common day-to-day and USB-C as a display technology hasn’t yet permeated standalone TVs and projectors.
Generally I agree, but projectors specifically have almost always required dongles w/ Macs and the solution was usually to keep one attached to the projector cable or tethered nearby. Mini-DVI, Mini-Displayport, and even HDMI wasn't standard on many office projectors. It's a hassle but not specific to usb-c or a recent phenomenon.
My LG 5k monitor acts as my hub and power source, so I'm pretty close to "plug 1 thing". And from there I have a wired apple keyboard that connects to the monitor (yes with a dongle). But the keyboard has 2 old USB connectors which double as thumb drive connectors.
As it stands, I still have an analog audio out because I've procrastinated on getting a usb digital in/out, and I also have a ethernet <-> thunderbolt connector because I only got about 400Mb/s when going via the monitor instead of 950+ when going direct.
3 small connectors all on 1 side isn't that bad.
That being said, I welcome the HDMI addition mostly for older conference rooms which aren't equipped with Apple TV / Webex Proximity / etc.
Yea, having "one cable to rule them all" is great. I can see why having an HDMI port would be nice when you're traveling, or using shared projectors, but at this point, if you don't have a USB-C to HDML adaptor...
100% this. I think it's a good lesson that sometimes there are multiple decisions/outcomes which are good enough, and there is no one perfect choice. Some people will be happy if you go one way and others prefer the other way. But waffling back and forth between them instead of sticking to one coherent strategy can be worse than committing to either of the options.
My problem with USB-C is its too dainty. Since it became the defacto charger interface for laptops I can't tell you how many chargers and laptops we've had to either replace or get repaired under warranty because the physical connector isn't up to the task.
USB-C seems to be fine perfectly straight on with no strain at all but use it in anything less than ideal situations and it wears out surprisingly fast. The connectors tend to be longer than they are wide, which is always bad for robustness and they have 0 built in strain relief.
I think it performs well and they got the reversible design right but its just a little too wimply for the way people really use their devices.
Im not saying we need to go back to screws. But to compare usb-c to the previous standard power connector lenovo used, which was shaped like a usb type a port but a bit chunkier. we never had issues with the ports wearing out or getting damaged with those. They lasted the life of the device just fine. But I have T14s with usb-c charging that need new chargers or charge ports after a few months.
Obviously YMMV but when you manage a device fleet you start to see more clearly how "average users" treat things and how designs hold up.
I haven't felt this way until recently when using a USB-C port plugged in while in bed. Felt like I had to be careful moving around or I'll hit the cable and damage my port.
Nah, this absolutely looks like the result of Jony Ive leaving. Head of design quits and suddenly a long term trend under his leadership suddenly reverses? Makes more sense than any sinister explanation.
Both are not mutually exclusive. His leaving may be why they changed track. However that is not why they are harping these reverts as new features/innovation.
Marketing releases talking about “innovation” are equivalent to protestations of innocence before a criminal court; you should ignore it because everyone is motivated to claim the same thing regardless of whether it’s true or not. Doesn’t mean it’s always true or always false, sometimes real innovation occurs, but taken alone it provides no useful information.
Yeah, I think Jony Ive jumped the shark with the touchbar and eliminating ports. I dunno why they ditched MagSafe. The Worse parody of latter day Ive [1] was pretty spot on.
I really like this MacBook Pro but at $2000 I may wait for the next Air. I really don't need the Pro features, HDMI, SD card, that many GPUs, ..., and my 2017 Air is just fine for now.
> Evans Hankey, vice president of Industrial Design, and Alan Dye, vice president of Human Interface Design, will report to Jeff Williams, Apple’s chief operating officer
When is the last time you saw Apple make everyone happy?
If you think this is bad, you should have been around after Jobs left in the 90's and the icky product line after Mac IIfx. No one liked the PowerMacs and PCs were so incredibly cheap that only art departments had macs.
But even when Jobs returned, remember when everyone lost their shit when Jobs removed the disk drive from the iMac?
Maybe the iPhone 3 was the last universally rejoiced Apple product.
(But that notch tho. It really tweaks my obsessive nature. I'm still on an iPhone 6 so I haven't encountered the notch anywhere yet.)
I think it's good for people to not think in such absolute terms. People are happy this product exists, but it's also annoying that it took so long. I'd rather have that type of response than the standard "fanboys" vs "haters"
Who is this "everyone"? Taking a look at the announcement thread here on HN, the response to these machines has been extremely positive.
I can't really find any criticism of the machines themselves. Negative comments are directed at Apple for pre-existing customer-hostile behavior like their repair policies.
Well they never gave us our money back for our keyboards where the keys fell off or the dGPU that overheats and throttles the CPU, when the CPU isn't even doing any useful work
> Everyone derides Apple for not listening to the demands of users and then, when Apple finally does give in they... continue to deride them.
I explain such behavior to myself by keeping in mind that the internet is many people with many opinions. If one sees an inconsistent response to a topic, one is likely seeing disjoint groups of people that separately have consistent opinions, rather than a cohesive group that can't make up its mind. I find it easy to miss that when I'm not paying attention to who said what.
They switched whole-hog to USB-C on their computers to push industry adoption and were derided for it, with people bitching and moaning about how they needed dongles for "everything." What was left out of the conversation: they dropped the proprietary magsafe connector, opened the door to charging their laptops off third party chargers, 12v/airline adapters, etc. They also opened the door to having one cable connecting your laptop to power, display, input devices, storage, and networking by means of a dock. But the teenage masses on reddit screamed "DONGLEEEEESSSS" and spewed shitty memes. People who evoke Nazi language to describe their smug choice in computer hardware, that's a crowd we should listen to...
Apple stopped including USB power bricks because everyone on the planet is drowning in them by now, and there are many choices like multi-port AC adapters, ones with different combinations of USB A and C meaning you probably want to pick the adapter you want anyway. But people screamed blue-bloody-murder. And then not long after Apple does it, Samsung quietly follows suit and nobody says a fucking word.
Apple dropped floppy drives and the world screamed about how stupid they were. At the time software had long since stopped coming on floppy, USB thumb drives were out, ethernet was everywhere and wifi was going mainstream. Apple helped put the nail in the coffin, pushing many people to USB thumb drives, network file shares, wifi, etc.
Apple dropped optical drives, same thing. Software was coming digitally; I worked for a company that offered a digital download in 1999 and we were a dinky little niche company. By the time Apple stopped including optical drives on their laptops, CDs were largely dead for software, everyone was using MP3s, etc. That room freed up space for stuff like bigger batteries.
Apple pushed hard on digital display interconnects when much of the PC world was still chugging along on VGA which worked poorly with actual raster displays that were rapidly becoming the norm and PC hardware companies just couldn't seem to figure out how the fuck to communicate "here are the resolutions and refresh rates I can do", something Macs had been doing for a decade. Kids these days don't remember the pains of having to do geometric adjustments on a digital display for a supposedly raster signal.
Dropping headphone connectors on their phones, people screamed blue bloody murder. A pair of pretty decent wireless headphones with 8 hours of battery life were not hard to find for $30ish, now they're commonplace for cheaper, and damn near every car comes with a bluetooth audio function in its stereo, even work vans. And if you want hardwired headphones, a dongle is a $5-20 affair, with Apple's audio adapter scoring very good marks in audio DAC tests...
This hardly makes a case at all. Only one section in the middle is actually dedicated to why syntax highlight is bad and it boils down to a baseless assertion about "biasing" the developer's mind toward syntax. I would counter that assisting developers in thinking about how code will be interpreted or compiled rather than bytes of text is closer to the semantic meaning than without.
There's also a comparison to highlight syntax in fiction, which is absurd for many reasons. A simple one being that the _goal_ of fiction can often be to obscure meaning for the sake of ambiguity. I'm offended not by the thesis but by the presumptuous nature of the argument.
That is patently false. You can divide Americans into any number of segments which they are born into and reliably determine that segment's success. E.g. white people earn more than black people. Children of rich parents do better in school than children of poor parents. Lawyers who graduated in 2000 are more successful than ones who graduated in 2008. Forces out of their control determined this. Their "conditions" are functions of this. Your statement would imply that the people in these segments just happened to make worse choices unrelated to the circumstances of their segments. That would be quite the coincidence.
Yes, like your lawyer example there's plenty of evidence where outcomes for people doing basically the same work varies a lot across times and places.
Maximally small government / libertarian people would like to pretend that everything is about individual character because this would justify their preferred public policies, but thinking about it even a little bit it should be obvious that internal and external factors both matter.
I remember in a previous discussion here Walter was objecting even to the use of standard economic terminology around things like what wealth-creation means (dating back to Adam Smith's The Wealth of Nations in 1776), so I'd take his views here with a grain of salt.