Hacker News new | past | comments | ask | show | jobs | submit login

Ideology has trumped engineering, and as hackers, you shouldn't tolerate it.

Frankly, all of this has been obvious all along to any competent engineer, since the moment Apple introduced lightening. They described it as a serial bus and talked about how it gave more flexibility. If you think about it for 2 seconds its obviously better to run protocols over a serial bus than to run 30 lines of individual signals, with dedicated lines for analog and dedicated lines for digital in a world where people want HDMI adapters for a connector that originally had firewire signals on it, from a time before HDMI was even common.

But this is Apple, so the REAL reality distortion field kicked in-- a tsunami of press acting as if Apple was ripping people off with a $30 adapter, hundreds of mindless conspiracy theories from Apple bashers on Hacker News about how this is to have more control over people and how this once again proves that "open" (defined as google, not actually anything to do with openness) is better than "closed" (defined as Apple, you know the company with the most popular open source operating system in the world?).

It's one thing to not know enough engineering for this to have been obvious to you, it's quite another to spread lies and engineering ignorance as a result of your ideological hatred of Apple. And the latter is basically all I saw on HN about this format. (Which is one of the reasons I write HN off as worthless for months at a time.)




What you're saying is true from an engineering standpoint (serial vs parallel), but has to be placed in the customer's context.

In this specific case the quality is bad, operation is unreliable, and the price is high. Consumer devices accept HDMI as input. Serial to parallel video (Lightning to HDMI) is tough without some heavy-duty hardware -- hence the exorbitant cost of these adapters.

The SoC design introduces a massive amount of complexity. This has yielded unreliable operation. And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters.

End-to-end serial communications would be nice, but that's not the world we live in.

Lightning isn't that much smaller than HDMI or Micro-HDMI. Reversibility is a very minor feature, and not worth the price being paid.

And that's not a $30 adapter. It's a $50 adapter. Did you think it was $30? That was the old one -- parallel to parallel.


After thinking a little bit about it, I think this approach does make some sense and allows for more flexibility in the future. Keep in mind Lightning was likely designed to last well over a decade and will be used in many different devices.

Now since the adapter is a SoC and it's OS is booted from the device, what that means is, every device has essentially full control over how it wants to output HDMI, without having to change the adapter or the port. Right now this is accomplished using this quirky h.264 encode/decode workaround, but this is first-gen, and it doesn't have to stay that way. Future iDevices might load a different OS onto the SoC and output lossless 1080p using the exact same adapter! And without breaking older devices.

It frees Apple from having to define a fixed method of transmitting HDMI over Lightning now, that is then set in stone for the next 10 years, and has to be supported by every future device.

It also frees them from having unnecessary pins, which might become useless in the future, but have to carry over to every new device (a.k.a. 30-pin connector). And knowing Apple, probably THE top priority of Lightning was to have a slick, easy-to-plug-in-n-out, user-friendly connector, which Lightning admittedly does way better then any MicroUSB standard.

Because in essence, the only thing that is fixed about Lightning is the physical shape and pins, so they focused on getting that aspect right and future-proof. How the data is transmitted can be changed on a device level basis.


No amount of software can increase the bandwidth of a fixed-rate serial transceiver.


The problem isn't even serial-to-parallel - HDMI is serial based - the problem is that Apple apparently designed Lightning with insufficient bandwidth for uncompressed video, then kludged around it. Then Apple fanboys went on and on about how much more elegant it is than MHL, which has much cheaper HDMI adapters and better video quality because all the MHL adapters have to do is convert one uncompressed serial video data format to another.

I mean, technically speaking Samsung or any of the other manufacturers could've done the same trick as Apple using plain old micro-USB OTG 2.0 with no special hardware support in their phones, no special connectors... but the reviewer community would call them out on it because it's ugly and user hostile, if their engineers even let them get that far.


Thanks for the correction -- it is a bandwidth issue. Lightning has less capability than what it replaces. But hey -- it's reversible!

Good trade? No.


I strongly disagree that reversibility is a small feature. Whenever Plugging the chargers to new iOS devices is effortless the same way as headphones jacks.

Non-symmetrical connectors are an affront to usability.


I appreciate reversibility once per day.

In 2074 days of owning an iPhone and 1065 days of owning an iPad I have never used or wanted an HDMI output.

I'd say they made the right tradeoff.


I agree. I don't even have a device with the connector (yet?), but it seems like a major advantage.

Who are all these people popping out of the woodwork wanting a wired connection from their phone to their TV? I'm sure some people do this sometimes, but so many? Why would you even do that? Perhaps this is uncharitable, but it makes me think that most of the people complaining here have never done it, never will, probably never even thought about doing it before, but are now outraged at the thought that the connector is not 100% perfect for this one uncommon use-case.


The only two use cases I can think of is playing a video you recorded on your phone at a family gathering, or for playing Netflix on your TV without needing to hook up a Roku or similar.

Edit: third use case, hooking this up to a monitor to turn your smartphone into a desktop computer.


I have dock to HDMI adapter. I use it in hotels to watch movies on big screen TV. Beats the crap on cable TV every time.


Thanks, that's the first use case that I could actually see using myself. Don't think it's quite enough to get me to go buy a cable, but I can see it being handy for that.


Probably the same people who want VGA as it what the corporate world use.


It's especially strange because if you want to do that, you can Airplay the video to the TV, and not have to deal with a cable from your phone to your TV.


Asymmetric connectors weren't bad, it's when you have asymmetric connectors in rectangular plugs that it becomes a problem. I never tried to put FireWire 400 in backward, but USB is awful.


"And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters."

I don't understand this. What makes these things any more fragile than a regular adapter? They are, as far as I understand it, compact, fully solid-state, and about as strong as any consumer electronics of that size would be.


> defined as Apple, you know the company with the most popular open source operating system in the world?

I'm not sure which OS you're talking about...perhaps you could point me to the source code of either OS X or iOS? Certain core components of OS X are open source, but Darwin isn't OS X.

As someone who makes his living from writing Objective-C code, I don't have any ideological objection to Apple. But I think you shouldn't accuse people of spreading "lies and engineering ignorance" when you seem to be claiming something that's patently untrue.


Perhaps you didn't experience the PR storm around the time OS X was initially released: it was heavily geared towards nerds and the literal phrase "open source" was extensively employed in their marketing material.

This aside, you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality, one of the common features of the discussions here that tends to make my skin crawl.


you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality

He's not trying to write anything off and it is a pretty big technicality. nirvana should have omitted that ideological jab to begin with as it was unnecessary and shows his bias for Apple/against Google. Google is more "open source" than Apple in any way that matters, considering Android is currently on devices their competitors (Amazon) are selling. Apple is mostly responsible for WebKit which is commendable and useful but as far as practical considerations go, nobody gives two shits about Darwin.

But yes, engineering should be the focus and people assume the worst with Apple.


>But yes, engineering should be the focus and people assume the worst with Apple.

Some people sometimes assume the worst of Apple, or Microsoft or Google, or [insert name of company here]. One of the things that can cause strong anti-Apple sentiment are the rabid fanboys (ie. postings like Nirvana's). They paint Apple to be patron saints - and when reality hits (like it did for me with antenna-gate), users are annoyed because of the unrealistic expectations, but also because of the RDF created by Fanboys. For the record, I think Apple have shown the phone industry a thing or two about engineering excellent products while maintaining a strong focus and excellent compromises. I just wish the rabid Fanboys would shut up, or present a balanced view ... it would make Apple a lot easier to respect.


I would tend to agree wrt HN nitpicking except in this case nirvana is presenting himself as the paragon of objectivity setting the story straight against the unwashed rabble of knee-jerk Apple haters, when in fact he is nowhere near objective when it comes to Apple, and shooting ignorant fish in a barrel is not good enough to validate his points. He needs to be held to a higher standard.


The technicality being that Darwin seizes to be an open source operating system once it is shipped with closed source components. Why would this be true?


OK here we go, and why not, after all it's Sunday and I've got nothing better to do.. right?

The central tenet was that discourse here is regularly devoid of sound engineering because it tends to be blinded by mindless cultural perceptions of the companies involved in whatever happens to be under discussion. In the case of Apple the expectation is their products are flawless and if not then all hell will be paid on blogs and comment sections everywhere.

Whether or not Darwin is or isn't open source doesn't freaking matter, it was heavily marketed as such back in the sands of time and even if this wasn't the case it doesn't invalidate the central point made in the rant - that just because this device has an Apple logo every popular discussion surrounding it turns to mindless diatribe as a result of non-engineering centric expectations people place on their products, and every engineering-centric party (i.e. hackers) must deal with the whining polluting engineering-centric forums for days every time it happens.

In effect, the complaint is that commenting resembles the squabble of a throng of uninformed consumers rather than the judicious discourse of a forum of engineers.


While I agree that there is precious little interesting technical discussion on HN, engineering is an ideological discipline, just like anything else.


I don't agree that engineering is necessarily and solely a cargo cult discipline. We have math, we have observable and repeatable experiments.


The technicality being that Darwin seizes [sic] to be an open source operating system once it is shipped with closed source components.

GNU/Linux is therefore cough nVidia cough completely, 100% open. Thanks for clearing that up. Darwin sux!1!

/sarc


"...Darwin isn't OS X".

Ok, but Darwin is the operating system.

I remember back in the early days – from Mach on black hardware through Openstep on 4 different architectures – the folks from NeXT were always very careful to use the phrase "system software" when referring to the whole thing and only using the phrase "operating system" when referring to the layer that supports basic functions like scheduling tasks, invoking user-space code, and controlling peripherals.

This is one of the things I appreciated of them back then, as they were respectful of the nomenclature actually used in computer science.

Now I realize that the phrase "operating system" commonly receives slight colloquial abuse to refer to everything inside the shrinkwrap, but I think the formal meaning hasn't completely died yet, so nirvana should be allowed to use it properly if he so desires.


Darwin is the operating system. Trying to point out that the whole system stack is considered not open source because the windowing system isn't open has nothing to do with it. Does my Ubuntu system become not open source when I use the binary nvidia drivers?


So what does Lightning Digital AV Adapter do today that could not have been done using Micro USB->MHL? Looking at the reviews the Apple solution is a) pricey b) over engineered and c) performs worse than MHL. And you are talking about Apple haters creating a reality distortion field?

One way you could defend it is to promise features that can be programmed into the adapter firmware but if today it does 720p poorly I see no reason to believe something much more useful/better will come later. I am paying the $50 today, not in the future.


>Ideology has trumped engineering, and as hackers, you shouldn't tolerate it.

Exactly. It turned out Apple's Lightning is actually inferior to a standard MicroUSB + MHL.


> The Samsung Galaxy S III and Galaxy Note II use a connector that is similar to the original 5-pin MHL-HDMI adapter/dongle, but it uses 11-pins in order to achieve a few functional improvements over the 5-pin design

standard, you say?


In what way do you believe (Lightning) is inferior?

and in what way do you believe (MicroUSB + MHL) is 'standard'?

Serious questions.


Both Micro USB and MHL seem to be industry standards - ref Wikipedia. Lot of vendors seem to putting out cheaper Micro USB to MHL adapters - so even if (Micro USB->MHL) may not be a standard it is at least built on two standards and proprietary licensing / approval seems to be unnecessary for vendors.

Also most MHL adapters seem to do 1080p - the Lightning one seems to do 720p badly if we are to believe Apple store reviewers.


The sad thing is that Lightning is even inferior to the Apple 30 pin connector for this use. You get better video out on an iPhone 4S than a brand new iPhone 5, and the connector is $10 less and has a pass thru so you can charge it at the same time.


This seems like the kind of solution you would bodge together if someone gave you the already designed Lightning connector and said "now make it support VGA and HDMI". A completely crazy hack. Bashing Apple over it seems completely reasonable.


It's a result worthy of pointy haired bosses.


I have always felt this way, I just don't feel a need to make a noise about it.

Be wary of confirmation bias and sample set bias (you only hear the worthless noise from those who are speaking it) when reading sites like HN/Reddit/etc.

It's a lot easier to hit the upvote button than it is to type a comment. Not all of HN's constituents are whiny blowhards.


I agree 100%. Software is eating the world, so I'm not sure why everyone is so against this.

Maybe it's a case of I didn't complain when they came for my DEC Alpha server with green screen, nor when they took away my Token Ring network, but I will not stand for only using one flimsy cable for all my devices. Come on, this is the tech industry, what did you think was going to happen?

What I see Apple's done here is future proofed the connector. Ok, so it doesn't output 1080p today, but I see no reason why it couldn't tomorrow. Devise a new protocol, download an update to all the iDevice's which in turn upgrades all the adapters out there and everything's golden. Once this (admittedly painful) transition is complete, I see no reason for Apple to have to endure another one. By the time it's outdated, I'm sure everything will be wireless.

Perhaps everyone complaining about a $30 adapter shouldn't have purchased a $600 phone and instead stuck with a $20 Moto Razr.


>Software is eating the world, so I'm not sure why everyone is so against this.

Added latency and worse quality are pretty big things. I wan't to send pixels 1:1 to my display device, which this new technology doesn't allow.


I do see where you're coming from, and agree it sucks that they've went backward in quality in this case. To my mind those are implementation details that Apple screwed up. It doesn't invalidate the basic idea though, which is to move the brains of the device into software so that the same connector can be used for a multitude of different functions, some of which don't even exist today.

Admittedly, I'm not privy to whatever design decisions the team that implemented the connector made, but I see no reason why it couldn't have the same fidelity that a straight hdmi cable would have. If I guessed, I'd say that they said 'good enough, ship it' instead of continuing to refine it since they knew they could always send down an update later.


The point here is that the electrical design of Lightning doesn't have the bandwidth for 1080p. It runs at USB 2 speeds.

Software can't magically make hardware do things. It can create an illusion (like the Lightning adapter does), but the design has to support, for real, capabilities you want to properly provide.


I agree with what you're saying, but I'm interested about Apple's open source operating system. Assuming OS X, I'm having trouble finding its entirety in open source, obviously it's built on open source but I don't think the entire OS is open sourced. I'm also having trouble believing it wouldn't have been used for a Linux desktop by now.

Sorry for replying to a small part of your comment, but I really do agree with the rest.


Can you explain a little more what you're getting at? I'm not very familiar with hardware and don't know anything about video. The adapter seems like a neat hack, but definitely a workaround for something. I can't really tell if you're defending it or not.

What's the win from an engineering standpoint here? And why is this an inevitable design (which you suggest if I understand you right)? What are some other options and what are the reasons those might not have been used?


Are you seriously defending Apple by saying a serial bus is the obvious solution? Apple, the company that has forced rediculous proprietary ports that add zero value into its products while every one else in the space was using Universal Serial Bus? If this post was meant to be sarcastic then I will accept a well deserved whoosh, but ideology trumped engineering at Apple years ago.


You do realize that it was Apple that first shipped computers with USB to replace legacy ports, right?


I think you are wrong about that. Apple shipped the iMac starting with USB v1.1 in 1998 [1]. It was the first Mac with a USB port. USB was developed in 1994 from a group of seven companies Compaq, DEC, IBM, Intel, Microsoft, NEC and Nortel. [2] Windows 95 that was released in 1997 had built-in support for USB devices.[3] The market share of Mac at that time was about 4.6% [4]. So no Apple wasn't the first computer to ship with USB support nor was it the reason USB went mainstream.

[1]http://www.everymac.com/systems/apple/imac/specs/imac_ab.htm...

[2]http://en.wikipedia.org/wiki/Universal_Serial_Bus#History

[3] http://en.wikipedia.org/wiki/Windows_95#Editions

[4] http://pctimeline.info/windows/win1997.htm


Wikipedia says:

“Few USB devices made it to market until USB 1.1, released in August 1998, which fixed problems identified in 1.0, mostly relating to hubs. 1.1 was the earliest revision to be widely adopted.”

The iMac G3 was released in August 1998. I didn’t say the iMac was the first computer to have USB ports, because it probably wasn’t quite the first (although, interestingly, no other computer comes up when you try to Google this); importantly, though, it only had USB ports, and killed off ADB, serial, parallel, and SCSI, forcing users to start buying USB peripherals. My family had to get a serial to USB adapter that still worked with OS X the last time I tried it with a GPS receiver (I just looked it up and it may have finally stopped working with Mountain Lion, nearly 15 years later). It was, what, about ten years after that that most PCs finally stopped including PS/2, serial, and parallel ports?

There’s some discussion here with people disputing that the iMac “jumpstarted” the USB market: http://skeptics.stackexchange.com/questions/2785/did-apple-j...

Someone complains that Apple promoted FireWire and was late to support USB 2.0, but FireWire came out first and was technically superior (although some devices do have weird compatibility issues), and that Apple dragged its feet supporting USB 3.0 because they were trying to promote Thunderbolt, but I believe this was because Apple is using Intel chipsets (because Intel killed off Nvidia’s chipsets) and Intel was doing exactly what this person accused Apple of doing.


USB was around long before Apple stopped providing PS/2.

If you buy a phone or tablet from anyone that isn't Apple you will very likely get a USB port. If you buy Apple you will not. Trying to argue that Apple is somehow looking to the future by providing a serial bus years after it was the norm is hard to comprehend.


Apple never provided PS/2; their solution was the (proprietary?) Apple Desktop Bus: http://en.wikipedia.org/wiki/Apple_Desktop_Bus

You may be thinking of the licensed Mac clones of the mid-90s: some of them did include PS/2 ports.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: