No. And contrary to what others have said, more Germans know english than Dutch know english, on a day to day basis. Though you can get by without knowing dutch in amsterdam with a little creativity.
Every educated Dutch person knows English. Even more than that - they willingly switch from Dutch to English when an international person joins the conversation. I can't say that I got the same impression about German people. They have very monolithic culture and it's not very open to foreigners.
So if you do not want to learn neither Dutch nor German you better move to Amsterdam than Berlin.
This is not what i experienced. I live in Germany, but i am quite often visiting the Netherlands. I was even able to speak to elderly people in English when i was lost in some small Dutch village.
On my experience, everyone in the Netherlands speaks English. I had problems getting by in Berlin and I know some basic notions of German (had to use English when that wasn't enough), English does NOT cut it, you need at least some basic German (maybe 1 or 2 months of studying).
Some Dutch words look like English words, and a lot of Dutch words are similar to German. So if you're an English-speaking expat in Amsterdam, learn German instead of Dutch (Goethe Institut is on Herengracht). Then you'll understand a bit of Dutch (though you don't have to since everyone speaks English), and you'll be all set when you move to Berlin later.
If you run a successful business in holland, and pay yourself a salary out of it, your tax rate is over %70! And on top of that, you have %20 VAT. And this isn't even going into the manifold fees and other types of taxes you are subjected to.
-------------------------------
This comment is not spam. It had a nice positive score until the socialists arrived. Of course, socialists being unable to deal with reality (or history!) must censor those who point out simple facts that undermine their ideology. And so this comment has been downvoted to obscurity (-15 at this moment) to make it less likely others would be made aware of the high tax rates in holland.
This kind of dishonesty and censorship is why you can't have fruitful discussion on hacker news. The site is overrun by ideologues and moderated by ideologues and anything that doesn't strictly toe to the fascist modern ideology gets the boot.
Ok, just accept, though, that you're anti-intellectual in a very profound way.
When you can't tolerate people stating inconvenient facts, it's time to admit you're an intellectual coward.
Frankly, I think you're all shit. You've elected a guy who claims he has the right to kill any american with a drone strike, in america, without any due process.
And you're so proud of it! It's astounding. You should be ashamed. Profoundly ashamed.
But know this, it is your cowardice and intellectual dishonesty that caused the state of affairs we find ourselves in. Run away to amsterdam if you like, see how long it remains nice.
I looked at Amsterdam as a move and it was about the same as the UK and other European states, and as I have the max years in for the uk pension scheme I was temped to move to Holland to buildup a second pension there.
The lack of a VCT's, CGT exemption and taper relief is an issue for smaller investors - Seems to me that the ducth have a lot of german style Mittelstand companies and cgt/taxsystems is trageted at that .
Ok, so you don't pay %50 on 1/3rd of your salary for 10 years. I wouldn't exactly call that "an amazing tax break". You're still paying way too much in taxes.
um you do know how to calculate % do you this is effectively adding 30% tax free to the base you do not get taxed at 52% on all the rest you still have the tired tax rates as well as the 30%
Tax rates in continental Europe can indeed be steep compared to UK and US. In the Netherlands the top rate is 'only' 52%, not 70%. Most people, especially company owners, pay much less.
Let's not make up numbers... Income tax is high here (not 70%) but you know, you get what you pay for. If you're American the 30% rule kicks in as long as you are here on a dutch american treaty visa or sponsored by a company. Any American can start a company here and get a 1 year visa. That's a pretty low barrier or entry compared to most countries. Corporate tax is also incredibly low, with a ton of benefits for the first 3 years you are a b.v. (dutch corp)
You don't to be American to qualify for 30% rule. You have to be an expat who qualifies as a Knowledge Worker, and was recruited to Netherlands for that purpose (if you were already living in the Netherlands when you got the job, you don't qualify). You can avail of the 30% rule for 10 years.
Those aren't made up numbers. I actually did the math based on my income. How about you stop being a dishonest coward who characterizes your opponent, rather than making an argument, hmm?
You don't get what you pay for, of course, if you bothered to actually add up what you really do pay, and what you really do get, but then that is the essential swindle isn't it? You and all the rest of the thieves like to pretend like it's better to pay government $20,000 a year for services the free market would happily deliver for $20. And then you act as if those of us who point out we're getting ripped off are "greedy", or in your case-- dishonest.
Well, if you have to lie to make an argument, that tells me even you know your argument is the suck.
Would you be interested in posting the numbers you used for your calculations? I appreciate it is confidential information, but I find it very difficult to believe your tax rate is 70%.
If you are in fact paying 70% of your income in taxes, I can recommend an accountant who can advise on a more efficient tax strategy.
I would prefer it if you didn't reply and call me a socialist, or complain that I am dishonest - I have no horse in this race, I am merely pointing out that if you are in fact paying 70% tax, you could save a lot of money by talking to an accountant who specialises in tax efficiency.
Why all the hostility? People are asking you where you got your numbers, which is reasonable considering your comment provided two percentages and nothing else. Your unwillingness to back up your opinions with facts that we can check and verify marks you as an intellectual coward as well. Stop calling the commenters names and engage in the conversation, please.
I'd also like to know where your numbers are coming from, since I'm in the middle of deciding where to found my first company and any information that strays away from what is popular tends to get my attention.
Milton Friedman did a study of healthcare in the USA, and found that the government "help" had driven up the cost 26fold and driven down availability. Or put another way, the reason people could get bankrupted by an illness is socialized medicine, and it's also the reason they might not be able to find care when they need it. Contrary to what Obama says, socialism doesn't work, and if you think you'll get better healthcare in the NL than in the USA, you'll have a very rude awakening... but of course you won't be alive to post your mea culpas here on HN. Frankly, even assuming this shows a profound ignorance of both history and economics.
Let's add a little balance: Health care in the us can be excellent and yes, better than in the Netherlands. (I'm Dutch by the way and have friends who have flown to the us specifically for healt-related issues. There are stories in the media here about communities saving for sick kids so they can get help in the US. The US is at the forefront of health).
However this excellent care is only available for people who can afford it. In the Netherlands they'll help anyone and they won't ask you for your credit card. No one goes bankrupt because they got ill and needed an operation. Downside of this is they can't help everyone right away and you need to wait some time before it's your turn. I had a nice talk about this with a surgeon for about two hours as he and his team were inserting four big metal tubes in my eye,draind all the fluid before puming it back up again with air and freezing the tissue back together again from the inside. At the end of the year I paid about $350 bucks and got my eysight back. The 350 bucks included all my ckeckup visits.
Milton Friedman is so full of corporate interest, you can probably make two senators out of him.
Seriously, explain how health care in any other 'socialist country' is way cheaper than in the US if government investment drives prices up. I dare you.
A little thought experiment... some clever dudes manage to reverse engineer enough to "jailbreak" this, and then they put a little OS image on it and start hacking away... then a bunch of people say "hey, this is really cool! It's like Apple made a little raspberry pi for us!" but then even more say "but it's so limited, it doesn't have USB out, etc, etc."
And then dozens more just go on and on about how Apple "crippled" the device by not giving it USB and how this "proves" Apple just wants "control" and why did they have to jailbreak it anyway?
In the process of getting their panties all bundled up they never realize they're bitching about an adapter not being a general computing platform.
They're also proving Apple right-- it's engineered to solve a specific problem and provide specific functionality. Even if it were jailbroken from the factory, people would be complaining and demanding that it does other things... than what it was designed to do.
You're extrapolating everything into a very silly future-tense strawman that won't ever exist. Nobody will be complaining that Apple didn't put USB in a proprietary SoC intended for video transfer.
"And then dozens more just go on and on about how Apple "crippled""
Exactly. "Dozens". Apple isn't building products for "dozens". They are building for the people who literally stream into the Apple retail store (non techies who clamor for their products) at all times of the day when other stores sit idle. I can walk into the Apple store near me (and it's not in Manhattan but in a suburban mall) and while the other upscale shops are idle, say, Tuesday at 10am the Apple store is quite busy.
Ideology has trumped engineering, and as hackers, you shouldn't tolerate it.
Frankly, all of this has been obvious all along to any competent engineer, since the moment Apple introduced lightening. They described it as a serial bus and talked about how it gave more flexibility. If you think about it for 2 seconds its obviously better to run protocols over a serial bus than to run 30 lines of individual signals, with dedicated lines for analog and dedicated lines for digital in a world where people want HDMI adapters for a connector that originally had firewire signals on it, from a time before HDMI was even common.
But this is Apple, so the REAL reality distortion field kicked in-- a tsunami of press acting as if Apple was ripping people off with a $30 adapter, hundreds of mindless conspiracy theories from Apple bashers on Hacker News about how this is to have more control over people and how this once again proves that "open" (defined as google, not actually anything to do with openness) is better than "closed" (defined as Apple, you know the company with the most popular open source operating system in the world?).
It's one thing to not know enough engineering for this to have been obvious to you, it's quite another to spread lies and engineering ignorance as a result of your ideological hatred of Apple. And the latter is basically all I saw on HN about this format. (Which is one of the reasons I write HN off as worthless for months at a time.)
What you're saying is true from an engineering standpoint (serial vs parallel), but has to be placed in the customer's context.
In this specific case the quality is bad, operation is unreliable, and the price is high. Consumer devices accept HDMI as input. Serial to parallel video (Lightning to HDMI) is tough without some heavy-duty hardware -- hence the exorbitant cost of these adapters.
The SoC design introduces a massive amount of complexity. This has yielded unreliable operation. And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters.
End-to-end serial communications would be nice, but that's not the world we live in.
Lightning isn't that much smaller than HDMI or Micro-HDMI. Reversibility is a very minor feature, and not worth the price being paid.
And that's not a $30 adapter. It's a $50 adapter. Did you think it was $30? That was the old one -- parallel to parallel.
After thinking a little bit about it, I think this approach does make some sense and allows for more flexibility in the future. Keep in mind Lightning was likely designed to last well over a decade and will be used in many different devices.
Now since the adapter is a SoC and it's OS is booted from the device, what that means is, every device has essentially full control over how it wants to output HDMI, without having to change the adapter or the port. Right now this is accomplished using this quirky h.264 encode/decode workaround, but this is first-gen, and it doesn't have to stay that way. Future iDevices might load a different OS onto the SoC and output lossless 1080p using the exact same adapter! And without breaking older devices.
It frees Apple from having to define a fixed method of transmitting HDMI over Lightning now, that is then set in stone for the next 10 years, and has to be supported by every future device.
It also frees them from having unnecessary pins, which might become useless in the future, but have to carry over to every new device (a.k.a. 30-pin connector). And knowing Apple, probably THE top priority of Lightning was to have a slick, easy-to-plug-in-n-out, user-friendly connector, which Lightning admittedly does way better then any MicroUSB standard.
Because in essence, the only thing that is fixed about Lightning is the physical shape and pins, so they focused on getting that aspect right and future-proof. How the data is transmitted can be changed on a device level basis.
The problem isn't even serial-to-parallel - HDMI is serial based - the problem is that Apple apparently designed Lightning with insufficient bandwidth for uncompressed video, then kludged around it. Then Apple fanboys went on and on about how much more elegant it is than MHL, which has much cheaper HDMI adapters and better video quality because all the MHL adapters have to do is convert one uncompressed serial video data format to another.
I mean, technically speaking Samsung or any of the other manufacturers could've done the same trick as Apple using plain old micro-USB OTG 2.0 with no special hardware support in their phones, no special connectors... but the reviewer community would call them out on it because it's ugly and user hostile, if their engineers even let them get that far.
I strongly disagree that reversibility is a small feature. Whenever Plugging the chargers to new iOS devices is effortless the same way as headphones jacks.
Non-symmetrical connectors are an affront to usability.
I agree. I don't even have a device with the connector (yet?), but it seems like a major advantage.
Who are all these people popping out of the woodwork wanting a wired connection from their phone to their TV? I'm sure some people do this sometimes, but so many? Why would you even do that? Perhaps this is uncharitable, but it makes me think that most of the people complaining here have never done it, never will, probably never even thought about doing it before, but are now outraged at the thought that the connector is not 100% perfect for this one uncommon use-case.
The only two use cases I can think of is playing a video you recorded on your phone at a family gathering, or for playing Netflix on your TV without needing to hook up a Roku or similar.
Edit: third use case, hooking this up to a monitor to turn your smartphone into a desktop computer.
Thanks, that's the first use case that I could actually see using myself. Don't think it's quite enough to get me to go buy a cable, but I can see it being handy for that.
It's especially strange because if you want to do that, you can Airplay the video to the TV, and not have to deal with a cable from your phone to your TV.
Asymmetric connectors weren't bad, it's when you have asymmetric connectors in rectangular plugs that it becomes a problem. I never tried to put FireWire 400 in backward, but USB is awful.
"And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters."
I don't understand this. What makes these things any more fragile than a regular adapter? They are, as far as I understand it, compact, fully solid-state, and about as strong as any consumer electronics of that size would be.
> defined as Apple, you know the company with the most popular open source operating system in the world?
I'm not sure which OS you're talking about...perhaps you could point me to the source code of either OS X or iOS? Certain core components of OS X are open source, but Darwin isn't OS X.
As someone who makes his living from writing Objective-C code, I don't have any ideological objection to Apple. But I think you shouldn't accuse people of spreading "lies and engineering ignorance" when you seem to be claiming something that's patently untrue.
Perhaps you didn't experience the PR storm around the time OS X was initially released: it was heavily geared towards nerds and the literal phrase "open source" was extensively employed in their marketing material.
This aside, you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality, one of the common features of the discussions here that tends to make my skin crawl.
you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality
He's not trying to write anything off and it is a pretty big technicality. nirvana should have omitted that ideological jab to begin with as it was unnecessary and shows his bias for Apple/against Google. Google is more "open source" than Apple in any way that matters, considering Android is currently on devices their competitors (Amazon) are selling. Apple is mostly responsible for WebKit which is commendable and useful but as far as practical considerations go, nobody gives two shits about Darwin.
But yes, engineering should be the focus and people assume the worst with Apple.
>But yes, engineering should be the focus and people assume the worst with Apple.
Some people sometimes assume the worst of Apple, or Microsoft or Google, or [insert name of company here]. One of the things that can cause strong anti-Apple sentiment are the rabid fanboys (ie. postings like Nirvana's). They paint Apple to be patron saints - and when reality hits (like it did for me with antenna-gate), users are annoyed because of the unrealistic expectations, but also because of the RDF created by Fanboys.
For the record, I think Apple have shown the phone industry a thing or two about engineering excellent products while maintaining a strong focus and excellent compromises. I just wish the rabid Fanboys would shut up, or present a balanced view ... it would make Apple a lot easier to respect.
I would tend to agree wrt HN nitpicking except in this case nirvana is presenting himself as the paragon of objectivity setting the story straight against the unwashed rabble of knee-jerk Apple haters, when in fact he is nowhere near objective when it comes to Apple, and shooting ignorant fish in a barrel is not good enough to validate his points. He needs to be held to a higher standard.
The technicality being that Darwin seizes to be an open source operating system once it is shipped with closed source components. Why would this be true?
OK here we go, and why not, after all it's Sunday and I've got nothing better to do.. right?
The central tenet was that discourse here is regularly devoid of sound engineering because it tends to be blinded by mindless cultural perceptions of the companies involved in whatever happens to be under discussion. In the case of Apple the expectation is their products are flawless and if not then all hell will be paid on blogs and comment sections everywhere.
Whether or not Darwin is or isn't open source doesn't freaking matter, it was heavily marketed as such back in the sands of time and even if this wasn't the case it doesn't invalidate the central point made in the rant - that just because this device has an Apple logo every popular discussion surrounding it turns to mindless diatribe as a result of non-engineering centric expectations people place on their products, and every engineering-centric party (i.e. hackers) must deal with the whining polluting engineering-centric forums for days every time it happens.
In effect, the complaint is that commenting resembles the squabble of a throng of uninformed consumers rather than the judicious discourse of a forum of engineers.
I remember back in the early days – from Mach on black hardware through Openstep on 4 different architectures – the folks from NeXT were always very careful to use the phrase "system software" when referring to the whole thing and only using the phrase "operating system" when referring to the layer that supports basic functions like scheduling tasks, invoking user-space code, and controlling peripherals.
This is one of the things I appreciated of them back then, as they were respectful of the nomenclature actually used in computer science.
Now I realize that the phrase "operating system" commonly receives slight colloquial abuse to refer to everything inside the shrinkwrap, but I think the formal meaning hasn't completely died yet, so nirvana should be allowed to use it properly if he so desires.
Darwin is the operating system. Trying to point out that the whole system stack is considered not open source because the windowing system isn't open has nothing to do with it. Does my Ubuntu system become not open source when I use the binary nvidia drivers?
So what does Lightning Digital AV Adapter do today that could not have been done using Micro USB->MHL? Looking at the reviews the Apple solution is a) pricey b) over engineered and c) performs worse than MHL. And you are talking about Apple haters creating a reality distortion field?
One way you could defend it is to promise features that can be programmed into the adapter firmware but if today it does 720p poorly I see no reason to believe something much more useful/better will come later. I am paying the $50 today, not in the future.
> The Samsung Galaxy S III and Galaxy Note II use a connector that is similar to the original 5-pin MHL-HDMI adapter/dongle, but it uses 11-pins in order to achieve a few functional improvements over the 5-pin design
Both Micro USB and MHL seem to be industry standards - ref Wikipedia. Lot of vendors seem to putting out cheaper Micro USB to MHL adapters - so even if (Micro USB->MHL) may not be a standard it is at least built on two standards and proprietary licensing / approval seems to be unnecessary for vendors.
Also most MHL adapters seem to do 1080p - the Lightning one seems to do 720p badly if we are to believe Apple store reviewers.
The sad thing is that Lightning is even inferior to the Apple 30 pin connector for this use. You get better video out on an iPhone 4S than a brand new iPhone 5, and the connector is $10 less and has a pass thru so you can charge it at the same time.
This seems like the kind of solution you would bodge together if someone gave you the already designed Lightning connector and said "now make it support VGA and HDMI". A completely crazy hack. Bashing Apple over it seems completely reasonable.
I have always felt this way, I just don't feel a need to make a noise about it.
Be wary of confirmation bias and sample set bias (you only hear the worthless noise from those who are speaking it) when reading sites like HN/Reddit/etc.
It's a lot easier to hit the upvote button than it is to type a comment. Not all of HN's constituents are whiny blowhards.
I agree 100%. Software is eating the world, so I'm not sure why everyone is so against this.
Maybe it's a case of I didn't complain when they came for my DEC Alpha server with green screen, nor when they took away my Token Ring network, but I will not stand for only using one flimsy cable for all my devices. Come on, this is the tech industry, what did you think was going to happen?
What I see Apple's done here is future proofed the connector. Ok, so it doesn't output 1080p today, but I see no reason why it couldn't tomorrow. Devise a new protocol, download an update to all the iDevice's which in turn upgrades all the adapters out there and everything's golden. Once this (admittedly painful) transition is complete, I see no reason for Apple to have to endure another one. By the time it's outdated, I'm sure everything will be wireless.
Perhaps everyone complaining about a $30 adapter shouldn't have purchased a $600 phone and instead stuck with a $20 Moto Razr.
I do see where you're coming from, and agree it sucks that they've went backward in quality in this case. To my mind those are implementation details that Apple screwed up. It doesn't invalidate the basic idea though, which is to move the brains of the device into software so that the same connector can be used for a multitude of different functions, some of which don't even exist today.
Admittedly, I'm not privy to whatever design decisions the team that implemented the connector made, but I see no reason why it couldn't have the same fidelity that a straight hdmi cable would have. If I guessed, I'd say that they said 'good enough, ship it' instead of continuing to refine it since they knew they could always send down an update later.
The point here is that the electrical design of Lightning doesn't have the bandwidth for 1080p. It runs at USB 2 speeds.
Software can't magically make hardware do things. It can create an illusion (like the Lightning adapter does), but the design has to support, for real, capabilities you want to properly provide.
I agree with what you're saying, but I'm interested about Apple's open source operating system. Assuming OS X, I'm having trouble finding its entirety in open source, obviously it's built on open source but I don't think the entire OS is open sourced. I'm also having trouble believing it wouldn't have been used for a Linux desktop by now.
Sorry for replying to a small part of your comment, but I really do agree with the rest.
Can you explain a little more what you're getting at? I'm not very familiar with hardware and don't know anything about video. The adapter seems like a neat hack, but definitely a workaround for something. I can't really tell if you're defending it or not.
What's the win from an engineering standpoint here? And why is this an inevitable design (which you suggest if I understand you right)? What are some other options and what are the reasons those might not have been used?
Are you seriously defending Apple by saying a serial bus is the obvious solution? Apple, the company that has forced rediculous proprietary ports that add zero value into its products while every one else in the space was using Universal Serial Bus? If this post was meant to be sarcastic then I will accept a well deserved whoosh, but ideology trumped engineering at Apple years ago.
I think you are wrong about that. Apple shipped the iMac starting with USB v1.1 in 1998 [1]. It was the first Mac with a USB port. USB was developed in 1994 from a group of seven companies Compaq, DEC, IBM, Intel, Microsoft, NEC and Nortel. [2] Windows 95 that was released in 1997 had built-in support for USB devices.[3] The market share of Mac at that time was about 4.6% [4]. So no Apple wasn't the first computer to ship with USB support nor was it the reason USB went mainstream.
“Few USB devices made it to market until USB 1.1, released in August 1998, which fixed problems identified in 1.0, mostly relating to hubs. 1.1 was the earliest revision to be widely adopted.”
The iMac G3 was released in August 1998. I didn’t say the iMac was the first computer to have USB ports, because it probably wasn’t quite the first (although, interestingly, no other computer comes up when you try to Google this); importantly, though, it only had USB ports, and killed off ADB, serial, parallel, and SCSI, forcing users to start buying USB peripherals. My family had to get a serial to USB adapter that still worked with OS X the last time I tried it with a GPS receiver (I just looked it up and it may have finally stopped working with Mountain Lion, nearly 15 years later). It was, what, about ten years after that that most PCs finally stopped including PS/2, serial, and parallel ports?
Someone complains that Apple promoted FireWire and was late to support USB 2.0, but FireWire came out first and was technically superior (although some devices do have weird compatibility issues), and that Apple dragged its feet supporting USB 3.0 because they were trying to promote Thunderbolt, but I believe this was because Apple is using Intel chipsets (because Intel killed off Nvidia’s chipsets) and Intel was doing exactly what this person accused Apple of doing.
USB was around long before Apple stopped providing PS/2.
If you buy a phone or tablet from anyone that isn't Apple you will very likely get a USB port. If you buy Apple you will not. Trying to argue that Apple is somehow looking to the future by providing a serial bus years after it was the norm is hard to comprehend.
Because Steve Mann is into wearable computers and google glass is not a wearable computer project. Your wondering this is understandable because google has mislead you. But glass has no CPU, there's not enough power or space for one. The intelligence lives on google servers where the speach recognition is done, and everything else, and glass is useless without a net connection (So you have to be in an area with wifi or have a smartphone handy to tether to.)
Steve Mann has been working on head mounted displays, true, but he's been focusing on local horsepower wearable computers.
Ultimately, I don't think google glass is anything more than a PR project to remove the stigma of google as ripoff artists and to make it look like they're innovative.
Given current technology, glass on wifi should have about 20 minutes of battery life, maybe an hour. Which makes them pretty useless.
There's really quite a difference between a wearable computer (what Mann works on) and a bluetooth headset with integrated display (what glass is.)
> But glass has no CPU, there's not enough power or space for one. The intelligence lives on google servers where the speach recognition is done, and everything else, and glass is useless without a net connection (So you have to be in an area with wifi or have a smartphone handy to tether to.)
I guess this CPU-less device talks to a Wifi connection via... magic.
For me it's something that execute instructions and have their own instruction sets. It can be even built in a breadboard and you could eventually invent your own instruction set.
Even low powered microcontrollers have a CPU (microcontrollers are small, low powered computers), and microcontrollers can come in many sizes[1]
Your point could be that it's just a head mounted display on top of an ASIC, which I doubt it would be.
Right now I think Glass is just a display for smartphones and a way to use Google services, which I think is quite limited (you said it's useless without the net, I agree). Right now we don't even have the tech to run sophisticated speech recognition in a smartphone without a couple of servers crunching statistical formulas why would you think it would be different with a low powered device?
EDIT: Basically my last paragraph is saying that I agree with you but without being too harsh in the comments. This could be the beginning of the wearable computers revolution along with a iWatch.
Android has offline speech recognition (introduced in Jelly Bean). From my limited testing it works really well. So, I don't think an external server is as required as some people say.
It works insanely fast, transcribing what I say in near real time which feels like black magic compared to Siri on my iPhone that has to record an audio clip in its entirety, send it up to their servers, process it, then send a response back.
Glass could use the Android handset as a remote server--it shouldn't matter if the Android crunches voice on its own or with a data connection, all that matters is Glass getting a reply from the API call.
>Ultimately, I don't think google glass is anything more than a PR project to remove the stigma of google as ripoff artists and to make it look like they're innovative.
I disagree with everything about your comment, but especially this part. What on earth are you talking about? Labeling Google as non-innovative and Glass a mere "PR project" is shortsighted (oops, forgot about self-driving cars) and quite frankly, something I'd expect from an Engadget comment thread, not someone who managed to snag "nirvana" on HN.
I don't see how the details of the hardware platform have anything to do with the advice he could give on the effects on the viewers eyesight, optics, etc.
I agree with your post in it's entirety, except for the bit on battery life as I spoke to an engineer using Glass a few weeks ago and 5-6 hours with real world use was the toted number (for a mix of Wi-Fi and Bluetooth use).
I still feel that is very poor though, for Glass to really be useful it should last a long working day, and ideally all of your average waking hours.
It remains a bluetooth headset with integrated display and camera. The phone and some remote servers do the real work.
Pure bullshit. If you seriously believe that Glass doesn't have a CPU or has under 20 minutes of battery life, your raging Apple partisanship is showing through even more than normal.
I'm willing to bet that you haven't used Glass, but don't let that stop you from desperately trying to portray it as vaporware, a PR stunt, or some kind of scheme masterminded by Satan himself on EVERY HN thread you can plausibly cram that nonsense into.
Between your recent posts about Glass and taligent's constant downplaying of Google's maps and autonomous cars, I have to wonder what makes you two work so hard at spewing blind hate for them here.
I like your comment, but would it really be so difficult to for the next generation of Glass to tether to a smart phone in your pocket that did all the computing and only used Google's cloud services for supplemental input when available? I.e. the voice recognition gets better when you've got a good data connection. The phone could be larger than average because you'd rarely need to pull it out of your pocket.
I don't see why you think there's such a large difference between local horsepower and cloud computing. They compliment each other, and the ultimate technology will be a mixture of both. User interaction is the much more interesting and difficult problem.
Does google glass not use the wearer's smartphone?
Regardless of whether the computation is done locally or not, he has been working on "augmented reality" using glasses, a great asset for google he may even prove to be.
For the record, this guy has been wearing actual computers.
Google glass is an accessory- essentially a bluetooth headset, display and camera built into glasses. The intelligence lives on the servers, and glass needs a bluetooth or wifi connection to talk to the net.
I think google's engaging in a bit of a PR swindle by making people think google glass is like an iPhone. It isn't, it needs and iPhone or android phone to connect to the net.
Consequently it can't replace a smartphone.
I'm also pretty dubious about the battery time it will get, even without having to run a local CPU.
When reading comments by nirvana you need to realise that anything done by Apple is good, and anything done by anyone not Apple (but especially MS, Google, and Samsung) is stupid, or evil, or crooked, a dumb.
"Google is trying to swindle people with dishonest PR stunts" translates into "Google is doing the normal pr stunts that every company attempt; there are problems with most pr."
In 1998 researchers with a 1000 subjects found a 93% confidence of predicting whether a comment was made by nirvana or not nirvana just from reading the post, based on phrasing such as "the real reality distortion field".
Yes of course, technology will never progress beyond what we have today, we won't get better CPUs or batteries... ever!! /sarcasm. No, seriously, you should check out the Osborne 1.
This isn't an answer, and in fact, supports the opposite of your point :)
It would have been just as easy to use micro-usb 3.0, which had plenty of bandwidth, and do the exact same pin or format conversions.
Standardized serial buses that are less complicated already existed. What exactly do you think USB 3.0 is?
If the only reason for lighting was "a general purpose next generation serial format", then it is definitely a horrible idea.
Of course, none of this (except DRM concerns) answers why they'd not bitstream the format and convert it to HDMI signaling instead of doing the weird crap they did here.
The 30 pin dock had 30 pins so it could put video out directly and things like that.
Lightening is a SERIAL FORMAT with 9pins. So it streams audio and video out in an encoded form.
The AV adapter, need to take that audio and video and turn it into a standardized AV format for the AV plugs.
Now, rather than a lot of odd incompatibilities because Apple added new features to new devices that older docks don't support, we have a common communication format in lightening that should be much more robust going forward.
Apple can add whatever protocols it needs over the serial connection to support future tech, rather than the old way of redefining what some of those 30 pins meant from period to period-- remember the 30 pin connecter started out in a time when there was firewire taking up some of those pins!
People like to ascribe nefarious purposes to Apple or claim apple is "rippng them off" because a small computer that does digital AV conversion costs $30... and they dont' realize that the 30 pin connector didnt' do any conversion of formats was just bringing the signals out to a standard connector. This one actually has to do work, which is why it has a SoC on it.
People don't realize that the 30 pin connector didn't do any conversion of formats because they don't care. They want a connector, and what Apple delivered is a power-sipping SOC that actively throws information away, given the artifacts seen, and then upscales to 1080p because they didn't even have enough throughput to get lossy compressed 1080p through the bus.
So yes you get your 30$ worth of components but it's horribly inferior to a 5$ adapter you can get for every other connector on the planet. It's a rip off alright.
A rip off in the sense that its a crappy output, not in that they are overcharging. I see mention that its likely making no money or actually losing money selling this cable.
I really don't get the logic. As if the cable was the thing that you would absolutely want to keep going from one version of the iphone to the other. If a new technology is out, (let's say 4k output), you'd have to upgrade memory and cpu to handle that power anyway, along with a new TV or monitor, and pretty much everything along the chain.
Hypothetically beeing able to save the cable seems really not the problem anyway..
Which would go absolutely nowhere to explain why my Pioneer car stereo head unit, which used to allow remote control of my iPhone's music collection, and a Pandora app, no longer works with the iPhone 5 - "iPod Out" would be one of the first protocols you would expect to be supported, even 'out of the box', but, no it's not, rendering an $800 head unit a lot less useful.
Perhaps Apple should think of such things first, before "future proofing" things.
Despite your defense of them, and whether there was any merit.... this change has the benefit of netting Apple a healthy profit, and rendering billions of dollars of accessories obsolete.
Steam's support on OS X is the suck. I don't know what the experience is like on Linux, but given there is less competition from better stores on Linux I wouldn't be surprised to see Steam be a big hit on Linux.
As for me, after being a steam user for years, but given the fact that I haven't been able to play team fortress for 5 months now despite playing it for years, in the future, I'll use the Mac App Store.
My experience using Steam on linux has been pretty good. The only thing is that the trailers/videos do not work because I run on 64-bit (there is a workaround for this).
As for the games, a lot of hit and miss. Team Fortress works great! However, a lot of other games require workarounds even though they have official linux support. Luckily, the Steam Linux community is quite strong and these workarounds were pretty easy to find.
I really wish Ubuntu and other linux distros can make an app store (or software center) of similar quality. It's really a joy to use.
What do you find problematic on the Mac version? My office has weekly TF2 games and a bunch of us use Steam on OSX. The only gripe I have is that it doesn't remember my password, but other than that it seems pretty much identical to the Windows counterpart.
Personally, I'm impressed by how horribly implemented Steam's OS X gamepad support is. That takes skill to do, when other apps Just Work™.
Also, Steam.app used to suck heaps of CPU just idling in the background, often enough that to make the fan audible in laptops. That improved lately, but still isn't perfect.
Not to mention the frequent crashes when you quit. I guess it's quitting either way, but still.