This article is dead on, all the way to the end. Takes a few minutes to read, but its observations on the market position of the device, all the way to the individual/social ramifications and the relevance vis a vis Meta, are as sharp as anything I’ve read about this yet.
If it seems like he’s gushing, he is. He’s speaking as someone who has seen lots of these devices and, in his view, this hardware and software outclasses anything else available.
If that bothers you but you push past it, I think you’ll find that whether you like this device or not, it could reflect or represent changes in how we incorporate technology in our lives that we might not yet fully appreciate.
Completely agree about MKBHD. My point was strictly that Stratechery is trash.
I cannot argue with Apple’s ambition but as has been hashed over in these, and many other comments I am skeptical of the adoption story.
The [i]VP is a snazzy product without a market. It remains to be seen whether it will create the market in the way gpt3 did or if it will be another flop like glass, oculus, and every other attempt in the space.
I used the word "qualitatively" because you're right — most of the concepts we see in Apple Vision Pro aren't unfamiliar, but the implementation/execution is radically different.
For example, the Quest has a passthrough feature, but as of Monday it seems incredibly quaint. From reports, the Apple Vision Pro completely sells the "mixed reality" effect.
"My first moment with Vision Pro seeing the physical room viewed through the headset's display in passthrough, I looked down at my own hands and it felt as if I was looking at them directly. This was a powerful moment, more powerful than any previous 'first' I'd experienced in VR. I feel the need to reiterate. I was looking at my own hands reconstructed by a headset's sensors and it felt as if I was looking at them directly."https://www.uploadvr.com/apple-vision-pro/
It's a bit dated really. His era is when VC was king and everything was about how this or that tech was truly disruptive or had a deeper moat than you think.
I've never been able to get anything from the whole west coast "startup intelligence" blogosphere, and now it seems increasingly irrelevant and archaic.
No, it’s a tactic to disarm criticism by pretending to pre-address the obvious “don’t do that” response. “I know my behavior is bad but I am openly acknowledging it so you should trust me and look beyond it” is remarkable in its ability to disarm critics. To put it another way, “they let you do it. You can do anything. Grab them by the..”
That's certainly an observation (and one I agree with), but it's a stretch to call it "sharp".
Everyone and everyone's mother has already made the claim that VR is dystopian. Snow Crash was written in 1992. For literally decades before it's been even implemented, VR has been the quintessential symbol for the irony of technology both bringing us together while keeping us apart.
And to be clear, I'm not even criticizing the product. I think the Vision Pro could be very cool. I'm criticizing the idea that this blog said anything insightful.
As if the skyrocketing rate of anxiety, depression and self-harm that started with the release of the iPhone and Facebook weren't bad enough, now we're going to mediate human relationships even more immersively through silicon and megacorps. Mass alienation is rat poison for civilization.
Yes - that mental picture of a divorced dad watching old family photos come to life, while sitting alone on his couch - is just gutting. Like a literal punch in the gut.
But having said that, is "loneliness" the correct (or only) takeaway on that? I mean, there are people toying with the idea of uploading emails and documents of the deceased to an LLM so relatives can still "talk" to them.
I don't have a word for what that kind of world looks like.
While I think the "divorced dad" thing is one use case, I don't see it as the primary one. Granted we're a ways from this reality, but, especially if Apple releases standalone cameras or builds recording for these moments into iphones/ipads, it seems to me that the use case for this feature is not to replace human connection, but to have a facsimile of it when circumstances prevent you from being able to do something in the real world.
I can very much see a future where that technology is a useful tool in helping people through grief and loss, or for experiencing parts of the world that you otherwise can't (prohibitively expensive; you have disabilities that prevent accessing it).
Examples:
- Replaying interactions with a beloved pet who has died. Reliving taking your dog to the dog park or playing with your cat.
- Replaying interactions with parents/grandparents/friends who have died, or have been altered by a debilitating illness such as ALS or Alzheimer's
- Reliving memories of taking a trip somewhere that you couldn't easily go to again (different country, national park, etc)
- Spatial experiences of beautiful parts of the earth that you've never been to
In the current world, we rely on our own imperfect memories and the inadequate still photos and 2D videos. This type of thing could end up a gamechanger in 5-10 years.
Given how good Apple is at crafting images, I can't believe no one caught how sad and pitiful the scene can look. Surely they could have made the dad a grandfather and made it look much less bleak?
I really worry that if AR/VR like Apple Vision takes off, we are going to find ourselves even more alone, isolated, and depressed than ever, separated from the physical humanity we all evolved to live in. Putting a screen between everything is the theme of so many dystopian movies I don't even know where to start.
I mean, I get it. I really, really do. But, man, is this healthy, really?
I’ve kept in touch with some childhood friends digitally. And I’ve worked while in coffee shops and offices.
I have much more authentic interactions with my friends over Discord than I do with strangers or pseudo-strangers in public. In public then interaction is intentionally inauthentic (which feels emotionally isolating to me, at least); the person wants to be nice to me to get a better tip, I want to be nice to my boss to get a better review. The online interactions with my friends are somewhat hampered by our inability to read body language over a mostly audio medium. The former will require deep societal changes to make into honest interactions. The latter can be fixed by Apple & co using cameras and gyroscopes, which they are already pretty good at. I’m much more optimistic about the latter.
I have much more real and honest interaction meeting physically people whome I don't know from childhood. Online it's all a pretentious character acting.
I think this is a generational and medium thing. We clearly have different definitions of "Online".
I'm guessing your online communication is done via something that uses pictures and videos like Instagram or Snapchat?
My main communication to my friends has been text-based chat for over 20 years, and I do have real and honest communication with them. When we meet physically, we continue where we left off online. Even if people have changed hairstyles, gained/lost weight etc. I wouldn't know, we don't share pictures or videos of each other regularly.
I think the other comment identified a mismatch there; I should have been more clear—by “online” I meant in video games and over Discord. Which is to say, individual conversations. If by online you mean something like Instagram or Facebook or other platforms broadcast out to the world, yeah, that can be a bit pretentious and fake.
It's not healthy. It's not healthy because as devices and software improves people who are in isolation use it as a further form of isolation. It's easier to escape into technology than it is to go out and interact with real humans. Life is messy, it's hard but it's entirely rewarding and what we need. But everything we've now come to know in the past three decades has taken us in a different direction. The technology has its valued when used for specific purposes. But when it becomes the all consuming thing that people use to attempt to solve those deeply troubling problems, that's when we have an issue.
So I think part of this only exacerbates escapism. Getting back to true reality is going to require a fundamental jolt to humanity.
I agree and disagree with this at the same time. Does true reality mean tossing ourselves out in the plains of Africa with a stick fighting off wild animals? I mean ever since that point human intelligence has been on a divergent path from animal intelligence. We create fields and towns and societies that subject us to new realities. Why is one particularly different from another?
At the same time individual divergent realities means society stops existing. What will society look like if we have our own individual universes we can occupy? If you want to take this to an extreme, one could imagine a matrix we voluntarily hook ourselves up to where the machines are left to care for the physical world.
I think there's a distinction between the physical and the digital. One might say it's electrons at the end of the day but if you think about how we came into this world and how we've operated for a millennia, putting a veil over our eyes to deny the real world is not part of that. It came much later.
Look I'm an engineer like everyone else here. I've grown up on the internet, I've played games consoles since I was a child. I've lost myself to endless hours watching TV, movies, roaming the internet and building software till the odd hours. I'm also closing in on 40 so what I value now is not the same as what I thought was important 20 years ago. Living in some sort of matrix was really appealing back then but as my mortality becomes more apparent I reflect more on our existence and reason for being. In my opinion it's not to defer all of our faculties to a screen strapped to our faces. We are so far from the truth is crazy. But hey we'll keep going in that direction and like all technology there will be the pendulum of extremes and moderation. I will likely not use one of these devices for a long time and only when there's a compelling reason to do so and for limited periods of time. I think that's how all technology should be used, in limited capacity for the betterment of our lives. I do not want to stare at a screen for hours on end anymore and spatial computing is basically taking us further in that direction.
We really don't see "reality" we see a filtered and limited version of it optimized by evolution to keep us alive on the Serengeti. How useful that actually is in a world that has quickly advanced beyond that point is now the new question.
For example, if we had long term space stations that people were born on, had babies on, and aged and died on, their idea of reality and yours would be significantly different. We define reality by the environment and society around us. Reality doesn't have a fixed value.
This is a separate warning from the LLM/Humanlike agent crowd too. Yuval Noah Harari warns that the battle of the past was attention, the battle of the future will be intimacy.
I don't agree with his comment, but he does address it:
In other words, there is actually a reason to hope that Meta might win: it seems like we could all do with more connectedness, and less isolation with incredible immersive experiences to dull the pain of loneliness. One wonders, though, if Meta is in fact fighting Apple not just on hardware, but on the overall trend of society; to put it another way, bullishness about the Vision Pro may in fact be a function of being bearish about our capability to meaningfully connect.
Apple has managed to combine AR and VR, by simply re-creating the AR inside the VR.
The idea is simple but profound.
However, for a company that prides itself on "shared" experiences, the VR is for "lonesome" experiences (watching a movie, watching vacation photos, etc) in some cases, but bang on for others (especially at work)
This will spurn the industry, just like the iPhone did, and hopefully, we will have a lot of use cases. To list a few.
1. Watching your family from afar, by linking two VR devices in real time.
2. Incredible engineering and design advances, with the combination of real time sharing, AR and VR merging.
3. A whole new way to experience content. Honestly, the idea of an unlimited screen size is appealing.
4. Incredible potential for gaming.
Anyone with little kids who like to play outside who also happened to own an Oculus already "knew" this (maybe it's just my family): my kids' favorite game on the Oculus is to spray paint the "play area" in the combined B&W 'outside' rendering. They've always demanded to be able to play beat sabers but also to be able to "see the room", so they can see their friends when they play. They don't even care that it's crappy, low-quality B&W; actually, that's kind of a charming feature!
I think what's stopped other companies is that it was too damn hard; not that it wasn't the obviously superior way of doing things. (Obviously superior: render AR onto VR.)
Not just too hard, but also maybe too expensive. Vision is using 12 cameras, LIDAR, and a lot of processing power to deliver it. People thought it was crazy when Apple started trying to get 6+ cameras and using 3+ at once for single photos/videos on a single phone (before they started to emulate them after they achieved that). (In some respects the early critiques of, for instance, the iPhone 13 in retrospect maybe start to look like hints of some of Vision's early planning/testing stages.)
Apple seems to have prioritized AR in a way that Meta didn't and that led them to exploring all the costs to do it right. (You can argue that Microsoft was also paying attention as their Mixed Reality-focused playbook resembles what Apple seems to be playing at. Microsoft just saw the costs as a reason to focus first on Enterprise customers. Apple seems to have the gumption to try to make this a potentially accessible to consumers play from Day One, including involving Disney to signal that. That's an interesting bet that will be curious to watch play out.)
I'll say one of the best moments in VR for me is in the lobby of Star Trek: Bridge Crew just nerding out with a complete stranger while waiting for others to join... I think we even sat around and chatted while we had a the whole crew.
Does the quest 2 do it at all? I was under the impression that it only has low-ish resolution IR (or possibly monochrome) cameras, I don't think it ever pretended to be an AR/MR device. The cameras were for controller detection, and passthrough mode was only added because it could be (and that was better than nothing).
It's monochrome and grainy. Passthrough half-works. It's great to have /something/ but it falls far short of really usable. You can drink from your water bottle or look around to find where the dog is but that's about it. God forbid you walk more than a few feet outside your guardian (boundary you draw) or for more than a minute, if you do it can disconnect you from your game, kill the running app, and/or lose your guardian boundaries even once you walk back.
If I'm playing VR with friends it really sucks if you need to get/refill your drink, go to the bathroom, or take the dog out. I wouldn't try taking the dog out with the headset on but the other two I would (because taking off the headset almost guarantees you will be disconnected and Meta's party-type/game-joining features are absolute shit), it's hit or miss.
So... what's the killer app for this thing? Best I can tell it's a status symbol for a particular type of computer nerd.
But blather about the Apple "Vision" has replaced blather about GPT based chatbots on HN, so I am thankful for that.
But I am not being facetious when I say that I don't see any application for the technology that justifies the $3500 price tag. But I'm more than willing to be educated.
Is BMW going to use it to train assembly line workers?
Is it going to be used to assist people living in memory care by recognizing faces of people in their field of view and displaying identity annotations?
Are construction workers supposed to wear it so they can view 3D plans overlaid on top of actual construction? Maybe it's cameras can create a 3d model of existing construction to beam back to the site architect to get input about how to proceed?
These all seem plausible, but none of them are consumer apps. Is Apple now not selling direct to consumers? I just don't get the business around these things.
The power in the headset makes it sound like it replaces your laptop.
So for $3,500 I get a new laptop plus a 100 inch screen. To buy those separately would be far more than $3,500.
In my case I plan to buy it on day one because I want it for working. I'm tired of having to choose enjoying the nice weather outside while using a tiny 15 inch laptop screen or being inside at my desk with my 3 32" screens.
With Apple Vision, I can have my 3 32" screen outside. Or on the road. Or in the office when I drop in for a day. Or anywhere else I may go. It will be a huge boost in productivity and definitely worth the price.
That is the killer feature for me -- portability of my most productive setup.
We've been working on fixing those issues since 1985. In the past decade we've gone from 20% of people getting nauseous wearing hmds to 17% of people getting nauseous.
Is it just because it's an iProduct that it will magically not have the same problems?
There would be private units within Apple testing this headset for hours and days. And now that it's announced, this testing will be broadened to the more public areas of their facilities.
They're gunning directly for work/productivity use, because of or despite what they've learned about how people feel using it. Either thinking people will use it for a couple of hours with occasional breaks (lunch, etc) or the nausea and so on are less of an issue.
This is a massive investment/bet, so presumably they know what they're doing.
The devil is in the details. If you can only mirror your laptop screen, it's not that useful, unless you can run Vision native apps next to it. Then I only have to mirror my laptop for laptop only stuff and can run things like Safari and Notes and Mail all the other stuff separately.
If I can set up vision as a 100 inch 8K external monitor, then that becomes much more interesting, because with that much screen real estate I don't need multiple monitors.
They showed in the demos mixing your desktop screen with native visionOS apps so I'm pretty sure this will be possible. It will be really cool if Apple allows a "Coherence"-type mode (a la Parallels) to pull macOS windows themselves into your space and not just a 2D pane that mirrors your desktop. Even if I can only create some virtual monitors in 3D I'm sold. I'm writing this on a MacBook running 3 external monitors right now and I bought a duplicate setup for my parents house when I had to stay there for about a month for some medical treatments for my dog (closest place that did the procedure) because I didn't want to lose my productivity (also wanted to make it easier to visit them regularly without worrying about work productivity). Being able to just grab my laptop and my Vision Pro would be amazing and would open up travel to way more places.
It better do more than project monitors. I’ll use it if the whole space around me is one big monitor for overlapping app windows. I want to throw my open apps in a pile to my right, shuffle through them to find what I need, push them all to the side to work on something else, and probably do the virtual equivalent of those meme conspiracy pins-and-string research boards.
Can confirm. I watched some of their developer videos for how to design apps, and it showed multiple safari windows side by side (though never more than two, equidistant from each other - no actual indication yet that we can overlap them). And it looks like we can push them farther away - though that will resize the apps, not just make them smaller.
Could actually be exactly what I've been hoping for since the first real VR stuff came out 10 years ago.
It has the M2. Apple says it works with select iPad apps in addition to its own App Store down the line. Apple just recently made FCP & Logic available on the iPad. I think the direction is clear.
Even if multiple virtual monitors isn't supported today, it's the sort of thing which could be entirely solvable in software.
From the way it's described, realityOS seems more equivalent to an iPad Pro than a MacBook. Many people don't need anything more than a powerful iPad in order to be productive. So the built-in computer would be sufficient on its own. (Or at the very least, sufficient for many secondary screen tasks such as email, calls, notifications, monitoring, etc.)
I think you get two 1.5" screens with half the resolution of my 34" monitor.
If your 3 32 inch screens are running collectively at 2.1k bits, they must be pretty old. Why not just keep them? They're fully depreciated.
And it runs an entirely new OS, so I have to buy apps for it again. Or cope with VisionSafari being slightly different than desktop safari and wait for Adobe to support it.
When I want to go outside, I just take my laptop outside. It's much easier to see in the full light. Plus, I can change the distance of things I focus on, which is supposedly good for vision.
But I was asking about the "killer app." What can you do with this thing that you can't do with a laptop, phone or desktop computer?
Have a multi-screen workspace at the office, in the study, on the couch, outside, in a hotel, on the plane. Without having to cart a laptop and two extra displays everywhere.
I'd like to see a tag on every Vision Pro commenter noting whether they typically use multiple displays or not. Might help understand from where they're approaching the discussion.
If Apple can bend the OS to cover more and more of the average work needs, they'll make in-roads fast.
Apple positioning this for “Work” addresses that. It also conveniently diminishes concerns about things like the battery (if you’re at a desk, plugging in is no problem) and controls (if the gestures bother you, use a keyboard and trackpad).
The work scenario could just mean using it for meetings and collaboration, or it could just mean working with large virtual screens. In any case, if you’re working from your bedroom or from your drab office, a virtual environment could be, I dunno, refreshing?
What if we could /all/ work in a beautiful space with lots of natural material surfaces and huge windows with views of beautiful forests? Sure, it’s all fake, but maybe fake nature is better than real offices?
“Work” also gives you reason to pick it up every day, unlike my neglected Oculus and Vive. If it’s useful for work and easy to slap on, I might bother to keep it charged and handy on my desk.
And then it won’t really matter what I’m wearing when I get on a Teams call…
I have a VR headset, and it lives in a cardboard box. The reason is that the only thing I'd actually use it for it does poorly. That being immersive media consumption. By the time you have a virtual movie theater the resolution remaining for the actual movie might as well be a VHS. If this is as good as it sounds like it is, that problem is solved.
I won't be getting one for $3500, but if I was single or a higher earner I probably would. I'm sure they'll sell as many as they can make.
The killer feature is going to be triggered when cameras that can capture video that this device can put you into come out. I'll happily pay this price and more to be able to record moments in my life and relive them later.
>Apple "Vision" has replaced blather about GPT based chatbots on HN
Heh, this is where you're making a mistake... GPT agents + generated worlds + Apple Vision will be the blather in another year where people go into simulated realities to check out.
To be honest, I'm waiting for someone to train a backrooms-style environment based on NeRF to make horror games with these sorts of XR experiences.
As much as I hate all the bluster about chatGPT, an application like this could actually work. NeRF models are already fairly capable of generating indoor environments like this, and backrooms-style spaces lean on repetitive structured and uncanniness so having them be generated on-the-fly helps lean into the experience.
Imagine DeepDream (2015), but inside apple vision someday.
Heading toward whole new frontiers of traumatizing people.
Some general use cases:
- training/education: say giving you tips and correcting you while cooking with checklist recipes and concurrent demos of the current step. Works for painting, fixing instrument finger positions, workout tips, STEM, or anything where you would want to use your hands instead of holding something. There was a keynote section about the watch helping with golf swings and the like.
- Streaming: there were apparently some terrific sports/concerts front row seat demos. Some were willing to plunk down $1000 courtside subscriptions instead of $1000/ticket seats. TikTok but far more intimate in the room with them experiences.
- decorating: there was a demo where a wall became a portal to a Jurassic period. Imagine the decorating possibility of an aquarium in your living room and a planetarium on your ceiling. Virtual pets cavorting and visitors just popping into your living room via a portal. Create a literal dream home on the cheap.
-escape/relaxation: live in a noisy bright urban center? Just cut it all light and noise out with your private getaway helmet. If you are paying $2000 a month for a so-so living space it would be a good add on.
Personally I value having a great bed and a great car because treating yourself makes it worthwhile. Having a dream environment the times I am alone? Having a theatre experience stuck on the toilet? I am looking forward to when the hardware can simultaneously do all of this at once.
The idea of virtual displays seems practical, right now if you want another monitor (or to make your primary one larger) you need the desk space + the money to add that on, which virtual monitors solves.
However, how practical that really ends up being relies on how the OS is positioned and the kind of thing it can run. If it ends up like MacOS, yeh I can see that being very useful, but if it ends up like iPadOS (or god forbid iOS), well I can just say anecdotally after owning an ipad for however-many years this is the first time I've even considered using multiple monitors for that kind of thing.
I probably won’t buy Apple Vision for a while but will eagerly follow it. Like the author, I’m mainly excited about using this for work/ productivity.
I’ve tried out working in VR using oculus and immersed. I’ve really liked the experience but the blurry text was a deal breaker. From the reviews so far, that might not be a problem anymore.
I have already imagined myself tony-stark style viewing complex systems as a hologram going up and down in levels of abstraction at the flick of my wrist, having notes pop up, select the pieces of info I need and view everything holistically.
It's interesting that Magic Leap took $3.5 billion in investment, and had a very similar vision and almost identical demos. But they failed to deliver.
I wander what the R and D cost for the Vision Pro is, 10x Magic Leaps investment, $35 billion over last 5 years? Seems plausible...
Their total R&D expense is about €100B in same time period.
Could be hard to break it down. It’s clear that much of what we have already — Sidecar, Continuity, all the ML subject recognition, spatial audio, image processing… is likely spun out of the development of this thing, or at least co-developed.
This is the space program that brought us Tang, so to speak.
Even if this thing flops magnificently, the whole Apple ecosystem will have been enriched by its development. From that perspective, it seems like a pretty low-risk high-reward effort for Apple, whatever it might have cost all told.
> Sidecar, Continuity, all the ML subject recognition, spatial audio, image processing
I also wonder if the real time OS stuff is a byproduct of the Apple Car project. As soon as they mentioned the R2 chip I immediately was asking myself "Did this come from the car project or will this make its way to the car project". Even if they never ship a car the tech they have to build/discover/etc in the process can help other product lines. Just like throwing Lidar into the phones/iPad Pro or all the ML/AR stuff helped this headset.
Don't forget all the "neural" branding for chips - this product wouldn't yet be viable without M-class (fanless) chipsets.
So yeah, it's not just hard to pin down, but a lot of the work that led to this is foundational for other Apple products as well (either as core like Mac Ms or features like Continuity) so it's seamless with other products.
There's no way this has cost $35bn. Tech companies label a lot of stuff as R&D, it's almost more likely that the €100B value includes all the macOS/iOS/etc development as well.
I expect until the last year or so this has been well under $1bn per year (based on 1000 people at $300k/yr, $300m). This will have ramped up as they begin figuring out mass production, but I'd still be surprised if this is more than single-digit billions for the last year, as it's not like they're reserving enough manufacturing or supplier capacity to make 100m of these devices yet.
I think if you start to include the cost of developing Vision OS on top of iOS, and cost of all the custom silicone Apple have in this, then you can approach that figure.
They have the advantage that they had a lot of this in place for their portables. Magic did not and had to start from scratch.
I suppose it's hard to judge how much of the R&D cost for Apple silicone, their cameras, LiDAR and iOS should be counted as costs for the Vision Pro.
If you include the development of iOS and the A/M series chips then sure I guess it gets close, but it seems obvious to me they would have been doing those things anyway.
I guess it depends whether you want an all-in cost to compare to Magic Leap, in which case you could argue that most of the history of Apple has set them up for this – if you want to make a great VR headset, just make the best phone ever first – if you want to make the best phone ever, just make a great personal computer first, just acquire NEXT first, etc. I don't know where you draw the line.
More useful I think is to say what did it cost the companies above what they were already doing. For Magic Leap it's the whole business, for Apple it's not much, for Microsoft it's somewhere in the middle, for Meta it's maybe closer to Magic Leap.
Yeh, exactly, I was aiming for a more direct comparison to Magic Leap. Effectively how much would it have cost them to actually achieve this level of functionality and polish, based on the "all in" cost for Apple.
I'm not sure if it's possible to draw that comparison. For example, Magic Leap could have used Android (I don't know, maybe they did?) and saved almost all of that iOS R&D cost. Even they were working on incremental costs over what already exists.
The issue is double-dipping/synergy of those costs. Magic Leap could have used Android, but they had no say in Android's development of or its API road map (or its processor road map or its GPU road map or its "neural engine" road map…). Apple benefits from the synergies of all of that development effort in ways that are really hard to quantify. Magic Leap using Google as a vendor to skip some of those steps also loses a lot of double/triple-used investment costs that come from even just having a seat at the road map table.
The "neural engine" road map alone is probably one of the biggest, obvious to me examples, that relying on Google as a vendor wouldn't have worked but Apple could do it and make a large long term investment in it. Google has a lot of incentives to keep ML models in the Cloud as much as possible. Apple made an early choice to focus on privacy and decided that it would be nice to always have as much of, for example, Siri running directly and strictly on device as possible. The home assistant trend and overtly trying to be "the privacy-first option" during it gave Apple plenty of reason to invest in on-device hardware. The fact that on-device hardware is also extremely necessary for AR vision support if you were planning a headset (or for car vision needs, if those rumors are also true) is a leverageable bonus of the hardware already being deployed. Smart moves a decade ago that paid out once already and then are already in the wings ready and waiting for the next projects/products.
(Meanwhile, how uncommon are "NPUs", discrete or SoC, in the general Android ecosystem still? Google had different incentives and different priorities and what pressure could a team like Magic Leap, especially if trying to "stealth" some of their tech, have on an ecosystem like that? Android doesn't care if Magic Leap is happy. Android can't predict where Magic Leap wants to go.)
How do you quantify all the money that went into "we'd like as much of Siri as possible to run locally on devices, for at least user privacy if no other reason, wink" that naturally wound up on M2 chips and we assume influenced the R1 chip design and in turn doubles as a lot of visual processing power available for use to the Vision Pro? (How do you double/triple count it, because those existing use cases of running parts of Siri locally and dictation and other models locally also carry over from the iPhone/iPad/Mac into the Vision Pro?)
As was pointed out elsewhere, we've seen some of the R&D show up in other places already. AR kit has been around for a while. All of the Apple Silicon has been in everything for a few years now (except the new R1 chip, but I'm sure that builds on their M1/M2 work).
Face unlocking tech on the iPhone applies to eye tracking (and eye unlocking). Battery tech applies across devices. All the new cameras on the last few years' of iPhones. And so on.
So it's totally reasonable that total all in is in the 10s of billions.
Unless someone can confirm otherwise I assume all the demo videos are composited "mock-ups" and not shot through a vision device. They are incredibly similar to all the demo videos from Magic Leap.
My bet would be completely the opposite. No matter how good these devices get they lack in input bandwidth. My keyboard beats all VR , and it seems to easily beat apple's VR too. The headset is clearly targeted to static environments, entertainment. The apps are like ipad apps, looking cute but when people want to do serious work fast, they use a laptop because it has a keyboard. It is overengineered. There are a lot of gimmicks in the device like the 3d avatars which imho are worse than simple zoom chats. We are humans, we like faces. And the (mandatorily positive) reviews from people who used it are not hinting to this new era of computing.
I see this as an early-era Tesla car, but without the long term prospect. A lot of people will buy it because it is expensive. It's something that happens with most apple products
Your keyboard works just fine in VR. What’s the problem?
I think a real display replacement (in terms of shrinking down your monitor/TV to something you can trivially carry around with you) seems like a real use case.
it doesnt exactly work as a screen replacement because it is stricly personal. Good for using in airplanes and maybe when alone but like other VR glasses, it doesnt fit well in other circumstances
Not even necessarily "later". They showed that the visionOS already supports existing Continuity and SharePlay functionality in iOS/iPadOS applications, which are features already built for sharing views of the same apps across devices and/or between people. It's somewhat limited which applications support those APIs today, but it is still an interesting amount more than "nothing".
(ETA: That seemed a subtle theme across the WWDC keynote to me, as increased SharePlay support in first-party apps and select third-party partners was many of the things they demoed across the iOS, iPadOS, Car Play, and macOS updates.)
> It really is one of the best product names in Apple history: Vision is a description of a product, it is an aspiration for a use case, and it is a critique on the sort of society we are building, behind Apple’s leadership more than anyone else.
I agree that “Vision” is a good product name, but “Vision Pro” just feels intensely pedestrian. Why is it called “Pro?” Is there a non-Pro version?
"Pro", in AppleSpeak, means "the expensive one". They've actually done this before; the MacBook Pro (first Intel Mac) launched some time before the MacBook (first consumer-oriented Intel Mac). This feels like fairly clear signalling that there'll be a cheaper more consumer-oriented one along later.
Apple's problem is that the "pro" version is not just twice as expensive as the "consumer" version needs to be in order to be a market success, it's ten times more expensive than the consumer version needs to be.
edit: the fanatics are really out with the downvotes today
To go 10x lower we're at $339. Thats not happening, this thing has essentially a full blown macbook baked into it. Apple has never been interested in cheaping out, and in VR that's Meta's job - their Quest headsets are litterally using the cheapest possible ways of getting to its price point, and it shows.
The consumer one will likely settle somewhere closer to $999 - matching the price of the cheapest macbook air. And it'll be a success at that price point, because by the time we get to that people will understand that it's not a quest with a mobile phone processor slapped inside, it's a laptop grade platform.
> To go 10x lower we're at $339. Thats not happening, this thing has essentially a full blown macbook baked into it.
The hardware could be a loss-leader if they intend to maintain their 30% commission on software purchases. Do we know if that's how it will work?
(edit: but boy howdy, somebody would have to sell an awful lot of apps...)
I guess we could debate what the exact multiple is by which this item is overpriced, but on a message board where people will leap to the defense of the $1000 Apple monitor stand (true story) I think I'll take a pass. The main point, I think, is that when the price right out of the gate is this exorbitant, a major tension exists between increasing user adoption and lowering prices via economies of scale.
People buy non-pro iPhones for $800 - 20% less than the Pro; they are willing to buy MacBook Air's for >$1000 or so, more than half the price of a Pro; if this is marketed as both a TV and computer monitor replacement, many will be willing to spend quadruple digits (or $45 a month for 24 months) even on next year's budget version.
The Day One Disney+ and Disney interest in the keynote to me reads like Apple are hinting that they have a much more aggressive consumer timeline than "a few years down the road". I'm very curious what Apple is thinking/betting here.
I think that's definitely true. To get to that point, Apple's multifaceted problem seems to be knocking a huge amount off the price of the thing while convincing a significant number of users to buy and actually use it (instead of putting it on a shelf after a few uses as so many Oculus users have done over the years), and also to convince a few excellent software developers to create worthwhile stuff for it.
They can throw a ton of money at that problem, but I wonder how effective that can possibly be.
Vision Pro. Vision Classic. Vision Air. Vision S. (expensive, everyday, luxury slim, budget.) They can also make variants of those 4 using Nano and Max modifiers.
I think you're right. There's a few opinions potentially clouded by "I'm jealous because I want this but can't afford it."
If they were making a Quest 2 replacement, fine, talk about $500 pricepoints. If they can deliver a MacBook Pro replacement, then $2k+ is the ballpark.
"People buy it" could mean a lot of things and I'm not sure what you mean. I don't think anyone is arguing that literally no people will buy it. "It works and people buy it" is a phrase that could be used to describe the Magic Leap One.
They probably plan on releasing a non-pro version later. Importantly, adding "Pro" also has the psychological effect of better justifying the higher price.
I found it odd that they started out with Pro. I think “Apple Vision” would have sufficed. Leaves room for the future to add “Pro” when they go to 8K/Retina or whatever. Plus just saying “Apple Vision” is kinda nice.
When Apple wants a mainstream product, they'll release one at a mainstream (for them) price. Right now they can't sell a non-pro for probably less than $2k, so why not just sell the pro at 3.5k and use the insights from that to create a real mainstream (but high end) non-pro with better chips/software etc next year?
Besides cheaper hardware, what could they remove for a non-Pro version? Maybe make it always tethered, and the power brick is +$150. Cutting the display quality kills their pitch. Reducing the headband hits viability also. Maybe they remove the front-facing display? And the ability to record?
It seems lazy to me, do a Google search for "Vision Pro", excluding results from the past couple of days, and it seems there were already a ton of products and services with the same name.
Perfect time to bring back the "i", though they already have "iSight". "iBorg"?
Apple became a victim of their own success in relation to the “i”. It is now associated with Apple, but it is too pedestrian to be registered as a trademark.
I forgot that Apple was having an event yesterday, and when I saw “Apple Vision Pro“ in headlines everywhere, my first thought was that they were opening an eyeglass store…
I don't mean projecting the outside world inside the headset so the user can see "through" it, that's a standard feature even on cheap headsets. I mean the display on the front of the Vision Pro which shows the users eyes "through" the headset so other people can see them, which is an awful lot of hardware to dedicate to a feature many users don't need.
If you look at that Simula VR page there is a video showing a person working with the device at a coffeeshop and being handed a coffee by a waitress. I don't know what that waitress would think about the interaction, particularly she might not be so clear if the person wearing the headset is paying attention or not. Visible eyes on the outside of the device could make a big difference. I don't think you'd need a particularly high quality display for the outside, something cartoony might be good enough or even better than a realistic display.
I imagine when it launches, the 2024 or 2025 "Vision" will be equivalent to the Vision Pro announced yesterday, and the new Vision Pro will probably have 8K displays or something ridiculous.
I agree there is a lot of low hanging fruit which could differentiate consumer from Pro for the next generations.
The pro could increase the FOV (some mentioned black borders at edge of vision), increase refresh rate and brightness (some mentioned adjusting to room brightness when removed), reduce the weight and thickness with titanium (some mentioned unbalanced weight), have more private sound (I understand it is pretty noisy to the neighbors so you would currently need AirPod Pro for plane/bed use), increase the cooling system (some mentioned it is warm), get some sort of med-tech sensors (control with thoughts, detect stress etc), get back sensors (the AirPod demo of being passed by the bike seems it needs some sort of backup camera/360 video record options), get cellular features, get Mx Pro chip, and increase the front screen size. After that I expect outdoor waterproof Ultra models (The scenes of people twitching on couches compared to the iPod dancers flinging themselves around could not be more stark).
Many reviewers also mentioned getting their hair messed up making it unwanted in professional scenarios where sharp looks are important which is something to solve for a true pro model. I imagine corporate will want all sorts of usage data and screen sharing.
It won’t have a very high price, for one. Other than that, it won’t have screen resolutions as high as this, nor have as many cameras and mics as the Pro version. It may not have some features in the software either.
It helps to think about the “SE” models of iPhone and Apple Watch in relation to this. And it will have its own target customer base for which it makes more sense than the Pro.
At a complete guess, the non-pro model will arrive in around 2025. It will have the same M2 chip as the current one, and less or lower quality sensors/cameras, and maybe a cheaper and/or non-replaceable strap (I'd lean towards the former as selling extra straps will be something they'll want to profit on).
By the time it comes out the pro model will move to the M3 or M4. Storage will also be lower on the non-pro model, and you wont get the same level of sound quality (Which for 99% of people wont matter as they'll have Airpods being existing Apple users).
Apples historically launched pro products, and then when its ready for a performance and/or feature bump it's passed those old features down to the non-pro models. Think TouchID and FaceID for example.
They could remove the front-facing display and the camera/ability to shoot 3D content. Both strike me as non-essential for a base device geared around content consumption and basic work. The front shield then becomes a plain piece, and quickly differentiates a user from Vision and Vision Pro.
If they swap metal for plastic, the cheaper version would become lighter and that would be a selling point. They can't go with lower-resolution screens. Poorer quality headband would make the thing uncomfortable to wear.
It could be the other way: Vision Pro will advance faster (field of view, compute power, some new feature) and Vision will become a lower price point full of rad hand-me-downs.
Pro is consistent with other product lines that Apple produces. IPhone Pro, MacBook Pro, etc.
They could change the convention, but weren’t y’all just complaining about how naming conventions for things like Windows, Intel, Nvidia, etc are not consistent?
It's like adding "tactical" to a product that is black on the title on Amazon. You'll be able to charge more money for the same thing. It also signals you might eventually launch a lower end model.
Agreed. If you pop into any dollar/pound type store the shelves are filled with cheap cables and car phone holders that use the 'i' name. It's become synonymous with cheap junk now.
> This system was, at the beginning, dismissed by most high-end camera users: sure, a mirrorless system allowed for a simpler and smaller design, but there was no way a screen could ever compare to actually looking through the lens of the camera like you could with a reflex mirror. Fast forward to today, though, and nearly every camera on the market, including professional ones, are mirrorless: not only did those tiny screens get a lot better, brighter, and faster, but they also brought many advantages of their own, including the ability to see exactly what a photo would look like before you took it.
Even in 2023 EVFs aren't as nice to look through as OVF from the 80s though. I'm always surprised when I pick up my old film nikon after shooting digital for a while.
No matter what you do, real life will always look better than a screen, especially if said screen sits 2cm from your eyeball
My first impulse when I come across a memory or photo in my camera log on my iPhone is to send it to my wife. Photos are really rarely a personal experience. A major reason why most famous paintings find their way from personal collections into museums so we can all enjoy them together.
It is that human nature that makes me believe that any device like this that doesn't have real-time interactions baked-in, no matter how cool they are, will always default to media consumption because media consumption is the main use case where you don't necessarily want to have a shared experience.
This device seems absolutely amazing and likely much more in the same vertical as home theater than personal computing. FWIW, I also don't buy into the work aspect of this, either. Partially because most people aren't in the Apple ecosystem for productivity apps, but mostly because as another person stated, nobody is going to use a digital avatar in a work setting. A static image or even just being on camera wearing the headset would feel much more normal. Sure, I could see it being great for certain applications like email but not enough of a catch-all to be the only device you use in a work setting.
All these "relive your memory" videos are incredibly awkward, because of how downright sad it is to strap this on your head during your kid's birthday just to relive it on repeat like Tom Cruise in Minority Report.
The real usecase is porn, but they're too tactful to say it.
Yeah, if this thing catches on, I think this will be my "old man gets left behind by technology" moment. Everything I see from the marketing of this just looks like a soulless dystopia. I hate it.
If the future is strapping a phone to your face and never leaving the house, I'm ready to be left in the past.
I am fairly cautious about tech in my own life but this still seems like an extreme Black Mirror episode take. The battery doesn’t even last long enough to have you never leave your house. It seems like something I might use two hours of my day maybe. I’m not sure how that makes it worse than any other media device or what it is about it that has people thinking they’ll never leave home again and feel completely socially isolated. I’m just considering the case of using it at home though and not walking around with it expecting people to interact with me normally.
Jokes aside, assisted living homes are exactly the place where XR is finding adoption - entertainment and healthcare for people who can’t readily leave.
Is that much worse than becoming enraged by 24 news cycles on daytime TV? I feel like reliving family memories and virtual travel/experiences could be quite positive for older people.
I played around a lot with VR180 when it came out. The experience is incredibly, almost uncomfortably intimate for personal videos. I felt so awkward watching demos of other peoples “blow out the birthday candles” moments. However, the fact that it was uncomfortable means that the technology itself is very good, otherwise it couldn’t produce such and emotional experience.
On the tech side, I’m just guessing, but it looks like Apple has an even better version of VR189. A 6dof version of VR180 seems entirely plausible for Apple to pull off with NeRFs and would be even more incredible.
Again, I agree that it’s a bit weird for personal memories, both on the recording side (possibly awkward to wear goggles in those situations) and even watching personal memories.
However, I’d expect Apple to make recording spatial videos possible w iPhone/iPad, which at least fixes the awkward recording issue.
Even with that possibility, I think Apple hurt themselves using this “personal memory spatial video” example.
For me, the far better use cases are for entertainment. Professional, live (and recorded) spatial video will be huge. Everyone can have front row, court side, or even birds-eye views of all forms of in-person entertainment. Sports, plays, comedy, concerts, orchestras. The experience of watching it is so intimate experientially I think it will be amazing. Looks like the tech to make it happen is finally here. Imagine them owning “the App Store” for spatial video pay-per-view…
If you go to any concert or other public event, half of the audience will be recording on their smartphone instead of watching. I do not see any reason why an iphone of tomorrow won’t have a 3D camera to record and share any important moments the same way as we do it now. There will be the time when we will watch dashcam and drone videos of car crashes, natural disasters and soldiers dying on battlefield sitting on a sofa in those shiny glasses.
This is where the 3D aspect is cool. The same way that we look at old newsreels and feel sort of chronologically alienated by how "retro" they look - simply because of the medium - the 2D recordings we are making nowadays will similarly look quaint to people that have become used to 3D. So, Vision Pro has the opportunity to set the tone for the next generation of fundamental media experience.
People keep bringing this up, but are we really trying to compare a small rectangle that fits in your pocket and takes 3 button clicks to take a picture with (double power + volume button) to a full on VR headset strapped to your head?
The apple fanaticism on this site is really something special, I see no universe in which an idiotic product like this is touted as even somewhat desirable if it didn't have the Apple logo plastered on top of it (with the associated ludicrous price tag)
Will my comment be more clear if I will rephrase it in simpler words? Vision Pro can record, but it is not a recording device for the most use cases. Something else will come either from Apple or from 3rd party vendors, probably in a format of a 3D camera in a smartphone. And when such devices will come to the market, people are going to use them the same way they did it before (see Instagram and TikTok for examples). That’s it.
I’m not Apple fan or insider, but what would you do if you were Apple and if you do not have or do not want to disclose the existence of more suitable recording devices? You would probably use the same glasses for the demo, right?
Nobody is strapping this headset on for their kid’s birthday party. There are incredible concert videos from professional crews, and that’s the direction this will go until the tech can be (if ever) made unobtrusive.
The primary purpose of this headset is not a recording device even if it can record, so I do not really see a point in your comment. Most of content creation will happen by other means.
I was responding to your comment and the parent comment both asserted that people would be using this device to record live concerts or their kid’s birthday party (which we, apparently, agree is silly).
Seems like hyperbole, IIRC the recent concert I went to people used their phones for a few photos/videos to post on social media a few clips that they were there basically, and then mostly enjoyed the performance.
> All these "relive your memory" videos are incredibly awkward, because of how downright sad it is to strap this on your head during your kid's birthday just to relive it on repeat like Tom Cruise in Minority Report.
I mean, quite obviously the recording tech will be a standalone device (and maybe even an iPhone which are ubiquitous at these events already) someday. The Vision headset has all the hardware already so why not let it capture videos? Just set it on the counter or something...
This has become one of those meme-y dismissals that I see in all the discussions but it's so transitory I can't imagine it possibly mattering.
If you read the article, you'll see him mention this as well, and also call out that it uses some "Apple Immersive Video Format" (which is something I didn't hear in the keynote), so this is the clear intention.
I was thinking the same thing. Aren’t iPhones getting lidar? One will just record on an iPhone. Why is a person recording using a headset more dystopian than using an iPhone or camcorder? Camcorders were way more bulky and isolating than this headset will be.
Is it that different from using one of those gigantic camcorders from the 80’s where you had to put one eye up to a viewer stalk? People used those, but it’s not like they had it out the entire event (I assume, based on how short the videos I’ve seen taken with them usually are).
It is slightly different in a better way. You can interact, engage and otherwise participate in events while recording using the headset than you can with a camcorder or phone. I'm guessing, but obviously don't know, that Apple let's people "see" your eyes while you use the headset because it makes it more natural for others to interact with you while you are using the headset.
Even the clip with recording the kids felt awkward as the kids did not ackowledge the presence of the parent at all. It felt they were Photoshopped into an empty movie.
Recording using an iPhone would an upgrade as your kids can now much better see your facial expressions and you would be able to turn around to make a "selfie" recording showing both you and the kids in the movie.
If you have more than one person recording the same event with an iPhone, uploading them all into one place would enable pretty precise photogrammetry to render the scene in detailed 3D.
That’s exactly the sort of complicated-behind-the-scenes technology that Apple is great at wrapping in a simple UI and clever name.
Once Apple has a 3D playback device, adding another camera bump at the bottom of the phone won't be terribly hard. I just wonder if it'll be a pair of regular lenses or fisheye lenses.
Leaving personal desires out of it, the relationship of porn and tech goes back at least to VHS vs Betamax, probably further. It’s not unreasonable to believe that does not stop with XR headsets.
Sure, but what's the relationship post Blu-Ray? Serving video in general has been important, but not pornography in particular. Ridiculous to say porn is the real usecase. If you have some articles about how porn in particular has driven cutting edge tech please do share. My understanding is that it's been more military, video games and space that have driven cutting edge tech (along with serving ads, of course).
Considering how hostile the whole financial industry is to that market segment I kinda seriously doubt that. eBay and PayPal propped each other up, Amazon normalized online shopping. It’s a little odd to me that you’re so set on putting porn at the center of everything in some strange Disney/marvel retconning attempt.
Anecdotally, everyone I know with a VR headset (all of about 5 people) is quite open that porn was one reason for the purchase.
The claim isn’t that porn drives technical advances the way space does, but that porn drives consumer adoption of new tech. If a new product category does X, Y, and Z, some percentage of people will buy it. If it does X, Y, Z, and porn, it’s going to sell to a bigger audience and probably make people more forgiving of shortcomings (ha) doing X, Y, and/or Z.
> but that porn drives consumer adoption of new tech
sure. where's the evidence? anything that can serve video can serve porn. where's evidence that porn in particular is somehow unique? YouTube for example as more active users than all porn sites combined, yet porn is not even available on YouTube.
being able to play videos in XR I think will be a real use-case, but this idea that porn in particular is the real usecase is probably wrong, imho. this is not to say that people won't use it for that.
The most dramatic case is older — VHS literally won because of porn.
And there was video porn long before YouTube was founded. The claim is that porn drove people to get computers capable of showing video, to buy internet connections fast enough for video, etc. It is not that video sites must serve porn, but that the grassroots interest in porn created the market conditions that made YouTube possible.
It’s hard to substantiate because how would you? Other than just having lived through that time and observed (VHS was before my time, but The DVD market was similar).
Maybe that era has stopped and horny people are no longer seeking out the newest best way to consume porn, and technology is now pure. I’m just not seeing it, but maybe that’s my cultural bubble.
why are we discussing 50 year old tech. if porn is so amazing a use case surely there's something in the past 10 years you can point to? also, I looked and did not see anything substantiating your statement that VHS beat betamax because of porn.
the reasons I remember and corroborated online were due to longer playback time and cheaper playback devices and more content, which was the result of the cheaper production costs.
I don't know how in the world you got "porn is an amazing use case". I just said it influences purchasing decisions in ways that prepare markets for more mass-market scenarios.
I also don't think you looked super hard since a Google search for "betamax vhs porn" turns up nothing but arguments for (and in some cases, against) the proposition.
Apple is famously anti-porn, and enforces this policy in their App Store. What are the chances that apps/content for this device won't be as locked-down or more locked-down as the App Store?
Its not projection, rather well known cases that specifically porn enabled technologies like VHS to gather mass adoption, even over technologically superior tech.
Like it or not, doesn't matter if on extreme left or right or anywhere in between, people en masse like it. That they often don't admit it publicly is another topic. Can't beat a billion years of evolution baked into the very foundation of each of us.
Yes but this isn't VHS. When's the last time porn was the major driving factor? You're about the 70s and 80s - might as well be an entirety ago as far as tech is concerned.
People thinking porn is going to be what drives adoption of this are insane. It'll most likely be productivity and vertical integration.
Or, here's a wild thought, the whole device will get smaller....And standalone cameras will exist. So it wont be awkward, as it will just appear to be glasses on your head that we've not found awkward in the last several hundred years.
> So it wont be awkward, as it will just appear to be glasses on your head that we've not found awkward in the last several hundred years.
That’s what Google Glass tried and it flopped anyway. People also found a camera on glasses (like the Google Glass version) to be highly creepy. Apple emphasized that the front of these glasses will clearly indicate when someone’s taking a photo or video (to the subjects who’re seeing it).
We are in a much different world then when google glass showed up. People taking pictures, videos, streaming is now much more commonplace. There was near no external indicator of taking a picture on google glass.
I also dont forsee this device being used outside often at first.
Google glass also flopped for many other reasons, it being uncomfortable, awkward, and abandoned quickly being some.
That controversy is an interesting thing to reflect on. I remember it seemed very concerning at the time, but now I feel that much the bigger privacy concern is not being recorded in third person in public by strangers, but platform owners recording what I see and do with the headset. Tracking my eyes to know what I pay attention to. Etc.
Google Glass wasn't given a chance.
They axed it as soon as people started complaining about being nervous around ppl with it on rather than change anything.
I think the key will be if they can make it so the next generation of iPhones can record videos for Apple Vision. If they cannot do that, I think they're really goofing up.
It feels like a missed opportunity that the current generation doesn't already do that. All the talk is that Apple have been exploring VR behind the scenes for a while, and the devices all have 2+ lenses.
It would have been quite cool for them to announce that all your iPhone 14 videos were actually recorded in 3D and ready for Apple Vision all along.
I think 3D photos will catch on, but you'll take the photos with your phone, like you do now. This device will really be for viewing them later.
(iPhone 15 or 16 will include the ability to "...take spectacular 3D photos that are incredible when viewed with a Vision Pro!!!" -- sorry, that quote is me imagining an Apple presenter at a future iPhone event.)
Porn and games are often what drive new tech forward.
So it does seem like a VR/AR device fully focused on those use cases could be good for pushing the tech forward. Won't be Apple that does that, though.
Side note, it's so interesting that GPUs were pulled forward by gaming, and then GPUs ended up becoming a key enabler in AI.
The games part I get. For the right game, made explicitly for VR it does improve how immersive it feels.
The video part I don't understand at all. I've never watched a VR video that made me go "wow, I've gotta watch all videos in VR now." I found it annoying, the scale is weird, an annoying headset, with less freedom to view what's going on that a traditional tv.
I expect they'll announce that the phone will be able to record 3D video at some point. But they didn't want to announce that yesterday because it would give away future iPhone plans. They should probably have left it out altogether.
Is weird why he said that when a video or photo serves the exact same purpose, to view and relive past memories except in 2D. The world has been doing it it for a century
3d video would capture depth, so not really the same thing as 360 degree video. Think moving your head and seeing in 3D from that new perspective and feeling the depth, vs just being able to look in all directions (in 2D)
> I’ll be honest: what this looked like to me was a divorced dad, alone at home with his Vision Pro, perhaps because his wife was irritated at the extent to which he got lost in his own virtual experience. That certainly puts a different spin on Apple’s proud declaration that the Vision Pro is “The Most Advanced Personal Electronics Device Ever”.
> Indeed, this, even more than the iPhone, is the true personal computer. Yes, there are affordances like mixed reality and EyeSight to interact with those around you, but at the end of the day the Vision Pro is a solitary experience.
Absolutely. I can imagine myself using an Vision Pro from my home office. I can imagine myself using a Vision Pro while on a plane. So I can imagine using it either where I'm already physically alone or where I want to be physically isolated from other humans nearby.
I can also imagine myself using a Vision Pro to connect with other humans who are physically distant (the FaceTime demo).
What I cannot ever imagine is having the Vision Pro on my head while interacting with other people in the room (the dad watching his kids demo), nor can I even imagine using them as part of a shared entertainment experience. e.g., I cannot imagine my wife and I each having a pair of these and using them to watch a movie "together".
But, watching a movie with other people when we are all physically distant from each other? Yes, that I can imagine.
I can see myself buying a pair of these. I'm fascinated by the technology and by the ability to use them in my home office.
I don't really see them taking the place of my home theater, but maybe... 95% of the time I use it alone because my wife prefers to read a book.
> I am completely serious when I say that I would pay the NBA thousands of dollars to get a season pass to watch games captured in this way. Yes, that’s a crazy statement to make, but courtside seats cost that much or more, and that 10-second clip was shockingly close to the real thing.
Personally, I think that watching a game while being virtually part of the crowd would leave me feeling lonely and isolated. It's one thing to watch a game alone, seeing the other people cheering. It seems like an altogether different experience being part of the crowd while not really being part of the crowd.
My wife and I usually watch different shows/movies so I could see myself using a headset like this for a lot of content consumption. Rest of the use cases I agree with.
On the NBA games, I reckon it would be a matter of deciding whether being able to get a courtside experience for a far cheaper price counted for more than the isolation. And being able to get a drink from the fridge. And not having to deal with driving and parking. And for sports events which are otherwise sold out or on the other side of the world. I can see this being a huge thing if the price and functionality are right.
Show me you've never used non tactile responses for interfaces, without telling me you've never used them.
I love the VR experience of Gran Turismo for driving. It is truly immersive in ways that is hard to communicate and makes me glad I have the VR. Even with that, the game is not nearly as fun without a force feedback steering wheel. Not even close. To the point that a moderate screen with the steering wheel is a better experience than the VR without.
To the point that the idea that I will do any elaborate hand/arm gestures to interact with something I can't feel, just doesn't seem like something that will fly.
Do I like that it has the ability to work without a controller? I mean, sure? But I also have no problem using controllers for gaming, and I don't see that going away anytime soon for a lot of games.
Not with VR controllers. They didn't even show a single VR game. It seems unlikely that a company would make a VR game for the headset if you can't even play it without VR controllers. For smartphones and tablets games also default to the imprecise touch controls, which are a poor solution.
Apple is wise to leave controllers up to the accessory market to develop and evolve. Let those companies create the ecosystem for this singular device that might have limited opportunity to create accessories outside of head straps and battery packs.
Right, I didn't mean this to mean that it won't work. I meant it more that the gestures are almost certainly going to get very limited use. Much like gestures for your phone. Neat, and can be rehearsed for good demos.
So, the idea that you will have a mobile workspace that can go with you where ever you want it is unlikely to actually work. Is it easier than taking a monitor? I mean, sure. Probably not as easy as taking a laptop, though. Especially with how long the battery lasts. (And the laptop almost certainly fits more easily in a bag.)
I think Apple is making a statement regarding control of Vision in that _there is no one great solution_ for fine control using a peripheral device, and any attempt to do so could go outdated or obsolete with a different peripheral device. So why try to build one at all?
Many VR companies/devices have come up with their own takes on controllers, only to have them updated/replaced within subsequent models.
Apple is making the right choice by focusing heavily on non-peripheral control.
This makes me realize that the gestures it is being demonstrated with are going to be hilarious if they are misidentified. I'm /assuming/ that the thing will not pick up gestures between you and someone else. Such that if I pick up a paper with a similar hand shape as their "pinch and move things" gesture, it will realize it wasn't the same.
Or will this be akin to how siri does a shit job understanding anything that is not mechanical in speech? Will be absolutely hilarious if it has a hard time recognizing non-light skinned hands for the gestures. Really hope they don't make that mistake.
Hopefully the troubles with Siri are understood enough in Apple that they won’t make that mistake, because Siri is truly awful on my HomePods for thinking I’m talking to Siri when I’m not.
Thankfully many of the people in the demo videos were people of colour, so I’m fairly confident Apple has gotten that bit right and hopefully their gesture detecting cameras have IR or dot-pattern emitters to work in the dark as well.
I would cite the growing unease in Siri as evidence that they almost certainly don't have this done well.
IR dot-patterns will be its own problem. And I hope you never want to curl up on that couch to watch a movie with a blanket. :D
That said, I am certainly not trying to say they definitely got it wrong. I share high hopes that this will work. Not enough that I will be an early adopter, though.
> IR dot-patterns will be its own problem. And I hope you never want to curl up on that couch to watch a movie with a blanket. :D
Speaking of which, notice how nobody who was relying on finger gestures (rather than a keyboard) was using menus, or doing anything mouse-like. Just scrolling and clicking.
You point to things by looking at it, which is very precise according to people who have tried it, like MKBHD. Clicking and dragging is done by finger pinching. So it works quite similar to a touchscreen or a mouse.
Yes apparently it is surprisingly precise. There might be an uncanny valley when the system is perhaps a little too good at predicting what your next move is. Are the goggles monitoring your facial expressions, looking not just for eye movement but also wrinkled noses, scrunched-up foreheads, etc. ? Will it be able to function as (for example) a lie detector ?
"The company was fairly mum about how it planned to make those cameras and its format more widely available, but I am completely serious when I say that I would pay the NBA thousands of dollars to get a season pass to watch games captured in this way. Yes, that’s a crazy statement to make, but courtside seats cost that much or more, and that 10-second clip was shockingly close to the real thing."
The author was shown a 10-second clip and is ready to hand over thousands of dollars.
A few paragraphs later he acknowledges that it was just a 10-second clip, but there was also another clip, so a total of 20 seconds - surely enough to write an opinion piece of thousands of words.
The reality-shaping power of Apple's demo maestros is truly admirable.
I'm an XR believer too and I'm very happy that Apple is entering the market with a high-end device that's not for gaming. At the same time, we have to recognize that the history of this form factor is littered with amazing 10-second demos that failed to deliver actual user value, from recent failures like Magic Leap going all the way back to the late 1980s and Jaron Lanier's VPL Research.
Thank you, I feel like I'm taking crazy pills or something. I have no idea how much of the footage in the long advertisements are actual in-headset recording vs highly editorialized made up aspirational CG. Additionally, I really don't think all the people here singing the praises of Apple's currently nonexistent headset (for actual customer use at least until 2024 or something) have ever used a headset. Headsets give me:
* Headaches
* Nausea
* Eye Strain
* Neck Pain
* Acne (from sitting on the same parts of your face all the time)
Etc. Also, reading text is incredibly difficult and exasperating which is this things whole value prop since it's targeting office work. Until anyone can actually try this and ensure that these are all non-issues, all the speculation is a weird cultic worship of Apple.
Just in case people forget, here's some misleading advertisements for Google Glass that didn't quite live up to the hype[0][1]. And some misleading advertisements about holo lense that didn't quite live up to the hype[2]. Full disclaimer, I have a valve index but haven't tried holo lens or Google Glass, but the fact that they're not ubiquitous today says something. It's especially funny that the holo lens looks very similar to Apple Vision, and then Apple makes these ridiculous claims about innovativing where no one else has. I swear I must be taking crazy pills or something.
If it’s as good as they say, then the market will show it. It’s not worth worrying about whether it will succeed or not; that’s Apple’s concern alone. We can all just enjoy the ride in the meantime.
This is why I dont understand all the doomerism. If Apple is totally bullshitting here, we're going to find out extremely quickly, which will undoubtedly harm their reputation in the space when it comes to the next generations.
If they weren't confident it works almost as magically as its described, it would not have been shown.
Skepticism is not doomerism— and in fact should be praised.
Multiple companies have tried and failed. Their marketing demos looked just as compelling.
Sure, we will find out. Eventually. In over a year when it launches.
But until then, why should we trust companies who’s best interest is not to give us the truth on what their product is but rather the best possible vision for what it will be?
So far every single iteration on VR/AR/XR has been extremely disappointing once actually used, but with extremely high fidelity marketing demos. This might certainly be different, but until it’s proven that it is, why should people believe it? “Fool me once” and all that very much applies.
>Skepticism is not doomerism— and in fact should be praised.
Assuming something is going to fail, like many are across Hacker News since the announcement dropped (perhaps not in this thread), is doomerism. There's a difference between that and expressing concern over Apple's claims.
>Their marketing demos looked just as compelling.
I could not disagree more. I could have not been less interested in Oculus nor Google Glass from their advertisements. But an AR experience (as opposed to VR) that integrates with my Mac and iPhone? It has my attention.
>why should we trust companies who’s best interest is not to give us the truth on what their product is but rather the best possible vision for what it will be
Because Apple has delivered high quality products the vast majority of the time (butterfly keyboards and App Store limitations notwithstanding). Why would we assume they're suddenly dropping the ball on this, when they're on stage calling it the future of computing?
> This might certainly be different, but until it’s proven that it is, why should people believe it?
It all comes down to the company producing it - Apple hasn't made a device like this before, and comparing Apple's products to Meta's and Google's is a terrible metric to go by. It's like eating gas station sushi, getting sick, then going to an upscale sushi bar, and saying "Ohhh I've had terrible experience with Sushi in the past"
Even if you aren't willing to give the benefit of the doubt to a company that consistently produces high quality products and building a brand new product line, express that very healthy skepticism by trying it out at an Apple Store before purchasing - at that price tag, I sure am.
When you say something IS going to fail, then it is doomerism which isn't hard to find. Also expressing skepticism on some forum is nothing praiseworthy, only reviews matter or just trying it yourself at the Apple store once available.
One of these... Yes let me scroll down and copy and paste a comment just for you. If you are actually interested, you put in the effort yourself, otherwise we both move on.
My assumption is that Apple’s big innovation on a familiar looking form factor is addressing the issues you describe.
Apple generally succeeds with mass market friendly ergonomics. I understand skepticism about unreleased technology, but I personally wouldn’t make the assumption Apple has botched the ergonomics on this release. It seems like exactly the kind of thing they would have focused on reaching a high level of quality with.
Most of the nausea issues in VR stem from the disconnect between moving in virtual space without moving in the real world, and Apple seems to be focusing on applications which maintain a 1:1 mapping between real and virtual movement. Games are where that's most likely to be an issue and they're very much not focusing on actual VR gaming, only playing non-VR games in VR on a virtual screen.
I think it’s actually more reflective of the fact apple tends to be pretty thorough in product development and anything launched usually covers an awful lot of the problem domain. However their 1.0 launch of any product is usually an 80% complete product, for apple. For most other device makers it’s a 220% complete device at 1.0. So, I expect the first version will be a directional indicator that address more than twice the gap of other products but is itself not “done” by their aspirational standard. They then iterate in their respective cadences by product towards some converged state that reflects a “done,” then hit an iterative improvement cycle. It seems to be in the range of 4-5 generations.
Because this pattern is has been done again and again it’s fairly obvious. The first is the high bar for releasing at all and the willingness to let it die before launching something less than 220% better than industry standard despite tons of investment (apple car?). The second is the progressive improvement over the first several iterations that advances the product in the remaining gaps as their technology improves.
To me, this feels fairly objective and rational and not particularly cultish. It’s the benefit of an earned reputation and a clear and adhered to product development strategy.
Amen. Love, hate, or don’t care (me), Apple does have an earned reputation for not delivering flops to market. While I have about as much (personal) interest in Vision as I have in anything Apple-stamped past the release of the iPhone 6, it’s certainly not crazy-pilled to think they will deliver a quality product on day one, with the caveat that there will probably be some iteration before we get to Vision 2.0 where the limitations get smoothed out.
And also to chime in on the most common argument for the "VR/AR for work" - the meetings. I actually DON'T need to see anyone's faces or avatars on the online meeting. I don't WANT to see them and I'm honestly tired of Zoom trying to shove me in my face their speaker thumbnails and constantly switching to full screen. I don't need that at all FOR WORK. For work I eithr want to see screen being shared pixel perfectly (so no floating pseudo screens in VR) or don't see zoom/meet interface at all, keep it in background and just listen, while doing work at the foreground.
VR people advertising to me a "real life like meetings along with a meeting room, whiteboard and full body avatars" are either clueless or delusional, because that stuff has actually a negative value FOR WORK. Sure, when you have something like an online party (yay... fun...) then seeing other can be entertaining. But not for actual work.
You might not be taking crazy pills - but your list of complaints is a very individual one. Headsets may give you personally headaches, nausea, eye strain, neck pain, and acne. But they don't for me, and they don't for any of my friends, or many of my colleagues. And I get motion sickness trying to play many video games.
The only thing stopping me from working for extended periods inside one of the headsets I currently own is the fact that the resolution they run at is too low for me to put enough text on the screen. I can absolutely believe that a headset with a higher resolution would make that use case viable for me.
It seems more likely to me that the reviewers who are hopeful about the product after their 30 minutes of use of the headset (some of whom are indeed very experienced users of AR/VR hardware) simply don't struggle with the individual issues that you have.
To add to your list, I was at a friends and used their headset for about ten minutes of gaming and the parts of my head that touched the headset were drenched in sweat, but that just may be a peculiarity of my physiology.
We also shouldn't assume that Apple will magically fix all of these issues either. These are well known problems with VR, and Apple has yet to prove they've solved them. And until these problems are solved, VR will struggle to move beyond a novelty.
> Acne (from sitting on the same parts of your face all the time)
This might actually end up being the hardest bit to solve. VR has to create a dark room for your face, so removing facial contact is practically impossible.
There are very few things that normies will accept, if the compromise is more acne.
> The reality-shaping power of Apple's demo maestros is truly admirable.
It really isn't just the demos that wow us, and it isn't just the high-end device. The device is packed with tech, for sure. But 80% of the story here is Apple's strong design choices:
"Here's how you will interact with the world. Here's how other people fit in. This is how UI is going to display. We rewrote our entire UI stack on top of a physically based renderer so developers can comfortably recompile their millions of existing apps. Here's how we process-isolated the gaze tracking so no third-party app will be able to data-mine it for personalised Ads"
They didn't just build a headset and call it a day. They didn't show useless, aspirational demos (though during the keynote, Disney certainly showed a whole lot of rubbish). They developed the UI paradigms, human interface guidelines, put huge thought towards accessibility. Those of us impressed with this are not wowed by just hardware, we're amazed by the amount and strength of design thought put into every aspect of your interaction with this computer. There's so much _opinion_ about what this computer should be, and how we should use it, that's the exciting thing
Their opinions may not pan out in the future. But I'm so glad _someone_ is having strong opinions about human computer interaction in the AR/VR space, because Meta and the rest certainly don't
Pretty sure if I have my headset on and I'm in the middle of the room with my kids, I'm not a "hip father" doing cool shit with my kids. Consider if this was someone on a phone in the kitchen discussing something. We wouldn't think, "how progressive this person is to have work conversations while mostly ignoring his kids."
Granted, as a parent, the scene of the airline person upset that there is a kid on the flight is... off putting. To the extreme. Especially when the response is "ugh, thank goodness I can turn off all visual senses around me." SMH
And I am struggling to see any UI design/consideration that was new in this clip. Do you have a rundown on exactly what was amazing?
Oh I totally agree. The scenes with parents/kids were so off, and felt emotionally jarring
> And I am struggling to see any UI design/consideration that was new in this clip. Do you have a rundown on exactly what was amazing?
The State of the Union video has more detail. All of SwiftUI and UIKit now sits on top of their material-based renderer. You can now render a Metal Shader on a SwiftUI view in one line of code. They have consistent and detailed paradigms for how menus, toolbars, productivity apps etc should work on this platform (likely from porting all of their own). 3D models fit seamlessly into the UI view hierarchy. They have detailed design and interaction constraints for all new types of spatial windows, as well as legacy apps. All of this even goes back to their Objective-C APIs, which are updated appropriately
Then they've implemented things like process isolation for eye-gaze tracking. Simply so apps can't read your gaze and abuse that data. No one else makes decisions like this in a v1 product. It probably required a whole team of people to get right, considering that gaze is the primary interaction method a user will be using with your app
It's hard to explain just how much depth of thought has been put into this platform. They have addressed a lot of hard questions with some very good design, and above all, they are opinionated about their platform. In every corner and detail there is a strong design choice about the right way to design and implement that interaction. That opinionated design is why people get so excited, and it can be so jarring when they get it wrong
The "material-based renderer" sounds... fluff? Most any system should have it so that rendering a menu is a single line of code. Same for toolbars and other. Those are typically simple registrations of ("name" "description" handler). Anything more that that, and you are doing tailored menu/toolbar and are straying on purpose.
That leads, then, to the spatial windows and APIs that go with them. And... quite frankly, I have zero faith that anybody can pull off a V1 for that API. Expect churn and capability growth.
For the process isolation on eye-gaze tracking... assuming you can get "hover" events for things you are gazing at, than I fail to see how they can keep you from reading the gaze. I expect they will try and lock down abuses of that, but as soon as you enable coding against what a user is looking at, developers will find a way to leak that.
And if they expect that the primary way I will interact with an application is by looking at it.... that feels very very sad, all told. I don't want to just look at things, I want to manipulate them. And for that, I am almost certainly going to want some haptic/tactile interaction. Such that this won't live on its own.
Happy to be proven wrong, of course. And, for what it is, this does look impressive. I just don't buy the marketing spin on it, at the moment. Way way way too "dreams work, bro!"
The gaze tracking is process isolated. Only when you tap is that location ever communicated to the application process — so they had to build a UI API that supports configurable hover behaviour where that hover behaviour is never readable by developers
They've thought through the accessibility features, how VoiceOver works with the headset, dynamic type sizes, etc.
Designing this sort of consistent API across a whole platform, where it's not only easy to use, but really delightful, is the bit that's exciting to a lot of people
I have collected a bunch of A/VR headsets and developed for them. None of them think deeply about these things, so they just haven't been that interesting. They are more "we supply the hardware, you supply the opinion about how it should be used." I really want the other side of that story
The Disney part of the keynote was all "dreams work." It was complete nonsense with no clear direction. Apple's stuff, while it might be wrong, is at least opinionated and clear
There is a lot of marketing sheen, but there's a clear design opinion about human computer interaction that comes through. That's the exciting bit that I think people react to, sometimes without realising
I didn't actually watch the Disney part. I saw the dad "working" in what seemed like a kitchen? Can't remember. The shallow interaction with the kids, though, gave me cringes.
Then there is the juxtaposition of, "this let's you present as if there" with the, "of course you have to fly places to be with people, so when you do, zone out like a champ"
Of course, zoning out better not need you to interact much, as hearing on a plane is tough, and pinching your neighbor is not smiled upon.
I will mostly have to take your word for it that hover actions are not observable by the application. Will make games... More amusing than usual, with no memory of being looked at for characters.
And not showing any extra peripherals is a big part of "dreams work." Physical feedback is huge, and a big part of why controllers are needed. Even just the vibration of the standard PS5 controllers go a long way. For driving games, a haptic wheel is more immersive than the vr.
> I will mostly have to take your word for it that hover actions are not observable by the application. Will make games... More amusing than usual, with no memory of being looked at for characters.
It's described in the Keynote. This is the exact quote:
"For example, where you look is very personal. It can give away something about what you're thinking. In Apple Vision Pro, where you look stays private. Eye input is isolated to a separate background process, so apps and websites can't see where you are looking. Only when you tap your fingers do results get communicated, just like a mouse click or tap on other Apple devices."
I can't speak to whether extra controllers are a good idea for Apple. From the impressions I have watched and read from VR enthusiasts (e.g, Norm from Tested) they have very high praise for the input and UI. I agree with you, however. I don't think typical VR games will be big on this device
> It’s obvious how a 10 second clip scales to 10 minutes or 10 hours or 10 days.
A 10 second clip that will have been tuned to perfection by the very best editors for the demo is one thing. A live, un/rough-edited capture of a game broadcast in real time is quite different.
I've seen some truly amazing sports coverage to show off my OLED TV when I bought it. I still have the USB drive the TV came with and it does indeed look gorgeous. You know what doesn't look quite so good? Live broadcasts of those same sports. They are pixelated with some macroblocking and judder and posterization as the cameras quickly pan around the field and cut randomly.
It still looks pretty good but no way near as good as the minute long demo video LG used to show off how good their TV can look.
Obviously I've not used a Vision Pro so my comment isn't about the Vision Pro specifically, just that it is quite easy to make things look very good in a highly tuned demo for something vs real world broadcast quality.
Is "an unrestricted view of the game from close up" the only appeal to court-side seats? I think it's perhaps the least interesting aspect, since you can achieve this for free at almost any non-NBA level basketball game.
Or perhaps the appeal of court-side seats is the atmosphere, experience and social signaling, none of which are provided by these glasses.
Our desire for "cheap" court-side seats is understandable from an evolutionary sense. But it's almost like these glasses (and VR in general) are designed to provide the "feeling" of evolutionary benefit while carefully and completely removing any actual benefit.
> Or perhaps the appeal of court-side seats is the atmosphere, experience and social signaling, none of which are provided by these glasses.
The author's entire point was that the demo did recreate the atmosphere and experience he had while courtside at an NBA game. Maybe it doesn't provide the social clout, but that's not the point for tons who wish they could sit courtside.
> Maybe it doesn't provide the social clout, that's not the point for tons who wish they could sit courtside
There is no maybe about it. Tell me another reason why people would want court side, that wouldn’t also apply to a high school game (where people could sit court side for free but absolutely don’t)
* You can't do the math from a baseline of "not watching the game at all". If this creates 20% of the experience but a $1000 TV creates a different 20% of the experience (watching with other people next to you, for example), you haven't gained any value over the cheaper device.
* Unless you're regularly buying NBA tickets, this isn't going to "pay for itself". Solar panels can (in theory) pay for themselves because they cut back on a bill you'd have either way. A product like this can only pay for itself if it causes you to spend less money on NBA tickets than you otherwise would have.
Make no sense to sit courtside, the spectator should be able to be wherever they wanted, either in the air, on the nosebleeds, on the court itself, or under the court.
Having a courtside seat is a physical constraint that does not need to be respected in VR.
Indeed, but even in this case, the spectator could still be at any location outside of the 3D area being filmed (in this case, the game itself) for complete experience, or inside for a incomplete (potential blindspots) presentation.
I think you could make the same claim about TV broadcasts, but pro sports seems to be doing just fine, some claim that live sports is what is keeping cable alive these days.
Live sports and being at the event is always going to be a special thing that people want to pay for- you can't heckle the opposing team through a headset. If people want to pay even more to get a coach-eye view of the game, I can't imagine ticket sales would be impacted one iota. As it is, watching a game on TV is a better viewing experience than being their live for most sports. American Football, for example- the field is large, your seat is static so it can't follow the action. Its often played in cold, rainy conditions- I go to a game or two a year just for the excitement of it all, but come a dreary rainy late November day, I am happy to sit on the couch and enjoy the surround sound. All major US sports make most of their money off TV broadcast rights. This will be no different.
I'm more a pro hockey fan (go Canes!) and have paid thousands of dollars a year in tickets. I would still go to several games because that is an experience in itself (tailgating, etc). But I would much more gladly pay for a VisionPro than a Bally Sports subscription to watch the rest of the games at home. If they can get the local announcers to do the play-by-play on the VisionPro, I'd be throwing them my money just for this feature alone.
It may look and sound the same, but no one is accidentally spilling beer in merriment or telling their friends they were there.
It builds the brand. Watching the game on tv is already a far superior way to actually see what’s happening. No different from any other live event, it’s a social experience.
> Watching the game on tv is already a far superior way to actually see what’s happening.
Yes and no.
Knowing the entire state of the game? Sure, TV's pretty good.
But watching on TV doesn't give the strong impression of the sheer physicality of the game that sitting a row or two away from the court does. I used to sit baseline for Warriors games and it sure left an impression.
The people who are paying for those seats will still pay for those seats. The people paying for the headset experience would’ve never paid for the seats. Different audience.
In the UK football broadcasting is regulated. The traditional top-flight league games kick off at 3pm on Saturday. None of those matches are allowed to be broadcast to maintain gate revenues for the clubs. Very roughly speaking only 3 matches on Sunday and one or two on weekday evenings are broadcast. Cup matches etc are allowed to be broadcast too.
All these rules only govern what they can broadcast to UK citizens though. If you have a globally popular sport/product like the Premier League you can broadcast the Saturday 3pm matches to them, as they're unlikely to be able to attend a match anyway.
All that aside I actually think cycling could be a wonderful sport to view in this format. Great screnery shots from helicopters and close up peleton shots captured on motorbikes. Also strap into your turbo trainer to ride along for an hour too!
Unfortunately any case where the camera moves is not going to work, for a very large portion of the population (I don't know the percentages of people who get motion sickness in VR but they get that even from their own slow motion, imagine motion from a bouncing camera on a motorcycle)
Really? We're talking about a transformation from a fixed and time-bound asset, the live experience of being physically present, into an entirely digital experience which can be replayed.
Doesn't this change the potential customer pool from people within traveling distance at the right time with enough money, into a global market?
Consider this: Sell the live courtside seat for the same 1000+; Open up sales for the virtual experience for 20% and sell it to anyone you like -- and, the next day, put the same no-longer-live experience available to pass-holders at $x/season and on-demand for $50 --
Haven't you just transformed your $1000 seat into some hugely increased figure?
I wouldn't know, but doesn't the point still hold if I can sell (let's say) 1000 tickets at movie ticket prices, which I wouldn't otherwise have been able to sell? Call it $15 * 1000 people ... ?
I think the made up hypothetical figures distracted from the main idea, I was questioning the position that NBA would lose value when it seems like technology like this would open up a new audience of consumers who might now be willing to spend some money that they wouldn't otherwise have spent.
There's a limited number of seats in any stadium and the demand today is far bigger than the offer. Besides being very hard to find tickets, a lot of fans with money live far away from the cities where the game is being played, and some of them (lots of them, actually) live in other countries.
People don't go to the game solely to watch the game. They go because of the atmosphere, the status of being able to afford the tickets, and the excitement. The VR experience even if a lot more immersive than watching on TV, would never be the same, people would still buy seats as they always did.
The worst will be blackouts for NBA games for this.
I had NBA TV for one season, but because I live in the Portland area, I can not watch Portland trailblazers games. It was ridiculous, so I did not subscribe this year.
I wear glasses daily. When I go on scuba or ski trips I wear heavy eyewear in tough conditions for multiple hours on end over several days. The form factor is not the problem, and it’s never been. It’s always been about the quality of the experience.
I'm on the fence about XR and I for sure won't buy a "Pro" grade device. But regarding your point about reality shaping: Sometimes it's just obvious. When I saw the presentation for the iPhone, I know this was the first phone I really really wanted - everything about it was compelling. I guess this what some XR enthusiasts feel about this device.
MKBHD mentioned the same use case in his video and how he would pay good money to NBA for it. And he used the headset for a good 30 minutes. Maybe it's a valid use case? Have you personally tried the headset to flat out dismiss it?
I'm not dismissing it. I'm actually considering shelling out the money for this Apple headset. I've just become wary of exuberant initial reports based on 10-second exposures to completely scripted demo apps.
These journalists who get this launch-day experience were selected for their history of writing nice things about Apple. They're being shown the device in ideal conditions under the watchful eye of Apple's world-leading PR team, and they've been primed for years to expect something amazing from the Next Big Apple Product. There isn't a tremendous amount of objectivity in these circumstances.
Apple benefits greatly from the fact that "journalists" are some of their most devoted fans. That gets them free coverage that's wildly disproportionate in both volume and positivity. Not to say they don't make good products - I have to admit that they do - but they escape a lot of harsh scrutiny that other companies have to deal with. Terminally online techies are another core demographic, resulting in a similar skew on social media sites like ... <looks around> ... um, Reddit.
Newton, Power Mac Cube, iMac G4 "Sunflower", and solid gold first-gen Apple Watch come to mind as Apple hardware products that failed to live up to the hype.
On the software side there are more misses, but also the stakes are lower.
It's a great track record for thirty years, of course. All the other big tech companies have graveyards full of half-assed product launches.
Except for the Apple Watch I had all these products. They were some of the best of their time and even hold up today in terms of design and usability.
Your definition of success seems extraordinarily high, if these products were failures. Maybe measured by items sold. But then each of them stands in nearly every single design museum like MoMA, history books, and were clear stepping stones to the Mac mini (cube, sunflower without display), and ipod/phone (newton). So bottom line they were a clear success to Apple’s enormous brand value.
IIRC Jobs purchased NeXT when he took the Apple CEO job, to get a new OS base. As NeXT was his company, I'm pretty sure it was a decision based on bias and urgency and not because NextStep was the best there was. But yeah it's an interesting legacy for sure :)
> IIRC Jobs purchased NeXT when he took the Apple CEO job
No, Apple bought NeXT after the failure of Copland to birth a replacement for the creaky and leaky MacOS (BeOS was the big alternative, but Apple thought they were asking for too much).
And NeXT proceeded to take over: Apple bought NeXT in February 1997 keeping Jobs back as an advisor, Jobs staged a boardroom coup to remove Amelio in July, and was then named interim CEO.
Following that he started cutting into the existing product lines and placing NeXT people (Tevanian , Forstall), promoting people he was interested in (Ive), or hiring them from outside (Cook). Basically reshaping the company.
More specifically, Steve Jobs founded NeXT after Apple pushed him out in the late 80s.
A decade or so later, Apple was on the tail end of a long, slow, downward slide. The team wasn't happy with the current state of their Mac operating system, and bought out NeXT to use their software as the basis of its replacement (Mac OS X).
Jobs, as CEO of NeXT, came back to Apple as a consultant, but was CEO again in a matter of years.
NextStep was easily the best there was. Nothing else was remotely suitable. The only contender people like to fantasize about is BeOS, which was nice (I used it as a daily driver for a year), but a toy compared to NextStep and OSX.
Other than the watch, all the products you mentioned are before 2002. Anything in the past 20 years that have been huge misses in delivering up to expectation?
Also how is the 1st gen watch a failure? It sold millions immediately, and was a huge commercial success. It pretty much started a gold rush for digital watches again.
I think it's fair to say the new "Apple" (last 15 years or so), has been pretty good with exceeding expectations and breaking through barriers that other companies just couldn't.
In what sense did the iMac G4 fail to live up to the hype? That was my first Mac, I still have it. Thought it was an incredible computer for the time--the iMac + OS X 10.1 Puma was absolutely magical coming from Windows 98 on a beige Dell.
The design still looks incredible too, 20+ years later.
I will grant that the Newton failed. In the Apple hardware category, I'd also add the iPod Hi-Fi, the butterfly keyboard, and the touch bar.
That said, Apple's failures are rare and their multi-decade track record of delivering on hype is unsurpassed.
The sales of iMac G4 failed so badly that there was a three-month period in 2004 when it was simply unavailable. It was discontinued without a successor on store shelves.
It was too expensive to manufacture and Apple wasn't sure they could sell it, so they just didn't make any. Hard to believe that could happen to a Mac model today.
What was a failure about that? It looked good and worked well.
> solid gold first-gen Apple Watch
In what way did the gold watch fail? It was the first gen watch, the same hardware as the rest, just made of gold for rich people. It didn’t fail any more than any other color did.
Both failed badly to live up to Apple's sales expectations.
The Sunflower iMac was discontinued even before its successor shipped.
The Apple Watch Edition was supposed to grow Apple into a luxury brand and expand its margins massively (you can find many interviews with Jony Ive from 2015 where he explains this thinking). This strategy was a dud.
You need to seriously read what you’re actually replying to instead of what you think you are because you keep bringing up that the AW is not a failure when nobody said it was.
> Newton, Power Mac Cube, iMac G4 "Sunflower", and solid gold first-gen Apple Watch come to mind as Apple hardware products that failed to live up to the hype.
No one seriously thought that the Apple Watch was going to be an iPhone size hit. It’s a complete straw man argument.
It was a pet project for Ives. Do you really think that Apple didn’t know their target market well enough to think that they wouldn’t be selling millions of $10K watches?
It’s a perfectly acceptable comment now. No one in their right mind thought there Apple had realistic expectations of selling 10 of millions of slow 1st generation $10K watches. It was a straw man argument that I really didn’t think that people took seriously
> No one in their right mind thought there Apple had realistic expectations of selling 10 of millions of slow 1st generation $10K watches
This is not and has never been the bar for "failure". Stop pretending it is just because it makes your argument easier.
The comment you replied to states:
> The Apple Watch Edition was supposed to grow Apple into a luxury brand and expand its margins massively (you can find many interviews with Jony Ive from 2015 where he explains this thinking). This strategy was a dud.
Show how this specific thing is untrue, not your own definition of failure. Was that not Apple's play with the AWE? Was it not a failure, almost immediately discontinued? What exact part of that statement is false?
This is the real strawman and your projection is plainly obvious.
There was no world where a few $10K Apple Watch was going to “expand Apple’s margins” meaningfully compared to the number of iPhones Apple sells. Apple knew this. Anyone who knows anything about finance or simple math knows this.
The AWE strategy failed, yet now the AW (like the iPad) define the category they are in. AV could be similar. Apple doesn't know which use case will take off, but it has to get it out there to find out. Leading with the best hardware they have right now lets developers go wild.
The Apple Watch is not a failure by any objective measure. It’s very profitable and by far the biggest in the industry. Everything else you name was pre-iPhone.
More recently, the TouchBar and 3D Touch are both pretty massive market failures for Apple despite being engineered to perfection.
I think the killer was that in the Platforms State of the Union, they didn't show anybody wearing the thing, even while saying "and I can send it to the device and look at it there". Almost like they were embarrassed by it, or something...
The Quest has live VR NBA games, and it's not selling a ton of $500 headsets for that use case, so I somehow doubt that it's going to move a lot of $3500 headsets.
> Five games will feature celebrity broadcasters and be shown in 180-degree immersive VR, and WNBA games, NBA G League games and NBA 2K League games will be available to watch as well.
There's a big difference between just five games and all the games in a season.
I could totally see it moving $3500 headsets if it were actually all games. Also 180° is good enough for a lot of things but for this you'd want 360°.
Eh, obviously all else being the same 360 would be better - but it’s a lot easier to make 3D camera rigs and streams that at 180 degrees.
People also probably don’t realize just how much bandwidth this takes. You need enough to stream to both 4K eyes along with enough buffer space wherever the head can turn so that there isn’t a delay when you move your head. A good portion of the world doesn’t have fast enough internet for that.
The author was shown far more than a 10 second clip. He along with John Gruber got to do a full hands on demo. They talk more about it in their (paid) Dithering podcast.
<< The reality-shaping power of Apple's demo maestros is truly admirable.
One of the things we have dissected in our MBA class was Apple's Iphone reveal and how carefully choreographed[1] it was ( I am being very generous given my anti-Apple bias ). You could argue that Apple was simply lying since the product was not ready, but looking back, they were able to deliver on the promise of that reveal.
Oddly, it makes me somewhat hopeful that this product will work as well. I am not ready to shell out that cash on it just yet, but I will be looking at SDK as soon as they actually release it. Heavens protect me; I want to play with that toy.
Not least because we have been doing the courtside view of sports events for what... nearly a decade now? And it's not very compelling compared to proper coverage with camera angles.
> It’s going to seem pretty weird when dad is wearing a headset as his daughter blows out birthday candles; perhaps this problem will be fixed by a separate line of standalone cameras that capture photos in the Apple Immersive Video Format, which is another way to say that this is a bit of a chicken-and-egg problem.
It makes me think about the article's earlier reference to the iPad's launch:
> Jobs went on to list a number of things he thought the iPad might be better at, including web browsing, email, viewing photos, watching videos, listening to music, playing games, and reading eBooks.
You know what else the iPad is good at? Perhaps having a wide enough frame, while in landscape mode, such that multiple back-side cameras can be far enough apart to get as good a depth-of-field as the Vision has? And, potentially, at being a preview device with good enough eye tracking (perhaps via multiple front-side cameras) to at least mimic the 3D effect when previewing, before you put Vision on for the full experience?
Dad holding up this kind of "iPad Vision," or putting an iPad Vision that's capturing the memories on an (automatically swiveling and face-tracking) "TRiPod," is far superior to putting on the goggles!
(Apple, please please hire me as your product naming visionary. I have many more punny names where these come from.)
But I can see why Apple would sequence the releases so that Vision comes first; the iPad Vision and TRiPod and accoutrements are likely to be expensive and perhaps even clunky at first, more akin to a pro camera rig than anything previously released, and would need to be justified by a "magical" viewing experience.
Make no mistake, though - Vision is just the beginning of an ecosystem that will be built around these experiences, and one that I think can complement our in-person interactions in a profoundly social and human way.
Lord. Watch any of the D#'s and hear Walt, Steve and Kara talk about things. Steve's Apple is just matter of fact simple, no heady logic - just bare sense. Listen to how many times he says "you want to do this, how we want to do that" - everything at Apple was perfecting what we innately wanted to do but just didn't know how to. If you hear anything other than: 'We make the best things for the things we want to do everyday' - please reply it here.
There was also a moment where he spoke directly to not making a PDA because the phone is what we already had, already understand to use, and implicitly know what would be amazing if we saw it.
The PDA is VR/AR, glasses are the phone*
*one could argue glasses are not even the goal and to that I'm not sold
I doubt this tech will be the size and weight of regular eyeglasses anytime soon. Probably 2050 at the earliest. It would require sub-nanometer lithography.
In regular I disagree with your assertion. As long as these end up being cheaper than a couple 4K monitors, which I imagine will happen by the end of the decade, it would be worth it for that alone. And that is to speak nothing of the onboard compute.
This is a really good in depth review, the fact that the vision is recreating everything digitally is really amazing and creepy at the same time. I am in awe of the technology, but the last part of this article also highlights what will certainly be a one of its major drawback and weaknesses, how isolating it is (which previous VR headset also suffer from).
While the AR aspect of it helps somewhat, I did note also during the presentation the ridiculous clip with the father wearing the headset at his child's birthday party, and especially how all the examples uses except in a professional setting were for individual lonely users. Both exciting and terrifying at the same time.
Now I am a lowly programmer + VR enthusiast, so my opinion does not count. But, how do big companies allow Apple to get away with such easy wins !??
Apple said : "We are going to throw infinite money at a pilot product that's basically a tech demo, and convince the market that this tech is for-real-for-real."
Meanwhile, Facebook refused to create a balls-to-the-wall VR headset for years and still ended up with a conservative 1500$ device even though they knew no one will purchase it. T̶o̶ ̶b̶e̶ ̶f̶a̶i̶r̶,̶ ̶C̶a̶r̶m̶a̶c̶k̶ ̶h̶a̶d̶ ̶b̶e̶e̶n̶ ̶p̶u̶s̶h̶i̶n̶g̶ ̶f̶o̶r̶ ̶a̶n̶ ̶a̶l̶l̶-̶o̶u̶t̶ ̶d̶e̶v̶i̶c̶e̶ ̶b̶u̶t̶ ̶g̶o̶t̶ ̶s̶i̶d̶e̶l̶i̶n̶e̶d̶ ̶b̶y̶ ̶Z̶u̶c̶k̶'̶s̶ ̶p̶u̶s̶h̶ ̶t̶o̶w̶a̶r̶d̶s̶ ̶c̶h̶e̶a̶p̶ ̶a̶n̶d̶ ̶s̶c̶a̶l̶e̶.̶ (I might have mis-remembered)
Apple's strategy is dead obvious. Of course. When you want people to change how they have consumed information for decades, you HAVE to blow their socks off. Samsung understood this with the 2000$ Fold devices. The first computers cost tens-of-thousands of 2023 dollars. Car manufacturers understand this with Halo cars.
Facebook had a 7-8 year head start, and it never struck them to lead with a 'Halo VR device' ? Meta has among the best ML teams in the world who could produce 'Apple quality' face renders with ease. Sadly, the entire group was strapped down due to limitations of the kind of compute you can fit into a 400$ device.
premature optimization is the root of all evil -Knuth
Amen Donald, Amen.
____________
What makes Apple great:
All big companies have competent employees. Competent employees come together in dysfunctional manners. 100 good ideas produce 1 terrible one and the final result looks worse than any of the initial solutions.
Apple somehow manages to preserve their obvious wisdoms & the brilliance of the best ideas across their development (product and software) process. The constant erosion caused by committees, meetings & standard communication friction does not atrophy the seed that was the initial idea.
It seems like faint praise. But, despite sounding easy, Apple seems to be the only company that's able to do this on a consistent basis with physical electronics. (video games, cars, construction projects achieve this in their field, but consumer electronics seems to completely lack this, barring Apple)
Apple seems to have two modes they do quite well, create tomorrow, refine today (as it pertains to technology/personal computing).
Some % of the company seems to be focused on extremely polished loooonngg term second market mover advantage of bleeding edge tech, and some % of the company seems to be focused on continuous improvement of what exists today.
Apple does a great job of working on cutting edge technology that is generally applicable, watching other companies implement that technology, and then leap frogging them. At the same time, they improve their current offerings and "future focused groups" learn from what "current focused groups" are doing. I guess it's why you see fits and starts from Apple.
Has Apple every introduced anything totally new? Has Apple really "invented" anything product wise? I think they're just masters of second market mover advantage.
"In 1979, a 24-year-old Steve Jobs struck a deal with the executives at Xerox: Jobs would allow the investment wing of Xerox to buy shares in Apple at a pre-IPO price, and in exchange Apple’s team of engineers would get to take an “open-kimono” tour of Xerox PARC. Over two visits, Apple engineers were given the ability to look under the hood of PARC’s most impressive innovations."
If that is all you see you are missing most of the iceberg. The App store, airpods, apple watch, a whole host of their own apps, iphotos, and so much more. All very well executed and generating tremendous revenue. Is each one the best? Not necessarily, but as a whole ecosystem? Probably.
apple have changed or introduced dominant paradigms that have shaped the entire computing industry several times, and have changed how most of the world access or use technology
> Carmack had been pushing for an all-out device but got sidelined by Zuck's push towards cheap and scale
Is that true? From this article[1], it sounds like he had been pushing for cheap headsets, not expensive ones:
> There he talked about his internal efforts to push for the development of a "super cheap, super lightweight" Meta VR headset that could come in at "$250 and 250 grams." Instead, Meta has put its recent VR hardware efforts behind the heavily overdesigned and expensive $1,500 Quest Pro. "We're not building that [cheap, light] headset today, but I keep trying," Carmack said with some exasperation during the keynote.
Well, as an Oculus customer I can say that the Apple vision is far far from Meta one, beyond what, with all due respect, Carmack thinks about Meta failures.
Apple includes a complete vision of an operating system on a new device while Meta vision is much narrow: run VR apps in a VR device (which runs Android). In the vision of Apple you will run the same, again the same!, apps but with a new experience in a new form-factor. In addition of that you will run the obvious specific apps. I imagine that they will launch GarageBand anytime and use a piano or guitar in VR. Microsoft tried to bring a new operating system for multiple form factors (e.g. tablet) which turns to be an messed UX. I really don't think many people understand the Apple edge.
Also, I think the Meta apps strategy is completely catastrophic: I don't see a significant amount of apps in their ecosystem. When Nintendo, Microsoft (Xbox), and Sony launch a new device they have a good list of games in the pipeline.
I can also add that I contacted several people at Meta about a specific VR initiative and nobody answered...
I was excited to see Carmack working on VR, but in retrospect he may have been exactly the wrong guy. Romero knew what 3D shooters needed to be, Carmack knew how to make that happen within the constraints of the day.
With Carmack’s guidance, Facebook made an eminently buildable VR headset, a perfect device for the constraints that limited VR at the time. I bet they could have satisfied a ton of demand if they’d shown anything that got people excited enough to actually have demand.
My hot take from a random person spending time in the trenches: Meta leadership is focused on a _meta_ problem, namely that they do not own a major platform and are constrained from further growth compared to Apple and Google who do own major platforms. They are perfectly happy to be the android to whatever the next iOS is. From that perspective, a halo device is a risky bet because it doesn’t let you onboard a lot of people quickly and start building platform.
The problem with Meta is far deeper than a few corporate heads or what they control - the customer just straight up does not trust them. Privacy is something they've never really been good with (remember, they were at the center of the Cambridge Analytica scandal) and whenever they've been asked to do any form of moderation, Metas name always gets dragged through the mud (regardless on if you think they screwed up or not, this is about how this affected their PR).
It doesn't matter how much money
they dump into VR - unless they either get to brand it fully away from Facebook or do some serious PR repairing, their VR efforts won't work out. Privacy is a big thing and for Facebook/Meta, that is the one name even most average users will recognize as "do not trust" when it comes to their privacy related matters.
Note that this only really goes for the Facebook and Meta brand names because Meta spend a lot of time promoting the rename of Facebook to Meta, so it inherited all that distrust. It doesn't go for Instagram or WhatsApp.
That’s a great way of putting it. It’s not necessarily that any given path is _bad_, but some of those paths aren’t consumer/privacy/business/whatever-friendly.
I think Meta and Amazon stand alone here with a different problem than the rest, which is in terms of people that they both have sociopathic leadership. Technically, neither have an “in-hand/hands-on” platform either: no desktop OS or truly viable productivity platform. Amazon doesn’t even use the AWS WorkMail. Those two things combined make the erraticism seem perfectly “normal”.
I think Google’s problems are survivable if they get rid of Pichai and learn to focus.
This exactly. In my view, Meta was hyper-focused on having their own App Store. They missed the boat on having one for smartphones and tablets, so VR was their only reasonable choice.
Everything else about the platform was secondary: just get a cheap device into people's homes as fast as possible.
The Oculus Quest is cheap enough to be a dust gatherer where many families never plan to purchase the 2.0 product (see also: Nintendo Wii).
I started to reply "why the hell does a social network need an App Store?" -- then I remembered FarmVille and a few of my friends becoming zombies...
What a time that was...
It's an interesting way of looking at things. I think, like the article mentions, Meta has been more focused on creating Facebook VR than on pushing things the tech forward. "Oculus" became a recognizable name even to those of us who weren't following the space.
The Oculus hardware seems well-built (my kid has a Rift S) but it's clunky. I can't count the number of times we've had to reset the boundaries/floor level. It's immersive enough when it's working, but (and I understand it's years-old, mid-level tech) I'd never expect to work with it.
Microsoft, on the third hard, has taken a hard pivot towards industry with HoloLens -- which is at the same price point at Vision Pro, but has almost 7 years experience gained in the space at this point.
The “Facebook Platform” days of FarmVille were the closest they came to building something resembling a platform. Unfortunately for them they optimized against their own long-term interests in exchange for quarterly ad revenue. And they spent it all on chasing Zuckerberg pipe dreams.
I think only Apple could have released such a headset and not been laughed off the stage.
And beyond that, we're talking about a hardware platform that only exists for Apple. Meta isn't in a position to spin up a fab and out perform Qualcomm. What are the alternatives?
I think you're grossly underestimating how easy this win is.
Plus it will take 5-7 years for Android (Google) or Microsoft to get the rest of the industry to come together for creating / replicating a device of this calibre or maybe not at all.
> To be fair, Carmack had been pushing for an all-out device but got sidelined by Zuck's push towards cheap and scale.
You have that completely backwards.
> "I've always been clear that I'm all about the cost-effective mass-market headsets being the most important thing for us and for the adoption of VR," Carmack said. "And Quest Pro is definitely not that..."
I'm not sure it's fair to say that Facebook had a 7-8 yr head start because Apple has been developing every single component that goes into the VR headset for well over 10 yrs. They design their own cameras, displays, micro processors, lenses, speakers, sensors, etc. Apple has hundreds of engineers working on each of these components for various devices they sell. They've shipped a billion+ of their own custom processors and have a world class chip design team to make custom chips for specific products. Meta has no where near the level of hardware design expertise that Apple has. They've also been slowly laying a lot of this ground work for years now with ARKit and LiDAR on iPhones / iPad Pro, moving Mac's to Apple Silicon and supporting iPad apps on it (allows them to bootstrap the Vision Pro app store).
Even if Apple found the displays, cameras, sensors, etc. the one thing they won't find available is the processors(s). Apple is shipping an M2 chip, which they use for desktop computing as well as a custom new chip to handle sensor information. I read that the actual parts cost of the Vision Pro is $1500, and that's with Apple having huge contracts with their suppliers for other components and getting a discount. Add in R&D costs and even at $3500 Apple is likely not making much on this device. If Meta were to make a similar device I think it would cost $5k or they would have to sell it at a major loss.
I'd be curious to know more about what has been done with ARKit since launch. I know it's a thing....I just don't know if it has been used for any major purpose or at any major scale and, by extension, if it is forming the ground work for anything.
There are some really cool apps, but likely not ones that you and I would use on a daily basis. Most of it is like seeing furniture in your living room in AR, or products on amazon, etc. There's also some cool ones for scanning objects, generating blueprints of a house, and more. Even if the apps themselves aren't that useful right now, they've been working on the software & hardware to do this work for years and iterating on it. So when they finally launch Vision Pro they're not starting from scratch and a lot of the code is tested in the real world.
Every year at WWDC they've been announcing more and more improvements to ARKit. I've never seen any "killer apps" that use it but now it feels pretty clear it was being invested in for Vision.
According to the Lex Friedman podcast interview with Carmack, Carmack has been pushing for lower spec, cheap VR, probably on a phone, IIRC. Something people can afford and use day to day
Then, using better use of hardware resources to push more performance out of vr/ar apps.
He believes more in optimization than ordinary programmers. Doom runs on modern conference phones and smart appliances
I think Apple might stumble over the same things as Meta and other AR/VR headset companies. The big issue for consumers is product differentiation. Put simply: How is the experience of using this product any different than another product that they already use?
Fundamentally, these companies have to show how their product is a) unique and b) useful in a way that differentiates the headset from using a laptop or a tablet or a smartphone. I think all of them are great at marketing point a but seriously struggle with point b. Getting widespread adoption, in any way, requires moving beyond novelty factor and into serious impact on how people work or go about their lives. Apple seems to be really focusing on domestic, 'casual' applications, or 'knowledge work' style arrangements, like meetings or co-operative creative work. I don't think people will find much benefit in that outside of their normal desktop workstations. AR in warehouse or construction or more labour intensive situations or (unfortunately) military applications is actually where I think something like these headsets could be more successful. Cheaper devices do have an advantage in that respect, so long as the tech matches the user need. To summarize my point, I think we're still a good decade or more away from the point when these headsets reach anything near moderate adoption. At some point the cheap devices, the amazing tech, and the clear use-case will converge; but I don't think we're there yet.
I was considering buying a new studio display for my Mac Studio. I have a single 4K display that is fine most of the time, but for serious work, I want more. That purchase is now on hold until I’ve tried Vision Pro.
For just this application alone, Vision Pro is interesting. As well as a desk, my office has a sofa, and a round conference table. I like to be able to move from one to another for different activities because it’s better for my body. Being able to do this and still access my Mac, or have computing surfaces floating nearby would be a huge benefit for me.
I also have a living room with a TV in it. For me, TV isn’t a social activity - it’s just a way to unwind or watch something of cultural significance. I would love to be able to remove the TV and arrange the room in a more social way. I’d also like to be able to watch movies etc. in other rooms.
I’m considering having roommates, and being able to sometimes watch large screen content in my bedroom feels like it would take pressure off the shared spaces, and allow me to downsize my personal use of space more comfortably.
If it works as well as is claimed, it’s practically a no-brainer for me to buy even at $3500, just based on the added freedom it would give in my own home.
I might be overthinking it, but you also might be mistaking yourself for the general consumer when you're really a rather select consumer. I don't doubt that the product appeals to you and to a certain demographic, but will that transfer to a general popularity? I'm rather skeptical but open to be proven wrong.
Personally, I think the "movie screen in a headset" is not the play. It's cool and novel at 3.5k, but will only remain that if that's all it does. AR and overlaid graphics is where I see the tech really coming into maturity and establishing its market share. Apple could be the company that owns that, or they could be a stepping stone.
I’m not mistaking myself for the general consumer. I’m aware that it’s only one day after the keynote, and very few people are already thinking this far into how it might affect their lives, and I know this is because I have industry experience going back decades.
That said, this doesn’t matter. If the use cases I’m thinking of are actually served by the device, then that knowledge will spread organically, and people will buy it.
I agree that movie screen in a headset is not the play, that’s why it wasn’t my first example. My point is that if just a couple of the use cases are real, it will enable people to use their space differently, and that will be about as sticky as a product gets.
> Apple seems to be the only company that's able to do this on a consistent basis with physical electronics. (video games, cars, construction projects achieve this in their field, but consumer electronics seems to completely lack this, barring Apple)
It's funny watching the interactions between tech reviewers and (usually Shenzhen-based) consumer-electronics startups — think earbuds, "retro consoles", etc.
Almost inevitably, the reviewer gradually develops domain knowledge from reviewing dozens of entrants into the space that allow them to say exactly what would make for a good device. Then one of the startups invites the reviewer to collaborate on a product design with them. The model that the reviewer had design input on becomes wildly successful, because it actually does some stuff right. And then, as soon as the reviewer is gone, everything the company makes goes right back to having the same obvious flaws. As if they're constitutionally incapable of learning what made their one good product succeed in the market and just replicating it.
Meta doesn't have the marketing chops, consumer product mindshare, or retail channel that Apple does, so I'm not sure Apple's approach makes sense for them.
And it remains to be seen if Apple's approach will work (though I suspect it will, over time... as you suggest, the premium device will set the market and later Apple will make cheaper devices available that make different tradeoffs.)
I'm not convinced it is or will be an "easy win" because the entire space is still so fundamentally flawed. Everyone I know with a VR headset has one...and barely ever uses it (similar to the smart speaker, which Apple also failed at.) Not because of quality of hardware or software (Meta's stuff is actually pretty impressive now), but that it's awkward and it's strapped to your face. I don't see how this really addresses that aspect.
Apple can surprise you with how successful things they make can be, but I can't think of a product they have now that wasn't already in space with tons of successes. Phones, tablets, watches, earbuds, all were already fairly popular. Tablets maybe being the one they really brough into the fold of common use, but the others already existed they just "perfected" them. This is...mostly a failed space with very little penetration. It will be much harder.
This "Apple DNA" provides both advantages and disadvantages. The Apple Vision has a shell of metal & glass. On a product where weight is super important. How many grams could be shaved if they used plastic instead?
Aluminum is about 3x the weight of ABS, but is significantly stronger. It’s not at all obvious that any weight could be ‘shaved’ while still making the device as durable.
We do already have an example of Apple using aluminium for a wearable device that's traditionally made of plastic - the AirPods Max. It's well known for being significantly heavier than the competition, they weren't able to reduce the volume of material enough to offset how much heavier it is.
Has anyone ever thought of Facebook as a pioneer in anything? They've made some smart purchases and stole a few ideas. I feel like their entire existence is based on right place, right time.
Maybe it's because Facebook's "move fast" culture, enforced by their performance review processes, isn't conducive to making a "halo" product. Apple's more authoritarian culture has its downsides, I'm sure, but one upside is that engineers don't have to take detours twice a year to produce intermediate results that will survive selfish criticism from their more short-term-focused peers.
Quest 2 is the best-selling headset to date. This Apple device will have an even smaller addressable market at the $3500 price. Meta can add more pixels faster than Apple can drop their price by 2/3. Really, Apple does a lot of things well, but their market supremacy is primarily due to marketing. They are incredibly well-positioned as a necessary luxury brand. They are "cool" while Android and Microsoft are not.
Yes it’s all marketing. People have been saying that for 20 years. It couldn’t possible have anything to do with having hardware that is literally years ahead of the competition, tight integration between products, a better user interface, etc.
Vertical integration/monopolism is part of it too. The money machine feeds everything else. They sell very high end hardware for uses that are beyond the needs of 99% of their market.
I literally had a dream the other night that I still had my android phone from 5 years ago because it had an IR blaster I could use to control my TV. That's probably the last time a phone from any maker had a distinguishing feature worth paying for.
Apple has the entire top of the market. The amount of money transacted on iOS is probably 90% of the mobile market. Been that way at every company I've ever worked for. They mandate use of their app store, take a cut of app purchases, in-app purchases, require you build and develop on their workstations and pay a fee to install software on your own phone. They have a long history of monopolism or attempted monopolism when it comes to things like music, podcast and ebooks as well. They have >60% of the podcast listening audience as well.
Their podcast app accepts a url to any podcast feed anywhere on the web. Apple never stores the actual audio like Spotify and Google. They’ve also had a documented API that doesn’t require any API key to their podcast directory that other podcast apps can and do use since 2005.
As far as music, when the music industry wanted Apple to license its DRM to other players, Jobs posted “Thoughts on Music” to the front page of Apple.com where he tried to convince the record labels to license all music DRM free to all of the sellers.
Apple never had a monopoly on ebooks and was sued for letting the book sellers set their own prices.
The amount transacted on mobile is closer to 60/40 in Apple’s favor.
Thank you. It’s amazing how people think Apple is somehow mind controlling everyone to buy their products, and mind controlling them to have huge customer satisfaction ratings and resale value. Maybe, just maybe, the products are actually good? Nah, Occam’s razor says they have a Reality Distortion Field.
It's not _all_ marketing. They make good products. They have managed to convince consumers that it's both necessary and worthwhile to pay extra for spit and polish. Like buying a Rolex instead of a Timex.
> Quest 2 is the best-selling headset to date. This Apple device will have an even smaller addressable market at the $3500 price.
I don't think they address the same market at all. Apple will sell their headset to people who already own an iPhone, an iPad or a Mac. This is a market on its own and it's not small. I wouldn't be surprised if 99% of Apple consumers never heard about the Quest 2 and don't care about VR headsets in general, but might still be interested in trying that new Apple device.
One thing which ties this all together is that Apple has a full-blown legitimate OS that can run professional software that already exists. Throw it into a headset, driving up the price significantly, but potentially make a computer replacement.
The only other company in the world that has an extensive OS ecosystem like that is Microsoft.
Everyone else never really had a chance to make a new class of device.
> One thing which ties this all together is that Apple has a full-blown legitimate OS that can run professional software that already exists. Throw it into a headset, driving up the price significantly, but potentially make a computer replacement.
There’s no reason that the iPhone or iPad couldn’t be a computer replacement. The issue is that Apple locks down devices preventing them from being fully utilized. I’m expecting the Vision to be just as locked down as iOS. What I would like to see is Valve take this on with a device as hackable as the Steam Deck.
It's a long way away for corporate to buy it's developers a headset like what your describing. As far as Work goes, if you cannot get a corporate office to buy them then it's never going to take off.
> It's a long way away for corporate to buy it's developers a headset like what your describing. As far as Work goes, if you cannot get a corporate office to buy them then it's never going to take off.
GDP is driven by consumer spending, business spending, and government spending, in that order. B2B is a nice niche and a person or a company can become wealthy in it, but if you want to be fortune one, the consumer _has_ to be your primary focus.
> I think VR has a future, but I don’t think it has a broad workplace future. Medicine? Absolutely! Gaming? Sure! Office work? Ehhh
I actually think office work is more practical than medicine in VR. While I agree that office work in VR is less than ideal, I think medicine is even less ideal. How does a doctor diagnose and treat a broken foot in VR..? How does a person accomplish physical therapy in VR? How is an X-ray taken in VR..? How does the doctor take vitals for a physical in VR..?
I don’t think VR will be a component of patient encounters, at least not directly. My thinking on VR in medicine is more AR than VR if we’re to differentiate, and I see it being used mainly in scenarios where visual inspection is more difficult or currently limited.
For example, giving diagnosticians and clinicians a much more economically viable COTS solution to reviewing minutely detailed scans of internal organs. Being able to use imaging data to go “inside” a patient to practice an uncommon, experimental, or otherwise high risk operation for instance.
That’s fair, I’m not trying to be argumentative, normally I would agree with you on the AR applications of medicine. However, given the debut of ChatGPT this year, I really do wonder if AGI will be doing the kinds of analysis you’re talking about before AR is at the level of being able to do it with a human.
The Index is a neat piece of hardware, but it needed a refresh even before the release of Vision. Now that Apple has planted a flag in the ground, Valve has the opportunity to make a cheaper, open, mixed reality headset, and own modern computing in the 2030s.
VR "for the masses" has always been hobbled by the extreme prices, extreme hardware demands and incompatibility. Even very early tech like VRML died because of that.
I suspect Apple VR will lead to a short and shallow VR boom, followed by it going the way of Apple TV, 3D TV and (not quite comparable, so with reservations) AR glasses.
It's because Apple is a (top) hardware company (and a software one) and they have years of experience at it. Meta is a software company; they could become a hardware company, but that doesn't happen in just a few years.
To be fair, Oculus was a $2 billion acquisition by Facebook in 2014. That should have brought them the talent needed to make hardware devices, despite being a software company at heart.
That’s not enough. Apple built its headset on top of 15 year mobile OS that has been ported across devices, custom built silicon, custom built cameras, screens, etc. Out of all the BigTech companies, Apple is the only one that does custom hardware at scale.
I seriously doubt people are going to walk into a bar and get punched with Vision Pro on their heads. Apple had years to look at everything everyone else did wrong.
No, I didn't mean it would be used exactly the same and people getting punched was not the reason Google Glass failed. It failed because it was pricey and no one saw a need to be walking around with a mixed reality device.
I can't see Apple Vision doing anything better than me just using a normal kvm input. Hand gestures are incredibly overrated and so is seeing things in VR. It works to make games more immersive but that's about it.
A Halo device is not meant to sell, it is meant to impress. Apple doesn't want to sell this Headset at all. They want everyone to wish for a headset when it is able to sell one for $1000.
This is what long term planning looks like. You need to prime your audience to buy something, before they actually want to go out and buy it.
A few hundred millions wasted in the development of this device is pocket change for Apple.
> This is what long term planning looks like. You need to prime your audience to buy something, before they actually want to go out and buy it.
When the Apple Watch was first unveiled, my opinion was something along the lines of, "What, I'm so lazy that I'll spend $400 to avoid pulling my phone out of my pocket?"
When the AirPods were released, I probably said something like, "What, I'm supposed to hate a little wire so much that I'll spend $200 to eliminate it and now I have another thing to charge? Hah!"
But with both of these, after enough exposure and seeing people I know use them consistently, I caved. And yeah, no way I'll go back to wired ear buds when I'm running or walking the dog. No way I'll ditch this watch now.
100%. I thought the same thing with my Apple Watch and I resisted spending what I considered to be an unreasonable amount on earbuds as I didn’t care (or so I thought) about having wireless earbuds. But eventually so many colleagues had them and talked to me about how great they were that I caved and bought a pair. Wow was I wrong about how wireless earbuds was something I didn’t need. Then I started evangelizing to people about them, and the cycle continued.
I agree, at first the watch was not really that interesting. Now, I can see it eating Garmin and wahoo for athletes. Why buy a separate device for tracking the bike rides or the runs when you can just use the apple watch. The vision pro will be the same thing. Give it 5 years.
You're supporting the idea behind "A lot of times, people don’t know what they want until you show it to them." Except, it sounds like the barrier is even higher for some people.
Very true. And I’m an iOS/macOS/tvOS developer who has all the toys, yet even I had trouble coming to grips with paying a premium for AirPods over a good pair of wired buds.
1st generation apple products are never for the masses. If you want to see what Vision will be in a few years, look at their other product trajectories.
Lots and lots of people were waiting for the iPhone 3G model and skipped the first one.
Considering it's the only model with the cell tech in the name (there's no iPhone 5G), I think even Apple knew the first model was a ground laying device, not the "pave the world" model the subsequent versions would become. They still weren't even committed to the app store yet.
While that is true, I would also say it was a little bit special. There are many reasons why, but I imagine one is, people loved their iPods. When iPhone came out, you now had both your phone and your iPod in a single device. I do think without iTunes and iTunes integration, iPhone may have been a flop.
Jobs stated goal for the first gen iPhone was to have 1% market share in the phone market in the first year. They barely made it. It definitely was a raging success
iPhone one was $500 when the average phone was $200. It was an upmarket device that eventually got cultural acceptance and reduced in price when produced at mass market scale.
It's the Tesla S of headsets. It creates the market. We already know that Apple has cheaper tech in the works. There are manufacturers working on "lesser" versions that don't have the "amazing 3D knitting for the one-piece headband..." that will be more affordable down-market versions. But you need something that blows people's hair back first.
What is amazing to me is that no matter how much a company makes, you can't change its DNA. Facebook can never build a good physical product, no matter how much money they throw at it. Google can never make a good piece of software, though they can have very strong backend engineering for it, and so on.
That is why people talk about culture a whole lot and why it is so important. I wonder if the culture is set and can never be changed though. Is it an immutable property of companies based on initialization?
> Google can never make a good piece of software, though they can have very strong backend engineering for it, and so on.
I think it’s more accurate to say that Google struggles to make application layer software. Most of their successful products like search, YouTube, Maps and Gmail are successful because of infrastructure/data layer advantages despite unexceptional UX/FE.
Chrome is a notable exception.
> I wonder if the culture is set and can never be changed though. Is it an immutable property of companies based on initialization?
I suspect there’s a feedback loop where the people who drove the initial product success end up running the organisation and their priorities dominate.
I’d love to read a study that investigated this by looking at how org charts on LinkedIn evolve over time.
I like Gmail. It started out good, and hasn't yet collapsed under all the features they've been throwing at it. Although recently I've been getting these brief freezes when I'm typing ... the end may be near.
I think too much is made about the "personal vs social" binning of Meta vs Apple.
It's true off the bat, but, is more useful as a hook to hang philosophy about our isolated selves on, than a concrete projection of how this is going to develop.
To put a sharp edge on that, I will go so far as to say I think this is quite the wrong take, and I have receipts. Maybe.
The FaceTime bit about scanning your face, and presenting you sans mask in teleconferencing? Great, I look fine in FaceTime!
...the exercise left to the student to infer is that if you can appear in FaceTime, next you can appear in the space with me outside that frame. Or your AI-backed agent, which also you know, can replicate your manner of speaking and voice, because Apple has infinite hours new and old of your speech, written and verbal, on tap, right?
You and proxy-you are going to be in shared space with other wearers... soon.
And then well we have something world-altering to deal with,
because our monkey-minds fully accept such figures as "real" and our memories filter out the virtuality of our interactions, where our analytic mind does not.
The world of "Her" and Joe's "girl friend" in Blade Runner etc are all getting really close. See the clickbait about someone "marrying" their AI boyfriend yesterday?
Anyway I digress, but, yeah, I think Apple is quietly collecting cards for shared virtual presence, and as per usual will take their time to make it wow on disclosure.
Am I the only one that is surprised they went with the "all in the glasses" design? (Minus the exterior battery and cable.) I was expecting an extremely lightweight pair of glasses that wirelessly connects with a stationary device that provides most of the computing power for the VR/AR. That device could even be the new generation of iPhones...
The design as-is looks better than 95% of other headsets, but still strikes me as a bit clunky.
The tech isn’t there for “just glasses”. You need motorized optics, sensors to measure your eye movements, and a dozen cameras to capture your surroundings, hand movements, etc. Also, image processing in a separate device might increase the AR pass-through latency, which allegedly they got down to 12 ms.
Seems like they needed a plethora of cameras and sensors up in the glasses to watch your eyes, hand gestures, and the environment. This is too big to fit in a lightweight pair of glasses.
Add that the R1 chip almost certainly needs to be right next to the cameras and the display to get that 12ms response time, and about the only thing you could move off the device is the CPU and SSD... at which point, probably, why bother?
Right, but that R1 chip is going to be doing work constantly, and can't be moved without increasing the latency (decreasing quality and increasing issues with vertigo); so you have to work on thermal regulation within the headset anyway. So comparing the whole package of "R1 in the headset, M2 somewhere else" vs "R1 and M2 both in the headset", is the slight reduction in heat to be diffused from having the M2 elsewhere worth all the hassle of having to figure out where to put it? It sounds like Apple's engineers have gone with "no". Time will tell.
But if it's "extremely lightweight", where would you put the battery to power a high bandwidth transmitter/receiver? I can't imagine anything approaching the size of a pair of Wayfarers having space for the battery or electronics (not to mention all the optics and sensors!)
When Ben Thompson writes something that you could also interpret as a massive paid ad, that says something really interesting about the product he’s writing about.
I understand that I have a pretty large bias in the opposite direction that Mr. Thompson claims. That is, I think VR is stupid and I don't understand the use case for it, at least in the incarnations we've seen so far. I also remember the failed Google Glass, and think, "If people wouldn't wear that comparatively less-cumbersome and less-stupid-looking thing, why the hell would they wear this one?".
So my immediate reaction was also to mistrust the motives of the author. However, I can recognize this bias in myself and be OK with being wrong. My track record is not exactly pristine, after all. I felt pretty much the same way about the iPad.
While you don't understand the use cases, I (and probably many others) do understand and appreciate them. And that's ok! No one is forced to buy this device, just like no one is forced buy any new innovative technology.
To me, Apple Vision has great implications for work, gaming, and other entertainment. The fact that I can use it sitting in any comfortable position, and interacting with the screen the way I want is a good enough selling point. Gone are the days where I have to hold this heavy slab in my hands and type awkwardly on this small keyboard.
For simple web browsing, reading ebooks, I don't need sit in front of my disk (back pain) and stare at my statically placed monitor (neck pain)
It’s definitely a theme of our times that we have all come to believe that any opinions contrary to our own must be for corrupt motives. I’m ready for that part to be over.
The "relive your memories" is maybe less dystopian than commenters are thinking because you can presumably also view footage from other 360 cameras, like action cams used during sports, or simply placed more discretely in a room (yes, porn is the obvious use case that Apple won't say).
I'm definitely not going to be keeping my hands in the air to type on a virtual keyboard. I can tell you right now that won't work and your arms will get too tired. Must be only for minor tasks. But I can't afford the thing anyways so doesn't matter.
Also, hand movements lack tactile feedback. This makes them very useless for a lot of tasks.
Even with my Quest 2 I notice the lack of push back in the hand controllers during motion, for things like flying games, and fps makes it very difficult to use. You almost always need an attachment to make it feel better.
Anyone here that was there yesterday and used it, what's it like looking "through" it at the world, how does focusing on near and far work? Are the cameras adjusting focus based on where you are looking?
I was not there and have read a few media outlets first hand 30 minutes demo summaries. None answered your questions about focus.
One did say, I think either from CNET or from creative solutions, which you can find on YouTube, that pass through cameras where the highest quality they had ever seen. And they are experts. It was CNET, now that I remember. And he talked about looking at his wrist watch and being able to make out the detail on his watch's text.
I'm viewing this release from Apple as, essentially, a public DevKit for more slick versions of the same kit in the future.
Something with the described functionality, but the form factor of something closer to Google Glass would 100% be game-changing
ie. You develop your AR apps on this, and then come out with something much more user-friendly in a couple of versions time.
I'm just not convinced that most people want to walk around with this thing on their face for any extended period of time, regardless of how amazing the tech is (and it is amazing!)
Seems like most of the use cases he's raving about are novelty VR experiences (e.g. immersive experiences watching sports), and the benefit of the AR is simply you can exit/browse these experiences without feeling like you're inside a VR void. I guess it would be nice to have infinite monitors when I'm doing work for productivity purposes if its seemless but I don't see the killer general use case yet for AR.
as usual what matters is the user experience. technology is only a conduit for such. the difference between Apple and many other tech companies is they prioritize the former over the latter, and when the latter is available it exists mainly to amplify the former.
this problem is most obvious in video games, where eye watering graphics are prevalent but gameplay sucks. Once it gets to 8K per eye with decent FoV, it will truly look pretty close to reality.
My biggest issue with Vision Pro is the extension of the App Store model to a new platform, where we could start from scratch and strike a new deal on the percentages.
Unfortunately Apple has the resources to seed the most prominent developers with any help they need and once the ball get’s rolling there’s no going back.
It will be interesting to see how this plays out.
I think the point that many are missing is that Apple doesn’t announce half-baked products at nearly the rate that Google and company does.
The fact that Apple took this bet isn’t indicative of the current product, it’s indicative of the future. They’re correct in almost every product bet they make. If you don’t see it, you’re probably not seeing the long game.
Throughout all this, I'm struck by the epic credibility of Apple. Any other company would have seen a much larger share price slump after making an announcement like this, but Apple managed to pull it off with only a slight dip. This is truly a super power and may very well be enough for them to achieve breakaway network effects.
I'm just still surprised that Apple made a heavy-duty massive VR/AR product instead of something more like Google Glass - something lightweight and fashionable as a peripheral for your phone just for showing notifications and turn-by-turn directions and other kind of status updates -- an Apple Watch for your face.
Apple saying they are doing AR which does not mean anything. If you use the headset as a big "projected computer screen" I doubt people really care if the actual surrounding is there or if that makes any difference.
This is a really good article although i don't fully agree with the conclusion. Thinking that Meta's quest is a social headset whereas Apple's is a solitary ignores the fact that social interactions and integrations can and will be developed into the VisionPro.
If it's comfortable enough to wear for prolonged of times it might be superior to most common multi monitor setups that users have these days. Nausea would be induced by lack of visual references and latency issues. AR with good responsiveness should address that. Eye fatigue and headaches might be another issue. Too early to tell but this might not be that much of an issue. Otherwise, it looks all good.
In the Platform State of the Union talk yesterday they actually showed how developing Vision Apps will work and it can be done through the device just by looking at your Mac. It will spawn your Mac screen and the Vision app you're working on in the air. It looked pretty slick.
A number of people have written blog posts about doing their work in VR with other headsets, including development. Unless Apple's walled garden gets in the way, then it'll definitely be possible in the Vision too.
I agree with the critique about recording videos using the headset — what an odd experience that will be. But I also assume a future iPhone will be able to record these 3d videos and photos.
funny, if one combines the (anti)social projections of this-and-related, and the (anti)social projections of the new generative models stuff..
so you buy a "skin-fixing" filter and your skin does not look that old and crappy anymore (to you - and also to all your registered friends if you buy the Pro version); hey lets attach a "mustache-adding" filter to that particular aunt (haha) ; f*, lets generate a new face for my partner - ah i always wanted a blondie..
I note that nobody in the Apple presentation was using finger flicks to delete and rearrange windows. Which I think were kind-of-iconic UI gestures in MR.
They're going to have developer previews available in certain cities around the world. You'll be able to take your app and run it on the hardware to see how it works outside of a simulator.
> the ability of the Vision Pro to take “pictures” — memories, really — of moments in time and render them in a way that feels incredibly intimate and vivid.
This reminds me of the following two scenes from Brainstorm (1983 film):
Is stratechery still doing high quality content? I remember reading their content a couple years ago and really enjoyed it but it's a tough business building constantly interesting content on a consistent basis.
Honestly curious -- it's a challenge for most podcasts / content producers.
> Apple, meanwhile, isn’t even bothering with presence: even its Facetime integration was with an avatar in a window, leaning into the fact you are apart, whereas Meta wants you to feel like you are together.
> 2hrs battery, on a flight can you even carry those...
Most flights have plug sockets now don't they, you're sitting down in a confined space, it seems fine to plug it in.
Any even if you can't plug it in, I'd guarantee it'll be below the 100Wh restriction that almost all airlines use, there aren't really any consumer electronics batteries above this precisely because you can't travel with them. They would also be huge and heavy.
They showed it, but didn't really focus on it. They also mentioned pairing with their magic keyboards and other peripherals.
I experimented with a keyboard/mouse in an Oculus headset and having physical keys under your fingers is much better. Only problem was that I couldn't see the keyboard inside the headset which was a bummer.
According to MKBHD’s video, you just look at the key you want on the on-screen keyboard and then click it by touching your thumb and index finger. They also mentioned in the keynote you can connect a Bluetooth keyboard and type on that.
For the first few minutes I thought Hololens was fun too with ET. If you have to look at a lot of letters you are going to get tired of it fast no matter how good it is. It's just not a great input method for typing compared to a real keyboard. If they are so good at hand tracking it'd be better if they put a virtual keyboard on a surface near you that you can type on (I bet they'll have that option eventually)
His video had lots of harsh criticism of many features of this headset that probably cost Apple $100MM each to build. Several parts of the demo were met with Brownlee's condescension, but one part even seemed to earn his condemnation: the part which implicitly required a dad weirdly wearing this at his child's birthday celebration so that it could be replayed in the future.
Videogaming seems like such a small market for Apple to chase individually. They make almost as much revenue on airpods every year as Sony has ever sold in PS5s.
I feel like I’ve read this comment verbatim in at least one other thread in the last day or so…am I in the vr simulation already?
Isn't apple like one of if not the largest videogame retailers because of the App Store? A quick search suggests they made more money on videogames than Activision Blizzard, Sony, and Microsoft combined in 2019. It's still a big piece of money pie. And with this headset they control both the hardware and software platform so they get to double dip on profits.
Apple makes a lot of money from games on iOS, but nobody would say the iPhone is a gaming device.
I expect the headset to be the same: it will have the GPU power to support games third party devs will make games, Apple will make money from distributing games, and it will still not be considered a gaming platform.
I disagree, I consider the iPhone to be a gaming device. By many metrics it might even be the most popular gaming device ever. It's not only a gaming device, but it can be more than one thing. That said, I suspect that the price tag on this generation of the Vision is probably gonna shut out most casual users; maybe we will see marquee, platform-selling games on it in later generations but I have a hard time imagining many people buying it for that until they become more affordable. I suspect that this will be a hard sell to the crowd that has $3K to spend on a gaming device (which granted, lots of people who play PC games spend that much), so it may be a chicken-and-egg problem.
That said, having used Hololens 2 and Magic Leap I saw the potential for AR to be something really cool. I'm certainly rooting for Apple to finally deliver something polished and compelling, both for professional and consumer users.
I suppose it's a specific thing. I'd say "gaming device" means heavily optimized for gaming, including tradeoffs that make it less attractive for other scenarios. Consoles, Nintendo Switch, wired mice with a billion buttons and 1000DPI, etc.. those are gaming devices to me.
Are they talking about input devices? Not only the headset doesn't need one for basic operations, but it's also compatible with keyboards, mice/trackpads and game controllers. What's missing?
I mean precision superhuman input like mouse or dual analog sticks.
I can't for the life of me see myself using a headset with a mouse, but who knows.
That said you cannot depend on not guaranteed inputs because the market goes down from 250.000 to like 1 for each input device (unless you want to debug all controllers with the headset). You need volume. ONLY Quest 2 has volume and as things are going now it's the only one that will for eternity.
Can we afford 10 million Vision Pro ($35 billion).
I'm just waiting for Quest 2 to get linux... that's inevitable if VR is supposed to have any future.
I'm never touching a closed system like iOS or Android.
I find his gushing about the centrality of phones to be annoying.
Verizon, AT&T and T-Mobile choose not to serve the valley I live in, why should I reward them by buying an expensive plan? From time to time I've had a Tracfone but after a few years I wind up with 3000+ minutes because my average mobile call lasts under 2 minutes.
Practically I use an iPad to do almost everything people use phones for, even making telephone calls with Skype. I'm just waiting for them to shush me at the public library and point out the sign that says using a phone isn't allowed and tell them I'm not using a phone, I'm using a tablet. (My typical call is so short though they don't have time)
I constantly complain about "phonishness" invading the PC and other computing platforms, like Reddit trying to push you to use their mobile crapp. I still see the phone as a step backwards in computing, not a step forwards, particularly if you are forced to use that other mobile OS that has a trashcan as its logo.
Maybe it a step back for you but look around. The phone won. Tablets, laptops, etc will never make a comeback to be nearly as important or popular as the smart phone. The only question is when a new thing will replace it. IMO it is not going to happen for a really long time
If it seems like he’s gushing, he is. He’s speaking as someone who has seen lots of these devices and, in his view, this hardware and software outclasses anything else available.
If that bothers you but you push past it, I think you’ll find that whether you like this device or not, it could reflect or represent changes in how we incorporate technology in our lives that we might not yet fully appreciate.