Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Apple forced to make major cuts to Vision Pro headset production plans (ft.com)
87 points by macleginn on July 3, 2023 | hide | past | favorite | 320 comments


I am very pessimistic about the future of VR beyond it being super nice for niche cases. But I think there’s something fundamentally good about:

1) Apple taking a stab at it, because it’s the kind of thing where UX really matters and Apple does UX well.

2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in, often because they’re not bursting at the seams with cash, talent, and runway.


> Apple does UX well.

I hear people say this often enough to question my own sanity, because I don't find Apple's UX to be very good at all. I find it confusing and it makes things difficult to figure out.


Could you provide an example of the sort of thing you’re talking about? Because I’m the type of person the other commenter was talking about (I’ve been in the Apple ecosystem for a very long time), and I feel I’m too close to it to have a reasonably objective viewpoint on it one way or the other.


Sure. iPhones and the Apple desktop OS. In both of them, I am lost as to how to accomplish anything but the most common stuff. The design of the OSes is such that it's not obvious how to do anything, and they provide very little guidance. You have to already know how to do what it is you're trying to do.

This is even worse on Apple smartphones, because a lot of stuff is done using mystery gestures that you have to already have memorized.

Skeuomorphism used to provide a small amount of relief, but since that's gone out of fashion, it eliminated even those clues.

In short, I find Apple's user interfaces to be very opaque. In fairness to Apple, this is also true of Android, Windows, and Gnome. But in the case of Android, it feels intuitive to me simply because I've internalized how to do the stuff that it gives no hint as to how to do.


I’m actually curious about specifics here for macOS aside from things like hidden scroll bars. Mainly because I find it to be one of the most discoverable. If you go into the ‘help’ menu in any application, you get a search box that searches all the menus of the application, if a shortcut exists for the action it’s listed alongside it. Admittedly Qt and KDE can do this too, but only for QT apps, while on macOS it’s every app that actually provides menus. Nearly every application handles files and actions the same way, and take the same common shortcuts for things like preferences. If I want to see shortcuts for system actions, they’re all listed in the keyboard control panel, and arbitrary shortcuts for arbitrary applications can be added there. I can’t name another system that feels as integrated and discoverable in that sense, except possibly KDE or gnome running ONLY apps developed specifically for that toolkit.

What system(s) do you find more discoverable, and how? I’m honestly curious because I love checking our new systems or window managers and it’s bugged me not to have some of this elsewhere for a while.


How do you show hidden folders in Finder? You literally have to memorize a keyboard shortcut. How do you switch apps in full screen mode, you need to know trackpad gestures or keyboard shortcuts. Every post like 2004 addition to MacOS seems to be reliant on trackpad gestures to the point using the operating system with a mouse is a negative.

The search bar in the menu is fine if there is a menu to search. IIRC or at least last time I used it the screenshot tool had no menus and was extremely unclear as to how to save a screenshot as a file in the location you actually want.


The screenshot tool is pretty tightly coupled to shortcuts that’s certainly true. I actually didn’t know there was a keyboard shortcut for hidden files and folders, it can be toggled through the preference menu for finder, but again yeah that’s not as discoverable as it should be. Switching apps in full screen mode would be using the dock. I never do, but it works just fine and it’s the way you switch apps with a mouse, also using cams+tab works with the system set to follow space of selected application which is set by default.

To your point about trackpads though, I agree. I can’t stand a mouse on macOS, it loses too much convenience, especially with BTT available.

That said, is there any system today that you find more discoverable? I mean serenityOS actually fits that bill, and classics like older macOS or windows 98, but anything modern?


> How do you switch apps in full screen mode, you need to know trackpad gestures or keyboard shortcuts.

The Dock also works, although you have to move the pointer downwars to show it. In a lot of ways Clicking on Icons is MacOS foundation mode and everything else is superfluous. I, as a Poweruser, forget that sometimes.

> IIRC or at least last time I used it the screenshot tool had no menus and was extremely unclear as to how to save a screenshot as a file in the location you actually want.

The Screenshot tool btw. is also an App under Applications → Utilities. If you use that, you’ll get a floating palette with different options including for different locations. Those options then get also used by the Poweruser ⇧⌘n shortcuts.

(I don’t necessary disagree with you, there is too much invisible navigation.)


That screenshot utility is also accessible with ⇧⌘5, which might be invisible, but just letting you know.

As for where to save that screenshot, that is available in the Options section of the floating menu.


And that screenshot tool does not have anything in the global menu. That was my point.


Or hold down the option key while right clicking things or in certain dialog boxes. I guess that's not horrible if you internalize that fact but it's not discoverable.


> How do you show hidden folders in Finder?

You don't. They are hidden. If you are supposed to see them they shouldn't be hidden.

I understand that other operating systems make a different choice about this, but that doesn't make their choice is the correct one.


⇧⌘.


I’ll give you a very specific example. When turning on the flashlight, I have to hold the flashlight icon for a specific period of time. If I hold too long, I get the dimmer option. If I don’t hold long enough, it doesn’t turn on. The iphone expects what feels to me like a zen-like stoner’s calm press of the button. This is absolutely infuriating to me. Every time. Just turn the damn light on when I press the button.

Did you know that you can hold the spacebar to move your cursor around on iOS? Without my friend telling me about that feature, I also found text selection in iOS to be deeply frustrating as compared to Android


My Dad is going through ALS and keeps accidentally turning on his flashlight. And is then unable to turn it off. So many UI choices in iOS just seem poorly thought out.


> I’ll give you a very specific example. When turning on the flashlight, I have to hold the flashlight icon for a specific period of time. If I hold too long, I get the dimmer option. If I don’t hold long enough, it doesn’t turn on. The iphone expects what feels to me like a zen-like stoner’s calm press of the button. This is absolutely infuriating to me. Every time. Just turn the damn light on when I press the button.

I just tested this on my phone (albeit it is an iPhone 8, so a bit older model perhaps without as many features). A short tap is sufficient to turn it on and off. A longer tap brings up the intensity slider. I was unable to tap the icon quickly enough so as not to register the tap and toggle the flashlight.

Upon retesting a couple times, I did notice that on the first tap the icon is illuminated but it did take a split second more for the actual flashlight to turn on. But waiting the split second was sufficient instead of further tapping.

I’m using the menu that is pulled up while the phone is unlocked (by dragging from the bottom of the screen, on my model), for reference.


The iOS UI has to determine intent from touch input, so as more gesture-based controls have been added, the OS needs to figure out what you're trying to do. Tap. Tap and drag. Scroll. Slide. Whatever. If your intent is one action, but your inputs don't match the actual requirement for what you want to do, you get bizarre behavior.

For the flashlight in particular, the button inside the control center can do a few different things—including closing the control center entirely—if you fumble the tap even slightly with an upward push of the thumb. With the Lock Screen, a sliding motion of any kind will just not turn the light on if your slide doesn't begin and end inside of the UI element that you're entirely obscuring the view of with your enormous fingers. You can accidentally open the Lock Screen customizer. It's even possible to get haptic feedback from the lockscreen flashlight button without actually turning it on.

While I don't have any of these problems, I am familiar with them and have observed others struggle. There are accessibility settings that are designed to help (repeated input filters and such) but they all slow the UI down and somehow make it more confusing because the phone is just more likely to do nothing rather than the wrong thing.


How do you Airplay something on Macos Monterrey?

On a multi monitor setup, why is it impossible to drag a window over to a monitor where another window is fullscreened?

How do you select a specific resolution or refresh rate when you're plugged into an older meeting room projector?


You're looking for specifics from people that don't necessarily have them. Not because their reasoning isn't valid, but a lot of us don't use the ecosystem frequently enough for them to be in the foreground of our minds.

I had to use an iPhone for a few weeks when mine broke. That was enough to convince me to never touch any of their products again. I'd love to tell you more about it but it's been 4 years and I just don't remember all of of details. The main thing I remember is that it was anything but intuitive to use and it was an entirely frustrating experience. Easily the worst phone experience I've ever had


This makes sense. The few times I’ve tried to help someone with Android, I had to go very slowly through every single step. The two look pretty similar from afar but are quite different. I can still make my way around Windows because I’ve used it before.

My 70-80+ year old parents can use iOS abd ipados mostly fine but get lots of things wrong on their Mac, so it seems to be a net improvement.


The macOS 'application centric' window system model is an excellent example IMHO.

Alt-Tab (erm... Cmd-Tab) switching through applications instead of windows (and then having a separate hotkey to tab through windows, but only within one application), and that mess combined with the fact that there can be UI applications that just have a menu bar but no open windows is just incredibly bizarre (and even though I'm now a primary Mac user for more than a decade it still feels incredibly clunky - at least there's this gesture-controlled 'exploded-desktop-view' as a workaround).

...and when it comes to the new stuff:

- hiding the scroll bar, what a completely non-sensical decision from a UX point of view, I now need to wiggle the touchpad to see where I am in a document, and how much of the document I'm currently viewing (yes I know that this can be disabled, and that's the first thing I do on a new Mac, along with inverting the scrolling direction)

- arranging buttons vertically in popup dialogs on macOS devices with landscape display orientation (arguably it makes sense on an iPhone with portrait display orientation, but in either case the vertical arrangement makes it a lot more likely to accidentally hit the wrong button)

- the new settings panel is just a massive step back from the old one (which wasn't all that great either)

- don't even get me started on iOS, if you don't know the 'magic gestures' nothing makes sense (same shit on Android though)


I love the application centric model; I find the “window” centric model to be incredibly messy.

I’m assuming you came to macOS from some years using Windows?


Basically Amiga => Windows (since ca 1998) => Mac + Windows + Linux (since ca 2010)

...but I was also exposed to 90's Macs somewhat. I remember that my switch from the Amiga UI to Windows was fairly smooth, but I never got quite used to the Mac UI (not that it matters all that much though because I mostly switched back to the command line anyway, that's the only way to keep sane when switching regularly between Mac, Windows and Linux).


Me too. If I had to keep a window open in — say — Photoshop open all the time just so I didn't have to wait ten seconds every time I open something with it, it'd drive me crazy.

Visual Studio on Windows does drive me crazy for exactly this reason, especially as each instance can only have one top-level document (solution) open at a time.


I grew up on Windows and Linux, but a mostly daily Mac user for more than a decade now.

I think the app vs window centric model really comes down to user preference - I think I could get used to window again but I had Chrome, Audacity, and the windows explorer open last night with maybe five windows and was alt-tabbing to all the wrong places. My brain just makes sense of “get to application, then find the window” better than a list of windows I have to sift through across disparate apps.

I agree with the scroll bars and settings getting worse. Buttons are either way, but - like app vs window switching - could be remedied with a setting to change the behavior.


By now Linux (Gnome, sure) has copied apple and hidden the scrollbar. But Apple does it better, so they still lead in UI.


That's another but even bigger problem: Apple is so highly regarded by designers (which was even justified in the past when Apple did actual UX research), that they just blindly copy even the bad stuff.


As a daily mac user at work (against my will), there is much to complain about mac's UX. The big problem here is that no way anyone is objective given that most UX becomes "good" after repeated use.

I have many examples, but I'll just pick one. When having multiple instances of an app, the doc adds a "dot" by the app icon. This dot is present no matter whether I have 2 instances or 10 instances. Oftentimes I forget instances there (which can open due to many reasons, including for instance private browsing) because it's not easy to tell that I have 3, instead of my usual 2 instances.

Additionally, switching instances is unnecessary slow and requires more clicks than the older Windows taskbar (which is hopefully coming back as MSFT imitated Apple's terrible UX). Maybe you love this but I objectively lose productivity with this UX decision.


> When having multiple instances of an app, the doc adds a "dot" by the app icon.

That's simply not the case.

The dock adds a dot to the application icon if the application is running.

If you have multiple instance of the application running, there will be multiple icons (each with a dot) in the dock. This is fairly difficult to do, the only way I know how is by starting the application again from the command line. Well, or when you're running an application you are developing from Xcode.

So not 100% sure what you mean with "multiple instances", but almost certainly not actual multiple instances. I am guessing you mean an application with multiple documents open.

So basically, what you regard as "bad UX" is simply you trying to map assumptions from one environment onto a different environment, and finding that your assumptions don't reflect reality.

And yes, the application will be running (showing a dot) whenever you have 1 or more documents open in the app.

To switch between documents of an application, you can either go to the application's "Window" menu and chose one from there, or to the dock icon and choose one from there.


If the UX was intuitive, it would, well, be intuitive and I wouldn't need all these paragraphs.

Anyway, I'm using Chrome and that's what happens.


This may be a case of the mental model not matching the technical model. Under macOS, windows aren’t program instances, they’re just windows — they’re all hosted by the same single parent program instance. It’s technically possible to open a true second instance of a program using the Terminal, which spawns a second Dock icon, but few people do this on a regular basis.

This is why open windows are represented on the second level, e.g. listed in the context menu that appears when you right click a dock icon. Programs and windows are not synonymous, and in fact windows belong to programs.

Macs have used this model since they first gained multitasking decades ago.


This is a good example of typical UX complaints which boil down to "this is not the thing I learned on." I personally don't think the early 2000s were the peak of computer UX, but I understand that most people don't like change.


The thing is that up to around the early 2000's, UX decisions were backed by actual research, not the aesthetic whims of an egocentric designer.


Yeah, right


How about iOS Settings? There are, counting now, 8 sections of settings before the ninth which is “all third-party apps alphabetically” (that last one is mainly where you can grant/deny a short list of entitlements for those such as location, cell data, etc.)

Those 9 sections are all unlabeled for some insane reason (I suppose maybe this type of plain table view thing has never had a UI element to make headings?), so even the intent of the designer is unknown. If you want a certain setting, you just have to keep scrolling. There is Search in here, though the search is mediocre and it’s still hard to find things.

Oh, and Settings has a junk drawer too called General, which seems to mostly mean “someone was forced to pull 15 or so items into a subfolder, and it also uses the “random groups without headings” method of organization.

All of this seems low-effort and like the designers didn’t learn even basic lessons from 40 years of GUI design (or like the ones calling the shots are fine art majors more concerned with words like “uncluttered” and “elegant” than “usability” and “discoverable.”


Like how are the "maximize window" and "minimize window" buttons even supposed to work? (That traffic light on top of your windows)

Or why don't we have Copy and Paste keys on our keyboards. Or Undo and Redo keys?

Or why is the USB drive in the back of an Apple monitor?

Or why is everything so opinionated?

Or that mouse that could only be charged with the cable plugged into the bottom so you couldn't use it?

Or in earlier versions of OSX, you had to upgrade the OS by starting iTunes (wtf?) ...


For this one

> Or why don't we have Copy and Paste keys on our keyboards. Or Undo and Redo keys?

Copy: Cmd+c

Paste: Cmd+v

Undo: Cmd+z

Redo: Cmd+Shift+z or maybe Cmd+y if the devs do windows stuff too


It’s getting harder and harder to see the good UX under a layer of bugs and minor issues Apple can’t be bothered to fix (mainly talking about macOS)


My dad got an iPhone despite us being an Android family because word on the street was that they're easy to use.

For the life of him he can't he can't look at a text message while in a phone call and then get back to the call menu.

He unknowingly mutes notifications for contacts and has no chance of figuring out how to unmute them.

He accidentally calls his contacts all the time.


I think it was about 10 years ago: a friend got a message on her iPhone while she was driving and asked me to reply. I had a Samsung back then. I couldn't find a way to reply. I had to ask her to guide me through the UI then I finally made it.


With me one example of poor UX with iOS 16 they’ve butchered “Do Not Disturb”. While I appreciate the added user control of Focus tabs, it has made DnD a much more annoying feature to use and access. Instead of a simple one-handed swipe up and press, I need to swipe up, click into the menu, and stretch my thumb while balancing my phone to enter DnD one-handed. For someone like me, who’s friends always seem to message at inopportune times and send 50 half sentences, its a rather common feature I enjoyed before that is now much worse to use.

Also Apple just killed some of my app notifs with the update and despite all of my settings being proper some apps like Instagram just don’t work.


Mac's windows system was good on the original Macintosh 1984, with a very small screen. It was worse than Windows' one for the Macs of the 90s with screen sizes comparable to the ones of the PCs. It was clear to me that a single global menu stuck at the top of the screen was a bad choice. I never bought a Mac because of that.

20 years later, given an Android and an iPhone, they were equally difficult (or easy) to use. Furthermore, after having used Android for years finding stuff on an iPhone was definitely difficult. Probably the other way around is true.


A personal example for me is that on the iphone, the default view shows me 24 icons, which is way too many to be useful. Its so useless, in fact, that most tech savvy users simply swipe up and use the search feature. But there's no way to know that you can swipe up to get to the search feature, leaving most non-tech-savvy users to visually inspect pages of icons, one at a time, until they find the app they are looking for.


I like the default view. I don’t find 24 icons overwhelming or confusing. The most frequently used ones, I remember where they are spacially.


Window management on OSX.


If you are using Apple products since decades of course you are used to it - except some complaints between major releases. If you just come from any other corner, oh well...


I'm pretty sure they still force auto arrange of the icons on the home screen.

It's probably invisible to most Apple users but it's a bit puzzling to anybody coming from anywhere else.


The problem is you aren't their intended audience.

I too thought Apple UIs were confoundingly bad, but one day I decided to take a step back and approach it as any ordinary man would. I pretended I knew nothing and cared nothing about computers.

You know what happened? It was intuitive. It was consistent. It was harmonious. My mind was blown.

Apple UIs are made for the ordinary man, not for tech nerds and professionals like us. Forget alt-tabbing and window managing, the vast majority of people on the planet don't know and don't care about computers; and it's those people Apple caters to.


I’d suggest you didn’t look at it from the perspective of an “ordinary man”, but instead, you dropped your longstanding assumptions from using Windows.


Practically speaking that's the same thing, normal people don't know and don't care about Windows vs. MacOS vs. Linux. It's a computer, and they hate it.

Apple at least designs their UIs to accomodate how normal people use computers, which is why their stuff doesn't mesh well with tech nerds and professionals without a mindset change. We are not normal people.


You don't think the more parsimonious stance is that you have different preferences?


> 2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in, often because they’re not bursting at the seams with cash, talent, and runway.

I feel like the iPhone mini being only two years long is a strong counterpoint to this. There's a small hope for an iPhone 15 mini, but with no 14 mini, it's a long shot. It accounted for 6% of all iPhone sales in it's first quarter, after a delayed and rocky launch.[1] There is a small but somewhat vocal minority that want smaller screens, which Linus Tech Tips' sub-channel Mac Address covered recently. [2]

[1] https://www.tomsguide.com/news/iphone-12-mini-sales-a-disast... [2] https://youtu.be/BjhiYa0KsSM


There was such a little difference between the 13 and 14 low end models, there was really no purpose for a “14 mini”. If you want a mini phone, you could buy the 13.

Apple also doesn’t update the SE every year.


A smaller iPhone isn’t exactly an exploratory venture into a new market. And loud minorities don’t make a market.


Loud minorities can form a market, if they are willing to put their money where their mouth is. But too often they don’t (cf. the “brown manual diesel wagon” from the car world. Enthusiasts say they want one, but when a manufacturer puts one out, they don’t buy enough of them).


“Revealed preference”


Maybe. But as we’ve seen (https://www.imore.com/iphone/lackluster-iphone-14-plus-sales...), the Minis did outpace the non-Pro 14s. Obviously as the 14s sit on the market if they will outsell the minis. It’s just a matter of essentially a self-fulfilling prophecy of being available and the latest SKU. But the velocity was not there. So expect Apple to react accordingly. I’d like for them to settle on set sizes. Hands aren’t evolving every year, they need to stop screwing around.

I use an iPhone 12 mini that I ordered on launch day and love it. Finally a worthy successor to the 5S/SE1.


Why can does apple need a new mini every year, or even every 2 years? The 13 mini bought today will probably still do everything that 90% of customers need it to do for another 5 years. Even the SE which is 3 years old now can probably do most things for another 5 years.

Also, the 12 mini was released 6 months after the 2020 SE, and the 2020 SE was released ~4 years after the previous SE. it stands to reason that many people who wanted a smaller phone jumped on the 2020 SE, and then had no reason to buy a 12 mini.

Also, the 12 mini battery life sucked. The 13 mini is perfectly usable though.


> 2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in

It seems to be the opposite with folding phones. A lot of Android manufacturers releasing them, despite low sales numbers, while Apple apparently thinks: if we can’t sell a billion of them then why even bother?


Apple are at the “the technology isn’t quite there yet” phase. When they enter the market you’ll know you’ll probably get something semi reliable.


I'm typing this on a foldable phone right now, which is plenty reliable. One of the enduring myths about Apple's late entry into product sectors is the idea that they're waiting to "get it right". The first iPhone was a joke, it took them until the 3GS to get it right.


> One of the enduring myths about Apple's late entry into product sectors is the idea that they're waiting to "get it right". The first iPhone was a joke, it took them until the 3GS to get it right.

I don't agree that this is a myth.

If you look at the first generation iPhone in a vacuum, it was limited, perhaps (but I think there are plenty of arguments that the first gen iPhone was pushing envelopes - name one other phone that had a fully functioning web browser at the time, and pinch/zoom felt magical), but nitpicks about functionality aside, I don't think this framing makes sense for other reasons.

A product line is not just its initial release. It is a process, an organizational discipline, a manufacturing and logistics pipeline, an ongoing upgrade cycle, an ongoing engagement with customers, an understanding of the market, etc.

When the first generation phone was released, plans for the 3GS were already well under way. The 1st gen phone was a necessary step to get you that 3GS. When operating at Apple's scale, at least a decade of roadmap if not more was already planned in depth. The 3GS was not a reaction to the success of the 1, it is the real product Apple envisioned and planned to ship before we even knew the iPhone would exist.

If the 3GS got it right, so did the gen 1, in a certain sense.

And as an aside, I have to say I really liked my 1st gen iPhone compared to everything else I'd had up to that point, and I was a gadget junkie who had owned quite a few of the other "hot" phones at the time.


I'm not looking at the iPhone in a vaccuum. I'm also taking into consideration other things Apple has inexplicably delayed for multiple years:

- support for mouse input

- a native file manager app (2017!)

- pressure-sensitive pens and screens (~ 4 years after Samsung)

- Multiple apps on same screen (2019!)

None of these are things that take years to get right. Apple commentators simply get lazy and recycle this iPod-era talking point.


> a native file manager app (2017!)

What would you do with one on a device where every app is heavily sandboxed and there's no common file system?

> pressure-sensitive pens and screens (~ 4 years after Samsung)

When they did release this, no competition was even close to the precision and fidelity.

When Apple releases something later than competition it's usually (but not always) justified and results in a better execution


> What would you do with one on a device where every app is heavily sandboxed and there's no common file system?

Access each app's sandbox to move/copy/exfiltrate the data contained there, for whatever reason the user desires. Not much different from the way file managers are used on traditional desktop systems.


> Access each app's sandbox to move/copy/exfiltrate the data contained there, for whatever reason the user desires.

That breaks the security promise that apps don't have direct access to other apps' data.

There's a reason you can't even access photos without an explicit user prompt.

> Not much different from the way file managers are used on traditional desktop systems

Traditional desktop systems never had heavily sandboxed apps with no outside access


> That breaks the security promise that apps don't have direct access to other apps' data.

No it doesn't. Giving the user the ability to get their hands on their own data is not the same as giving other apps direct access to it.

> Traditional desktop systems never had heavily sandboxed apps with no outside access

That's completely orthogonal.


> Giving the user the ability to get their hands on their own data

Does the user have the need for that on an iPhone?

I don't think a single app on my phone has any files I'd ever need access to

> That's completely orthogonal.

Of course it's not orthogonal


> Of course it's not orthogonal

You questioned whether there was a use case for file managers when apps do not share a "common file system" and are "heavily sandboxed", and stipulated that providing one would break the security promise of not giving apps "direct access to other apps' data".

Whether desktop systems have traditionally sandboxed their apps to the same degree is, in fact, orthogonal. It's a separate question entirely.

> Does the user have the need for that on an iPhone?

Sorry, but when I spot somebody moving the goalposts (and failing to acknowledge when one they staked out earlier was satisfied), my approach is not to indulge them by continuing to respond as if they're just like anyone else having a discussion in good faith. Instead, I say, "you are moving the goalposts; you are not asking in good faith". This is monumentally helpful in making economical use of my time.


Regarding looking at the iPhone in a vacuum, I was referring to the iPhone gen 1 vs. the iPhone product line as a whole. The point was that the claims don't make sense unless you look at just the first generation of the product, which was clearly not Apple's vision.

Regarding other features, I don't understand the connection you're trying to draw here, or how they serve as counterpoints. All these anecdotes point to is where Apple has chosen to focus resources and in what order. Having worked on a product team, there are dozens of "obvious" things that get cut from every release. I don't see this as some enormous indictment of the product.

Mouse input on what? iPhone? Is this a thing?

A native file manager is a feature that applies to a small subset of users. Do some people want it? Sure! Does a lack of it somehow imply Apple is failing? I think that's hard to argue. Anecdotally, as a highly technical user, I rarely if ever touch that app. I don't think most of my friends/family know it exists.

Multiple apps on the same screen ... on the only tablet in the market worth considering. Again, this is a gripe about roadmap order and not an effective argument against the core claim.

And core to the claim is the fact that Apple is often late to the party.

> None of these are things that take years to get right. Apple commentators simply get lazy and recycle this iPod-era talking point.

I don't think anyone is making the claim that these things take years to get right. More interesting than this would be to examine what they chose to ship instead.

If you don't believe the iPhone, iPad, Apple Watch, MacBook Air and AirPods are market-defining products, we'll have to just agree to disagree. It's not as if the iPod was the last product to resemble the "Apple does it better" pattern.


> When operating at Apple's scale, at least a decade of roadmap if not more was already planned in depth.

I highly doubt that considering that web app / native app store debacle.

Also was there any real reason to not support 3g from the beginning? A lot of “dumber” phones had it so it seems more likely that they just made a mistake.


> I highly doubt that considering that web app / native app store debacle.

You mean the same Apple where Jobs said that no one would want to watch video on a tiny screen less than two years before the video iPod?

Apple didn’t go from idea of a fully featured SDK and an App Store in nine months without already planning to do so.


I'm referring primarily to the hardware side of this from a roadmap perspective.

I don't think early politics about how apps get shipped plays too much into the core planning for the device, e.g. form factor, manufacturing, user interactions, structure of built-in apps, etc. are all pretty much orthogonal to the app distribution model.

But even then, there were clearly two camps, which means there was a disagreement about a specific path, but not a lack of forward thinking. And then they course corrected. Something that many companies simply do not do.


Yeah absolutely. I just don’t think that consumer tech companies which make 10 year plans (instead of a broad high level vision) and try to stick to them tend to be that successful.

> are all pretty much orthogonal to the app distribution model.

I’m not sure it’s only the distribution but the entire existence of native third party apps that they weren’t sure about.


I was including 1st party vs. 3rd party in my mind when I mentioned distribution. Admittedly that word is doing some heavy lifting.


I'm pretty sure the whole web app thing was just a smoke screen (aka lie) because the app store wasn't ready yet.


IMO it's only a debacle for developers who don't want to learn the native platform or don't think they can do it.


Not sure what happened to the original reply I typed.

The recent pixel foldable shows the way to totally screw up a foldable screen. I’m sure you’re totally happy with yours, but all the foldables I’ve seen have distortion at the fold and it’s definitely a week point that will eventually degrade. Not for me.

I was actually around when the first iPhone came about. As a tech enthusiast it was incredibly disruptive. Everyone was still shipping devices with stylus and this device with a capacitive screen came and blew everyone’s socks off. I don’t think anyone can rewrite that history.


It’s not like there was anything that similar to the first iPhone. But yeah it was arguably closer to a tech demo and a toy than a real device. Apple and Jobs himself weren’t even exactly sure which way they wanted to go with it.


This is a good counterpoint to what was posed above.

That being said I feel like this is almost the obverse, they have a product that sells well so they don't need to follow each trend that comes out. So they can pick and choose which way they run with things.


Not to speak of the discontinued iPhone mini.


The iPhone 13 mini is still available for purchase:

https://store.apple.com/xc/product/IPHONE13_MAIN

Any source that they stopped making it?


I meant in the sense that the hardware isn’t getting updated anymore. I hope they don’t stop making it, but I don’t know if they are actually still making it or only selling existing stock. It sells rather poorly. See https://forums.macrumors.com/threads/iphone-mini-15-coming.2....


VR with extremely light eyewear coupled with extremely high resolution where real life is indistinguishable is essentially being able to pick your reality. I don’t think there’s any doubt that this wouldn’t be a niche thing.


It’s not clear we will reach that before we will reach direct brain interfaces.


I agree about VR, but I'm really excited for advancements in AR!

I'm a really avid mountain biker, road cyclist and snowboarder, and I'd love some basic telemetry displayed in the goggles or glasses I'm already wearing. Nothing kills a ride faster than having to stop to pull out your phone to see if you missed a turn or to figure out which unmarked trail you're supposed to take. (Not to mention all of the information we current have to look down at our cycling computers to see: HR, watts, cadence, etc.)


As a trail runner, same - I really look forward to a future where decent glasses can show route details/programmable workout info/etc. and I can leave the watch at home. Less crap to carry + more info readily available would be awesome!

I know that's not the problem Apple's solving here, but I'm hopeful that anything they can do to push AR will filter down quickly.


Absolutely! I feel like VR is still in its Napster phase.


Seems like existing helmet HUD technology should be sufficient for some basic telemetry.


I agree. My guess is that they are at fork in production ramp. They need to choose between high production high risk or low production low risk.

If they are 100% certain this is the final form factor they would take one path. If they think we need to get this out in the world and see how it is useful they would take the other.


Even if they only build 100,000 units, it should be enough to build hype and get developers onboard for their more affordable mass market 2nd gen device coming in 2025.

For context, Oculus sold only 56,334 of their DK1 devkits.


The Oculus Quest series sold around 20 million units so far, still not a lot of hype.


I’m curious what the usage figures are like: everyone I know who bought VR rigs basically stopped using them and if the WSJ was right earlier this year the best MAU was a third of the total sales:

https://www.wsj.com/articles/tiktok-parent-bytedance-battles...

I suspect that gets back to the space / isolation problem: if you don’t live alone there’s probably just not enough time where you’re using VR to be worth the purchase price to most people. That doesn’t get better until they’re cheap enough that the entire household can have their own or people find professional reasons to own one.


Oculus were nobodies more or less back then and didn't have apple's supply chain expertise, and even still their devkits were only like $300 USD 10 years ago.

I feel like the major cuts are because of lack of hype. I don't think anybody understands if there is a market for this, so nobody is sure how much money to put into developing for it.

To me this feels like the first apple product in a long time without a compelling use case and no solid idea of who the potential customer is.


I don't think they will have problems selling a few 100,000 units at $3500. The first testers report that the displays are indeed extraordinary and using them as desktop monitor replacements is feasible. That alone is a great use case.


> using them as desktop monitor replacements is feasible. That alone is a great use case.

For some, maybe, but not for everybody. I certainly would never use them for this, no matter how good they were at it.


again, the issue is development dollars. Its great that its high resolution displays, the question is if your company wants to dump hundreds of thousands of development dollars making something for the vision pro when its not clear there's a consumer market for them, or who the target market is.

Sure apple could sell 100,000 anything. The question is are you going to be able to make any money developing for apple's $3500 VR headset or not. I don't think that these can be used for desktop replacements makes them worth it as desktop replacements. $3500 is an "out there" price with no ecosystem level market, in my opinion.


“Out there” for who?

Apple has customers who will buy multiple $15,000 configuration Mac Pros without batting an eye. That’s the price of the tool and for what they’re working on (perhaps a $250,000,000 movie) it’s justified.

The Vision Pro, as announced, clearly isn’t much of a consumer product. The price is too high. But that’s not surprising, it does have ‘Pro’ in the name.

When the Vision or Vision Air. come out in a few years for $1000-$1500, that will be the (higher end) consumer move. And by then software will exist.

For now it seems more for executives. Maybe travelers. On The Talk Show Gruber and Panzarino were talking about how useful it would be for working on secret designs or private financial information or HIPAA information in public. You could be in an airline seat or at Starbucks or whatever and work on the information without anyone else being able to see it, unlike a laptop screen.

And it’s not hard to imagine how it might be very useful to architects or certain other businesses that might be willing to spend the money for an expensive tool to buy expensive software if it really helped them out.

I don’t think Apple will have problems selling them, even at $3500. Is it the next iPod? No.

But right now, they have got to be 109% constrained on those insane displays. Nothing else in the headset seems anywhere near as hard to manufacture.


> The Vision Pro, as announced, clearly isn’t much of a consumer product.

Someone ought to tell Apple then; I see eight videos on https://www.apple.com/apple-vision-pro/ and seven of them show home use.

Of course, it's possible the product targeting will change. I certainly agree that "it's for business" is how the execs responsible for HoloLens and Google Glass have kept them on life support to delay pulling the plug.


<< The Vision Pro, as announced, clearly isn’t much of a consumer product. The price is too high. But that’s not surprising,

I would absolutely buy this argument were it not for Apple's own portrayal of it during unveiling that shown a person watching movies, which is not commonly a professional role ( as in, it is intended as a consumer product ).

That is not to say that I think it is a bad idea. I am not an apple fanboy and I want to get my hands on it if I can convince wife it is not a waste.


Apple recently released Logic Pro and Final Cut for the iPad. It wouldn’t surprise me if that work was done in part to support those apps on the headset at release.

It may have looked like that person in the demo was watching a movie, but maybe they were making a movie…


Who will be uploading gigabytes of videos to a headset to edit movies with a heavy device on your face instead of to a workstation where you have actual monitors and don't have to keep something heavy on you. Apple already sells machines that are good and comfortable for doing those things. Doing them in VR doesn't automatically make them better.


I think video editing would be done by having it connected wirelessly to your Mac with its storage and apps, and to use this as your display and input device.


Who? People who record something and want to edit & post ASAP.


A MacBook or ipad will be more portable if you want something you can take along wherever if you want to do that.


If you want a really big workspace, you are going to have to also haul your large monitors with you.


If you are doing quick edits you don't need a really big workspace. If you don't want to haul things around you probably don't want to also haul around a vision pro carrying case.


> you don’t need

To be clear, I don’t think any of this will be driven by need, especially at first. It’s going to be something people want to use. If the virtual screens are just barely usable, they are going to sell a bunch of these things to people who want to work on virtual screens. I can’t seem myself ever doing it, but I know there’s some demand for it.


The person wasn't exactly living in subsidised housing and living on food stamps.

People who have the money WILL buy these as early adopters, but it's not a mass-market product.

Like the comment above said, the real game changer will be the Vision and Vision Air or whatever they're called. Either the tech gets cheaper or Apple will pare down the sensors and features to a point that satisfies the applications devs have built for the Pro and shown that they are viable.

The pass-through eye display for example is just 100% Apple flexing, not something a "normal" user will ever need. That's a few hundred off the price.


I have a 3500 EUR MacBook pro for the only reason because my company bought it for me for work. There is no way in hell that I would spend that much for a MacBook, and I don't think my company would buy me a 4000eur vision pro.

I have a gaming PC that I bought for 2000 EUR pc that I built for entertainment and some work. 4000 EUR is so much out there that I cannot imagine who is buying it with his own money


People don’t seem to understand how much comparatively early adopters have paid for Apple hardware.

My Apple //e setup cost around $3500 in 1986.

My Mac LCII setup was $4K in 1992.

There are enough people in the world who will drop $3500 on the Vision Pro without blinking for Apple to sell all it can make.


I think you are underestimating how much disposable income some people have.

People buy skiing equipment for 2000€ Euros just to go skiing a few times a year. They buy mountain bikes or road bikes for 5000€. They buy brake upgrades on their sports car for 7000€.

I think Apple will easily find a few people willing to spend 4000€ on a toy.

Whether the market will be big enough for 3rd party developers to be sustainable is another question.


I’m a software developer. They’re not going to buy me one.

Top designers for Ferrari? Famous architects working on buildings costs tens of millions? Other expensive people like that I can see it.


Only if there is content to create for people having apple vision pro. If there is not enough consumers I don't think that would be true.


Agreed that the price of the tool can be justified by the value it helps create. I'm always bemused to know that most giant movies are not made with Apple tech, though? Seems render farms in custom linux toolchains are the norm, more than the tools that we so often see in marketing.

Or has that changed? I'd be lying if I said I keep up to date on this stuff.


The Apple stuff is for the artists or color graders or editors or whatever. That’s my understanding. The rendering farms are Linux.


I think it has largely been the artists in non movie firms, though? Prosumer and journalistic endeavors, at large. Wacom and similar setups dominate the asset creation industry.


I was specifically thinking of the movie industry, as I know they’ll spend outrageous sums on computers because the tab nothing and they need the best.

You’re right, Apple stuff is certainly used in tons of other industries by people who either need them for a specific reason or just prefer them.


Right, I could tell your quote was about the movie industry. I don't know how accurate that actually is, though. Seems every time I've gotten a dive about what tools are used by movie editing groups, it has not been Mac.

Do you have quotes showing otherwise? I wouldn't be shocked to be wrong here, mind you.


Apple's Final Cut Pro used to hold 60% market share.

Movies are not just rendered on rendering farms. There's a ton of other stuff that gets done. Rendering the final movies is the smallest, most boring and trivial part of the whole process.


> Apple has customers who will buy multiple $15,000 configuration Mac Pros without batting an eye. That’s the price of the tool and for what they’re working on (perhaps a $250,000,000 movie) it’s justified.

Exactly. And Apple showed exactly zero use cases for "vision pro" that would be useful to these people.


> For now it seems more for executive

For me it seems more like a devkit targeted at developers who are supposed to figure out what can you even do with it.


Apple has been smart to present it as a desktop replacement (as opposed to some ar/vr/metaverse device), which is something they can deliver that will let users bring their existing apps to the new medium without adaption.

Ideally yes apps should update themselves, tailor the experience. But the main focus so far has been pretty conventional app like experiences, which happen to be hovering in space. Where-as most headsets have tried to create entirely new ecosystems from nothing, and that would have been a huge mountain to climb.


I don't think presenting it as a desktop replacement was "smart" so much "the only possible way to justify the asking price", but I don't think its going to work out that way personally. Having tried many VR headsets I can't think of any I want to wear all day.

They can sell 100k of them and crow about being sold out, but I don't think even apple has figured out who this is for - thats what I got from their presentation on it. The only answer they have so far is "people who will spend $3500 on a vr headset" which isn't a use case.

For the most part we got marketing level "watch people emote joy as they do things with this device" and all the use cases sucked.


>Apple has been smart to present it as a desktop replacement

It means that now you are competing against desktops which have been iterated upon for decades and have a lot of value already. Instead of standing out by having apps that are only possible in VR people will way if they would rather use the app outside of VR.


Imho this is the foible of many technical people in that they want to advertise unique use cases.

But the problem is:

1. The people who aren’t already engrossed in the field, don’t have a good view on how to bridge between their current world view and the new one.

2. The people who are already in the space don’t need to be sold on unique cases.

Very few post-jobs-return Apple products show dramatic new use cases even if the product then goes on to enable it, and even if Apple themselves have clearly thought of it.

Their marketing is: this is how you take what you’re already doing into this space. Unique VR experiences only matter to a fringe set of users. The every day mundane stuff is what matters to the rest.

Take the ability to run iPad apps on it natively. VR enthusiasts will scoff at it. The real trick though is that it means you aren’t having to switch devices to do a mundane task, which means more time on each device. That’s what appeals to the bigger market, and has been proven time and time again , because it’s not making them do contortions to use it.

Another issue is thinking that the demographic for sales has to be the demographic for ads. People will reply and say: well the price isn’t for the lay person. To which I’d say, who cares? They’re not the early adopter but they’re still the demographic for who the people buying this will be developing apps and content for.


>Unique VR experiences only matter to a fringe set of users

I disagree with this. Why would someone buy a headset instead of use an ipad or desktop? If there is nothing unique to VR why should people put a heavy thing on their head for about the same experience?

>Take the ability to run iPad apps on it natively. VR enthusiasts will scoff at it

No, where have you seen this? Everyone likes the ability to run these apps, but my point is that these apps are not a draw. Most people find it more convenient to use these apps on their phones or tablets.

>To which I’d say, who cares?

Developers care. Most big developers don't care about devices unless they have a large amount of users. Their time is better spent on devices / platforms which have hundreds of millions of users.


> Why would someone buy a headset instead of use an ipad or desktop? If there is nothing unique to VR why should people put a heavy thing on their head for about the same experience?

Because the experience can still be enhanced by the form factor and be compelling. Watching a movie isn't unique, but watching it on a 100ft display while trapped in a plane is compelling. The entire pitch is progressive experiences, and has been for every product they've shown since the iPhone. Take what you're used to doing and make an experience that progressively scale to the form factors.

> No, where have you seen this?

Countless posts here on HN and in the virtual reality community (like /r/virtualreality) that bemoan the device as a glorified iPad, and hate the amount of 2D windows shown.

> Most big developers don't care about devices unless they have a large amount of users

Big developers will take the risk, and are already doing so. Apple showed off multiple developers like Unity, Disney and more making apps. ( more here https://www.apple.com/newsroom/2023/06/developer-tools-to-cr... )

Part of that is that the visionOS allows for progressive experiences, which is not something other HMDs allow for. Developers aren't building an app for visionOS. They're adding to their existing codebases for iOS. Their investment therefore isn't a niche new platform, but the entire ecosystem.

There's a huge first mover advantage in software on these platforms as has been shown by the iPhone where people were going ga-ga over fart apps, and beer drinking apps. Those people made bank because they delivered fun knick-knacks before the market got saturated.

Even looking at other HMDs, it's often the big players and the indie players that move first. The middle of the spectrum are the ones who move last when the ecosystem is there. The Oculus Quest launched with a Star Wars game available.


To me it comes down to which Apple can more reliably deliver, that will see regular use. I have a hard time knowing what unique VR experience would keep people coming back day after day. But we know for a fact people use screens for desktop-like concerns for many hours a day. And we have lots of experience developing those experiences.


They’ve been slowly shipping pieces of the ecosystem for a while. The iPhone has had AR for years, and tvOS has moved to 3D icons already (you can move your finger on the remote touchpad to see the parallax and material reflections.

Also, iPad has had lidar for a while, which you can use to 3d scan stuff when building out AR apps.

I hope someone will find more compelling use cases for the hardware, but it will definitely have an ecosystem on day one.


I think 3rd party developers are largely irrelevant for it’s initial success. It’s like the original iPhone, if the hardware and built in apps are compelling then people will buy it which will eventually create an ecosystem.

That’s IMO why VR has largely failed, out of the box there hasn’t been a compelling reason to get one and there isn’t enough software to keep people interested.


I really can't imagine built-in apps being enough to carry a VR/AR headset specifically. If it doesn't have third party games, it'll sink.


As someone with a VR headset that supports all the games, those headsets are already sunk. Right now, gaming is all that exists in the VR world, because all headsets currently are horrendous for any other task other than gaming. I own a Quest Pro and have attempted to use it for work, but the poor resolution horrible passthrough make it quite uncomfortable to use. For games, it’s great.

I’ll be ordering a Vision Pro primarily to use as a display for my Mac, but also for development purposes. It is definitely a niche product at launch, but I think it’ll grow into the consumer space over time - it is really only niche because of the price tag.

Games will come, but gaming is the “easy” part of VR because you control the entire environment. AR interactions are what will make this into a viable computing platform that actually moves beyond the current 2D paradigm.


The iPhone had some very clear and straightforward use case which were appealing to some extent to every person who had a phone and used internet.

To me VR seems closer to where PCs were in the 80s at this point.


Being able to work/watch a movie with great sound and effectively a 100” screen while on a long airplane/train/bus ride seem rather compelling IMO. Not worth 3,500$ alone, but it doesn’t need to solve every problem for every person just be worth using for enough people to get things rolling.

I doubt anyone is going to be using one of this in a spin class any time soon, but there’s plenty of situations where a laptop/phone/tablet don’t really work well.


And people (including my parents) paid close to $3000 for a full Apple // setup back then - in 1986 dollars…


If a company is risk averse or doesn’t have the resources to make a big commitment, they don’t have to. Their iOS app will probably work in the headset with fairly small changes. If they see big demand, then they have the option of developing something more tailored to the headset.


Is it feasible or is it something they'll do day after day for one year up to the point that they'll store their monitor away in the basement?

By the way, how do you show something inside the VR headset to somebody sitting near to you? Do you buy two headsets one for you and one for guests? One for each family member?


If I was a developer, I’d adopt this platform wholeheartedly.

Anyone who is willing to pay $3500 for this device is also likely to pay a lot for apps and games. Way better than extracting pennies out of Android users.


If these headsets ends up collecting dust on a shelf after a few weeks, as most headsets seem to, then it doesn't matter how wealthy the owners are. They won't buy apps for a device they no longer use.


Apple tends to be good at getting technology into people's hands and then iterating based on the use cases that emerge. The iPhone without being able to write apps for it seemed like a really bad idea at the time, but they were able to iterate quickly and get feedback that let them launch the app store more effectively.

Their Vision Pro just needs to get into enough company's hands for people to start exploring what markets might exist. The direction Apple will take things if it turns out that most of the demand is in the medical field will be a lot different than if the demand turns out to be in education. By growing where the demand is initially, they can lower the cost of entry which will make it possible to go into other fields where their is value but lower margins.


> Apple tends to be good at getting technology into people's hands and then iterating based on the use cases that emerge. The iPhone without being able to write apps for it seemed like a really bad idea at the time, but they were able to iterate quickly and get feedback that let them launch the app store more effectively.

The iPod is an even better example, the first three or so versions didn't hit the mark, but with 4G they finally had the right feature set to sell them like hotcakes.


It sounds like you agree apple has no idea what the use case is or who its for.


>> To me this feels like the first apple product in a long time without a compelling use case and no solid idea of who the potential customer is.

This is exactly how the Apple Watch started! Today it's the most popular wearable on the market by a huge margin.


> Oculus were nobodies more or less back then and didn't have apple's supply chain expertise, and even still their devkits were only like $300 USD 10 years ago.

Those also required a high-end PC costing costing thousands of dollars and lacked key technologies like eye-tracking. There simply won’t be bargains here for many years: VR is less demanding than AR but even there the low-end approach hasn’t seen popular adoption because your brain’s threshold for something seeming real is quite high and has failure modes like nausea and vomiting.

The Vision Pro is Apple’s take on that problem trying to see if they can sacrifice on pricing but actually hit a threshold where the quality is high enough to work for most people. It’s going to be interesting to see if the productivity applications they demoed are going to be as good as they looked – if that experiment doesn’t work at this price, it doesn’t bode well for the rest of the field.


Man you really hate this thing!

Apple has the luxury of being patient, and the benefit of a developer ecosystem and bottomless funds. This release may or may not be huge (I don’t think it will), but the one after, or the one after that, probably will be.

I agree Apple doesn’t know the use cases or customers that will be big for gen 3 in a few years. So what? They didn’t know that for the Watch or iPhone either. The markets evolves, as markets do, and the developers experimented, and later-generation hardware evolved to support how people used earlier generations.

I can understand skepticism, but I really don’t understand adamance at this point.


I don't think it's hate, it's simply skepticism. I adopted VR early on with the Vive and have since stopped using it completely, and I'm similarly skeptical that even Apple can pull a VR device off and make it more than a novelty. Still, we'll see if they manage it.


[flagged]


Please lay off the ad hominems. And read the other posts that led me to characterize their view as hate.


Please take your own advice. I've reviewed his four comments in this discussion and none of them contain emotional language strong enough to characterize as indicative of hate.


> the benefit of a developer ecosystem

Apple hates 3rd party developers, the ecosystem exists because there's money to be made off Apple users. If there's no users of an Apple device there will be no developers - and frankly, there's no incentive to even hack around on their devices.


I would have said they loved 3rd party developers.

Apple likes to build platforms and let 3rd party developers figure out the application layer. After a third party finds a big enough market Apple will crush them and take it. It's essentially just outsourced R&D where Apple doesn't pay for the bad ideas.


They love 3rd party developers who play by Apple’s rules and just ignore everyone else because they can.


You've totally imagined my emotion and somehow taken 4 posts in discussion as "Adamance" as if having an opinion of this device and talking about it on HN is outrageous.

You're outrageous. It seems to me you're the one outraged that someone would do anything less than praise it breathlessly.


I can think of so many use cases if it were good. If it were good, you could replace all Displays with it. Games, Movies, Remote Meetings etc. Oh and Porn. So much porn.

The problem is the pricetag. It's 10x too expensive for mass adoption and 5x too expensive for developers to bother working for it.


Not useful for movies. Netflix and chill is not about the media it’s about the people. Many console games as well. Most individual gaming is done on smartphones at this point. Same with porn. Remote meetings you want to see the person’s eye clearly (it helps identify body language).

If it were a smartphone killer maybe you could eat that market, but smartphones are commodity products at this point. You can’t compete in that space at 3.5k usd. Even if it was good, it still will be a niche product for enthusiasts. It needs a low price and a mass market killer app.


> Most individual gaming is done on smartphones

Perhaps, but only because most gaming in general is done on smartphones.

I reckon about 80-90% of the games I own on PlayStation are either single-player or online multiplayer, which is the same as single-player in what relates to the matter at hand.

PC gaming is even less social.


Games - will be available on cheaper a oculus or valve headset you can give to your kids.

Movies - same thing, also not worth the headset price.

Remote Meetings - why do I want to wear a headset that stops a video of me being recorded? The presentation they had exactly showed that people who didn't wear the headset had video of them and people wearing the headset just had an avatar card. It was the opposite of helpful. This headset makes remote meetings worse.

Porn - its an apple device - any porn will just be web based because otherwise apple won't allow it.

None of these use cases are $3500 compelling. There's no target market, and no compelling use case. It's a brand new helicopter skidoo hybrid and as impressive as it sounds its not clear why it exists yet.


Apple sells a $6999 monitor. The Vision seems like a very niche, very expensive portable monitor.


So you agree with me, there are many usecases for an advance ar/vr device its just

> The problem is the pricetag. It's 10x too expensive for mass adoption and 5x too expensive for developers to bother working for it.


No, I don't agree with you. Everything you said was either a bad usecase already answered by existing VR tech or made actively worse by the headset.


> If it were good, you could replace all Displays with it

Except when you want to show the display to multiple people at the same time.


I think 5x is closer for mass adoption. Oculus is selling their new ones starting at $500, and Apple could easily match that tech and more. They're probably have to give up gimmicks like the front eye screen, though, and they'd have to target gamers, which is a group they completely left out of the presentation.

Apple's name behind it, and advertising it as a work device as well as a gaming device, would let them steal a lot of the market.

And I say that as someone who doesn't currently own any Apple products.



TLS gives no cypher overlap error.


This is likely a result of Cloudflare DNS redirecting you, since archive.today does not have its DNS set up in a way that Cloudflare likes.


Holy hell this drove me crazy the other day. Firefox wouldn't work, but my backup, Chrome, did. But a fresh Firefox profile also worked.

So I figured I must've left a bad user-agent string or something in my about:config. But after lots of trial and error with FF settings, curl/dig, with VPN, without VPN... turns out it was because I was using Cloudflare DNS. I forgotten I'd switched a while back when I was getting dropped packets to quad9.

My best assessment is for whatever reason Cloudflare's DNS gives a different A record pointing to a non-TLS (or broken TLS) redirect, so Chrome worked because that's allowed by default. A fresh FF profile also worked because it defaults to DoH thus bypassing the problem completely. My VPN worked because it has its own DNS. But because my daily driver FF profile is set to use system DNS with forced TLS, it'd hit the broken redirect it got from Cloudflare and die.

So as usual, it was DNS.


Correct - archive.is doesn't like not getting EDNS from Cloudflare (https://jarv.is/notes/cloudflare-dns-archive-is-blocked/), so Cloudflare sends you to 1.1.1.7 to indicate a problem.


Just to be clear: there's no Cloudflare special case here. We're not sending you to 1.1.1.7. We just send whatever the archive.is auth servers decide to send to us. They are auth after all. If we returned 1.1.1.7 it's because they did.


So archive.is is upset Cloudflare isn't forwarding the EDNS data, even though the feature's RFC itself states:

> If we were just beginning to design this mechanism, and not documenting existing protocol, it is unlikely that we would have done things exactly this way.

--/--

> We recommend that the feature be turned off by default in all nameserver software, and that operators only enable it explicitly in those circumstances where it provides a clear benefit for their clients. We also encourage the deployment of means to allow users to make use of the opt-out provided. Finally, we recommend that others avoid techniques that may introduce additional metadata in future work, as it may damage user trust.

Seems archive.is is in the wrong here, which is a little surprising. Don't meet your heroes I guess. But then I also can't see their rebuttal because Twitter is currently a dumpster fire.


get firefox


Firefox is the only browser that gives that error message. The same error yields a different message on Chrome.

I switched back to Firefox Developer Edition in 2017-ish.


I've been seeing the no-cipher error intermittently on Firefox as well.

The site is working for me at the moment.


From the beginning, this product was a very classically Apple strategy of 'let's just take the Next Gen technology, integrate it well, and beat everyone else to market.'

Similar to the iPhone 'winning' capacitive touch (or the iPod winning with Toshiba's mini HDD's), Apple plans to 'win' on SOTA hand tracking, camera-based full-color passthrough, face tracking, etc.

These are all things that Meta (and the rest) have talked about, but have not yet executed on.

It's not surprising (given this strategy) that the first runs are at low yield, the $3tn question will be can Tim Cook fix the problem fast enough (he was largely chosen as CEO because of this exact skill).

My guess is that this follows a 1/2 scale iPad trajectory - People like it, the early days are choppy but no one denigrates the product's quality, the product defines the category (and potentially becomes the category like iPad did), then over time the category becomes a utility product (e.g., iPad cash registers driving a lot of sales) rather than a 'new' iPhone.


The Vision Pro looks like a lot of fun but I'm not sure what the use case is.

Unlike the iPhone, this is clearly an indoor device, not an AR system to navigate around town.

It seems a much more immersive desktop experience, but are you really going to be more productive using AR than a 4k monitor or two? Maybe so, but I wouldn't take that for granted until people start actually using these for work.

It does seem like it has a lot to offer for gaming. That isn't Apple's traditional market, but it could be the direction they're going in.


It seems they are really against VR gaming. No controllers, nothing shown in the announcement video. They apparently really want to focus on their immersive AR desktop. But who wants to pay $3500+ for ... floating screens?


that alone is not mass market, but I'd imagine there's application in industrial settings, especially when you have robotics operating in a factory floor. I imagine walking through a factory and seeing realtime analytics data and even having the ability to do on-the-fly settings changes on the working robots could be quite interesting.


Meta is kind of the Xerox PARC of VR, in that they're doing a lot of great research but utterly failing to productize it.

Think for example of their work on virtual avatars [0]. That research is FOUR years old but looks about 1000 times better than Apple's "Personas" from visionOS. But Apple will actually ship.

[0]: https://www.wired.com/story/facebook-oculus-codec-avatars-vr...


Lol if that's what their researchers made, how did we end up with disembodied legless wii avatars?


The hardware they have in production is underpowered and wouldn't be able to render those high fidelity avatars. That is also why their version of the meta verse looks like it does. They want to be independent of the "legacy" platforms, so their device has to do all of the compute while also being affordable for wide adoption.


Because no one told them they could add a zero to the price of the headset. haha


They also had prototypes with a reverse passthrough screen. They didn't ship it likely because they are focused on low cost.


I don't know, at this point I would still count Apple as only talking about it, too? Thing isn't released, after all.

And you'd probably be surprised to know just how capable other offerings are, too. I'm a broken record, but the PS5 VR is quite impressive. Yes, you have to have a cable to hook up to the PS5, but not having a battery is an odd benefit. Not to mention the general cost difference. And my understanding is that it isn't that far ahead of the Meta offerings.


People (admittedly access-dependent friendly people [0]) have worn it and say it roughly lines up with demoes. There is no such analogue for Meta, Nreal, etc.

[0]https://www.youtube.com/watch?v=OFvXuyITwBI


I'm interested, but I'm not seeing anything in that video that really sounds as magically new as it is being reported. Eye tracking, in particular, is already doable with the existing options. For what it is, it works well. Still isn't nearly as good as using a controller.

This was also the first time I read about your FaceTime usage being an odd 3d rendered version of you.


Full color pass through and good hand tracking are not available on any headset even near the mainstream.


The pass through on the PS5 is nice, I'm assuming the cameras are not color, though? That or there is some other reason to not have color pass through? Curious on if anyone has given that a discussion.

The tracking of my hands, though, is one that just feels misguided. The controllers provide haptic feedback that is needed for many games. To the point that I would view hand gestures as completely irrelevant?

Is the eye tracking much better than the PS5? Again a thing that feels neat and is nice to see. But then I quickly turned off all controls based on it, as I would rather the indirect options with my hands. In almost all cases.


> classically Apple strategy of 'let's just take the Next Gen technology, integrate it well, and beat everyone else to market.'

If that were true, they’d long have an iPhone with a foldable screen, and would have shipped AR/VR glasses years ago.

I don’t think that ever was their strategy. IMO, they figure out what a good product needs and build it as soon as technology allows.

That always requires integrating technology well, and often also requires using new tech, but that’s just a consequence of wanting to build the best possible.

Beating everyone else to market certainly isn’t (high) on their list (“less space then a nomad”, “my Nokia could do that years before iPhone”, etc)


Yes, they only beat people to market in the sense that once they release a product, everyone else realizes that's what the product should be and creates variations on it—so in hindsight it looks like Apple's take is the "first."


Which is weird because with the vision pro is not really any "first", they are incrementally doing a lot of things a bit better. My stupid guess is they probably worked for so long on this that they figured they have to ship something and see what sticks. It's cool that they do, of course.

But the use-cases displayed in the demo video are just weird, not what I expected apple to come up with.. you don't buy a ski-mask to make 3D photos of your kids, or buy a ski-mask to watch 2D facetime windows of conference call participants even if they are floating in mid-air instead of on your phone..


I have a lot of the same skepticism, but then I was also skeptical about the Apple Watch, and I'm wearing one now! That original Apple Watch was RIGHT at the edge of being a usable product. It was so slow and had so many connection issues.

My guess/hope about the 3D videos is that they'll bring 3D capture to the iPhone. For conference calls, I can sort of see the value—I might actually prefer being an avatar so I can sit on the couch and have plenty of room to open multiple documents and Slack threads.


> If that were true, they’d long have an iPhone with a foldable screen, and would have shipped AR/VR glasses years ago.

I think you’re really confusing my point here.

There were plenty of touch screen devices before the iPhone, but only a few niche devices had capacitive touch screens.

Apple doesn’t throw too-early technologies against the wall (folding phones still have visible creases when opened, for example).


The article includes a (entirely made-up speculative) growth projection, which says it will grow very differently from the iPads.



The fact that not a single person wore the Vision Pro or showed off anything other than prerecorded/rendered videos during the keynote should have given everyone a clue about how far along in production it was.

Everything about it gave off the vibe of marketing teams overhyping a product and engineering left to pick up the pieces afterwards, which is very unlike how Apple usually works.


Journalists were allowed to wear the headset during closed-door sessions after the keynote. It exists and it works. By not allowing anyone to be photographed wearing the headset, Apple appears to be doing some kind of preemptive damage control about the fact that headsets look dorky no matter how expensive they are.


They literally had hands on demos for journalists after the event...


Yet not a single one of them was allowed to even take a picture of themselves wearing one. Why was that?


Speculating, but Apple wanted to make sure that the only people photographed using the headset were models and actors who are very attractive. That way whenever a news outlet runs a story about it, they have to use the picture of the attractive/cool people using it, and people make an association that the headset makes people attractive/cool.


Surely there are attractive/cool looking journalists?


Telling journalists that their young colleague is allowed to have their picture taken but they cannot because they're too old and frumpy doesn't sound like a good way to get positive coverage for your product. Easier to forbid pictures of anybody using it than to make a policy of judging journalists by their physical appearance.


https://youtu.be/Df_2BBTvJ2o

Video of a journalist using it.


Why is that significant?

This seems like exactly what Apple would do.


The passthrough face requires a face scan thing that wasn't ready yet. For controlling avatars viewed in a virtual mirror I think I read they gave you an avatar of someone else to control.

If the passthrough eyes were available, clickbait journalism would also probably do things like intentionally cross their eyes and get a goofy look on it, then write and article about how the headset will make you look extra goofy.


With no reverse passthrough


I don't understand why grossly misinformed comments like this always make it to the top in these threads.

Yes, the headset exists and it works. People wore it after the keynote and wrote about the experience.


because the overall vibe of HN isn't about sharing knowledge, it's about showing off by sounding the most confident or cynical or just negative.

most threads are of extremely low value. it is sad, it's significantly worse than even just five years ago.


“Exists and works” is not Apple’s normal bar for a product launch. They routinely ship 100M units in a sales cycle.


The existence and functionality of iPhone at launch:

> They had AT&T, the iPhone’s wireless carrier, bring in a portable cell tower, so they knew reception would be strong. > > Then, with Jobs’s approval, they preprogrammed the phone’s display to always show five bars of signal strength regardless of its true strength. > The chances of the radio’s crashing during the few minutes that Jobs would use it to make a call were small, but the chances of its crashing at some point during the 90-minute presentation were high. [1]

This mobile data hackn was among many other problems with the product at the announcement of iPhone.

iPhone sold ~1.4 million units in the first year. But arguably, the Vision Pro is easily 10x more complex of a product than iPhone was. It is also a much bigger leap in human computer interaction than a touchscreen.

I think these sales projection numbers seem reasonable for what the tech is, and if anything the system’s stability at demo was way further along than iPhone.

[1] https://archive.is/Bo5f5


My theory is they set a bunch of unattainable goals for the team. The deal was they would ship the product when these milestones were met.

Once the goals were met they had to ship the product despite uncertainty on how exactly it would be used.

They could tell it could be used but have no insight if it will be used.


This is not true for the iPod, iPhone, iPad, or Watch.


Based on the articles I've read about this product, it might be the most advanced consumer tech we've ever seen. It would be a surprise if the production was a smooth sailing.


It's created seemingly for a high growth part of the cycle and not a recession. It might fail based on that timing?


Good thing for them we're[1] not in a recession, then[2]?

More seriously, if anybody launches a platform that can't weather a recession, it wasn't going to make it long-term anyway. Windows 3 launched around the 1990 recession, IBM PC launched into the 1982 recession, iPhone launched during the 2007/2008 recession. Platforms take time to take shape. A recession occurring early in a platform's life might even be beneficial, as it will make management more forgiving of "missed" sales targets in the early years.

1 - The USA

2 - Seriously, we're not in a recession and haven't been for 3 years. The economy grew at this rate most of the last decade and there were no complaints about us being in a recession. Money costs more now is the major difference.


Apple was just valued at over 3T. NVIDIA spiked. Infrastructure Bill spending is coming through as I understand it.

We're in a post-vid slump growth phase.


Growth, or bubble? I don't forsee a world where Apple paves their way to 5 trillion dollar valuation. Both of their highest-margin products (iPhone and App Store) are under threat and require immediate re-arranging. They're an easy bet because "big company strong", but have so many mounting issues to deal with on the global stage. It's extremely unlikely this period of growth lasts into the future.

Nvidia is, as you said, a spike. One driven by AI but will probably be irrelevant in a few years. They're still great at engineering and CUDA will almost certainly continue to dominate, but the AI boom won't last forever. Once ASICs hit the scene it will be game over for GPGPU vendors, like it was with crypto in 2021. They'll still have gaming and industry markets to cater to, but their spiking days are numbered.


I thought a valuation of a company on a stock market was a reflection of its potential future profits.

So with these recent influx of capital to me that's the market saying that it believes Nvidia and Apple are going to bring in profits in the future so here's money now to grow in order to capture those profits that come in later.

I'm probably misunderstanding aspects here though.


>I'm probably misunderstanding aspects here though.

Yes, it's not as simple as "the company will make profits, therefore it's worth a lot".

Apple's expected forward p/e is ~28 over the next five years, which means if you believe that and buy now, you can expect an average return of 3.5%. How much you expect future earnings to grow determines if buying now is a good deal or not.


That's the spherical cow view of how markets work, and it's a meaningful part of what actually happens, but there's also a good amount of hype / new cycle/ non "discounted value of future cash flow" going on in practice.


no one should underestimate apple. have you seen the inflate phone prices outside of USA? guess how much iphone 15 will cost... they ll make more money than ever and in 3 generations the vision pro will as well


How is it so much better then other passthrough vr headsets?


Its fidelity is considerably higher with a 4k+ MicroOLED panel for each eye, which eliminates screen door effect, aliasing, etc making use cases that involve reading a lot of text in it more feasible. Its passthrough is also extremely low latency at 12ms, and it’s built with a full M2 SoC which is a good deal beefier than anything integrated into other standalone headsets.


I just want to point out that despite this is one of the highest resolution VR screens on the planet right now, the pixel density would have to double once again in order to achieve Retina resolution. This is why Apple doesn't say "Retina" when talking about Vision Pro.

For people who think they'll be designing and coding on Vision Pro virtual screens all day might be disappointed.


Certainly text won’t be as comfortable to read as on a high DPI display, but it’ll at least be usable, which is a notable step up from where things stand now. I own a Quest 2 and though it’s fine for gaming I can’t imagine using it for work in any capacity at all.


Yes, the resolution should be roughly comparable to working on a typical 20-something inch HD PC screen from normal desk distance.

EDIT: Although since the screen pixels are aligned to your eyes (head, more specifically), and not to the environment, this means the rasterization of the text will shift around as you move (even slightly). Which means it'll be slightly blurrier than a screen of similar resolution, as you can't employ "pixel snapping" and other tricks (as seen on Windows) to make low res text look better.


Never going to be able to justify the current price but I’m still really interested in playing with it to see if I think they hit my personal bar.

I’m perfectly comfortable spending 8+ hours in a day writing code on a non-4K screen (widescreen with the pixel density of a 27in 1440p) and based on what I’ve used in the past could probably handle anything as good as a 27in 1080p monitor with the right fonts.

That said, I do find the screen door effect on my Valve Index rather distracting and can’t use it for longer than an hour or two without risking a headache, especially if I am reading any text (tried watching a movie with subtitles and oh boy).

So I don’t think retina resolution is necessary for work (at least for me personally) but it does seem like having the screens so close and completely covering vision does increase sensitivity to picture quality so I may be wrong about that.


But in about 3 more generations all of these features will be the standard in 500$ headsets. Just being better by making a device more expensive doesn't lead to more sales.

What's the killer app for this device which other devices on the market don't support today?


Though I’d love to see the price of these specs to come down that quickly I’m not optimistic they will. Those MicroOLED panels in particular are extremely difficult to manufacture with terrible yield.

I don’t know that a killer app for it exists, but with its built-in developer base it’s inheriting from iOS thanks to using the same APIs (a dev base that’s notoriously fast to pick up new features, at that), I believe the chances of a third party dev finding a killer app are significantly higher than with prior AR platforms.


Maybe, but, by then, Apple will have had the marked for themselves for some time.

Also worth noting this is not a “headset” but a computer (“headtop”?) in a different format. Think of it not as buying a headset for your gaming rig but as a laptop replacement.


The killer app is the brand. People trust and furiously consume the brand because of mental conditioning.


Depends what kind of passthrough you’re talking about.

If you’re talking about additive displays like xreal , magic lens or HoloLens, then the Vision Pro has a much higher field of view of the content and the ability to display things that are darker than the real world , in exchange for looking at a screen of the world instead of directly at it. Those trade offs are subjective so I won’t focus on them because the tech is so different.

If you compare to other VR headsets with cameras feeding video to the screen, there’s a lot:

1. Clarity is higher. If you look at the footage from their product video which they said is shot from device, it’s a lot more stable and clear than anything up until you get to the $6k Varjo XR-3.

2. Latency is lower apparently. Most other headsets have more of a delay between photon in to photon out based on people who’ve tried it.

3. Better reprojection of the image. If you look at videos of pass through from the Quest Pro, as you move, things warp in the view. From videos from the vision pro, it doesn’t seem to have any of that. (Obviously I’m just taking each company at its word that videos are honest)

Supposedly the Quest 3 is significantly better passthrough than the Quest Pro but very few people have tried it and said anything in depth about it, so it’s too early to judge.


I quite frankly don't understand what makes this the most advanced consumer tech ever.

What about mRNA vaccines? Centimetre-accuracy differential GPS navigation systems that fit the palm of your hand, like the uBlox f9p? 3D printers like the Ultimaker S5 that can feed carbon fiber thread into the print? High-end cameras like the Sony A1?

What is the defining feature(s) that makes a new VR headset so much more advanced?


It really pushes the envelope on many fronts - thermals, displays, materials, sensor integration, specialised hardware… I would never expect any rollout like that to go as planned.


All of them are basically iterations on current customer devices, with main advantages coming mostly from: 1. Apple willing to attach mind boggling price tag to it. You can engineer much more things if price tag is 10x of industry leader (oculus 2). 2. Apple is taking a big shortcuts with external battery, that gives you a lot of extra power, space and thermal leeway.

I’m not saying that they didn’t make something great - it does seem great. And given it’s Apple, they’ll likely have great polish and attention to detail, that make a HUGE difference.

But it’s not a best thing since sliced bread. It’s not a technological breakthrough. It’s “just” a new generation of customer electronic device, that’s better than previous one.


If you replace "display" with "image sensor", the Sony A1 pushes the envelope on all those fronts as well.

And of course Apple is buying the displays for their headset from Sony, so who is doing the envelope pushing there?


The Apple Vision Pro is much more advanced than a random 2023 Intel laptop or any Android device.

You mentioned some cool tech stuff, but it's all too categorically random and distantly unrelated for a fair and meaningful comparison.


I don’t know that I agree with the wording of the parent post, but I understand the sentiment. It does include a lot of hardware that is relatively cutting edge, and the insanely low latency on the system is reflected in that.

While calling it “the most advanced” might be a bit of a stretch, the sophistication of the hardware was something that I wouldn’t have previously thought possible. Some of your counter examples fall short of that complexity. The actual mRNA vaccines are pretty technologically simple, the complexity in differential gps is in its algorithms rather than hardware, and is that camera really groundbreaking (genuine question).


> The actual mRNA vaccines are pretty technologically simple,

Apple vision is even more technologically simple. You get the screen, attach some sensors to it, sprinkle cameras, add CPU, battery, lenses and you’re done.

> the complexity in differential gps is in its algorithms rather than hardware,

I could somehow understand your ignorance when it comes to mRNA - it’s outside of typical software engineer field of expertise. But GPS?

Yeah, complexity is just in algorithms. Building a satellite that can survive decades in space, building and attaching extremely precise clock to, some solar panels and antennas, thrusters and few other minor details, then getting that onto the rocket and injecting into very precise orbit - that’s an easy part! And especially doing that in 1970s!

All those things are easy compared to leet code! /s


That said, the Vision Pro will also push the envelope when it comes to software, or so it seems.


Well, now - Financial Times doesn’t give away much for free, do they.



The Bypass Paywalls plugin works well with FT [0].

[0] https://github.com/iamadamdev/bypass-paywalls-chrome


If you're on iOS then I find that this shortcut works pretty well - simply open the paywalled link, share and then "Unpaywall".

https://www.icloud.com/shortcuts/71648f5ad34f4d8f972718e5f36...


nomen est omen


For all the people saying that this is dead on arrival or will never gain traction because it's too expensive, too bulky, too whatever: this is going succeed. The first watch serious sucked, but now it has all kinds of sensors and an always on display. A lot of those unique components (like the stereoscopic display) will get mass produced, which will bring down cost. Refinement will improve abilities.

This particular headset? It is too expensive, too bulky, etc. But, I can tell you that in time any serious developer will have this. The only thing that will make me give up my ultra-wide monitor is this: an ultra-wide screen monitor that I can take anywhere.


Consider the alternative -- most people don't care about this device...how many non-video-game playing people bought a VR headset until now?


This is not VR.


the watch is a poor comparison.

Is this the next iphone or not? That is the question.


Neither the iPad nor the Watch were the best iPhone. No one seriously thinks Apple will sell 80 million+ headsets a year


It doesn't need to be the next iPhone to be a great business. Famously, AirPods alone would be a Fortune 200 company (probably higher now).


This assumes that an independent accessory vendor could somehow worm its way into Apple's multi-billion dollar marketing campaigns. Airpods sell primarily due to the value of being a 1st-party Apple accessory.


First of all, I disagree with the premise—plenty of Android users buy AirPods. But putting that aside, why does the success of AirPods invalidate the point that Vision Pro could be successful without being as big as the iPhone?


Airpods are an existing product category (headphones) that everybody with a phone already uses. Couple that with Apple products being a status symbol, advertised everywhere, and it's not hard to see how Airpods could succeed even at a $200-ish price point.

As for AVP, it is a brand new category where the use cases are yet to be identified. It costs 2 months' rent, it has no killer apps that people are going to line up for, and you'll look weird wearing it. It's a completely different proposition.


The history of Apple is a history of "no one in their right mind would pay a premium for THAT." A $500 unsubsidized phone? A $350 watch to get notifications? Stupid-looking $200 earbuds that will fall out of your ears?

AirPods got so much hate in 2016, it can be hard to remember. They were far from a sure thing.


Apple Watch can only be said to be a moderate success at best, considering Apple themselves have never disclosed how many they've sold.

As for unsubsidized phones, the US is the only country that sold them on contract, and the iPhone didn't take off until Apple had AT&Tcut the price to $199 to match other flagships. The Nokia N95 for example retailed at $730 and sold millions in 2007.


According to industry analysts, Apple sells over 40 million watches per year and is by far the top smartwatch seller, taking over half the revenue of the entire market. Given how Apple protects its margins, that adds up. I'd sure love to have that kind of "moderate success."

Regarding the iPhone, I'm not sure what your point is, but my point is: people mocked the iPhone for having a ridiculously high price at launch despite having fewer features than other high-end phones. It's a direct analog to the Vision Pro reception, because this happens literally every time Apple launches in a new category.


Apple hasn’t disclosed unit sells in years of any of its products.

But the Watch is a moderate success compared to what? What other consumer electronic product by any other company is either selling in the estimated volume and with the margins of the Watch?


Not the “Apple products are status symbols” canard.

How is something a “status symbol” if it has a 60% market share in the case of phones?

Any teenager working at McDonalds can buy an iPhone on 2 year contract or get one “free” with a contract from one of the low end MVNOs


> How is something a “status symbol” if it has a 60% market share in the case of phones?

Perhaps the majority of humans alive today don't live in a market where that's true?


I think the limiting factor of this device is apps/experiences.

For this kind of money and discomfort you require mind blowing 3D experiences for it to be worth it. The problem with that is that those are impossibly hard/expensive to build by 3rd party developers.

The risk is that you'll end up with endless 2D apps projected into 3D, or poorly made 3D apps similar to today's AR apps with shitty graphics.


For this specific article, it’s 100% speculation.

I’m not saying it isn’t true, but rather there is no evidence to support it. Ars Technica is also reporting on this topic this morning, and they are also missing evidence-backed sources. It’s all Wall Street analyst speculation.

Read with several large, coarse, grains of salt.


I read many positive posts written by users who went to the conference. I'd like to try it out but I also do not think I'm going to wear a headset a few hours everyday. I wonder if I can rent it for a few weeks from my company...


While I'm sure Apple would prefer a full scale production run, limited availability for a new product often can create a demand frenzy that is overall beneficial as production is able to ramp up.


It may create demand for the hardware, but it will diminish demand by developers to make apps for it, knowing the audience will be small. And Apple has an ecosystem to bootstrap here. If they fail, the whole platform will die.

I'll be curious how they navigate the difficulties of getting this category to self-sustainability.


It’s good, however, it’ll be able to run macOS and iOS apps on virtual flat panels.

I haven’t looked into the SDK and what it offers, but I read it’s not full room immersive as some gaming sets are.


I'm not sure what the demand frenzy is for an unproven $3500 VR headset with no killer app during an economic slowdown.


I'll be getting one to use for coding. One person does not a demand frenzy make, but having 'infinite' monitor space is a killer app to me.


What does Apple's virtual monitor provide over the N existing virtual monitor solutions that you've waited this long? Honest question as this tech has been around for ages.


As a person who wanted VR monitors, I backed the original Oculus Dev Kit 1 a decade ago, and it was terrible for that purpose. So I got the DK2, and the Vive, and the Index afterwards, trying each.

The Index pretty much has it -- it's doable. I can read comfortably in there, or watch a youtube video and not feel like I'm getting some grainy, crappy version, but it's just barely there. It's very heavy and hot, though, so I almost never choose to use it.

The claimed resolution of the Vision Pro is unbelievably larger than the Index. I forget what the numbers were exactly, but they claimed to be able to virtualize "multiple 4k monitors" or something, and the per-eye resolutions _actually supported that_.

The next thing is: it's gotta be reasonably light, balanced, and cool, so I can wear it for an hour and not get intense neck pain or nausea. I'm skeptical but, hey, we gotta keep trying.

If it hits those points: it'll be a great virtual monitor and a lot more.


There are other high resolution headsets on the market right now at comparable or less cost than the vision pro.


Apple’s headset claims to be >4K per eye, which is what the “8k” headsets of today offer.

But they don’t have as compelling latency, hand tracking, or app development toolkits


wdym by latency? From what I can find the vision pro is 90hz which is the minimum for sota headsets with 120 being targeted by higher end devices. There are lots of hand tracking solutions for headsets that don’t have it built in but for a lot of interactions you want more than one button anyways. I’d rather have my pick of engines and SDKs through the open XR standard then be stuck with whatever Apple cooks up.


Such as?


Pimax 8k has been around for years: https://pimax.com/product/vision-8k-x/


The above comment doesn't deserve to be downvoted, it's a legitimate question. I myself was excited at the idea of VR to replace monitors, until I realized that making use of all that theoretical space would require me to crane and twist my neck at extreme angles for extended amounts of time. Upon reflection, it seems strictly worse than just making use of your OS's built-in support for virtual workspaces, where switching between them takes only a simple keypress rather than contorting my body.


I could see a combination is eye tracking + alt tab to work for vr


The eye tracking sounds like a killer feature (I can imagine how it could enable reading with touchless, perfect, infinite scrolling). Sadly, from my brief read of the docs, it seems like Apple doesn’t allow apps to access what people are looking at (“to preserve privacy”) though.


Higher quality screens at a lower weight-on-your-head than anything else on the market.


1. There is only so big before you have to crane your neck, which existing screens can reach.

2. My existing monitor is already really light on my neck.

I don't see any benefits.


1. Not having to sit at your desk where your monitor is. You can switch to sitting on a couch or in a recliner or walking on a treadmill or on your porch.

2. Taking it with you easily. You have the same large screen (s) with you in the hotel room or at your in-law’s or at the cafe (people might look at you weird but this is a massive flex regardless)


maybe there are no benefits for you. I’m going to get one since I have 450sqft total and nowhere to put a monitor.


It's all about the branded polishing cloth, obviously: https://www.apple.com/shop/product/MM6F3AM/A/polishing-cloth


$3k is a lot of money and I already bought 4k monitors, desk, etc..

MacOS seems to have pretty spotty monitor settings stability, I'm regularly having to fix a setting that was forgotten for no reason.

Activating PIP mode on my 4k monitor causes OSX to shit itself as it rearranges everything, which is odd because there is no perceptible change to OSX's understanding of my monitors. This means my 4k monitor is somehow reporting PIP as a change in monitors (basically snitching on me) when I'm PIPing in an unrelated device.

Ubuntu, Debian, Garuda don't have this behavior. They remember settings better and don't shit themselves when PIP gets activated.

For reasons like this, I'm skeptical that Apple Vision Pro will do a great job at the monitor replacement angle.

However, I'll follow that closely and probably wait for a gen 2.


Having something on my face all day and looking like a minor burn victim for an hour after isn’t worth all the space in the universe.


If it's that uncomfortable, then I won't be buying one either. I hope they've worked out how to avoid that, ahem, result.


They think it'll be another goldrush and want to be the one with the killer app before the novelty wears off and the shovelware takes over.


"Beneficial" ? Are you a scalpel ? Only scalpels benefit from that shit...


Is there a developer program that gets you early access to a head set ?



The headset is the early access product.


How would anything else be the early access product?


In the past Apple has sent out early access dev kits that were quite different from the final consumer product.

When they switched from PowerPC to x86 in 2005, the dev kit was a Pentium 4 motherboard in a Power Mac G5 case:

https://en.wikipedia.org/wiki/Developer_Transition_Kit

Apple never used any Pentium CPUs in Macs except for this loaner model. The first Intel Mac desktop was an iMac, so completely different form factor.


it really is amazing to see so many people with such strong opinions about a device they've never used, entering a market that doesn't really yet exist, in a year.

why does every thread about an Apple product end up just as a competition to see who can have the least informed but most negative take on whatever it is?


Times like this are when people can only speculate. There’s too little information for anything else. The discussion is tautologically speculative. Some will be optimistic and others will be pessimistic and the only thing informing the disparity of those opinions is the authors’ personal perspectives and biases. You’re frustrated by something that is only natural.


Maybe VP 2.0 already shows enough gains that this is a smart choice?

Obviously this is just one of many cases.


Because of supply chain issues


okay - if it's a successful product then production can be scaled up.

paywall and I'm not really interested beyond the headline anyway.


You didn't miss anything.


Talk about brain dead. Here is a product but it can't be mass produced.


This makes me worry they might cancel the solid gold Vision Pro Edition next.


You may have to go to third party accessories for that https://caviar.global/catalog/virtual-reality


WTAF! I mean I know there is a market for everything, but a gold-plated vision pro… and a Rolex-embedded in a smartphone case…

To each their own.


I obviously don't know anyone who would buy this, but my impression of the product category is that it's for rich people who can't cope with owning the same Vision Pro as any chump with $3500 sitting around. How is it supposed to be a status symbol if it doesn't cost $40k like their watch?

Same with a Rolex embedded phone case, having all the money in the world won't buy you a fancier iPhone than anybody else, so if you're intent on turning everything you own into an excessive status symbol you have to take your device that already has a built-in clock, and add an less accurate but handmade and more expensive watch to it.

I have a hard time imagining who exactly would be impressed by that, but what do I know?


I got to try one (several big companies got them in advance) absolutely brilliant.

I was worried my Unity Game dev contracts were dead for good, but I'm back now baby.

Apple saved my consulting biz with their Unity partnership!


Have you also used any Meta Quest devices and are you allowed/willing to say anything about how the experience compares between them?


Apple vision pro's screen is extremely extremely high quality. It's what Vr should have been to begin with

The only thing that sucks is no controllers


why would your unity gamedev consulting be dead without this?

anything happening in this space?


Probably referring to the announcement of partnership with unity [1].

[1] https://www.uploadvr.com/apple-vision-pro-supports-unity/


At least in my little corner of that world, Unreal has been gaining mindshare and market share pretty aggressively for the last five years. This is forcing me to start thinking about Unity again.


Economic downturn lead to unofficial freezing of contracts


What kind of software, speaking as an independent developer, is going to work well with a market of only some 100k's of users globally? Yikes...


What kind of software would you sell to corporations or rich early adopters?

We'll see in Q1/2024 when the Vision Pro comes out for real.


apple enterprise is an even thinner slice of that


All you need is something like the Avatar 3 production buying a 1000 of these so they can show the previz live for everyone on the greenscreen set.


idk how that's relevant


The Vision Pro claims it can scan a face/person and use the digitized likeness as an avatar.

No need for an “Avatar 3” — it is fast and relatively cheap compared to the hardware that movie / video game productions already use.


LOL. This comment rhymes with Steve Ballmer talking about the iPhone circus 2008.

If the market likes the headset, there’s no reason it has to be limited to 100k units. Figure out how to make an average office employee 10% more effective and it pays for itself quickly.


Well the article does mention "high net worth users" so maybe it's time to bring back the notorious I Am Rich app[0]

[0]https://en.m.wikipedia.org/wiki/I_Am_Rich


100k rich users though


compared with around 1 billion people with credit cards loaded into an iphone, 100 million macbook users

vision pro apps need to return 1000x-10000x

Leaving only longer term options for devs


I don't think so - solidifying yourself as the first / best app in the new ecosystem can be absolutely worth it. Until Apple takes your app and includes it as a native functionality like those journal apps that are done come iOS 17.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: