Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am very pessimistic about the future of VR beyond it being super nice for niche cases. But I think there’s something fundamentally good about:

1) Apple taking a stab at it, because it’s the kind of thing where UX really matters and Apple does UX well.

2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in, often because they’re not bursting at the seams with cash, talent, and runway.



> Apple does UX well.

I hear people say this often enough to question my own sanity, because I don't find Apple's UX to be very good at all. I find it confusing and it makes things difficult to figure out.


Could you provide an example of the sort of thing you’re talking about? Because I’m the type of person the other commenter was talking about (I’ve been in the Apple ecosystem for a very long time), and I feel I’m too close to it to have a reasonably objective viewpoint on it one way or the other.


Sure. iPhones and the Apple desktop OS. In both of them, I am lost as to how to accomplish anything but the most common stuff. The design of the OSes is such that it's not obvious how to do anything, and they provide very little guidance. You have to already know how to do what it is you're trying to do.

This is even worse on Apple smartphones, because a lot of stuff is done using mystery gestures that you have to already have memorized.

Skeuomorphism used to provide a small amount of relief, but since that's gone out of fashion, it eliminated even those clues.

In short, I find Apple's user interfaces to be very opaque. In fairness to Apple, this is also true of Android, Windows, and Gnome. But in the case of Android, it feels intuitive to me simply because I've internalized how to do the stuff that it gives no hint as to how to do.


I’m actually curious about specifics here for macOS aside from things like hidden scroll bars. Mainly because I find it to be one of the most discoverable. If you go into the ‘help’ menu in any application, you get a search box that searches all the menus of the application, if a shortcut exists for the action it’s listed alongside it. Admittedly Qt and KDE can do this too, but only for QT apps, while on macOS it’s every app that actually provides menus. Nearly every application handles files and actions the same way, and take the same common shortcuts for things like preferences. If I want to see shortcuts for system actions, they’re all listed in the keyboard control panel, and arbitrary shortcuts for arbitrary applications can be added there. I can’t name another system that feels as integrated and discoverable in that sense, except possibly KDE or gnome running ONLY apps developed specifically for that toolkit.

What system(s) do you find more discoverable, and how? I’m honestly curious because I love checking our new systems or window managers and it’s bugged me not to have some of this elsewhere for a while.


How do you show hidden folders in Finder? You literally have to memorize a keyboard shortcut. How do you switch apps in full screen mode, you need to know trackpad gestures or keyboard shortcuts. Every post like 2004 addition to MacOS seems to be reliant on trackpad gestures to the point using the operating system with a mouse is a negative.

The search bar in the menu is fine if there is a menu to search. IIRC or at least last time I used it the screenshot tool had no menus and was extremely unclear as to how to save a screenshot as a file in the location you actually want.


The screenshot tool is pretty tightly coupled to shortcuts that’s certainly true. I actually didn’t know there was a keyboard shortcut for hidden files and folders, it can be toggled through the preference menu for finder, but again yeah that’s not as discoverable as it should be. Switching apps in full screen mode would be using the dock. I never do, but it works just fine and it’s the way you switch apps with a mouse, also using cams+tab works with the system set to follow space of selected application which is set by default.

To your point about trackpads though, I agree. I can’t stand a mouse on macOS, it loses too much convenience, especially with BTT available.

That said, is there any system today that you find more discoverable? I mean serenityOS actually fits that bill, and classics like older macOS or windows 98, but anything modern?


> How do you switch apps in full screen mode, you need to know trackpad gestures or keyboard shortcuts.

The Dock also works, although you have to move the pointer downwars to show it. In a lot of ways Clicking on Icons is MacOS foundation mode and everything else is superfluous. I, as a Poweruser, forget that sometimes.

> IIRC or at least last time I used it the screenshot tool had no menus and was extremely unclear as to how to save a screenshot as a file in the location you actually want.

The Screenshot tool btw. is also an App under Applications → Utilities. If you use that, you’ll get a floating palette with different options including for different locations. Those options then get also used by the Poweruser ⇧⌘n shortcuts.

(I don’t necessary disagree with you, there is too much invisible navigation.)


That screenshot utility is also accessible with ⇧⌘5, which might be invisible, but just letting you know.

As for where to save that screenshot, that is available in the Options section of the floating menu.


And that screenshot tool does not have anything in the global menu. That was my point.


Or hold down the option key while right clicking things or in certain dialog boxes. I guess that's not horrible if you internalize that fact but it's not discoverable.


> How do you show hidden folders in Finder?

You don't. They are hidden. If you are supposed to see them they shouldn't be hidden.

I understand that other operating systems make a different choice about this, but that doesn't make their choice is the correct one.


⇧⌘.


I’ll give you a very specific example. When turning on the flashlight, I have to hold the flashlight icon for a specific period of time. If I hold too long, I get the dimmer option. If I don’t hold long enough, it doesn’t turn on. The iphone expects what feels to me like a zen-like stoner’s calm press of the button. This is absolutely infuriating to me. Every time. Just turn the damn light on when I press the button.

Did you know that you can hold the spacebar to move your cursor around on iOS? Without my friend telling me about that feature, I also found text selection in iOS to be deeply frustrating as compared to Android


My Dad is going through ALS and keeps accidentally turning on his flashlight. And is then unable to turn it off. So many UI choices in iOS just seem poorly thought out.


> I’ll give you a very specific example. When turning on the flashlight, I have to hold the flashlight icon for a specific period of time. If I hold too long, I get the dimmer option. If I don’t hold long enough, it doesn’t turn on. The iphone expects what feels to me like a zen-like stoner’s calm press of the button. This is absolutely infuriating to me. Every time. Just turn the damn light on when I press the button.

I just tested this on my phone (albeit it is an iPhone 8, so a bit older model perhaps without as many features). A short tap is sufficient to turn it on and off. A longer tap brings up the intensity slider. I was unable to tap the icon quickly enough so as not to register the tap and toggle the flashlight.

Upon retesting a couple times, I did notice that on the first tap the icon is illuminated but it did take a split second more for the actual flashlight to turn on. But waiting the split second was sufficient instead of further tapping.

I’m using the menu that is pulled up while the phone is unlocked (by dragging from the bottom of the screen, on my model), for reference.


The iOS UI has to determine intent from touch input, so as more gesture-based controls have been added, the OS needs to figure out what you're trying to do. Tap. Tap and drag. Scroll. Slide. Whatever. If your intent is one action, but your inputs don't match the actual requirement for what you want to do, you get bizarre behavior.

For the flashlight in particular, the button inside the control center can do a few different things—including closing the control center entirely—if you fumble the tap even slightly with an upward push of the thumb. With the Lock Screen, a sliding motion of any kind will just not turn the light on if your slide doesn't begin and end inside of the UI element that you're entirely obscuring the view of with your enormous fingers. You can accidentally open the Lock Screen customizer. It's even possible to get haptic feedback from the lockscreen flashlight button without actually turning it on.

While I don't have any of these problems, I am familiar with them and have observed others struggle. There are accessibility settings that are designed to help (repeated input filters and such) but they all slow the UI down and somehow make it more confusing because the phone is just more likely to do nothing rather than the wrong thing.


How do you Airplay something on Macos Monterrey?

On a multi monitor setup, why is it impossible to drag a window over to a monitor where another window is fullscreened?

How do you select a specific resolution or refresh rate when you're plugged into an older meeting room projector?


You're looking for specifics from people that don't necessarily have them. Not because their reasoning isn't valid, but a lot of us don't use the ecosystem frequently enough for them to be in the foreground of our minds.

I had to use an iPhone for a few weeks when mine broke. That was enough to convince me to never touch any of their products again. I'd love to tell you more about it but it's been 4 years and I just don't remember all of of details. The main thing I remember is that it was anything but intuitive to use and it was an entirely frustrating experience. Easily the worst phone experience I've ever had


This makes sense. The few times I’ve tried to help someone with Android, I had to go very slowly through every single step. The two look pretty similar from afar but are quite different. I can still make my way around Windows because I’ve used it before.

My 70-80+ year old parents can use iOS abd ipados mostly fine but get lots of things wrong on their Mac, so it seems to be a net improvement.


The macOS 'application centric' window system model is an excellent example IMHO.

Alt-Tab (erm... Cmd-Tab) switching through applications instead of windows (and then having a separate hotkey to tab through windows, but only within one application), and that mess combined with the fact that there can be UI applications that just have a menu bar but no open windows is just incredibly bizarre (and even though I'm now a primary Mac user for more than a decade it still feels incredibly clunky - at least there's this gesture-controlled 'exploded-desktop-view' as a workaround).

...and when it comes to the new stuff:

- hiding the scroll bar, what a completely non-sensical decision from a UX point of view, I now need to wiggle the touchpad to see where I am in a document, and how much of the document I'm currently viewing (yes I know that this can be disabled, and that's the first thing I do on a new Mac, along with inverting the scrolling direction)

- arranging buttons vertically in popup dialogs on macOS devices with landscape display orientation (arguably it makes sense on an iPhone with portrait display orientation, but in either case the vertical arrangement makes it a lot more likely to accidentally hit the wrong button)

- the new settings panel is just a massive step back from the old one (which wasn't all that great either)

- don't even get me started on iOS, if you don't know the 'magic gestures' nothing makes sense (same shit on Android though)


I love the application centric model; I find the “window” centric model to be incredibly messy.

I’m assuming you came to macOS from some years using Windows?


Basically Amiga => Windows (since ca 1998) => Mac + Windows + Linux (since ca 2010)

...but I was also exposed to 90's Macs somewhat. I remember that my switch from the Amiga UI to Windows was fairly smooth, but I never got quite used to the Mac UI (not that it matters all that much though because I mostly switched back to the command line anyway, that's the only way to keep sane when switching regularly between Mac, Windows and Linux).


Me too. If I had to keep a window open in — say — Photoshop open all the time just so I didn't have to wait ten seconds every time I open something with it, it'd drive me crazy.

Visual Studio on Windows does drive me crazy for exactly this reason, especially as each instance can only have one top-level document (solution) open at a time.


I grew up on Windows and Linux, but a mostly daily Mac user for more than a decade now.

I think the app vs window centric model really comes down to user preference - I think I could get used to window again but I had Chrome, Audacity, and the windows explorer open last night with maybe five windows and was alt-tabbing to all the wrong places. My brain just makes sense of “get to application, then find the window” better than a list of windows I have to sift through across disparate apps.

I agree with the scroll bars and settings getting worse. Buttons are either way, but - like app vs window switching - could be remedied with a setting to change the behavior.


By now Linux (Gnome, sure) has copied apple and hidden the scrollbar. But Apple does it better, so they still lead in UI.


That's another but even bigger problem: Apple is so highly regarded by designers (which was even justified in the past when Apple did actual UX research), that they just blindly copy even the bad stuff.


As a daily mac user at work (against my will), there is much to complain about mac's UX. The big problem here is that no way anyone is objective given that most UX becomes "good" after repeated use.

I have many examples, but I'll just pick one. When having multiple instances of an app, the doc adds a "dot" by the app icon. This dot is present no matter whether I have 2 instances or 10 instances. Oftentimes I forget instances there (which can open due to many reasons, including for instance private browsing) because it's not easy to tell that I have 3, instead of my usual 2 instances.

Additionally, switching instances is unnecessary slow and requires more clicks than the older Windows taskbar (which is hopefully coming back as MSFT imitated Apple's terrible UX). Maybe you love this but I objectively lose productivity with this UX decision.


> When having multiple instances of an app, the doc adds a "dot" by the app icon.

That's simply not the case.

The dock adds a dot to the application icon if the application is running.

If you have multiple instance of the application running, there will be multiple icons (each with a dot) in the dock. This is fairly difficult to do, the only way I know how is by starting the application again from the command line. Well, or when you're running an application you are developing from Xcode.

So not 100% sure what you mean with "multiple instances", but almost certainly not actual multiple instances. I am guessing you mean an application with multiple documents open.

So basically, what you regard as "bad UX" is simply you trying to map assumptions from one environment onto a different environment, and finding that your assumptions don't reflect reality.

And yes, the application will be running (showing a dot) whenever you have 1 or more documents open in the app.

To switch between documents of an application, you can either go to the application's "Window" menu and chose one from there, or to the dock icon and choose one from there.


If the UX was intuitive, it would, well, be intuitive and I wouldn't need all these paragraphs.

Anyway, I'm using Chrome and that's what happens.


This may be a case of the mental model not matching the technical model. Under macOS, windows aren’t program instances, they’re just windows — they’re all hosted by the same single parent program instance. It’s technically possible to open a true second instance of a program using the Terminal, which spawns a second Dock icon, but few people do this on a regular basis.

This is why open windows are represented on the second level, e.g. listed in the context menu that appears when you right click a dock icon. Programs and windows are not synonymous, and in fact windows belong to programs.

Macs have used this model since they first gained multitasking decades ago.


This is a good example of typical UX complaints which boil down to "this is not the thing I learned on." I personally don't think the early 2000s were the peak of computer UX, but I understand that most people don't like change.


The thing is that up to around the early 2000's, UX decisions were backed by actual research, not the aesthetic whims of an egocentric designer.


Yeah, right


How about iOS Settings? There are, counting now, 8 sections of settings before the ninth which is “all third-party apps alphabetically” (that last one is mainly where you can grant/deny a short list of entitlements for those such as location, cell data, etc.)

Those 9 sections are all unlabeled for some insane reason (I suppose maybe this type of plain table view thing has never had a UI element to make headings?), so even the intent of the designer is unknown. If you want a certain setting, you just have to keep scrolling. There is Search in here, though the search is mediocre and it’s still hard to find things.

Oh, and Settings has a junk drawer too called General, which seems to mostly mean “someone was forced to pull 15 or so items into a subfolder, and it also uses the “random groups without headings” method of organization.

All of this seems low-effort and like the designers didn’t learn even basic lessons from 40 years of GUI design (or like the ones calling the shots are fine art majors more concerned with words like “uncluttered” and “elegant” than “usability” and “discoverable.”


Like how are the "maximize window" and "minimize window" buttons even supposed to work? (That traffic light on top of your windows)

Or why don't we have Copy and Paste keys on our keyboards. Or Undo and Redo keys?

Or why is the USB drive in the back of an Apple monitor?

Or why is everything so opinionated?

Or that mouse that could only be charged with the cable plugged into the bottom so you couldn't use it?

Or in earlier versions of OSX, you had to upgrade the OS by starting iTunes (wtf?) ...


For this one

> Or why don't we have Copy and Paste keys on our keyboards. Or Undo and Redo keys?

Copy: Cmd+c

Paste: Cmd+v

Undo: Cmd+z

Redo: Cmd+Shift+z or maybe Cmd+y if the devs do windows stuff too


It’s getting harder and harder to see the good UX under a layer of bugs and minor issues Apple can’t be bothered to fix (mainly talking about macOS)


My dad got an iPhone despite us being an Android family because word on the street was that they're easy to use.

For the life of him he can't he can't look at a text message while in a phone call and then get back to the call menu.

He unknowingly mutes notifications for contacts and has no chance of figuring out how to unmute them.

He accidentally calls his contacts all the time.


I think it was about 10 years ago: a friend got a message on her iPhone while she was driving and asked me to reply. I had a Samsung back then. I couldn't find a way to reply. I had to ask her to guide me through the UI then I finally made it.


With me one example of poor UX with iOS 16 they’ve butchered “Do Not Disturb”. While I appreciate the added user control of Focus tabs, it has made DnD a much more annoying feature to use and access. Instead of a simple one-handed swipe up and press, I need to swipe up, click into the menu, and stretch my thumb while balancing my phone to enter DnD one-handed. For someone like me, who’s friends always seem to message at inopportune times and send 50 half sentences, its a rather common feature I enjoyed before that is now much worse to use.

Also Apple just killed some of my app notifs with the update and despite all of my settings being proper some apps like Instagram just don’t work.


Mac's windows system was good on the original Macintosh 1984, with a very small screen. It was worse than Windows' one for the Macs of the 90s with screen sizes comparable to the ones of the PCs. It was clear to me that a single global menu stuck at the top of the screen was a bad choice. I never bought a Mac because of that.

20 years later, given an Android and an iPhone, they were equally difficult (or easy) to use. Furthermore, after having used Android for years finding stuff on an iPhone was definitely difficult. Probably the other way around is true.


A personal example for me is that on the iphone, the default view shows me 24 icons, which is way too many to be useful. Its so useless, in fact, that most tech savvy users simply swipe up and use the search feature. But there's no way to know that you can swipe up to get to the search feature, leaving most non-tech-savvy users to visually inspect pages of icons, one at a time, until they find the app they are looking for.


I like the default view. I don’t find 24 icons overwhelming or confusing. The most frequently used ones, I remember where they are spacially.


Window management on OSX.


If you are using Apple products since decades of course you are used to it - except some complaints between major releases. If you just come from any other corner, oh well...


I'm pretty sure they still force auto arrange of the icons on the home screen.

It's probably invisible to most Apple users but it's a bit puzzling to anybody coming from anywhere else.


The problem is you aren't their intended audience.

I too thought Apple UIs were confoundingly bad, but one day I decided to take a step back and approach it as any ordinary man would. I pretended I knew nothing and cared nothing about computers.

You know what happened? It was intuitive. It was consistent. It was harmonious. My mind was blown.

Apple UIs are made for the ordinary man, not for tech nerds and professionals like us. Forget alt-tabbing and window managing, the vast majority of people on the planet don't know and don't care about computers; and it's those people Apple caters to.


I’d suggest you didn’t look at it from the perspective of an “ordinary man”, but instead, you dropped your longstanding assumptions from using Windows.


Practically speaking that's the same thing, normal people don't know and don't care about Windows vs. MacOS vs. Linux. It's a computer, and they hate it.

Apple at least designs their UIs to accomodate how normal people use computers, which is why their stuff doesn't mesh well with tech nerds and professionals without a mindset change. We are not normal people.


You don't think the more parsimonious stance is that you have different preferences?


> 2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in, often because they’re not bursting at the seams with cash, talent, and runway.

I feel like the iPhone mini being only two years long is a strong counterpoint to this. There's a small hope for an iPhone 15 mini, but with no 14 mini, it's a long shot. It accounted for 6% of all iPhone sales in it's first quarter, after a delayed and rocky launch.[1] There is a small but somewhat vocal minority that want smaller screens, which Linus Tech Tips' sub-channel Mac Address covered recently. [2]

[1] https://www.tomsguide.com/news/iphone-12-mini-sales-a-disast... [2] https://youtu.be/BjhiYa0KsSM


There was such a little difference between the 13 and 14 low end models, there was really no purpose for a “14 mini”. If you want a mini phone, you could buy the 13.

Apple also doesn’t update the SE every year.


A smaller iPhone isn’t exactly an exploratory venture into a new market. And loud minorities don’t make a market.


Loud minorities can form a market, if they are willing to put their money where their mouth is. But too often they don’t (cf. the “brown manual diesel wagon” from the car world. Enthusiasts say they want one, but when a manufacturer puts one out, they don’t buy enough of them).


“Revealed preference”


Maybe. But as we’ve seen (https://www.imore.com/iphone/lackluster-iphone-14-plus-sales...), the Minis did outpace the non-Pro 14s. Obviously as the 14s sit on the market if they will outsell the minis. It’s just a matter of essentially a self-fulfilling prophecy of being available and the latest SKU. But the velocity was not there. So expect Apple to react accordingly. I’d like for them to settle on set sizes. Hands aren’t evolving every year, they need to stop screwing around.

I use an iPhone 12 mini that I ordered on launch day and love it. Finally a worthy successor to the 5S/SE1.


Why can does apple need a new mini every year, or even every 2 years? The 13 mini bought today will probably still do everything that 90% of customers need it to do for another 5 years. Even the SE which is 3 years old now can probably do most things for another 5 years.

Also, the 12 mini was released 6 months after the 2020 SE, and the 2020 SE was released ~4 years after the previous SE. it stands to reason that many people who wanted a smaller phone jumped on the 2020 SE, and then had no reason to buy a 12 mini.

Also, the 12 mini battery life sucked. The 13 mini is perfectly usable though.


> 2) Apple not being trapped in the “if we can’t sell a billion of them then why even bother?” mindset that so many other companies get trapped in

It seems to be the opposite with folding phones. A lot of Android manufacturers releasing them, despite low sales numbers, while Apple apparently thinks: if we can’t sell a billion of them then why even bother?


Apple are at the “the technology isn’t quite there yet” phase. When they enter the market you’ll know you’ll probably get something semi reliable.


I'm typing this on a foldable phone right now, which is plenty reliable. One of the enduring myths about Apple's late entry into product sectors is the idea that they're waiting to "get it right". The first iPhone was a joke, it took them until the 3GS to get it right.


> One of the enduring myths about Apple's late entry into product sectors is the idea that they're waiting to "get it right". The first iPhone was a joke, it took them until the 3GS to get it right.

I don't agree that this is a myth.

If you look at the first generation iPhone in a vacuum, it was limited, perhaps (but I think there are plenty of arguments that the first gen iPhone was pushing envelopes - name one other phone that had a fully functioning web browser at the time, and pinch/zoom felt magical), but nitpicks about functionality aside, I don't think this framing makes sense for other reasons.

A product line is not just its initial release. It is a process, an organizational discipline, a manufacturing and logistics pipeline, an ongoing upgrade cycle, an ongoing engagement with customers, an understanding of the market, etc.

When the first generation phone was released, plans for the 3GS were already well under way. The 1st gen phone was a necessary step to get you that 3GS. When operating at Apple's scale, at least a decade of roadmap if not more was already planned in depth. The 3GS was not a reaction to the success of the 1, it is the real product Apple envisioned and planned to ship before we even knew the iPhone would exist.

If the 3GS got it right, so did the gen 1, in a certain sense.

And as an aside, I have to say I really liked my 1st gen iPhone compared to everything else I'd had up to that point, and I was a gadget junkie who had owned quite a few of the other "hot" phones at the time.


I'm not looking at the iPhone in a vaccuum. I'm also taking into consideration other things Apple has inexplicably delayed for multiple years:

- support for mouse input

- a native file manager app (2017!)

- pressure-sensitive pens and screens (~ 4 years after Samsung)

- Multiple apps on same screen (2019!)

None of these are things that take years to get right. Apple commentators simply get lazy and recycle this iPod-era talking point.


> a native file manager app (2017!)

What would you do with one on a device where every app is heavily sandboxed and there's no common file system?

> pressure-sensitive pens and screens (~ 4 years after Samsung)

When they did release this, no competition was even close to the precision and fidelity.

When Apple releases something later than competition it's usually (but not always) justified and results in a better execution


> What would you do with one on a device where every app is heavily sandboxed and there's no common file system?

Access each app's sandbox to move/copy/exfiltrate the data contained there, for whatever reason the user desires. Not much different from the way file managers are used on traditional desktop systems.


> Access each app's sandbox to move/copy/exfiltrate the data contained there, for whatever reason the user desires.

That breaks the security promise that apps don't have direct access to other apps' data.

There's a reason you can't even access photos without an explicit user prompt.

> Not much different from the way file managers are used on traditional desktop systems

Traditional desktop systems never had heavily sandboxed apps with no outside access


> That breaks the security promise that apps don't have direct access to other apps' data.

No it doesn't. Giving the user the ability to get their hands on their own data is not the same as giving other apps direct access to it.

> Traditional desktop systems never had heavily sandboxed apps with no outside access

That's completely orthogonal.


> Giving the user the ability to get their hands on their own data

Does the user have the need for that on an iPhone?

I don't think a single app on my phone has any files I'd ever need access to

> That's completely orthogonal.

Of course it's not orthogonal


> Of course it's not orthogonal

You questioned whether there was a use case for file managers when apps do not share a "common file system" and are "heavily sandboxed", and stipulated that providing one would break the security promise of not giving apps "direct access to other apps' data".

Whether desktop systems have traditionally sandboxed their apps to the same degree is, in fact, orthogonal. It's a separate question entirely.

> Does the user have the need for that on an iPhone?

Sorry, but when I spot somebody moving the goalposts (and failing to acknowledge when one they staked out earlier was satisfied), my approach is not to indulge them by continuing to respond as if they're just like anyone else having a discussion in good faith. Instead, I say, "you are moving the goalposts; you are not asking in good faith". This is monumentally helpful in making economical use of my time.


Regarding looking at the iPhone in a vacuum, I was referring to the iPhone gen 1 vs. the iPhone product line as a whole. The point was that the claims don't make sense unless you look at just the first generation of the product, which was clearly not Apple's vision.

Regarding other features, I don't understand the connection you're trying to draw here, or how they serve as counterpoints. All these anecdotes point to is where Apple has chosen to focus resources and in what order. Having worked on a product team, there are dozens of "obvious" things that get cut from every release. I don't see this as some enormous indictment of the product.

Mouse input on what? iPhone? Is this a thing?

A native file manager is a feature that applies to a small subset of users. Do some people want it? Sure! Does a lack of it somehow imply Apple is failing? I think that's hard to argue. Anecdotally, as a highly technical user, I rarely if ever touch that app. I don't think most of my friends/family know it exists.

Multiple apps on the same screen ... on the only tablet in the market worth considering. Again, this is a gripe about roadmap order and not an effective argument against the core claim.

And core to the claim is the fact that Apple is often late to the party.

> None of these are things that take years to get right. Apple commentators simply get lazy and recycle this iPod-era talking point.

I don't think anyone is making the claim that these things take years to get right. More interesting than this would be to examine what they chose to ship instead.

If you don't believe the iPhone, iPad, Apple Watch, MacBook Air and AirPods are market-defining products, we'll have to just agree to disagree. It's not as if the iPod was the last product to resemble the "Apple does it better" pattern.


> When operating at Apple's scale, at least a decade of roadmap if not more was already planned in depth.

I highly doubt that considering that web app / native app store debacle.

Also was there any real reason to not support 3g from the beginning? A lot of “dumber” phones had it so it seems more likely that they just made a mistake.


> I highly doubt that considering that web app / native app store debacle.

You mean the same Apple where Jobs said that no one would want to watch video on a tiny screen less than two years before the video iPod?

Apple didn’t go from idea of a fully featured SDK and an App Store in nine months without already planning to do so.


I'm referring primarily to the hardware side of this from a roadmap perspective.

I don't think early politics about how apps get shipped plays too much into the core planning for the device, e.g. form factor, manufacturing, user interactions, structure of built-in apps, etc. are all pretty much orthogonal to the app distribution model.

But even then, there were clearly two camps, which means there was a disagreement about a specific path, but not a lack of forward thinking. And then they course corrected. Something that many companies simply do not do.


Yeah absolutely. I just don’t think that consumer tech companies which make 10 year plans (instead of a broad high level vision) and try to stick to them tend to be that successful.

> are all pretty much orthogonal to the app distribution model.

I’m not sure it’s only the distribution but the entire existence of native third party apps that they weren’t sure about.


I was including 1st party vs. 3rd party in my mind when I mentioned distribution. Admittedly that word is doing some heavy lifting.


I'm pretty sure the whole web app thing was just a smoke screen (aka lie) because the app store wasn't ready yet.


IMO it's only a debacle for developers who don't want to learn the native platform or don't think they can do it.


Not sure what happened to the original reply I typed.

The recent pixel foldable shows the way to totally screw up a foldable screen. I’m sure you’re totally happy with yours, but all the foldables I’ve seen have distortion at the fold and it’s definitely a week point that will eventually degrade. Not for me.

I was actually around when the first iPhone came about. As a tech enthusiast it was incredibly disruptive. Everyone was still shipping devices with stylus and this device with a capacitive screen came and blew everyone’s socks off. I don’t think anyone can rewrite that history.


It’s not like there was anything that similar to the first iPhone. But yeah it was arguably closer to a tech demo and a toy than a real device. Apple and Jobs himself weren’t even exactly sure which way they wanted to go with it.


This is a good counterpoint to what was posed above.

That being said I feel like this is almost the obverse, they have a product that sells well so they don't need to follow each trend that comes out. So they can pick and choose which way they run with things.


Not to speak of the discontinued iPhone mini.


The iPhone 13 mini is still available for purchase:

https://store.apple.com/xc/product/IPHONE13_MAIN

Any source that they stopped making it?


I meant in the sense that the hardware isn’t getting updated anymore. I hope they don’t stop making it, but I don’t know if they are actually still making it or only selling existing stock. It sells rather poorly. See https://forums.macrumors.com/threads/iphone-mini-15-coming.2....


VR with extremely light eyewear coupled with extremely high resolution where real life is indistinguishable is essentially being able to pick your reality. I don’t think there’s any doubt that this wouldn’t be a niche thing.


It’s not clear we will reach that before we will reach direct brain interfaces.


I agree about VR, but I'm really excited for advancements in AR!

I'm a really avid mountain biker, road cyclist and snowboarder, and I'd love some basic telemetry displayed in the goggles or glasses I'm already wearing. Nothing kills a ride faster than having to stop to pull out your phone to see if you missed a turn or to figure out which unmarked trail you're supposed to take. (Not to mention all of the information we current have to look down at our cycling computers to see: HR, watts, cadence, etc.)


As a trail runner, same - I really look forward to a future where decent glasses can show route details/programmable workout info/etc. and I can leave the watch at home. Less crap to carry + more info readily available would be awesome!

I know that's not the problem Apple's solving here, but I'm hopeful that anything they can do to push AR will filter down quickly.


Absolutely! I feel like VR is still in its Napster phase.


Seems like existing helmet HUD technology should be sufficient for some basic telemetry.


I agree. My guess is that they are at fork in production ramp. They need to choose between high production high risk or low production low risk.

If they are 100% certain this is the final form factor they would take one path. If they think we need to get this out in the world and see how it is useful they would take the other.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: