Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The iOS UI has to determine intent from touch input, so as more gesture-based controls have been added, the OS needs to figure out what you're trying to do. Tap. Tap and drag. Scroll. Slide. Whatever. If your intent is one action, but your inputs don't match the actual requirement for what you want to do, you get bizarre behavior.

For the flashlight in particular, the button inside the control center can do a few different things—including closing the control center entirely—if you fumble the tap even slightly with an upward push of the thumb. With the Lock Screen, a sliding motion of any kind will just not turn the light on if your slide doesn't begin and end inside of the UI element that you're entirely obscuring the view of with your enormous fingers. You can accidentally open the Lock Screen customizer. It's even possible to get haptic feedback from the lockscreen flashlight button without actually turning it on.

While I don't have any of these problems, I am familiar with them and have observed others struggle. There are accessibility settings that are designed to help (repeated input filters and such) but they all slow the UI down and somehow make it more confusing because the phone is just more likely to do nothing rather than the wrong thing.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: