Vision Pro: what was I made for?

Vision Pro: You can’t multitask with eyes and both hands, have to look at each control serially and tap with one hand. More like a telegraph than a keyboard.

Eye tracking works better for fighter pilots because they also have physical controls like weapons buttons, eyes are just used to pick targets.

MacBook trackpad, keyboard can control native Vision Pro windows, can actually get work done until you get tired of it on your face.

The virtual keyboard only supports pecking with the index fingers.

The aesthetic is much cleaner than geeky Meta. Less Wii, more Bauhaus. Less social, more productive.

You can practice your presentation on a virtual stage at the actual venue, with the slides on a big screen behind you.

Panoramic photos are curved around you, like a peek into real world.

AR pomodoro timer:

Personas show your hands as you gesture. The angle of your head match Lee who you’re looking at.

Spatial audio matches the virtual environment you pick, and when you move a person’s window to the side or back, the sound matches.

Showing 5 screens at once in a semicircle around you isn’t useful for me, though it would be for air traffic control, stock traders, etc.

The simulated eyes are lenticular so they look like they’re actually on your face, not bugging out in the screen. But that sacrifices brightness and clarity.

MKBHD: the simulated eyes are not as visible if you have dark skin.

Personas explain Apple’s fixation on Memojis, they were working toward these 3D sims.

Optic ID is probably iris biometrics.

Lots of little niceties. Setting pupil distance is motorized, not manual. People ghost into view when they’re looking at you.

Apple seems to have solved anchoring virtual objects in physical space, rock-steady according to reviews.