A couple of weeks ago, at the end of July, I booked a slot to try out the Apple Vision Pro.
It has been available for months in the USA, and might already be in the ‘trough of disillusionment’ there already – but I wanted to give it a try nonetheless.
I sat on a custom wood and leather bench in the Apple Store Covent Garden that probably cost more than a small family car, as a custom machine scanned my glasses to select the custom lenses that would be fitted to the headset.
I chatted to the personable, partially-scripted Apple employee who would be my guide for the demo.
Eventually the device showed up on a custom tray perfectly 10mm smaller than the custom sliding shelf mounted in the custom wood and leather bench.

And… I got the demo?
It was impressive technically, but the experience – which seemed to be framed as one of ‘experiencing content’ left me nonplused.
I’m probably an atypical punter, but the bits I enjoyed the most were the playful calibration processes, where I had to look at coloured dots and pinch my fingers, accompanied by satisfying playful little touches of motion graphics and haptics.
That is, the stuff where the spatial embodiment was the experience was the most fun, for me…
Apple certainly have gone to great pains to try a and distinguish the Vision Pro from AR and VR – making sure it’s referenced throughout as ‘spatial computing’ – but there’s very little experience of space, in a kinaesthetic sense.
It’s definitely conceived of as ‘spatial-so-long-as-you-stay-put-on-the-sofa computing’ rather than something kinetic, embodied.
The technical achievements of the fine grain recognition of gesture are incredible – but this too serves to reduce the embodied experience.
At the end of the demo, the Apple employee seemed to be noticeably crestfallen that I hadn’t gasped or flinched at the usual moments through the immersive videos of sport, pop music performance and wildlife.
He asked me what I would imagine using the Vision Pro for – and I said int he nicest possible way I probably couldn’t imagine using it – but I could imagine interesting uses teamed with something like Shapr3d and the Apple Pencil on my iPad.
He looked a little sheepish and said that wasn’t probably going to happen but sooner with SW updates, I could use the Vision Pro as an extended display. OK- that’s … great?
But I came away imagining more.
I happened to run into an old friend and colleague from BERG in the street near the Apple Store and we started to chat about the experience I’d just had.
I unloaded a little bit on them, and started to talk about the disappointing lack of embodied experiences.
We talked about the constraint of staying put on the sofa – rather than wandering around with the attendant dangers.
But we’ve been thinking about ‘stationary’ embodiment since Dourish, Sony Eyetoy and the Wii, over 20 years ago.
It doesn’t seem like that much of a leap to apply some of those thoughts to this new level of resolution and responsiveness that the Vision Pro presents.
With all that as a preamble – here are some crappy sketches and first (half-formed) thoughts I wanted to put down here.

Vision Pro STL Printer Sim
The first thing that came to mind in talking to my old colleague in the street was to take some of the beautiful realistically-embedded-in-space-with-gorgeous-shadows windows that just act like standard 2D pixel containers in the Vision Pro interface and turn them into ‘shelves’ or platens that you could have 3D virtual objects atop.
One idea was to extend my wish for some kind of Shapr3D experience into being able to “previsualise” the things I’m making in the real world. The app already does a great job of this with it’s AR features, but how about having a bit of fun with it, and rendering the object on the Vision Pro via a super fast, impossibly capable (simulated) 3d printer – that of course because it’s simulated can print in any material…

Once my designed objected had been “printed” in the material of my choosing, super-fast (and without any of the annoying things that can happen when you actually try to 3d print something…) I could of course change my scale in relation to it to examine details, place it in beautiful inaccessible immersive surroundings, apply impossible physics to it etc etc. Fun!
Vision Pro Pottery
Extending the idea of the virtual platen – could I use my iPad in combination with with Vision pro as a cross-over real/virtual creative surface in my field of view. Rather than have a robot 3d printer do the work for me, could I use my hands and sculpt something on it?
Could I move the iPad up and down or side to side to extrude or lathe sculpted shapes in space in front of me?
Could it spin and become a potter’s wheel with the detailed resolution hand detection of the Vision Pro picking up the slightest changes to give fine control to what I’m shaping.
Is Patrick Swayze over my shoulder?

Maybe it’s something much more throw-away and playful – like using the iPad as an extremely expensive version of a deformed wire coat-hanger to create streams of beautiful, iridescent bubbles as you drag it through the air – but perhaps capturing rare butterflies or fairies in them as you while away the hours atop Machu Picchu or somewhere similar where it would be frowned up to spill washing-up liquid so frivolously…

Of course this interaction owes more than a little debt to a previous iPad project I saw get made first hand, namely BERG’s iPad Light-painting

Although my only real involvement in that project was as a photographic model…

Pencils, Pads, Platforms, Pots, Platens, Plinths
Perhaps there is an interesting little more general, sober, useful pattern in these sketches – of horizontal virtual/real crossover ‘plates’ for making, examining and swapping between embodied creation with pencil/iPad and spatial examination and play with the Vision Pro.
I could imagine pinching something from the vertical display windows ion Vision Pro to place onto my ipad (or even my watch?) in order to keep it, edit it, change something about it – before casting it back into the simulated spatial reality of the Vision Pro.

Perhaps it allows for a relationship between two realms that feels more embodied and ‘real’ without having to leave the sofa.
Perhaps it also allows for less ‘real’ but more fun stuff to happen in the world of the Vision Pro (which in the demo seems doggedly to anchor on ‘real’ experience verissimilitude – sport, travel, family, pop concerts)
Perhaps my Apple watch can be more of a Ben 10 supercontroller – changing into a dynamic UI to the environment I’m entering, much like it changes automatically when I go swimming with it and dive under…
Anyway – was very much worth doing the demo, I’d recommend it, if only for some quick stretching (and sketching) of the mindlegs.

All in all I wish the Vision Pro was just *weirder*.
Back when it came out in the US in February I did some more sketches in reaction to that thought… I can’t wait to see something like a bonkers Gondry video created just for the Vision Pro…


Until then…