Apple Vision Pro (M2, 512 GB, 2024)

Apple described the original Apple Vision Pro as “a revolutionary spatial computer that seamlessly blends digital content with the physical world, while allowing users to stay present and connected to others.” While other companies at the time were producing “Augmented Reality” and “Virtual Reality” headsets and glasses, Apple chose to forego the AR/VR descriptions completely use the term “Spatial Computing.”

Apple did not invent the term or concept of spatial computing. The term “Spatial Computing” in the context used by Apple Vision Pro is attributed to MIT researcher Simon Greenwold and is the title of a paper he wrote on 2003. Greenwold’s paper defined the term as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” He added, “Ideally, these real objects and spaces have prior significance to the user.”

Apple does, however, claim that they created the world’s first spatial operating system, visionOS. Apple Vision Pro works with visionOS to allow “users interact with digital content in a way that feels like it is physically present in their space.” A FastCompany article explains the differences among AR, VR, and spatial computing by noting that the Vision Pro has:

“12 cameras and five sensors that help the device know everything from the level of light in a physical space to where objects are in relation to each other, to where your eyes are looking, and how your hands move… In spatial computing, you can interact with those virtual objects by simply using your hands in the physical space in front of you.”

By contrast, in virtual reality “you are completely immersed in a virtual world and can see none of the real world around you,” while augmented reality “displays virtual elements on top of the real world.” The three terms are related because spatial computing uses elements from both AR and VR.

Apple described the “breakthrough design” of the Vision Pro as featuring “an ultra-high-resolution display system that packs 23 million pixels across two displays, and custom Apple silicon in a unique dual-chip design to ensure every experience feels like it’s taking place in front of the user’s eyes in real time.” Mike Rockwell, Apple’s Vice President of the Technology Development Group said that “through a tight integration of hardware and software, we designed a standalone spatial computer in a compact wearable form factor that is the most advanced personal electronics device ever.”

The Apple Vision Pro “can transform any space into a personal movie theater with a screen that feels 100 feet wide.” Internally, the seamless display is accomplished by delivering “more pixels than a 4K display” to each eye.

To add to the visual realism, a new Spatial Audio system is also part of the Apple Vision Pro that Apple called “audio pods.” Apple describes the sound system:

“Dual-driver audio pods positioned next to each ear deliver personalized sound while letting you hear what’s around you. Spatial Audio makes sounds feel like they’re coming from your surroundings. Audio ray tracing analyzes your room’s acoustic properties to adapt and match sound to your space.” I have observed that first-time Vision Pro users are often surprised by the audio experience delivered by the audio pods and ask if others around them can hear the audio. (Others in the room can faintly hear the audio at a low volume level, even if the Vision Pro user has the volume at maximum.)

The Apple Vision Pro is also packed with cameras and sensors that all work together to deliver the overall experience, including:

  • 2 high‑resolution main cameras
  • 6 world‑facing tracking cameras
  • 4 internal eye‑tracking cameras
  • TrueDepth camera
  • LiDAR Scanner
  • 4 inertial measurement units (IMUs)
  • Flicker sensor
  • Ambient light sensor

Apple described the sensor functionality: “high-resolution cameras transmit over one billion pixels per second to the displays so you can see the world around you clearly. The system also helps deliver precise head and hand tracking and real‑time 3D mapping, all while understanding your hand gestures from a wide range of positions.” Similar to an augmented reality experience, Vision Pro users see the world through live “passthrough” video, and not through a transparent lens.

The original Apple Vision Pro was powered by two chips. Apple’s M2 chip provided an 8‑core CPU with 4 performance cores and 4 efficiency cores, a 10‑core GPU, a 16‑core Neural Engine, and 16 GB unified memory. The Apple R1 chip allowed 12‑millisecond photon‑to‑photon latency using 256 GB/s memory bandwidth.

In addition to the sensor cameras, the Apple Vision Pro could capture Spatial photos and video using the company’s first stereoscopic 3D main camera system. The 18 mm cameras used a ƒ/2.00 aperture and could capture 6.5 stereo megapixels. Upon release of the Apple Vision Pro, the iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models could capture Spatial video using two cameras on each of those iPhone models (single-camera iPhones cannot capture Spatial video).

Inputs built in to the Apple Vision Pro included hand, eye, and voice. In addition, supported input accessories included keyboards, trackpads, game controllers, Bluetooth mouse support, and other third-party accessories such as the Logitech Muse pen (not released until 2025).

The Apple Vision Pro used a battery pack that delivered “up to 2 hours of general use” or up to 2.5 hours while watching videos. However, the device could also be used with the USB-C port plugged into power while charging the battery.

The Apple Vision Pro shipped with many accessories and custom-sized parts compared to Apple’s other devices. The following accessories were included with each Apple Vision Pro:

  • Light Seal
  • Light Seal Cushions (2 sizes)
  • Solo Knit Band
  • Dual Loop Band
  • Battery pack
  • Cover
  • 30W USB-C Power Adapter
  • USB-C Charge Cable

Several of the parts and accessories that shipped with the Apple Vision Pro were impressive design innovations on their own, even if they were not often mentioned in reviews—or even by Apple. Some examples from my perspective included:

Light Seal—The light seal came in multiple sizes that were matched to the user through a custom app that scanned a user’s face to calculate the appropriate size. The light seal attached magnetically to the main body of the Apple Vision Pro.

Light Seal Cushions—The light seal cushion was also sized for the user and attached with magnets to the light seal to provide a custom fit so light would not “leak” into the space around the eyes.

Dual Loop Band and Solo Knit Band—The two bands that shipped each represented impressive engineering and design to fit the 22.9 ounce (1.43 pounds) device to the head and provide relative comfort and support during use. The Dual Loop Band provided a 2-strap system that supported the device around the back and over the top of the head with adjustable velcro closures. The Solo Knit Band was a single thicker band that was “3D knitted as a single piece to create a unique rib structure that provides cushioning, breathability, and stretch. It has an easy-to-reach Fit Dial to let you adjust Apple Vision Pro to your head and enables microadjustments during use.” I personally prefer the Solo Knit Band.

Further, the Solo Knit Band was noted by journalists and reviewers as looking fashionable, especially compared to the utilitarian straps provided by other AR/VR headsets. One 9to5Mac author noted, “I just think the Solo Knit Band looks cooler, and comfort just hasn’t been an issue for me.”

Cover—Even the lowly knit cover was an impressive piece of design in my opinion. The cover itself had knit edges, but allowed the Apple Vision Pro device to be effortlessly lowered into the accessory with a perfect fit that fully protected the glass front. Tabs on the edges also allowed it to be easily removed.

ZEISS Optical Inserts—For those of us who require vision correction and do not wear contact lenses, Prescription ZEISS Optical Inserts were available to be custom-made to an exact prescription. The inserts easily snapped in with magnets and were “recognized” by an Apple Vision Pro device by selecting the user’s account settings.

Although this entry is not intended as a review of the Apple Vision Pro, as a user I can attest that the device is extremely difficult to describe to someone who has not used it first-hand. In my experiences, the device and visionOS functioned seamlessly from the original visionOS through visionOS 2. In my Apple-user-experience lifetime (since the early-1980s), I have never experienced a more mature operating system for a brand new device—especially one with so many brand new user interface elements.

After a lifetime of keyboard typing, mouse clicking, and most recently touch-based interfaces, the Apple Vision Pro required a user to make the leap to looking at virtual interface elements (through eye tracking) and interacting though hand gestures (pinches, pulls, and a 2-hand pinch/pull motions). Having coached about 50 first-time users through using the Apple Vision Pro as of this writing, I have observed that every user was able to understand these UI paradigms within the first 5–10 minutes of using the device (most adapted more quickly).

Finally, I wrote a series of education-focused articles about my first impressions of the Apple Vision Pro after the device was first released. They are available on a separate blog at Blogger:

Sources: Simon Greenwold (2003), FastCompany (2024), Apple (product page, Newsroom, Tech Specs, Solo Knit Band, gestures and controls), 9to5Mac

iPhone X (silver, 2017)

The iPhone X was introduced ten years after the original iPhone and was described by Apple as “the future of the smartphone.” The iPhone used “X” in its name, pronounced “ten,” as a nod to Mac OS X—which also used the Roman numeral X and marked a major milestone in the evolution of the Mac operating system.

The iPhone X was announced on September 12, 2017, at the same time as the lower-cost iPhone 8, Apple’s base iPhone at the time. Somewhat curiously, Apple skipped the iPhone 9 model and continued naming its iPhone models after the iPhone X with typical numerals.

The iPhone X introduced many firsts, including:

  • It was the first iPhone to use “a gorgeous all-glass design with a beautiful 5.8-inch Super Retina display,” removing the Home button and replacing it with a swipe-up from the bottom to unlock.
  • The iPhone X was the first iPhone with an “all-screen” display. It used the “first OLED panel that rises to the standards of iPhone…for a more natural, paper-like viewing experience.”
  • The iPhone X was the first to use FaceID to unlock, authenticate, and make payments. This technology was enabled by a “TrueDepth camera” that was “made up of a dot projector, infrared camera and flood illuminator…powered by A11 Bionic to accurately map and recognize a face.”
  • The TrueDepth camera also allowed the iPhone X to bring “emoji to life in a fun new way with Animoji.” The camera “captures and analyzes over 50 different facial muscle movements, then animates those expressions in a dozen different Animoji, including a panda, unicorn and robot.”
  • The iPhone X was the first iPhone to offer wireless charging using the Qi standard. “The glass back design enables a world-class wireless charging solution.”
  • This iPhone introduced a “notch” design at the top-center to allow the display to stretch “edge-to-edge” and allow a place for the front camera system. The design choice was polarizing. The Verge wrote that “There’s a mix of surprise, sarcasm, and intrigue that Apple has chosen to go with a screen layout that leads to design compromises,” and added the oft-repeated speculation that “Steve Jobs would have never let that happen.”

The iPhone X was available in two colors, silver and space gray, and offered 64GB and 256GB storage options. This example is silver. The sides of the phone were described as “surgical-grade stainless steel [that] seamlessly wraps around and reinforces iPhone X.”

The Super Retina HD display was 5.8-inches diagonal at 2436 x 1125 resolution (458ppi). The device measured 5.65 inches (143.6 mm) high x 2.79 inches (70.9 mm) wide x 0.30 inch (7.7 mm) deep, and weighed 6.14 ounces (174 grams). Its A11 Bionic chip included a Neural engine that enabled artificial intelligence machine learning.

The iPhone X camera system featured a 6‑element lens with 12 Megapixel wide-angle and telephoto cameras. Portrait mode on the iPhone X introduced Portrait Lighting (listed as a “beta” feature in specifications). Other camera features included panorama (up to 63MP), autofocus, tap to focus, auto HDR (photos), auto image stabilization, burst mode, and geotagging. It could record video at 4K (24, 30, or 60fps), 1080p HD (30 or 60fps), or 720p HD (30fps) with features including optical image stabilization, slo‑mo video (1080p at 120 or 240 fps), cinematic video stabilization (1080p and 720p), and continuous autofocus. The front TrueDepth camera offered 7 Megapixel resolution, portrait mode, Portrait Lighting (beta), Animoji, and recorded video at 1080p HD.

The iPhone X included 6 sensors, including Face ID, barometer, 3-axis gyro, accelerometer, proximity sensor, and an ambient light sensor.

Like previous iPhone models, the iPhone X included a set of custom wallpapers, two of which were featured on the product’s packaging and prominently in advertisements. 9to5Mac reported that Spanish artist Ana Montiel created the art that inspired the iPhone X wallpaper set:

“‘Fields’ is the title of Montiel’s series of paintings and exhibit that explore ‘altered states of consciousness as vehicles to go beyond the easily perceived.’ The original digital paintings were transferred to canvas and museum quality prints, and the styling came to life this past fall when Apple introduced the iPhone X with three new live wallpapers…”

The Montiel work that most closely represents one of her original works was used on the Space Gray iPhone X packaging, titled “FIELDS 9 : Tactile Irreality” (2017), an archival pigment print measuring 100x70cm. I am honored to own one of Montiel’s original prints. The iPhone X version of FIELDS 9 uses an aspect ratio to fit the iPhone screen, and it is flipped upside-down from the original, presumably to allow the time and date to be optimally displayed on the iPhone. I have opted to hang it in its original format.

Sources: Apple (Newsroom, Tech Specs), The Verge, 9to5Mac, Ana Montiel

iPhone XR (Yellow, 2018)

Apple’s press release for the iPhone XR led with, “Featuring A12 Bionic Chip, 6.1-Inch Liquid Retina Display, Aluminum and Glass Design in Six Beautiful Finishes, Face ID and Advanced Camera System.”

The iPhone XR, pronounced “ten-R” was released along with the iPhone XS and XS Max. According to MacRumors, the XR model shares hardware with the XS models, but features were removed and/or downgraded to reduce the price of the XR.

The six available colors included (PRODUCT)RED, Yellow, White, Coral, Black, and Blue. The iPhone XR was available in 64GB, 128GB, and 256GB capacities. It measured 2.98 inches (75.7 mm) x 5.94 inches (150.9 mm), and was 0.33 inch (8.3 mm) thick. It weighed 6.84 ounces (194 grams).

The Liquid Retina HD display measured 6.1 inches (diagonal) with 1792 x 828-pixel resolution (at 326 ppi). The iPhone XR was powered by the A12 Bionic chip with a second-generation Neural Engine.

The primary back camera was 12 Megapixels with up to 5x digital zoom. Its features included Portrait mode with advanced bokeh and depth control, portrait lighting (Natural, Studio, Contour), optical image stabilization, panorama (up to 63MP), autofocus, and smart HDR. It could record video at up to 4K video (at 24 fps, 30 fps, or 60 fps). The front had a TrueDepth Camera at 7 Megapixels with features including Portrait mode with advanced bokeh and depth control, portrait lighting (Natural, Studio, Contour, Stage, Stage Mono, High-Key Mono), and allowed the use of Animoji and Memoji.

Its six sensors included Face ID, barometer, 3‑axis gyro, accelerometer, proximity sensor, and ambient light sensor. It shipped with iOS 14 and it included a USB-C to Lightning cable.

This example is yellow and includes Apple’s iPhone XR Clear Case.

Sources: Apple (Newsroom, Tech Specs), MacRumors

iPhone XR ((PRODUCT)RED, 2018)

Apple’s press release for the iPhone XR led with, “Featuring A12 Bionic Chip, 6.1-Inch Liquid Retina Display, Aluminum and Glass Design in Six Beautiful Finishes, Face ID and Advanced Camera System”

The iPhone XR, pronounced “ten-R” was released along with the iPhone XS and XS Max. According to MacRumors, the XR model shares hardware with the XS models, but features were removed and/or downgraded to reduce the price of the XR.

The six available colors included (PRODUCT)RED, Yellow, White, Coral, Black, and Blue. The iPhone XR was available in 64GB, 128GB, and 256GB capacities. It measured 2.98 inches (75.7 mm) x 5.94 inches (150.9 mm), and was 0.33 inch (8.3 mm) thick. It weighed 6.84 ounces (194 grams). This is a (PRODUCT)RED model with 64GB of storage.

The Liquid Retina HD display measured 6.1 inches (diagonal) with 1792 x 828-pixel resolution (at 326ppi). The iPhone XR was powered by the A12 Bionic chip with a second-generation Neural Engine.

The primary back camera was 12 Megapixels with up to 5x digital zoom. Its features included Portrait mode with advanced bokeh and Depth Control, Portrait Lighting (Natural, Studio, Contour), Optical image stabilization, Panorama (up to 63MP), Autofocus, and Smart HDR. It could record video at up to 4K video (at 24 fps, 30 fps, or 60 fps). The front had a TrueDepth Camera at 7 Megapixels with features including Portrait mode with advanced bokeh and Depth Control, Portrait Lighting (Natural, Studio, Contour, Stage, Stage Mono, High-Key Mono), and allowed the use of Animoji and Memoji.

Its six sensors included Face ID, barometer, 3‑axis gyro, accelerometer, proximity sensor, and ambient light sensor. It originally shipped with iOS 14, and it included a USB-C to Lightning Cable.

Sources: Apple (Newsroom, Tech Specs), MacRumors

iPhone 14 Pro (Deep Purple, 2022)

The iPhone 14 Pro was announced on September 7, 2022; began pre-orders on Friday, September 9, 2022; and was available beginning Friday, September 16, 2022. Apple’s website led with the following description of the iPhone 14 Pro:

“A magical new way to interact with iPhone. Groundbreaking safety features designed to save lives. An innovative 48MP camera for mind-blowing detail. All powered by the ultimate smartphone chip.”

The primary new technologies used in the iPhone 14 Pro included: “Always-On display, the first-ever 48MP camera on iPhone, Crash Detection, Emergency SOS via satellite, and an innovative new way to receive notifications and activities with the Dynamic Island.”

The four colors available at release were deep purple, silver, gold, and space black. The iPhone 14 Pro had a 6.1-inch “Super Retina XDR display with ProMotion” with an Always-On display (for the first time on an iPhone) that used a 1Hz refresh rate with power-efficient technologies. In practice, the Always-On display faded to a dim/dark version of the Wallpaper and allowed the time and up to four widgets to show (a widget above the time and up to 3 below the time). Other “Live Activities” showed in the bottom two-thirds of the Lock screen, including alerts and play/pause options for media.

The iPhone 14 Pro also delivered “the highest outdoor peak brightness in a smartphone: up to 2000 nits, which is twice as bright as iPhone 13 Pro.”

The Dynamic Island was also introduced in the iPhone 14 Pro. The design of this iPhone removed the “notch” that had been used since the iPhone X and moved the functions slightly lower into a pill shape. Apple described the Dynamic Island system as one “that blends the line between hardware and software, adapting in real time to show important alerts, notifications, and activities. With the introduction of the Dynamic Island, the TrueDepth camera has been redesigned to take up less of the display area.”

Apple continued, “Without impeding content on the screen, the Dynamic Island maintains an active state to allow users easier access to controls with a simple tap-and-hold. Ongoing background activities like Maps, Music, or a timer remain visible and interactive, and third-party apps in iOS 16 that provide information like sports scores and ride-sharing with Live Activities can take advantage of the Dynamic Island.”

The iPhone 14 Pro camera system added a 2x camera (in addition to the 0.5x, 1x, and 3x options on the iPhone 13 Pro). The iPhone 14 Pro also offered a new “48MP Main camera with a quad-pixel sensor that adapts to the photo being captured, and features second-generation sensor-shift optical image stabilization.”

Other new camera features included a front TrueDepth camera with an ƒ/1.9 aperture for better low-light photos and video, adaptive True Tone flash with an array of nine LEDs, and Action mode for “incredibly smooth-looking video that adjusts to significant shakes, motion, and vibrations, even when video is being captured.”

All iPhone 14 models added Crash Detection that used a variety of built-in sensors (dual-core accelerometer, gyroscope, barometer, GPS, and microphone) to “detect a severe car crash and automatically dial emergency services when a user is unconscious or unable to reach their iPhone.” Additionally, Emergency SOS via satellite was added, “which combines custom components…to allow antennas to connect directly to a satellite, enabling messaging with emergency services when outside of cellular or Wi-Fi coverage.”

The iPhone 14 Pro models are powered by the A16 Bionic chip that includes two high-performance cores and four high-efficiency cores, an accelerated 5-core GPU with 50% more memory bandwidth, and a new 16-core Neural Engine capable of nearly 17 trillion operations per second.

Many of the new features of the iPhone 14 Pro were enabled by iOS 16, released along with the entire iPhone 14 line.

This iPhone 14 Pro example is Deep Purple.

Sources: Apple (iPhone 14 Pro, Newsroom)