Among the 12 press releases of the “base” Apple Watch models (original Apple Watch through the Series 11), 7 of the 12 have mentioned health and/or fitness features in the headline (or subhead). The Apple Watch Series 11 press release leads with “groundbreaking health insights” and specified hypertension notifications and sleep score as its primary new features along with battery improvements and a more scratch-resistant display.
Stan Ng, Apple’s vice president of Apple Watch and Health Product Marketing, said in the press release that “Apple Watch is the world’s most popular watch, using advanced sensing capabilities to empower millions of people around the world to better understand their health simply by wearing it, while also serving as a fitness coach, message center, mobile wallet, and beautiful timepiece.” He added that the “Apple Watch Series 11 is an indispensable companion that supports users’ health, fitness, safety, and connectivity throughout the day and night.”
The durability of the screen is described in more detail, noting that it is “Made from a unique Ion-X (ion-exchanged strengthened) glass…treated with a breakthrough Apple-designed ceramic coating that bonds to the glass at an atomic level through a physical vapor deposition process, significantly hardening the surface.”
The Apple Watch Series 11 was released with watchOS 26, using the Liquid Glass design also available on all Apple devices as of late 2025. Some of the new features added:
2 new watch faces (Flow and Exactograph)
A new one-handed wrist flick gesture to easily dismiss notifications
Live Translation in Messages with Apple Intelligence
A Watch version of the Notes app (finally!!)
The Aluminum case options for Apple Watch Series 11 were available in Rose Gold, Silver, Jet Black, and a new Space Gray option. Titanium case options included Gold, Natural, and Slate.
The 46 mm model measured 46mm x 39 mm and was 9.7 mm thick. The aluminum/GPS model weighed 37.8 grams.
The screen was 416 x 496 pixels at 326 pixels per inch with an always‑On Retina display with wide‑angle OLEDs. It could display up to 2000 nits peak brightness down to 1 nit minimum brightness.
The Series 11 had an impressive list of sensors:
Electrical heart sensor
Third‑generation optical heart sensor
Blood oxygen sensor
Temperature sensor
Compass
Always‑on altimeter
High‑g accelerometer
High dynamic range gyroscope
Ambient light sensor
Depth gauge to 6 meters
Water temperature sensor
Connectivity options included support for 2.4GHz and 5GHz Wi-Fi networks with Wi-Fi 4 (802.11n); Bluetooth 5.3; GPS systems including L1, GLONASS, Galileo, QZSS, and BeiDou. It used a second-generation Ultra Wideband chip.
This example is in the Aluminum Space Gray option in the 46 mm size. It shipped with a black M/L Sport Band.
On March 4, 2025, when Apple introduced the iPad Air with the M3 chip, they also announced the iPad Generation 11 with an A16 chip. The entire press release for this iPad device release amounted to one paragraph:
“Apple today also updated iPad with double the starting storage and the A16 chip, bringing even more value to customers. The A16 chip provides a jump in performance for everyday tasks and experiences in iPadOS, while still providing all-day battery life. Compared to the previous generation, the updated iPad with A16 is nearly 30 percent faster. In fact, compared to iPad with A13 Bionic, users will see up to a 50 percent improvement in overall performance, and A16 makes the updated iPad up to 6x faster than the best-selling Android tablet.”
This iPad update was released soon after the announcement of Apple Intelligence. While Apple was touting Apple Intelligence and updating its devices to take advantage of the new features, the iPad Generation 11 was given a chip that did not allow Apple Intelligence. While this updated iPad is faster than the previous Generation 10 model, it does not meet the minimum specification of the A17 Pro chip to allow Apple Intelligence compatibility. Apple provided no explanation for this omission.
One notable upgrade was in storage on the A16 iPad. The iPad Generation 11 doubled its base storage from 64 to 128GB and was also available in 256 and 512GB configurations.
Except for the A16 chip and upgraded storage, this iPad was nearly identical to the iPad Generation 10.
The iPad Generation 11 measured 9.79 x 7.07 inches, and was 0.28 inch thick. It weighed 1.05 pounds (Wi-Fi model). It had stereo speakers, a front camera and microphone centered on the “long”landscape” (long) side to enhance the FaceTime experience, and included both USB-C and a Smart Connector.
Its Liquid Retina display was 2360 x 1640 pixels at 264 ppi and delivered up to 500 nits brightness. It supported the Apple Pencil (USB-C) and also the Apple Pencil Generation 1 (although clumsily since the original Apple Pencil used a Lightning connector for charging).
The back camera was 12MP with digital zoom up to 5x supporting Smart HDR 4, geotagging, auto image stabilization, and burst mode. It could capture video up to 4K. The similar front camera was also 12MP and supported Center Stage in Landscape mode. It supported Smart HDR 4, but only could record 1080p HD video.
The iPad Generation 11 had 5 sensors, including Touch ID, 3‐axis gyro, accelerometer, barometer, and an ambient light sensor.
Apple described the original Apple Vision Pro as “a revolutionary spatial computer that seamlessly blends digital content with the physical world, while allowing users to stay present and connected to others.” While other companies at the time were producing “Augmented Reality” and “Virtual Reality” headsets and glasses, Apple chose to forego the AR/VR descriptions completely use the term “Spatial Computing.”
Apple did not invent the term or concept of spatial computing. The term “Spatial Computing” in the context used by Apple Vision Pro is attributed to MIT researcher Simon Greenwold and is the title of a paper he wrote on 2003. Greenwold’s paper defined the term as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” He added, “Ideally, these real objects and spaces have prior significance to the user.”
Apple does, however, claim that they created the world’s first spatial operating system, visionOS. Apple Vision Pro works with visionOS to allow “users interact with digital content in a way that feels like it is physically present in their space.” A FastCompany article explains the differences among AR, VR, and spatial computing by noting that the Vision Pro has:
“12 cameras and five sensors that help the device know everything from the level of light in a physical space to where objects are in relation to each other, to where your eyes are looking, and how your hands move… In spatial computing, you can interact with those virtual objects by simply using your hands in the physical space in front of you.”
By contrast, in virtual reality “you are completely immersed in a virtual world and can see none of the real world around you,” while augmented reality “displays virtual elements on top of the real world.” The three terms are related because spatial computing uses elements from both AR and VR.
Apple described the “breakthrough design” of the Vision Pro as featuring “an ultra-high-resolution display system that packs 23 million pixels across two displays, and custom Apple silicon in a unique dual-chip design to ensure every experience feels like it’s taking place in front of the user’s eyes in real time.” Mike Rockwell, Apple’s Vice President of the Technology Development Group said that “through a tight integration of hardware and software, we designed a standalone spatial computer in a compact wearable form factor that is the most advanced personal electronics device ever.”
The Apple Vision Pro “can transform any space into a personal movie theater with a screen that feels 100 feet wide.” Internally, the seamless display is accomplished by delivering “more pixels than a 4K display” to each eye.
To add to the visual realism, a new Spatial Audio system is also part of the Apple Vision Pro that Apple called “audio pods.” Apple describes the sound system:
“Dual-driver audio pods positioned next to each ear deliver personalized sound while letting you hear what’s around you. Spatial Audio makes sounds feel like they’re coming from your surroundings. Audio ray tracing analyzes your room’s acoustic properties to adapt and match sound to your space.” I have observed that first-time Vision Pro users are often surprised by the audio experience delivered by the audio pods and ask if others around them can hear the audio. (Others in the room can faintly hear the audio at a low volume level, even if the Vision Pro user has the volume at maximum.)
The Apple Vision Pro is also packed with cameras and sensors that all work together to deliver the overall experience, including:
2 high‑resolution main cameras
6 world‑facing tracking cameras
4 internal eye‑tracking cameras
TrueDepth camera
LiDAR Scanner
4 inertial measurement units (IMUs)
Flicker sensor
Ambient light sensor
Apple described the sensor functionality: “high-resolution cameras transmit over one billion pixels per second to the displays so you can see the world around you clearly. The system also helps deliver precise head and hand tracking and real‑time 3D mapping, all while understanding your hand gestures from a wide range of positions.” Similar to an augmented reality experience, Vision Pro users see the world through live “passthrough” video, and not through a transparent lens.
The original Apple Vision Pro was powered by two chips. Apple’s M2 chip provided an 8‑core CPU with 4 performance cores and 4 efficiency cores, a 10‑core GPU, a 16‑core Neural Engine, and 16 GB unified memory. The Apple R1 chip allowed 12‑millisecond photon‑to‑photon latency using 256 GB/s memory bandwidth.
In addition to the sensor cameras, the Apple Vision Pro could capture Spatial photos and video using the company’s first stereoscopic 3D main camera system. The 18 mm cameras used a ƒ/2.00 aperture and could capture 6.5 stereo megapixels. Upon release of the Apple Vision Pro, the iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models could capture Spatial video using two cameras on each of those iPhone models (single-camera iPhones cannot capture Spatial video).
Inputs built in to the Apple Vision Pro included hand, eye, and voice. In addition, supported input accessories included keyboards, trackpads, game controllers, Bluetooth mouse support, and other third-party accessories such as the Logitech Muse pen (not released until 2025).
The Apple Vision Pro used a battery pack that delivered “up to 2 hours of general use” or up to 2.5 hours while watching videos. However, the device could also be used with the USB-C port plugged into power while charging the battery.
The Apple Vision Pro shipped with many accessories and custom-sized parts compared to Apple’s other devices. The following accessories were included with each Apple Vision Pro:
Light Seal
Light Seal Cushions (2 sizes)
Solo Knit Band
Dual Loop Band
Battery pack
Cover
30W USB-C Power Adapter
USB-C Charge Cable
Several of the parts and accessories that shipped with the Apple Vision Pro were impressive design innovations on their own, even if they were not often mentioned in reviews—or even by Apple. Some examples from my perspective included:
Light Seal—The light seal came in multiple sizes that were matched to the user through a custom app that scanned a user’s face to calculate the appropriate size. The light seal attached magnetically to the main body of the Apple Vision Pro.
Light Seal Cushions—The light seal cushion was also sized for the user and attached with magnets to the light seal to provide a custom fit so light would not “leak” into the space around the eyes.
Dual Loop Band and Solo Knit Band—The two bands that shipped each represented impressive engineering and design to fit the 22.9 ounce (1.43 pounds) device to the head and provide relative comfort and support during use. The Dual Loop Band provided a 2-strap system that supported the device around the back and over the top of the head with adjustable velcro closures. The Solo Knit Band was a single thicker band that was “3D knitted as a single piece to create a unique rib structure that provides cushioning, breathability, and stretch. It has an easy-to-reach Fit Dial to let you adjust Apple Vision Pro to your head and enables microadjustments during use.” I personally prefer the Solo Knit Band.
Further, the Solo Knit Band was noted by journalists and reviewers as looking fashionable, especially compared to the utilitarian straps provided by other AR/VR headsets. One 9to5Mac author noted, “I just think the Solo Knit Band looks cooler, and comfort just hasn’t been an issue for me.”
Cover—Even the lowly knit cover was an impressive piece of design in my opinion. The cover itself had knit edges, but allowed the Apple Vision Pro device to be effortlessly lowered into the accessory with a perfect fit that fully protected the glass front. Tabs on the edges also allowed it to be easily removed.
ZEISS Optical Inserts—For those of us who require vision correction and do not wear contact lenses, Prescription ZEISS Optical Inserts were available to be custom-made to an exact prescription. The inserts easily snapped in with magnets and were “recognized” by an Apple Vision Pro device by selecting the user’s account settings.
Although this entry is not intended as a review of the Apple Vision Pro, as a user I can attest that the device is extremely difficult to describe to someone who has not used it first-hand. In my experiences, the device and visionOS functioned seamlessly from the original visionOS through visionOS 2. In my Apple-user-experience lifetime (since the early-1980s), I have never experienced a more mature operating system for a brand new device—especially one with so many brand new user interface elements.
After a lifetime of keyboard typing, mouse clicking, and most recently touch-based interfaces, the Apple Vision Pro required a user to make the leap to looking at virtual interface elements (through eye tracking) and interacting though hand gestures (pinches, pulls, and a 2-hand pinch/pull motions). Having coached about 50 first-time users through using the Apple Vision Pro as of this writing, I have observed that every user was able to understand these UI paradigms within the first 5–10 minutes of using the device (most adapted more quickly).
Finally, I wrote a series of education-focused articles about my first impressions of the Apple Vision Pro after the device was first released. They are available on a separate blog at Blogger:
The iPad 2 represented a major update to the original iPad by allowing the iPad to begin its move from a content-consumption device to a content-creation device, mostly due to the addition of front and back cameras. Apple’s press release led with its subhead, “All New Design is Thinner, Lighter & Faster with FaceTime, Smart Covers & 10 Hour Battery.”
Like the original iPad, the iPad 2 was described as a “magical device for browsing the web, reading and sending email, enjoying photos, watching videos, listening to music, playing games, reading ebooks and much more.” The iPad 2 added “two cameras, a front-facing VGA camera for FaceTime and Photo Booth, and a rear-facing camera that captures 720p HD video, bringing the innovative FaceTime feature to iPad users for the first time.”
The iPad 2 had a silver aluminum back and was available with a white or black front. This example is white.
The iPad 2 had a 9.7-inch glossy LED backlit display (1024×768 at 132 ppi) and could run both iPhone and iPad-specific apps. It shipped with the A5 processor with storage options including 16, 32, or 64 GB. In addition to its front and rear cameras, it had 802.11a/b/g/n Wi-Fi support, an accelerometer, a three-axis gyroscope, an ambient light sensor, digital compass, a speaker and a built-in microphone. The iPad 2 was 33% thinner than the original iPad and weighed 1.33 pounds.
The iPad 2 was also released with the Smart Cover. The Smart Cover used magnets to attach and, when closed, automatically put the iPad 2 into Sleep mode, and would wake the iPad when opened.
On September 12, 2023, Apple released an updated Apple Watch Ultra model, the Apple Watch Ultra 2. Apple described the upgrade:
“Apple’s most rugged and capable watch is now even better with performance updates, a new double tap gesture, and carbon neutral options.”
The Apple Watch Ultra 2 was similar to the original Apple Watch Ultra, but added a “powerful new S9 SiP” processor (64 GB capacity), a brighter display (3000 nits), and other enhancements including “expanded altitude range, on-device Siri, Precision Finding for iPhone, and advanced capabilities for water adventures.” It also included a 4-core Neural Engine that could “process machine learning tasks up to twice as fast as the original Apple Watch Ultra.”
One year after the release of the Apple Watch Ultra 2, Apple introduced a black titanium option. The new color was released along with watchOS 11 on September 9, 2024:
“Apple today introduced Apple Watch Ultra 2 in a striking new black titanium finish, enhanced with features in watchOS 11 that make the most rugged and capable Apple Watch even better.”
Apple described the color and manufacturing process:
“The new black titanium finish for Apple Watch Ultra 2 is achieved with a custom blasting process, and the diamond-like carbon physical vapor deposition coating over the grade 5 titanium makes it scratch-resistant and durable. The back crystal is made from a matching, dark zirconia.”
The new black color was also made available on the titanium hardware and other band materials:
“To complement the new black finish, the popular Trail Loop, Alpine Loop, and Ocean Band have all been updated to offer a black hardware option in addition to natural titanium. Both black and natural finishes of Apple Watch Ultra 2 are made from 95 percent recycled titanium, and are carbon neutral with any Titanium Milanese Loop, Trail Loop, or Alpine Loop.”
This Apple Watch Ultra 2 shipped with the new Black Trail Loop with black titanium hardware that matched the black titanium case of the watch.
The Apple Watch Ultra 2 was 49 mm high, 44 mm wide, and 14.4 mm deep. The display was 410 x 502 pixels (326 pixels per inch) with an always-on Retina LTPO2 OLED display in a flat sapphire crystal. The natural titanium version weighed 61.4 grams, and the black titanium version weighed 61.8 grams. The Apple Watch Ultra 2 contained 10 sensors, including an electrical heart sensor, optical heart sensor, temperature sensor, depth gauge (±1m accuracy), water temperature sensor, compass, always-on altimeter, high-G accelerometer, high dynamic range gyroscope, and an ambient light sensor.
The Apple Watch Ultra 2 also advanced Apple’s carbon neutral initiative “to be carbon neutral across its entire business, manufacturing supply chain, and product life cycle by 2030.”
The iPad Generation 6 was considered Apple’s “base” iPad when it was released on March 27, 2018. It was offered in Silver, Gold, and Space Gray. It was available in 32GB and 128GB configurations with Wi-Fi-only or with Wi-Fi+Cellular capabilities. This example is a 32GB Space Gray Wi-Fi-only model.
This iPad was announced in Chicago at an education-focused event at Lane Tech High School. An Apple Press Release stated:
“The new 9.7-inch iPad and Apple Pencil give users the ability to be even more creative and productive, from sketching ideas and jotting down handwritten notes to marking up screenshots. The new iPad is more versatile and capable than ever, features a large Retina display, the A10 Fusion chip and advanced sensors that help deliver immersive augmented reality, and provides unmatched portability, ease of use and all-day battery life.”
The iPad generation 6 used a 9.7-inch LED-backlit Multi-Touch Retina display at 2048 x 1536-pixel resolution (264 ppi). This iPad measured 9.4 inches (240 mm) x 6.6 inches (169.5 mm), and was 0.29 inch (7.5 mm) thick. It weighed 1.03 pounds (469 g). This iPad was powered by the A10 Fusion chip.
The back camera was 8 Megapixels with features such as Autofocus, Panorama (up to 43 megapixels), and HDR. The front FaceTime HD Camera was 1.2 Megapixels.
This iPad used five sensors including a 3-axis gyro, accelerometer, barometer, and an ambient light sensor. Its Home button included the Touch ID fingerprint identity sensor.
This was the first base-model iPad to support the Apple Pencil and the Logitech Crayon, and it originally shipped with iOS 12.