Since the Apple Vision Pro isn’t set to launch until 2024, there’s a lot we don’t know about it. Apple gave us a rough overview of the hardware inside, but didn’t go into detail about the various components or specs.

Apple Vision Pro chips
What the headset weighs is unknown, for example, and there is no information on how many cameras and sensors are inside. We’ve compiled the limited information we already know about headphone hardware for those who are interested.


  • There is a 3D laminated glass screen attached to a curved aluminum alloy frame.
  • There is a thermal system that quietly moves air through the Vision Pro to provide performance and keep heat down.


  • There are two custom micro-OLED screens that deliver “more pixels than a 4K TV” per eye (23 million total). Apple says it’s about the size of a postage stamp.
  • There’s a three-element lens that makes the screen appear to be “everywhere you look.”
  • An external “EyeSight” display shows other people when you’re using a less engaging augmented reality mode by showing them your eyes, or letting them know that you’re completely immersed and unaware of your surroundings. This screen also provides a recording indicator when you are taking video with the camera.
  • There’s a 90Hz refresh rate, with a special 96Hz refresh rate available for 24fps video.
  • For those who wear glasses, Zeiss Optical Inserts can attach your prescription magnetically to the lenses inside the headset.


  • The speaker has 12 cameras and five sensors for monitoring hand gestures and mapping the environment. There have been rumors that the headset will also consider leg movements, but this is not something Apple has mentioned. Two of the cameras send over a billion pixels per second to the screen to clearly depict the world around you, while the other cameras are for head tracking, hand tracking, and real-time 3D mapping. There are infrared flood illuminators that improve hand tracking in low light conditions.
  • There is a camera that can take 3D photos and videos.
  • LiDAR depth sensors can determine the size and location of objects in the room around you, and they can even scan your face to create a digital avatar for use in FaceTime.
  • Inside the headset are four infrared cameras and LED lights. The lights shine invisible light patterns onto each eye, and the system uses iris scanning for authentication and also for accurate eye tracking so the headset can know where to look for navigation purposes.

Straps and bands

  • The audio belts on each side include speakers that support spatial audio. They are described as dual-motor acoustic capsules placed next to each ear, with the ability to analyze the acoustic characteristics of a room to adapt the sound to fit the space. Six microphones are included as well.
  • There is a detachable braided headband that has a convenient adjustment dial.
  • The magnetic light seal, which comes in several shapes and sizes, secures the headset to the face and blocks light.


  • The Digital Crown at the top right of the headset can bring up Home View when pressed, or control the immersion level when turned. The level of immersion can be adjusted when using “environments,” aka virtual reality scenes where you can do things like watch TV. You can choose to use an entirely fake environment that makes the screen larger than reality, or have the TV show you’re watching superimposed in your real room to minimize immersion.
  • The button on the top left can be used to capture spatial videos and spatial photos.


  • There is a single braided woven power cable that magnetically attaches to the left side of the headset. The cable can be connected to a power source or connected to an external battery for use on the go.
  • The headset appears to be charged with an external battery, so there is no need to tether while using it. The battery is designed to fit in a pocket.


  • There are two chips inside the ‌Apple Vision Pro‌. The M2, the same chip found in the Mac lineup, is the main chip that handles processing. The ‌M2‌ runs on VisionOS, implements computer vision algorithms, and provides graphical content.
  • A second chip, R1, is responsible for all information received from cameras, sensors, and microphones. It’s capable of streaming images to screens in 12 milliseconds, and Apple has said this chip is capable of providing a “lag-free” display to the world.

As we get closer to launch, we’ll likely learn more about the headset’s capabilities and details on the components Apple is using, but we may wait until it’s available to get a full teardown and sneak peek inside.

popular stories

The Brazilian electronics company revives its long-running iPhone trademark dispute

Apple has been involved in a long-running iPhone trademark dispute in Brazil, which today has been revived by IGB Electronica, a Brazilian consumer electronics company that originally registered the name “iPhone” in 2000. IGB Electronica fought a multi-year battle with Apple in an attempt to to obtain exclusive rights to the “iPhone” brand, but eventually lost, and now the case has been transferred to …