Apple rolled up its sleeves to jump into a whole new world with the Vision Pro mixed reality glasses, which it introduced on June 5. While the device is expected to be released next year, of course, whether Apple will be successful in this regard will depend primarily on the developers and then on the users.
Today, the first tools for developers have been released that will enable them to develop applications and games never seen before for users. Apple has released the software development kit (SDK) of the visionOS operating system, updated Xcode, Simulator and Reality Composer Pro tools.
With the release of the first kit, details about the Vision Pro that we haven’t seen started to emerge:
With the release of the SDK and developer tools, developers and also began to roll up their sleeves. The first images from the simulators that show what we will see with Vision Pro on the computer screen were also shared.
These screenshots revealed the basics of the ecosystem that Apple has developed for Vision Pro. On top of what we saw in the promotional video, new details began to emerge.
First, let’s see what pure visionOS looks like:
Steve Moser from MacRumors and Ian Zelbo from 9to5Mac shared the visionOS environment in detail on the simulator. When users wear Vision Pro, they will now see the environment in their home environment like this:
Of course, the background can also be changed in the simulator to preview how the applications will look in different environments:
Currently, only the kitchen, museum and living room environments can be used, while 13 more environments are waiting in the operating system to be activated:
- Mount Haleakala
- Yosemite National Park
- Sky
- Spring Light
- Joshua Tree National Park
- Lake Vrangla
- Mount Hood
- Summer Light
- Autumn Light
- Moon
- Beach
- Snow
- Winter Light
In addition to the appearance of the operating system, some features began to show itself:
The only thing we saw with the SDK was not the environment presented in the simulator and the appearance of the base operating system. At the same time, some features that were not shown in the promotional video that the developers could benefit from in Vision Pro were also revealed.
One of them was a feature called “Visual Search”. Basically similar to Visual Search on iPhones and iPads, the feature will allow Vision Pro to detect and recognize objects and objects in the environment.
In addition, thanks to this feature, real-world texts can be copied and pasted into any running application. At the same time, these texts will be translated in real time in 17 different languages. We can think of it like the translation feature in Google’s Lens app.
We can expect new unknown features to emerge as developers spend more time in visionOS and SDK.