Apple has released the first developer tools for Vision Pro. With the tools, the visionOS developer environment and some of its features that were not shown in the introduction began to appear.
Apple rolled up its sleeves to jump into a whole new world with the Vision Pro mixed reality glasses, which it introduced on June 5. While the device is expected to be released next year, of course, whether Apple will be successful in this regard. first to developers, then to users will be connected.
Today, the first tools for developers have been released that will enable them to develop applications and games never seen before for users. Apple’s visionOS operating system software development kit (SDK), updated Xcode, Simulator and Reality Composer Pro tools published.
With the release of the first kit, details about the Vision Pro that we haven’t seen started to emerge:
With the release of the SDK and developer tools, developers and also began to roll up their sleeves. The first images from the simulators that show what we will see with Vision Pro on the computer screen were also shared.
These screenshots revealed the basics of the ecosystem that Apple has developed for Vision Pro. On top of what we saw in the promotional video, new details began to emerge.
First, let’s see what pure visionOS looks like:
Steve Moser from MacRumors and Ian Zelbo from 9to5Mac shared the visionOS environment in detail on the simulator. When users wear Vision Pro, they will now see the environment in their home environment like this:
Of course, to see in advance how the applications will look in different environments in the simulator. background can also be changed:
Currently, only the kitchen, museum and living room environments can be used, while 13 more environments are waiting in the operating system to be activated:
- Mount Haleakala
- Yosemite National Park
- Sky
- Spring Light
- Joshua Tree National Park
- Lake Vrangla
- Mount Hood
- Summer Light
- Autumn Light
- Moon
- Beach
- Snow
- Winter Light
In addition to the appearance of the operating system, some features began to show itself:
The only thing we saw with the SDK was not the environment presented in the simulator and the appearance of the base operating system. At the same time, developers can benefit from Vision Pro, not shown in the promotional video Some features were also revealed.
One of them “Visual SearchIt was a feature. Essentially similar to Visual Explore on iPhones and iPads, Vision Pro’s detecting and recognizing objects and objects will provide.
In addition, thanks to this feature, real-world texts can be copied and pasted into any running application. At the same time, these texts It will be translated in real time in 17 different languages. We can think of it like the translation feature in Google’s Lens app.
We can expect new unknown features to emerge as developers spend more time in visionOS and SDK.
RELATED NEWS
iPhone 15 will have a chip specific to Apple Vision Pro
RELATED NEWS