EasyAR realizes spatial computing. Sightplus AR will release the most powerful EasyAR engine ever
Recently, the Augmented Reality (AR) open platform Sightplus AR announced the release of EasyAR SDK 3.0 and EasyAR SDK 4.0 Beta in its WeChat official account. The new EasyAR SDK engine is launched with more than a dozen new features including AR cloud-based computing, surface tracking, etc and will bring a new AR experience with shared AR, persistence, occlusion, and collision.
Realize AR cloud space calculation, AR experience is now more realistic with the realization of AR cloud-based computing
According to the official introduction of EasyAR, EasyAR SDK 4.0 Beta is an upgraded version of EasyAR SDK 3.0. It brings the newest AR Cloud capability of Sightplus AR on the base of 3.0. Developers can experience new AR effects such as multi-person AR interaction, persistent AR experience, collision, and occlusion which are all brought by the latest ability of spatial computing of EasyAR. EasyAR will change the long-term situation that AR can't be persistent and can't integrate with the physical world. It also makes up for the inability of Cloud Anchors provided by Google’s ARCore in China.
In the EasyAR 4.0 Beta version, the AR cloud-based map function includes Sparse Spatial Map, Dense Spatial Map, and Motion Tracking.
Sparse Spatial Map provides the ability to simultaneously generates point cloud maps for real-time location while scanning physical spaces. Developers can efficiently create applications which are based on real-world space, such as AR manuals and AR navigation. The virtual content deployed on the point cloud map will also be persisted in the real space to realize the connection between the virtual world and the physical world. In addition, multi-person sharing functions are also implemented through this.
In order to achieve a more realistic AR experience with interactive collisions between virtual content and the physical world, EasyAR SDK 4.0 Beta also supports Dense Spatial Map with a real-time reconstruction of the environments, which enables the effect of collisions, occlusions, etc., to provide a more realistic AR experience without using a ToF camera.
Furthermore, Motion Tracking provides multi-sensor fusion to solve for the position and attitude, reducing the drift caused by camera motion and making virtual objects more stable in space. At the same time, it provides the relocation function so that the positioning can be restored after the tracking is lost. Apps that use motion tracking on both iOS and Android devices, can work independently without ARKit as well as installing ARCore services through the Google Services Framework for Android users.
Open source Unity plugin and support for Android AR glasses
For a long time, many developers have been developing applications with Unity. To make it easier for these developers and to support open source community development, EasyAR has decided to publish open source Unity3D plug-ins. The plug-ins include the use of efficient Unity rendering pipelines, the use of Unity's built-in Video Player and Android IL2CPP compiler interface, support for ARM64 compilation, etc. The plugin API is consistent with other languages.
With the accelerated development of AR smart glasses this year, many AR glasses have been applied at the B-end. EasyAR has recalibrated the image recognition tracking parameters to adapt Android glasses. so that developers can develop AR glasses applications with EasyAR SDK. According to the introductions, EasyAR SDK currently supports the Action One and EPSON BT-350 AR glasses, More AR glasses equipment such as Microsoft Hololens will be supported in later versions.
Further, EasyAR SDK 3.0 is updated with Surface Tracking which is specially designed for smaller AR interactive games, AR short video capture, and product placement display. This function provides the ability to compute and track any surface feature points. By eliminating the time needed to look for planes, it can achieve surface fitting and gesture tracing with faster speeds and enhance the user experience.
In addition, EasyAR 3.0 also supports external camera access. This function enables AR identification and tracking by receiving picture frames from an external cameral. The AR display will no longer have to depend on the internal cameras of cellphones. As long as the device can detect the external camera and obtain the video stream, it can be used to transfer the video stream into the picture frames to the EasyAR SDK and use them in your AR application, thus helping the EasyAR developer to develop applications for AR/VR/MR glasses, drones, and USB device.
Support access to an external algorithm
EasyAR's new API supports developers to access algorithms other than the EasyAR SDK's own algorithms (image recognition tracking, object recognition tracking, etc.), such as face recognition and gesture recognition, to provide the extension of more flexible capabilities. Further, there are more improvements in EasyAR 3.0. The new EasyAR SDK supports multi-threaded rendering and rendering APIs besides GLES2, which can deal with the impact of Apple's abandonment of GL. Redundancy functions including QR code scanning will also be removed in exchange to achieve smaller packages.
Of course, EasyAR SDK 3.0 and EasyAR SDK 4.0 Beta may have other features worthy of discussion, but they have not yet been fully exposed. In fact, EasyAR developers won’t have to wait too long since it is just about one month from the official release.
Comments