![collabora code http collabora code http](https://groupoffice.readthedocs.io/en/latest/_images/add-collabora-code-service.png)
![collabora code http collabora code http](https://www.collaboraoffice.com/wp-content/uploads/2017/05/partner_portal_img_2.png)
#Collabora code http full
With which we will be able to use the controller assets from the W3C Immersive Web WebXR Asset repository in an upcoming release.īlender's Suzanne model loaded from GLTF.įuture work will make the loader and renderer compatible with the full glTF Sample Model repository from Khronos and provide an asset pipeline from Blender to xrdesktop. We are able to parse and render GLTF 2.0.
![collabora code http collabora code http](https://blog.emrich-ebersheim.de/wp-content/uploads/2017/03/LibreOfficeOnline-2.0.5.png)
The GLTF loader is based on gthree, the GObject port of three.js. This work enables xrdesktop to use assets like VR controller models and 3D environments. Manas implemented loading and rendering of GLTF models, which included improving our renderer and adding rendering techniques like normal mapping. More details can be found on Remco's blog post. We will also be able to expose this keyboard as a stand-alone OpenXR application so it will be usable outside of xrdesktop.įuture work on this project includes implementing input methods for Chinese and other asian languages as well as prediction with suggestions as known from mobile phones and other on-screen keyboards. With Remco's work xrdesktop will be able to provide a virtual keyboard internally to xrdesktop, independent of the XR runtime it runs on. With our internal json format arbitary keyboards can be specified, so one could also replicate a 100% desktop keyboard with numpad or even keyboards that are not feasible in a physical form, like this Japanese keyboard with multiple hundred characters: Our current design focus was a simple keyboard with mode switches as seen in the GNOME Shell onscreen keyboard or on mobile phones. He specified a JSON format for describing keyboard layouts and an importer script that uses the Unicode CLDR Project definitions to achieve internalization. The advanced glyph support was achieved by using the Pango font rendering library. Supporting non-english languages and emojis required adding Unicode support to our desktop input synthesis library libinputsynth. Remco implemented such a virtual keyboard for xrdesktop with support for emojis and 56 languages. The OpenXR API currently does not expose any API for text input, and it is in the applications' responsibility to implement a text input method. In XR, users want to input text with methods like controllers or hand tracking without having to walk to their desk and use a physical keyboard. Kudos!Īs our tools of choice, C/GObject and Vulkan were used in both projects, maintaining our goal of keeping xrdesktop low level and thus providing a performant XR experience.Ĭontributions like these support our vision with projects like xrdesktop and Monado to provide a fully open source XR stack that enables complete control and independence for end users and product builders alike. Our students, Remco and Manas, were both able to finish their projects and submit merge requests. This summer, Christoph Haag and I had the pleasure of taking part in Google Summer of Code (GSoC) as mentors for xrdesktop, the Open Source project bringing the Linux desktop to VR on SteamVR & Monado.