Develop XR with Oracle Cloud, Database on HoloLens

This is the first in a series on developing XR applications and experiences with Oracle. Specifically, I’ll show apps that do the following:

  • Oracle database and cloud technologies
  • Hololens 2 (Microsoft Mixed Reality Headset)
  • MRTK APIs (Mixed Reality Toolkit) (Version 2.7.2)
  • Unity Platform (v2021.1.20f) (leading software for creating and running interactive 3D content in real time)

Throughout the blog, I will be referring to a similar workshop video found at https://youtu.be/MBaQ8ohI80E.

Extended Reality (XR) and HoloLens

XR (Extended Reality) is the umbrella term for VR, AR, and MR. The fourth evolution, the metaverse (omniverse, etc.), or the network, basically refers to the extent or the other and the inevitable integration of the XR into everyday life, similar to today’s smartphones.

While the concepts described can be applied to one extent or another in different flavors of XR and hardware, the focus is on what will be the most common everyday use of the XR in the future: the ultimate presence of the XR eyewear. In this case, HoloLens is being used for development and demonstration, as it is the most advanced technology out there currently to that end. HoloLens provide holographic images and sounds spatially to the wearer/user who can interact with them via hands, speech and eye gaze. There is much more to HoloLens than this basic definition, of course, and much more to this space in general.

XR Hololens application(s) are developed using MRTK, which provides a wide range of cross-device mobile APIs. This is of course to varying degrees depending on the nature and capability of the device, but it is in line with Apple ARKit, Google ARCore, etc. It is also forward looking so that it is applicable to future devices (glasses, etc..) from other vendors like Apple and others.

Protests

I’ll start the series by showing you the XR version of the popular Oracle LiveLabs workshop “Simplify Microservices with Converged Oracle Database.” They illustrate a number of different areas of modern application development, as well as DevSecOps, including Kubernetes, microservices and related data patterns, spatial, maps, AI/ML/OML, observability (in particular, traceability), basic graphing, etc.

Future installments of this series will continue to provide examples and explain XR enabling of Oracle Database functions such as Graph, Internet of Things, Event Mesh, Sagas, ML, Unified Observability, Chaos Testing, and more. Additionally, future installments will address industry use cases as well as Oracle AI cloud offerings such as computer vision, speech recognition, and text semantics that work in tandem with the database. At the same time, I will show more aspects and use cases of XR / MR with these services.

Video 1: “GrabDish” (online store / food delivery) front end

GrabDish screenshot 1

The microservice workshop and its XR version use the food delivery service app ‘GrabDish’ to demonstrate the mentioned concepts.

The video and this blog are split into two demos: I’ll list the Oracle technologies used, followed by the MRTK/Hololens that you use. Unity is the development tool that brings these two together (Unreal Engine alternative).

  • Food options (sushi, burgers, pizza) are presented in 3D. These images are uploaded from the cloud object store via the Oracle database you are facing.
  • Users can select all objects either by hand, by voice/speech or by eye sight (by simply staring at them).
  • Users can then bring the object closer to them by making a hand gesture or saying “come to me” and once the image is close, it can be rotated by simply staring to one side or the other, or it can be grabbed and moved that way.
  • Once the food is selected, a suggestive sale of a drink is generated via AI/ML/OML (in the original workshop this is the food and wine pairing and in the XR version this is the tea pairing for the selected sushi).
  • The word “dwelling” refers to a prolonged gaze of the eye in a particular place. This becomes evident when the user looks at the tea and, as a result, hears the tea’s name spoken. This audio is also retrieved via the database.
  • The specified elements of the command are placed in a new MRTK resource called dock. This is a very useful structure (serves a container and a classification mechanism) that allows placing a number of items on it and resizing it to suit everyone.
  • Then the stay button is selected with a volumetric user interface to place the order. The command is entered into an Oracle database in JSON format and the appropriate repository is reduced using relational SQL. This shows two of the many data types supported by the converged Oracle database.
  • Another data model supported in the database is a spatial model. This is clarified when the “Submit an order” button is selected. The restaurant and delivery addresses (in this case, Rittenhouse Square in Philadelphia) are sent to a cloud-based Oracle spatial service powered by an Oracle database where the Routing API is used to geocode the driving path between the two addresses and return the result as GeoJSON. This is then fed to the Map API (Google, BingMaps, Mapbox, etc), where it is plotted on a 3D map. A car follows this ray trail between points in the 3D map in HoloLens. Again, this map can be manipulated by hand, speech, and eye sight.
  • These data types in an Oracle database can be accessed across any number of languages ​​and also via REST endpoints.

Video 2: “GrabDish” DevOps (Kubernetes, Health Sensors, Trace/OpenTelemetry, etc.)

Video 2

  • A 3D visual representation of a live Kubernetes cluster and related resources including namespaces, deployments, pods (provided/sent to by services), etc. , measurement, etc. in a relational and proportional manner.
  • The tags for these resources redirect themselves so they are always facing the user. Since HoloLens continuously maps the spatial network of the environment, the position of objects remains constant even across restarts. This can of course be extended to remote aspects and GPS and the network can be shared in dual digital and remote mode use cases (eg advanced remote assistance, etc.).
  • Each pod has a menu with various options/actions, including the ability to select and view the logs for that pod. Records can be read either by hand scrolling or simply by reading them as eye tracking is used to scroll and advance the page.
  • The original workshop is opened in a browser in order to set the health status declared by the ordering service to ‘down’. This is done to run Kubernetes health investigations to restart the service. This is reflected in the XR app by the capsule turning red and an audible alert about health status. The sound is also mapped spatially. The source is the capsule object so that if the capsule is across the room, the sound is coming from across the room.
  • Then the visual tracking is shown by placing an order. OpenTracing / OpenTelemetry is used to track the flow of information through this system. This is translated and mapped to the visual representation of the respective Kubernetes and database objects. Specific order and epic tags are defined for specific identification and tracking and these designations can be seen visually flowing through the node chart. This of course has greater potential in chart analytics.

Additional ideas

Please see my blogs for more information about Oracle converged database as well as various topics on microservices, monitoring, transaction processing, etc.

Also, feel free to contact me with any questions or suggestions for new blogs and videos as I am very open to suggestions. Thanks for reading and watching.

Thanks to the content, go to the wonderful Ruirui Hou for Chinese voice, Hiromu Kato for Japanese voice, Chaosmonger Studio for car drawing, Altaer-lite for sushi drawing, and Gian Marco Rizzo for hamburger drawing.

.

Leave a Comment