Augmented Reality (AR) and Virtual Reality (VR) are getting more and more traction. Developers and marketers together are coming up with endless ways to bring what was once upon a time-limited to the computer world into our everyday reality.
IKEA Place application allows users to try furniture in their own house/apartment/room. It also has a very cool object search feature that helps shoppers find a piece of furniture that they might see in someone else’s house on the ikea.com.
Retailers across the work are exploring AR providing various “try before you buy” applications, within the store and outside of it.
Remember Snapchat or PokemonGo? Possibilities are endless…
Now, Sitecore developers and architects, marketers and content authors, imagine if you had an AR application and would be able to control from Sitecore what shows up in it and where. You would also be able to personalize AR experience.
The idea behind this experiment is simple. I wanted to figure out if it is possible to make Augmented Reality experience Sitecore driven. In the end, I was able to store and retrieve trained markers and models as well as define what should happen when a marker is recognized.
On the screenshot below you see the result of this experiment. My marker image was recognized, and a video loaded.
If you would like to try the below example yourself, you’ll need:
- Sitecore 9.1.1 or any other version of Sitecore
- Unity version 2018.3.12f1
- Wikitude (AR SDK)
- Visual Studio 2017
Markers
Wikitude Studio online was the tool that I use to create markers. 2D markers have .wtc extension that Sitecore doesn't have any issues with storing and serving.
In Wikitude Studio I uploaded two of my target images, both of them received three stars, which means that they have enough contrast and are suitable to be used as markers.
The icon with a little "w" on it allows you to request the trained marker to be sent to you over the email.
Sitecore
In Sitecore, I created the following structure for a two-target marker that I created.
The "Business" item represents a marker with two targets. The "Marker" field points to the marker file in the Media Library.
The child items have the following fields:
- Target Name - for determining which target in the marker was recognized
- Video - a video to load in the place of the marker target in AR
- Click Url - when the video has finished playing, it becomes clickable.
- Model (for 3D markers that is work-in-progress)
To make this data available to Unity and the mobile application, I created a web service that produces the following output:
Now to Unity...
Unity
In Unity, I created a new project and imported Wikitude SDK.
Lets look at the configurations for these GameObjects a little closer.
ImageTracker
In the ImageTracker Target Collection field you'll see the value of Wikitude/twotargets.wtc. It has to be set even if the actual target collection file comes from Sitecore.
On the bottom of the ImageTracker properties window, notice "Load from Sitecore" script.
This script was created in the Unity Assets folder.
Double-click on the script asset file to load it in Visual Studio (assuming you already have it installed). Here is the code for this script asset.
using System.Collections; using System.IO; using System.Linq; using System.Net; using UnityEngine; using UnityEngine.Video; using Wikitude; public class LoadFromSitecore : MonoBehaviour { private MarkerModelReference Marker; private GameObject Trackable; // Start is called before the first frame update void Start() { var serviceUrl = "http://ar.sitecore/api/sitecore/services/markers"; HttpWebRequest request = (HttpWebRequest)WebRequest.Create(serviceUrl); HttpWebResponse response = (HttpWebResponse)request.GetResponse(); StreamReader reader = new StreamReader(response.GetResponseStream()); string jsonResponse = reader.ReadToEnd(); Marker = JsonUtility.FromJson(jsonResponse); Marker.Targets = JsonHelper.GetObjectArray (jsonResponse, "Targets"); Debug.Log(Marker.Targets.Count()); GameObject trackerObjectNew = new GameObject("ImageTracker"); ImageTracker imageTrackerNew = trackerObjectNew.AddComponent (); Trackable = new GameObject("ImageTrackable"); ImageTrackable imageTrackable = Trackable.AddComponent (); imageTrackable.transform.SetParent(imageTrackerNew.transform, false); imageTrackable.OnImageRecognized.AddListener(OnImageRecognized); imageTrackable.OnImageLost.AddListener(OnImageLost); imageTrackerNew.TargetSourceType = TargetSourceType.TargetCollectionResource; imageTrackerNew.TargetCollectionResource = new TargetCollectionResource(); imageTrackerNew.TargetCollectionResource.UseCustomURL = true; imageTrackerNew.TargetCollectionResource.TargetPath = Marker.MarkerUrl; Debug.Log(Marker.Targets.Count()); imageTrackerNew.MaximumNumberOfConcurrentTrackableTargets = Marker.Targets.Count() > 5 ? 5 : Marker.Targets.Count(); Debug.Log(imageTrackerNew.TargetCollectionResource.TargetPath); } public void OnImageLost(ImageTarget recognizedTarget) { Debug.Log("OnImageLost"); Debug.Log(recognizedTarget.Drawable.gameObject); Destroy(recognizedTarget.Drawable.gameObject); } public void OnImageRecognized(ImageTarget recognizedTarget) { Debug.Log("OnImageRecognized"); if (Marker == null || Marker.Targets == null) { return; } Debug.Log(recognizedTarget.Name); var videoPlane = GameObject.CreatePrimitive(PrimitiveType.Plane); videoPlane.transform.SetParent(Trackable.transform, false); // Set the newAugmentation to be a child of the Drawable. videoPlane.transform.parent = recognizedTarget.Drawable.transform; // Position the augmentation relative to the Drawable by using the localPosition. videoPlane.transform.localPosition = Vector3.zero; videoPlane.transform.localScale = new Vector3(-1f, -1f, -1f); videoPlane.name = recognizedTarget.Name; var target = Marker.Targets.FirstOrDefault(t => t.TargetName == recognizedTarget.Name); if (target == null || string.IsNullOrWhiteSpace(target.VideoUrl)) { Debug.Log("target.ModelUrl is empty"); return; } StartCoroutine(playVideo(target.VideoUrl, videoPlane)); } IEnumerator playVideo(string url, GameObject videoPlane) { Debug.Log("playVideo: " + url); //Add VideoPlayer to the GameObject var videoPlayer = gameObject.AddComponent (); //Disable Play on Awake for both Video and Audio videoPlayer.playOnAwake = false; videoPlayer.source = VideoSource.Url; videoPlayer.loopPointReached += EndReached; videoPlayer.url = url; videoPlayer.name = videoPlane.name; videoPlayer.Prepare(); //videoPlayer.transform.SetParent(videoPlane.transform); //Wait until video is prepared while (!videoPlayer.isPrepared) { Debug.Log("Preparing Video"); yield return null; } Debug.Log("Done Preparing Video"); //Assign the Texture from Video to RawImage to be displayed var renderer = videoPlane.GetComponent (); renderer.material.mainTexture = videoPlayer.texture; //// Get the video width and height var videoWidth = videoPlayer.width; var videoHeight = videoPlayer.height; if (videoWidth > 0 && videoHeight > 0) { // Scale the video plane to match the video aspect ratio float aspect = videoHeight / (float)videoWidth; // Flip the plane as the video texture is mirrored on the horizontal videoPlane.transform.localScale = new Vector3(0.1f, 0.1f, 0.1f * aspect); } //Play Video videoPlayer.Play(); } void EndReached(VideoPlayer videoPlayer) { videoPlayer.Stop(); videoPlayer.SendMessageUpwards("VideoEnded", videoPlayer.name, SendMessageOptions.DontRequireReceiver); Debug.Log("video ended"); } void VideoEnded(string videoName) { Debug.Log("VideoEnded"); } }
To be able to use a panel as Drawable in my ImageTracker/Trackable, the panel had to be converted into Prefab.
This Prefab is referenced in Trackable.
The next item to inspect is the "Sitecore" GameObject. In the properties window I have the following:
No comments:
Post a Comment