NASA Mars Rover Image Viewer App
This article demonstrates how to quickly build a simple spatial panel app using newly released Meta Horizon OS UI Set. The app shows list of the photo images taken by Mars Rover Curiosity, provided by NASA Open API. When the user selects one of the image, it displays selected photo on a large curved panoramic window.
What is Meta Horizon OS UI Set?
Meta Horizon OS UI Set is a collection of UI components built with Unity canvas UI and Interaction SDK components based on Meta Quest Design System. By using these UI components, you can quickly build common UI patterns for spatial apps with consistent input modality support, visual quality, and consistent interaction behaviors. Now you don’t have to spend time configuring things to get proper input support such as hand tracking and controllers.
Where to find the UI Set
UI Set is available in the latest Meta XR Interaction SDK Essentials package v69 which can be downloaded from the Unity Asset Store. It is recommended install Meta XR Interaction SDK package which automatically installs Essentials package together.
Documentation
You can find comprehensive documentation on Meta’s Horizon developer website and learn about the details of the UI components and Theme Manager.
Figma design resources
With Meta Horizon OS UISet Figma file, you can easily sketch out ideas on spatial app experiences.
https://www.figma.com/community/file/1425877250001997196
Location of the UI Set in the package
UI Set folder is located at Packages > Meta XR Interaction SDK Essentials > Runtime > Sample > Objects > UISet
Example scenes
Due to Unity’s restrictions, example scenes located in the Packages folder cannot be opened directly. To access and modify these scenes, simply drag and drop them into the Assets folder. This will enable you to open and edit the scenes as needed.
UISet example scene shows entire library of the UI components. You can also try changing the theme using the ThemeManager’s inspector.
UISetPatterns example scenes shows widely used UI patterns built with UI components such as menus, simple settings page, and some of the common navigation layout patterns that you can find on Quest.
Building a simple app using existing UI pattern – NASA Mars Rover Image Viewer
By leveraging provided UI patterns, you can easily build common app experiences. In this example, I chose this pattern (ContentUIExample-HorizonOS1.prefab) to make a simple photo image feed and viewer using NASA’s Mars Rover API.
As you can see, it is constructed many nested Horizonal Layout Groups and Scroll View components. From this pattern example, I have removed unnecessary search bar and content tile buttons. Since I wanted to dynamically populate the content area with image tile buttons, I removed all objects under main content area Scroll View.
I have modified TextTileButton_IconAndLabel_Regular.prefab to meet my app’s requirements.
NASA Open API
You can easily signup for NASA’s Open APIs and get API Key through this website: https://api.nasa.gov/
Script for retrieving images through API
Prepared a simple script that retrieves JSON data from the NASA’s Mars Rover API. When parsing the image URL, it assigns the image texture into the Tile Button’s Image component. Since they are Unity UI’s Toggle buttons, I have added listeners so that when one of the image buttons are pressed, it displays the image on a separate window.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
using UnityEngine.UI;
using TMPro;
public class MarsRoverImageFetcher : MonoBehaviour
{
public string marsRoverApiUrl = "https://api.nasa.gov/mars-photos/api/v1/rovers/"; // Base URL for Mars Rover API
public string roverName = "curiosity"; // Default rover name, can be changed
public string apiKey = "YOUR_NASA_API_KEY"; // NASA API Key
public GameObject tileButtonPrefab; // Prefab for the tile button
public Transform contentPanel; // Content panel of the ScrollView
public Image TargetImageDisplay; // Target Image component to display selected image
private string currentCategory = "";
void Start()
{
// Optionally, load initial photos from the Curiosity rover on a specific date
StartCoroutine(GetMarsRoverImages("curiosity", "2022-10-02"));
}
public void OnRoverButtonClick(string roverName)
{
this.roverName = roverName;
// Fetch rover images for a specific date (e.g., Earth date)
StartCoroutine(GetMarsRoverImages(roverName, "2022-10-02"));
}
IEnumerator GetMarsRoverImages(string roverName, string earthDate)
{
string url = $"{marsRoverApiUrl}{roverName}/photos?earth_date={earthDate}&api_key={apiKey}";
UnityWebRequest request = UnityWebRequest.Get(url);
yield return request.SendWebRequest();
if (request.result == UnityWebRequest.Result.ConnectionError || request.result == UnityWebRequest.Result.ProtocolError)
{
Debug.LogError(request.error);
}
else
{
// Clear existing content
foreach (Transform child in contentPanel)
{
Destroy(child.gameObject);
}
// Parse JSON response
string jsonResponse = request.downloadHandler.text;
MarsRoverApiResponse response = JsonUtility.FromJson<MarsRoverApiResponse>(jsonResponse);
// Instantiate buttons for each photo
foreach (Photo photo in response.photos)
{
GameObject button = Instantiate(tileButtonPrefab, contentPanel);
TextMeshProUGUI buttonText = button.GetComponentInChildren<TextMeshProUGUI>();
buttonText.text = $"{photo.rover.name} - {photo.camera.full_name}";
// Load and set rover image
if (!string.IsNullOrEmpty(photo.img_src))
{
StartCoroutine(LoadImage(photo.img_src, button));
}
// Add a toggle event listener
Toggle toggleComponent = button.GetComponent<Toggle>();
toggleComponent.onValueChanged.AddListener((isOn) => OnToggleValueChanged(isOn, button));
}
}
}
IEnumerator LoadImage(string imageUrl, GameObject button)
{
UnityWebRequest request = UnityWebRequestTexture.GetTexture(imageUrl);
yield return request.SendWebRequest();
if (request.result == UnityWebRequest.Result.ConnectionError || request.result == UnityWebRequest.Result.ProtocolError)
{
Debug.LogError(request.error);
}
else
{
Texture2D texture = ((DownloadHandlerTexture)request.downloadHandler).texture;
Image buttonImage = button.GetComponentInChildren<Image>();
buttonImage.sprite = Sprite.Create(texture, new Rect(0, 0, texture.width, texture.height), new Vector2(0.5f, 0.5f));
}
}
void OnToggleValueChanged(bool isOn, GameObject toggleObj)
{
// Implement functionality to display photo details
// Get the Image component from the selected Toggle
Image toggleImage = toggleObj.GetComponentInChildren<Image>();
// If the Toggle is selected, assign the image to TargetImageDisplay
if (toggleImage != null && TargetImageDisplay != null)
{
TargetImageDisplay.sprite = toggleImage.sprite;
Debug.Log("Selected image assigned to TargetImageDisplay.");
}
}
}
[System.Serializable]
public class MarsRoverApiResponse
{
public List<Photo> photos;
}
[System.Serializable]
public class Photo
{
public string img_src; // URL to the image
public Rover rover; // Rover information
public Camera camera; // Camera information
}
[System.Serializable]
public class Rover
{
public string name; // Name of the rover (e.g., "Curiosity")
}
[System.Serializable]
public class Camera
{
public string full_name; // Full name of the camera (e.g., "Mast Camera")
}
To display the selected photo, I added another window in the scene. Since I wanted to have a large curved cinematic screen for photo image display, I grabbed a curved video player panel that exist in ComprehensiveRigExample scene. I have removed everything in it because I will use it only to display an image which can be simply assigning a texture to the Image component.
One small addition was to adding colliders to both windows’ title bar area to make it grabbable and movable with Interaction SDK’s Hand Grab Interactor and Ray Grab Interactor. You can easily add these interactions by right clicking and using the Quick Actions menu under Interaction SDK.
Using Meta XR Simulator for design & development iterations
With Meta XR Simulator, you can quickly test out your app in the context of simulated environments. It supports all available input modalities such as hands and controllers which allows you test out interactions.
Here is an example of using Meta XR Simulator for testing out my app’s functionalities and interactions – how the image data retrieved from NASA API are populated with instantiated tile buttons, how pressing a image tile button displays selected image on a large curved panel, etc.
Result
The app is running on the device in passthrough Mixed Reality. As you can see, by using UI Set components, all input modalities are supported by default such as direct hand interactions – poke, scroll – and indirect interactions with hand ray or controller rays. All UI components provides proper visual feedback for different interaction states such as hover and select. For example, you can see scaling button size on hand poke interactions.
This app shows an example of theme switching between default Quest Design System’s dark & light theme using the Theme Manager.
Multitasking Support
With experimental feature Seamless Multitasking (available from Horizon OS v69) enabled, you can interact with the app along with other system apps. It supports natural input switching with select action – pinch gesture with hand or button on controller.