Experiential Design | Task 3 : Project MVP Prototype

3 Jun 2025 - 6 July 2025 | (Week 7 - Week 11 )
Chan Xiang Lam | 0358400 
Experiential Design | Bachelor of Design (Honours) in Creative Media 
Task 3: Project MVP Prototype

TABLE CONTENT     
    1. Instructions
          - Final Submission
    3. Reflection

INSTRUCTIONS

TASK 3
INSTRUCTIONS:

Develop a working prototype that shows the core features of your approved Task 2 proposal. The goal is to test and showcase key interactions, not to create a polished final product.  Include:
  • A functional demo (app, AR, website, or physical model)
  • Highlight what works, what doesn’t, and any technical or creative limitations
  • Propose or apply solutions to overcome those issues
REQUIREMENTS:
  • Figma screen design prototype
  • Working MVP showing core app features
SUBMISSION:
  • Video walkthrough & presentation
  • Reflective post on your E-portfolio

Task 3 Progress: MVP Prototype Development 

For Task 3, we focused on developing a  functioning MVP based on our approved proposal for the AR WonderWords project — an educational AR experience combining 3D animal visuals, multilingual support, and interactive learning.

Fig.3.1 Task 2- Proposal

Fig.3.2 MoodBoard of AR WonderWords

Figma Design Process

In our Figma prototype, we introduced an Onboarding Page and a structured Home Page to improve the initial user experience. We also made some adjustments to the Mini Quizzes section, focusing on better clarity and interaction for users.

Below are the links to our Figma prototype and Figma design file:

Fig.3.3 Figma design file

Fig.3.4 Figma prototype

Initial Feature Planning

We revisited our Task 2 proposal to determine the key features that should be included in our MVP. Our objective was to select elements that clearly reflect the core concept of our project — an engaging and educational AR experience designed for children.       

1. Image Target Recognition + AR Activation

  • This is the entry point of the experience. Scanning a real-world card triggers the 3D animal model and opens the learning interface.
  • It immediately grabs the child’s attention and creates a smooth “scan-to-learn” experience.
2.  Animal Sound Playback
  • Auditory learning is key for children. The sound button allows users to hear the animal’s real roar.
  • This creates a multi-sensory experience and makes the AR content more immersive.
3. Translation Button (Language Learning)
  • One of our goals is to teach children animal names in another language (e.g., Malay).
  • The translation button plays the translated word and displays it visually, supporting bilingual learning.
4. Fun Fact Button
  • This feature reveals a short, interesting fact about the animal.
  • It encourages curiosity and adds an extra layer of learning beyond just vocabulary.
5. Play Game Button (Transition to Quiz)
  • After learning, users are guided to take a mini quiz to reinforce what they’ve just learned.
  • This creates a learning loop: Discover -> Understand -> Practice.

Team Collaboration & Division of Work

To ensure smooth and efficient development, we divided our tasks based on functional modules and areas of expertise.

  • Natalie was in charge of designing and implementing the ar scene, which included the onboarding experience, the main menu (HomeScene), and the AR learning interface.
  • Xiang Lam (me) took full responsibility for developing the quiz scene, which covered all quiz scenes (QuizScene, QuizScene2, QuizScene3) and the final completion page (CompleteScene), ensuring a seamless transition from learning to testing.
Unity Process (Quiz Section )

  1. QuizScene 1 – Spelling Quiz

In this scene, users are asked to complete the spelling of an animal name by filling in missing letters using on-screen buttons. I added three TextMeshProUGUI slots to display the letters as they’refilled. 

Fig.3.5 Spelling Quiz 

The SpellingQuiz.cs script tracks the user’s input, checks the completed word, and provides sound feedback. When the correct answer is entered, a pop sound plays and the “Next” button becomes visible.

Fig.3.6 SpellingQuiz.cs script

  2. QuizScene 2 – Sound Quiz

In the Sound Quiz, users are challenged to identify the animal by listening to its sound and select the correct animal name from multiple choices. When the speaker icon is clicked, the tiger's roar is played using an AudioSource component, helping users recognize the animal through auditory cues before making a selection.

Fig.3.7 Sound Quiz 

I implemented this using the AnimalQuizManager.cs scriptwhich handles playing the animal sound when the speaker button is clicked, checking the selected answer, and triggering audio feedback. The “Next” button only appears after the correct answer is selected, ensuring users complete the interaction before progressing.

Fig.3.8 Manager.cs script

 3. QuizScene 3 – Word Matching Quiz

In the final quiz scene, users are asked to match English animal names on the left with the corresponding animal images on the right. To assist them, clicking on an English word plays the animal's sound, providing an auditory hint. 

Once all matches are correctly completed, the “Next” button becomes active. Clicking it brings the user to the Complete screen, marking the end of the quiz experience.

Fig.3.9 Word Matching Quiz

The interaction is managed by the WordMatchManager.cs which handles:
  • Detecting correct pairs
  • Playing confirmation sounds for each correct match
  • Activating the “Next” button once all matches are correctly completed
Fig.3.10 WordMatchManager.cs scrip

 4. CompleteScene – Final Page

The Complete screen displays a congratulatory message once all quizzes are successfully completed. It features two buttons:
  • Return to Home – brings the user back to the main menu
The layout is kept simple and clear, providing a satisfying conclusion to the quiz journey.

Fig.3.11 CompleteScene

Task 3 Final Submission
Fig.3.12 Video Presentation 

Here to view our Video Walkthrough:

Fig.3.12 Video Walkthrough


REFLECTION

Working on this task allowed me to translate our design proposal into a functional MVP, focusing on user interaction and educational engagement. Through building features like the AR learning scenes, sound-based quizzes, and the word-matching challenge, I gained hands-on experience in combining UI elements with interactive logic in Unity.

One of the biggest challenges was ensuring smooth communication between UI buttons and audio feedback. I also had to find creative solutions for managing quiz progression and user input feedback. For example, triggering the “Next” button only after correct answers added a logical and rewarding flow.

Overall, this task helped me strengthen my skills in prototyping, user experience design, and problem-solving in Unity, while also learning the importance of team collaboration and feature integration for a cohesive experience.

Comments

Popular posts from this blog

Information Design | Exercise 1 & 2

Major Project I | Final Compilation & Reflection