Capstone – ProjectAR by Bruce

The year of 2018 marks the dawn of mobile Augmented Reality (AR). With new technologies like ARKit and ARCore being widely available to the developers and users, our phones can see the world like never before. Phones know where they are, and they track themselves spatially. This allows the user to interact with objects in the virtual environment through the screen as if they exist in the real world. It opens up new possibilities for interactive experiences. The problem, however, is that the screen has a limited size. In virtual reality, users can turn their heads immediately to see interactions happening around them, which is not possible in mobile AR. ProjectAR seeks improvements in this direction.

The experimental interactive game “ProjectAR” aims to explore extended immersion with the help of projection mapping. The player sees a virtual world through the phone working similar to magnifying glass and need to pay attention to the floor which is projected using a projector installed on the ceiling to look for targets to interact with. The game itself is simple: the player needs to find as many sprites as possible in a pond of fog. The little sprites inside the screen won’t move since they are trying to hide from the player. But the sprites may move when outside of the screen, and they will leave a trace on the floor. After figuring out where the sprites are, the player can blow into the phone to dispel the fog away and reveal the sprites underneath, and click to collect them.

ProjectAR aims to experiment and explore what works and what doesn’t with mobile AR and tackle some of its weaknesses using existing technologies. The initial idea for this combined platform was to build a spatial Twitter visualization, however, during user testing, it does not trigger interactions that much. Games are going to put the test of interactions and tracking capabilities to the extreme and encourage participation.

ProjectAR can be set up as an interactive installation using one ARCore enabled mobile phone, one computer, and one projector.

Project video:

Presentation Slides and project files:

RUI – Final Documentation by Bruce

Main Menu

Filtered Category

Detailed view

Print Layout



Online Demo

IMA is a unique major, and every IMA student is unique. As graduation is approaching, every senior is preparing their portfolio so that they could show their best to the world. However, the examples found on the Internet are usually more specific in one area.

For IMA students like me who are not sure what to do, but capable of doing a few different kinds of jobs, the portfolio design faces challenges similar to full-service advertising agencies that provide a large variety of works. This portfolio is also a React remake of my previous Vue.js version in order to compare what’s good and what’s bad of both frameworks.

Capstone – Progress Update 2 by Bruce

During this period of time, I descriptions and storyboards to a few friends outside of China for some remote user testing and feedback.

One game designer suggested that since my main focus was on the combination of AR + projection mapping, the content isn’t that important. But the subject I picked for this new combined media isn’t that user participatory. In other words, the content I am making at the moment isn’t interesting and attractive enough. We talked for a few hours and figured out a new game that could utilize most of the interaction I programmed already and further emphasizes the new possibilities introduced by this combination.

The new game is still based on the same setup: one ARCore enabled mobile phone, one computer, and one projector. The main goal of the game is to catch the little glowing sprites running around under the fog. The fog is displayed on the phone, and will also leave a white glow in the physical room using projection. The sprites will leave a golden trace when they run around. Users need to blow into the phone to puff the fog away. The sprites uncovered will freeze for a second, and the user has to get close enough and click to capture, or they will hide back into the fog. The sprites will randomly move after idling for a while, so it requires the user to pay attention to the projected floor. User’s scores are used to compete with each other.

I could say this testing session changed almost everything of my project, but I feel better working on the new one. But this means that I need to work extra hard to finish it in time before the presentation. One thing I learned is that it is really helpful to test around with people outside of my own circle with a different background.

RUI – Final Competitor Research & User Story by Bruce

The portfolio is a personal thing. When you are showing your portfolio to others, it should be talking just like you, not like anyone else. IMA students are highly diverse in skill sets, which makes it harder to design.

  1. R/GA:
    R/GA is a full-service advertising company. Like us, they provide services in many different fields. Having to face different kinds of clients, their works page has comprehensive features to help viewers choose and filter for their needs. Each project case study has a short write-up, a slideshow of images or videos, tools to share on social media and related works.
    For my own need, it’s good to provide filtering, but it’s way too complicated.
  2. Shirley Huang:
    Shirley is a former IMA student who’s portfolio website has been recommended by professors as a good example. As someone with a strong visual design background, her website has her strong personal styles. The works are categorized by their types, and then further tagged. Since her works are usually interactive, hovering the mouse above will reveal a preview footage.
    The issue is there is no priority and control over what to show for a specific group of viewers.

User Story:

  1. As a user, I can see a few featured projects.
  2. As a user, I can see a list of projects in an order of importance.
  3. As a user, I can see the title of the project and a big thumbnail.
  4. As a user, I can get an URL made specifically for me so the results are tailored to my need. (academic, job search, client)
  5. As a user, I can see details of a project by clicking in, seeing images, videos, descriptions, and even demos.
  6. As a user, I can go to relative projects by clicking on the category or specific tag.
  7. As a user, I can view the website with no issues on my phone.

RUI – Final Project Idea

For the final project, I’m going to rebuild a personal portfolio website for myself using React, as a comparison to my previous version, which is made in Vue. Front-end developers can never settle down to an agreement with each other on such topics. The only way for myself to better understand the design differences of different frameworks is to actually use them, so I can pick the right tool for the right job.


Each IMA student is different yet sharing one core: cross-disciplinary. If you try to Google portfolio designs, almost all of them are focusing on one area. We need to build a better UI to filter out the works in specific areas and show the wide range of areas we work on while being able to manually select a few of them for a certain employer by giving them a special hidden category or tag.

Capstone – Progress Update by Bruce

Progress so far:

  1. Networked AR
    Basically, all the things happening on the phone is now sent to the computer in real-time.
  2. Twitter API
    The fundamental groundwork for accessing the platform.
  3. Initial Modelling
  4. Moved everything to the prototyping area, got a dedicated workstation for running the stuff

In Progress:

  1. Unity game logic
    Staging for the animations, stories, and user interactions.
  2. Projector installation on Thursday with Facilities
    In the process of evaluating.


  1. Solve the wifi connection issue with IT (ticket filed)
    One AP in the room is dead.
  2. Tracking correction algorithm
    A few options to try.

RUI – Midterm by Bruce

Code Demo

This is my contact book that focuses on basic features. But it has full-text search features combined with filters.

Users won’t need to input precise information, just information that they need to describe someone, and searching for that text will bring the contact out.

Other designs planned: the user’s interactions with the contact will be recorded and used to generate the “recent” section, as well as sorting the entire list in such order. Users will also be able to see the list in the order they add the contact. Users will be able to override the default random color, or just upload an avatar for the contact to help to memorize and quickly locate someone.

Capstone – User Testing by Bruce

The milestone for the two-week progress is to have the basic networking setup and test with the projector. So far, the networking has been done and tested, but the capstone prototyping space wasn’t allocated in time for the installation. So instead of testing the prototype I made, I asked users to try a bit with a few mobile AR demos. I picked this because not a lot of people have experience with mobile AR, specifically, ARCore since it’s only available on very few devices.

The interactions of the AR platform is spatial movements and tapping on the screen. This is consistent with almost all ARCore apps, including my proposed project.

Since it’s something new to most of the people, I asked them some of the questions below, and here are some of the answers I think that is expected, interesting, or completely surprising.

  1. What do you think about it?
    “It’s cool.” This is the most common answer I get.
    Actually, the feedback I get treats it more as a trick, other than something that is really useful. The limited use case and availability are going to be the issue. Though my proposed networking feature sounds like something fun.
  2. What do you think works well?
    It is technically working really well. Some of the testers have tried and experienced Project Tango, and the fact that mobile AR doesn’t need additional sensors amaze them. The spatial tracking is fairly quick and accurate, and the ability to pin something spatially to a textured surface is really cool.
  3. What doesn’t?
    One would think it is something that works when the app is open but it actually requires the user to spatially move a bit. By telling the user to move, the user would usually just swing the arm a bit, which is not enough for the tracking to recognize surfaces. This is expected since it is like exploring the space using only one eye, only by moving can you tell the distance between objects. A good way to deal with it is to guide the user through some movements (by utilizing storytelling for example) without explicitly telling the user to move.

    One of the demos is a Tilt-Brush-like app. One of the testers reported that it’s way harder to use since the input is by touching and drawing on a 2D screen, where the drawn object is actually in 3D space. The lack of stereo vision also adds to the chaos. Users need visual clues to know what they are interacting with.

    One of the testers have seen me attaching objects to places like the whiteboard or the ceiling and he wanted to try that too. The point cloud indicates the area is well tracked, but the user can’t really attach anything to it. I have to give him instructions to move a bit spatially to let the phone understand the surface it’s facing. This is also something that I need to be aware of when designing interactions utilizing this feature.

  4. How do you find the tracking?
    So far so good. There are dropouts, but it recovers fairly quickly. But it’s really annoying when it happens, and users have no clue how to resume the tracking.

RUI – Week 5 Address Book App Competitor Analysis by Bruce

  1. Google Android Contacts:
    This is the contact app that comes with Pixel / Nexus phones, and can be downloaded from Google Play Store. The design of the app assumes you are a Google user and have all the contacts in your Google Account. The user interface is fairly simple, it’s a list of contacts started with the contact’s avatar, and divided into different sectors by the first character of the contacts’ name. The search bar is triggered by the search button on the top navbar. The contact search can be triggered by the system-wide searching bar. It also allows the user to make groups and clean up the clutter. By sliding on the far right side of the screen, the user can quickly jump between the initials. The details page lists everything clearly, including phone history and chat history with the exact contact.
    It’s simple and easy to use, and handles multiple languages fairly well. However, it lacks filtering options. The initial jumping is not precise and lacks feedback, and can’t accomplish tasks like filtering contacts by country or sort by something else. Google is a search company, but search can’t satisfy all needs.
  2. iOS Contacts:
    Fairly similar to Google’s Contacts app, it can do the basics, listing all of the user’s contacts with their names only, separated by initials. It also has the quick initial jumping feature, with a bit more clear presentation. The search bar is on the top side of the screen.
    Similarly, it lacks effective filtering and sorting.
  3. People (Windows 10)
    Another contact app that comes with the system. Similar design to the Google Contacts, it shows the contact starting with their avatars, names, emails and phone numbers. It has a search bar at the very top of the screen, and a filtering option in the side column. The filtering button does nothing more than selecting the account to display, or hiding contacts without a number. The integration of contacts between multiple apps is a plus. It has poor multi language support, putting all non-English contacts together.
  4. WeChat’s Contacts
    Not a typical contact app, but it’s something that a lot of people use. While the main chat layout displays contacts in an order of the last message, with the option to pin messages to the top, the contact page is no different from others: alphabetical order. The top navbar is the same between the two layouts: search and add button. The search feature is a Swiss knife, allowing to search almost everything in WeChat, even including chat logs. Each contact is displayed like Google’s Contacts, starting with an avatar and the name following it. It’s also divided into different initials, with the iPhone styled initial navigation bar on the right side. The top section displays a few options: New Friends, Group Chats, Tags and Official Accounts, separating them from the individual contacts. The tag feature allows the user to tag any contact with any desired text label.
    There is no way to sort, and it doesn’t have a separate filtering layout. But filtering can be accomplished through searching, covering tags, countries, provinces, cities, WeChat ID and remarks.

User Story:

  1. As a user, I can see a few contacts I interact with the most, and followed by all other contacts in alphabetical order.
  2. By clicking the search button, I can open up an interface for searching and filtering.
  3. I can search using anything stored in the data, while it will prioritize the name search.
  4. The filtering feature gives different interfaces for different kinds of data.
  5. There is a sort button on the navbar, letting user choose between ascending and descending, first added and last added, and most contacted.
  6. It can support English and Chinese names. Chinese names are clustered by Pinyin together with English names.