You are here

Hackathon Spotlight: Experimenting with Augmented Reality
Thursday, May 25, 2017

Some hackathon projects require more equipment than just a laptop. At Computation’s recent 24-hour collaborative event, Huy Le, Lisa Belk, and Ryan Chen built a multiplayer augmented reality application for use with a mobile phone or an untethered Microsoft HoloLens headset. The team’s hackathon experience demonstrates the event’s value for generating new ideas relevant to Lawrence Livermore’s missions. “Hackathons encourage great collaborations, novel approaches to solving complex problems, and technical risk taking without the pressures of program deliverables,” says Belk.

Whereas virtual reality technology provides a simulated environment, augmented reality superimposes generated images over real-world views (such as those captured by a smartphone’s camera). Le had been wanting to create a networked multiplayer networked experience, so he registered his project with an open invitation to other participants. Belk signed on to explore unfamiliar technology that piqued her interest. She states, “I enjoy working on user interfaces because they allow me to connect with the customer. Working on an augmented reality hackathon project sounded like a great way to extend the interface concepts and tools in ways we have not typically done at the Laboratory.” Chen sought a project that would use his skills in animation and game development. “I was surprised to see that someone else was interested in augmented reality, so I introduced myself and joined the project,” he says.

Figure 1. (from left) Ryan Chen, Lisa Belk, and Huy Le break plan their project along the hackathon’s 24-hour timeline. (Photo by Randy Wong.)

Managing time and priorities during the 24-hour period is often a challenge for hackathon participants. Within the first hour of the event, the team decided on a goal and approach. “Above all else, we wanted to finish a multiplayer shared experience that actually worked,” says Le. “We separated the project into clearly defined smaller goals so that we could gauge progress and eventually accomplish our number-one requirement.”

Chen’s work developing interactive tools and visualizations for the Global Security Directorate inspired an emergency response focus for the hackathon project. Using the scenario of an indoor radiological dispersion device exposure, the team’s proof-of-concept application allows the mobile/headset operator to view radiation dose rate data via quick response (QR) code image targets overlaid on a spatial map of the affected room. Simultaneously, another user can interact with the simulation from a command center (laptop) perspective. The combination of portable technology and real-time data transmission is intended to allow response teams to practice monitoring simulated exposure levels in a classroom setting.

Figure 2. Ryan Chen (left) and Huy Le test augmented reality functionality with HoloLens headsets. (Photo by Randy Wong.)

Each member of the hackathon team brought a different perspective and background to the project. Le is a software developer for Livermore Computing, Chen is a data analyst and visualization technologist for Global Security Computing, and Belk is division leader for National Ignition Facility (NIF) Computing. Le’s prior experience with the HoloLens headset made him a natural fit for tasks involving headset–computer syncing and multiplayer functionality. Chen tackled the simulation model with game development and animation software. Belk kept the team on task while managing QR code responsibilities.

In addition to the HoloLens devices and smartphones, the project’s technology suite included Microsoft Visual Studio for C#, Unity (a game engine), Blender (open-source animation software), UNET for networking functionality, and Vuforia Target Manager. Tasks were varied: create a database of image-recognizable QR codes used as radiation sensors in the simulation, generate a static model of the conference room where the hackathon took place, add QR code images to the model and establish image recognition, simulate radiation dispersal behavior based on estimated environmental variables, connect the operator’s view of the room to the command center, and provide visual feedback for scanned QR codes via colorization (see image at top of page).

To develop the three-dimensional model of the environment, Chen calculated the length and width of the room and combined his measurements with those for standard ceiling heights, doors, and windows. For networking, Le notes, “We had to make sure that the computer application kept track of every movement and rotation of the HoloLens operators, and that the dynamic mapping of the room could be sent over the network in order for the computer to see room updates.” Using a series of hand gestures interpreted by the headsets, the team executed commands such as setting up wireless network connectivity. “With this system, pinching, tapping, and scrolling motions take the place of a traditional keyboard and mouse,” explains Belk.

Figure 3. This interactive training concept for responding to an indoor radiological dispersal device (RDD) exposure uses the hackathon team’s augmented reality technology. Click to enlarge. (Rendering by Ryan Chen.)

Many Laboratory research areas can benefit from augmented reality. The team cites applications in emergency-responder training and decision making in situations where global positioning systems cannot be used. In addition, computer models overlaid on existing surfaces could enhance advanced manufacturing or stockpile stewardship. “The technology could be used to better visualize the path of the NIF laser beamlines as they traverse the laser bay, switchyard, and target bay. It could also assist with optimizing configurations of diagnostic assemblies on the NIF target chamber,” says Belk. “Our hackathon project reminded me that stepping outside our comfort zone can lead to great advances that ultimately serve our national interest, and that makes for a very rewarding 24-hour endeavor!”

After the hackathon concluded, the team agreed their experience was positive. They proved that a prototype can be successfully executed in a short turnaround when everyone works together according to a plan. Le reflects, “At a hackathon, the difference between success and failure is a great team. At the end, you can either show that you learned something or show that you made something ambitious that worked. I’m proud that our team managed to do both.”

The Computation Directorate hosts three hackathons per year—spring, summer, fall—at the Laboratory’s High Performance Computing Innovation Center. Past hackathons are covered in News & Press. More information about how Livermore’s simulation technology serves disaster preparedness can be found in Science & Technology Review.