This February, the Bankhead Theater in downtown Livermore hosted another series of Science on Saturday events. Lawrence Livermore National Laboratory’s (LLNL’s) educational outreach program gives local students and the public a glimpse of LLNL’s recent research. Continuing this year’s theme of Marvelous Machines, LLNL computer scientist Katie Lewis and Monte Vista High School (Danville, CA) teacher Rodger Johnson collaborated on a lecture called “The Evolution of Computing Technologies: From Following Instructions to Learning.”
LLNL’s 65-year computing history includes marvelous machines that predate personal computers and smartphones as well as Sierra, the Laboratory’s newest supercomputer. As computing demands have increased, computing architectures have evolved from serial to parallel processing, with the most powerful machines able to perform quadrillions of operations per second.
Scientific computing contributes to major advances in technology. “Scientists integrate experimental observations and computer modeling,” Lewis explained, showing a three-dimensional aerodynamics animation as an example. The computer simulation of air flowing over, under, and around an airplane wing was derived from physics equations. To obtain better resolution (higher accuracy) of a process, scientists use supercomputers that can handle complex equations.
Figure 1. LLNL’s newest supercomputer, Sierra, can process 125 petaflops. “It’s difficult to convey how huge this number is,” said Lewis. For example, this number of seconds is almost as long as the Earth’s age. (Photo by Joanna Albala/LLNL Science Education Program.)
Are Computers Getting Smarter?
Lewis introduced the concept of machine learning, in which a computer improves its performance of a task by recognizing patterns and other connections among data. Machine learning can be used to tackle diverse scientific problems such as understanding Earth’s climate, fighting cancer, and discovering how the universe works.
The presenters used text messaging to demonstrate a basic form of machine learning. First, Lewis sent Johnson a message: How are you doing? Johnson began typing—Okay—then completed his response by choosing only words suggested by the messaging application. These suggestions were based on his past texting behavior—words and phrases he commonly uses—as well as usage by the application’s aggregate users. As their exchange progressed, Johnson’s responses quickly became nonsensical. Although the smartphone exhibited the ability to learn from user behavior, its performance did not consider Lewis’ half of the conversation.
Lewis continued with an explanation of neural networks. The human brain is filled with almost 100 billion neurons, which send and receive interconnected signals. An artificial neural network is modeled after the brain. “With artificial neural networks, computers can explore many paths simultaneously and adapt to change, just like the brain does,” stated Lewis.
To illustrate this point, Lewis showed how traditional if/then code constructs sentences using a combination of certain words. A neural network approaches the problem by making connections between words and weighing the strength of those connections. When words are added, the neural network adapts by creating more connections based on this new information.
For the second demonstration, a student dribbled a basketball but could not explain the physics involved. “You solve problems because of your experience, not necessarily because you know how it works,” Lewis noted. “Machines also learn by practicing.” Similarly, Google’s search engine strengthens the connections driving its auto-complete suggestions by learning from users’ search inputs over time.
The presentation concluded with a video showing an artificial neural network in action. LLNL is evaluating IBM’s TrueNorth, a brain-inspired computing platform. The 16-chip system imitates the human brain with the equivalent of 16 million neurons and 4 billion neural connections. In the video, TrueNorth identified objects in images by determining the probability of a match.
Figure 2. Lewis and Johnson demonstrated how smartphones learn from users’ texting habits. (Photo by Joanna Albala/LLNL Science Education Program.)
Reaching the Next Generation
As leader of LLNL’s Applications, Simulations, and Quality division, Lewis knows firsthand how computing helps scientists across the Laboratory. She finds great potential in machine learning for tackling challenges in new ways. “It’s an exciting time to be working in scientific computing,” said Lewis. “I want students to see that.”
Accordingly, Lewis was happy to contribute to Science on Saturday and its spinoff series, Science on Screen. “I was excited to talk about how computing is an ‘enabling technology’ for many different areas of science,” she noted, adding that Science on Screen reaches a community (Modesto, CA) beyond Livermore. “I like that our work at LLNL is highlighted through this program so students can learn about STEM as well as how they can participate in growing the field.”
The texting and basketball demonstrations during the lecture were designed for this target audience. Lewis explained, “Students can see how they use machine learning daily and that they learn in a much different way than how most computer programming is done.”