Published: 
By  Wende Whitman

To make machines that are smarter than Alexa, Siri and other robo-advisors, researchers are trying to develop a more sophisticated and powerful system of data processing. Artificial intelligence needs a memory type that can handle pushing around the terabytes of data that machine learning algorithms generate. And pushing big data can't take weeks; AI needs to crunch through task after task at lightning speed, while consuming very little energy. Lastly, if AI is to become commercial, materials need to be readily available so manufacturing them doesn't break the bank. Kiumars Aryana, a fourth-year mechanical and aerospace engineering Ph.D. student in the University of Virginia School of Engineering and Applied Science, has made significant advances in all of these areas, and some of these advances were captured in his two latest research articles aboutdeveloping a new, more efficient type of memoryandfinding the best materials for thermal insulation in memory cellspublished in Nature Communications. Aryana's work, as a part of mechanical and aerospace engineering professorPatrick Hopkins'ExSiTE lab, could bring about the next leap in memory capability, not unlike the jump we saw between floppy disks (remember those?) and thumb drives; his work could also allow AI to have big upgrades in processing capability without taking down the power grid. Aryana is also being intentional about developing solutions that use everyday, available materials like silicon, which is already widely used commercially for electronic devices. So, what do the building blocks of AI 2.0 look like? First, Aryana and other researchers hope to say goodbye to von Neumann's bottleneck – the separation between memory and processors that has been part of all mainstream computing architectures since pioneering computer scientist John von Neumann invented it in 1945. “If we intend to do the type of data processing needed by more advanced artificial intelligence, we need a better way to process and store information,” Aryana said. “The current transistor-based memory technology is no longer capable of satisfying our needs for faster and power-efficient computation. One of the most promising candidates for next-level data processing is phase change memory, and that's what we're working on. “Your computer CPU uses two types of storage memory: short-term ‘working' memory, what we know as RAM, and long-term ‘storage' memory, what we know as SSD or our hard drive,” Aryana said. “Forty percent of the energy needed for the computer is used by transferring data back and forth between the processor and the memory. Phase change memory has the potential to combine all of this into a single processing and storage unit, called in-memory data processing, using significantly lower energy. “In our field, reverse-engineering the brain and developing an in-memory processor that can perform highly parallel tasks such as language translation and pattern recognition with minimum energy and latency is considered the Holy Grail. However, we still have a way to go to realize this technology. If we can implement it, this would be a huge boost for future data-intensive technologies,” Aryana said. Aryana explains that the human brain uses an unbelievably low amount of power for daily tasks – about the same amount of energy as a light bulb. In contrast, theworld's biggest supercomputerand the closest comparison to the human brain uses 15 megawatts, enough to power more than 7,000 homes. “Before we scale up on computational capability, we need to scale down on power usage,” Aryana said. For phase change memory, it all comes down to heat – managing the way heat is transferred within the memory cell by electrical pulses and then controlling where the heat goes. Data is stored using a special type of material that can switch between crystalline and non-crystalline, or amorphous, states using heat. Bits, or data, are stored and erased by toggling between the states. Before now, phase change memory has needed a lot of heat, i.e. energy, to switch between a crystalline and an amorphous state, and that is the primary reason phase change memory hasn't become more commonplace. Aryana discovered a way to make the switch between states with a lot less heat – and energy – by engineering the interfaces between components in a much better way to conserve heat within the memory cells. By controlling the heat and isolating it within the memory cell, less energy is required to make the switch. “Our work presents a new paradigm to manage thermal transport in memory cells by manipulating the interfacial thermal resistance between the phase change unit and the electrodes without incorporating additional insulating layers,” Aryana said. The properties of phase change memory also allow for the execution of computational tasks in the memory cells themselves by taking advantage of a binary system naturally available because of the two material state phases. With more control and less heat, phase change memory can be used to its fullest:  faster processing capabilityin memorybecause the data isn't shuttled around to different places to be processed, and there is no risk of burning up the memory when it's doing all the intense computational work. “Solving the heat issue for phase change memory means memory processing components for AI can get super fast and powerful. The supercomputers can get smaller with tons more computing power and not use crazy amounts of electricity; this capability could be the genesis of neural networks needed for machines and robots that can lead to higher-level learning, computation and decision-making. At the very least, your computer and your devices are going to get a whole lot faster,” Aryana said. Aryana took another step toward AI 2.0 when he took a peek behind the door of nanofabrication — layering and designing materials at the atomic level — to find out the details of heat transfer, or thermal conductivity, in different materials used for memory construction. In his study, Aryana pinned down exactly which alloy of materials has the lowest thermal conductivity. The answer? Silicon telluride. When used with phase change memory, this material is capable of insulating the heat with better efficiency and smaller thicknesses. In his work, he not only discovered the lowest thermally insulative material, called an amorphous solid, but he also found out the reason why heat is poorly conducted in these materials. Scientists know that atomic weight – how many protons a material has – and atomic structure – how the protons connect with each other in a material – work together to produce low thermal conductivity, but Aryana discovered that the degree of connectivity in the atomic structure plays the biggest role. This might not seem like a huge announcement, but since heat decides the fate of faster processing with low energy consumption in phase change memory, the insulation material that controls the heat is the other decision-maker on the team. Together, they are the gatekeepers of phase change memory and the progress of AI. Because of Aryana's work, the heat problem for phase change memory has been improved, and the best insulator has been discovered and analyzed. These discoveries have carried Aryana and other scientists a lot farther down the road toward building the next-generation AI. A verybasic phase change memoryis available for your computer now. It's a first attempt at combining short-term and long-term memory together, i.e. combining RAM and SSD, and eliminating a lot of the data shuttling. But before you shop on Black Friday and Cyber Monday, know that this nascent technology uses a great deal of energy, so it isn't yet able to reach its fullest potential. Aryana's work is supported by industry partnerWestern Digital, a California-based company that intends to create a bridge between the working and storage memory using this new phase change memory technology and potentially combine them into one single memory – without using a lot of juice. Aryana thinks that with breakthroughs in power consumption on the horizon, Western Digital will implement the first applications of a more mature phase change memory on the commercial market in the next five to 10 years for computers, with cellphones and other devices not far behind. “But the real power of a matured phase change memory technology is in adding capacity and speed to supercomputers for complicated simulations,” he said. “Right now, when scientists study complex challenges, they only study a small portion of the big picture, then piece together the results and form hypotheses. With supercomputers, they could study the whole thing. Supercomputers paired with phase change memory are starting to mimic human thinking.” One pertinent example of how supercomputer simulations could help is in the field of medicine, exploring the effects of drugs or vaccines on the human body. Human physiology is extremely complex with a multitude of systems and dependencies, but a supercomputer with powerful processing units that could be potentially delivered by phase change memory could work out the potential outcomes using a simulation that captures the whole-body behavior instead of just looking for clues using smaller studies. Animal testing and clinical trials on people have the potential to become obsolete, and having this capability in the triage arsenal would be powerful in pandemics. As of this writing, Aryana's third Nature Communications paper in 2021 has been accepted for publication.