AI for Nanoscale Imaging of Contact in the Human-Machine Interface



Over the Summer of 2020, I reserched this topic as part of the Texas A&M University REU program. In collaboration with their INVENT lab, I used deep learning neural networks to analyze and classify apparent contact. The video above summarizes my findings.

While everyone knows what it feels like to touch a phone screen, the physics of this simple action are poorly understood. Human fingers are capable of perceiving the nanoscale topography of surfaces as well as the nanoscale interactions between the finger, the surface, and space between the points of contact. A finger is not flat and contains many valleys and ridges we call fingerprints. These ridges have ridges of their own down to the nanoscale, and this finger roughness is critical to how humans perceive texture. The ability to create accurate images that distinguish between sweat, oil, air, and true contact area is necessary to better understand the forces involved with touch and the tribology of human fingers. Deep learning shows promise as a tool to gain insight into this material interface.