We are interested in the security and privacy in smart connected devices and in digital interactions. The overarching goal of our research group is to create secure machine learning models for existing and upcoming technologies for various local and online applications. Our group is building methodologies to ensure user security and privacy in IoT devices and in online communications. We apply secure machine learning models in the areas of wearable devices, biometrics, attack-averse authentication, and side channel attack formulation.
In the rapidly evolving realm of augmented reality (AR) and virtual reality (VR) systems, robust user authentication is vital due to the unique challenges posed by these immersive environments. Traditional authentication methods like PINs, passwords, facial recognition and fingerprints are proving inadequate and vulnerable to attacks. This vulnerability is further exacerbated by the nature of immersive environments, where users' attention is often diverted from external stimuli, making traditional authentication methods less reliable. Additionally, biometric methods face challenges such as susceptibility to presentation attacks, where adversaries attempt to deceive the system using fake biometric data, further emphasizing the necessity for advanced authentication mechanisms tailored specifically for AR and VR systems. This project aims to address these challenges by leveraging Electroencephalography (EEG) signals, which measure the electrical activity in a user's brain. The objective is to develop and revolutionize user authentication methods to both secure and enhance human-computer interaction within AR/VR environments. Additionally, the proposed effort includes educational activities for both undergraduate and graduate students; and activities for broadening participation in STEM fields. Through research, education, and outreach efforts, the project seeks to shape the trajectory of emerging technology toward a more secure and equitable digital landscape.
The goal of this work is to develop novel user authentication algorithms tailored specifically for AR/VR systems. The idea is to harness EEG signals of the user's brain to develop user authentication algorithms that are not only secure but also lightweight and user-friendly. By exploring how users' brains respond to various stimuli like visual cues or auditory prompts, the research seeks to create authentication methods that seamlessly integrate into the AR/VR experience. Through comprehensive analysis, including investigation into various attack models such as spoofing attacks, the project aims to ensure the robustness and reliability of authentication performance in real-world scenarios. Novel longitudinal studies for permanence and persistence analysis will be conducted to enhance the authentication system's effectiveness.
The Virtual Reality (VR) ecosystem has seen a massive transformation from just a game-playing environment to a point where it is practically applicable in almost every field and supports numerous human activities. Among the innumerable human activities increasingly incorporated in VR are conferences, business meetings, therapeutic appointments, education, shopping, and many more. Such applications must maintain high security and user privacy while not interrupting user activities in the virtual world. In this project, we are developing methods to unobtrusively and seamlesly authenticate users in VR/AR environment while ensuring the user's privacy.
This project delves into the potential of dynamic behavioral biometrics to revolutionize the way we verify a user's identity. Instead of relying on traditional authentication methods, this project utilizes unique patterns of user behavior such as hand gestures, body movements, gait, and even brain signals (EEG). By adopting an adaptive and active verification approach, dynamic behavioral biometrics can significantly enhance the security of computing devices and protect against unauthorized access. The project represents a major step forward in the ongoing effort to develop more sophisticated and effective methods of ensuring secure access to sensitive information and revolutionize the ways we interact with technology.
Side-channel attacks exploit unintended side-effects of a system's design or implementation, such as power consumption, electromagnetic radiation, timing information, auditory information, or video information, to extract sensitive information. In this project, we are examining various types of side-channel attacks and developing countermeasures that can be used to protect against them. The developed side channel-attacks can be utilized as attack testing cases to ensure security of a system against such attacks.
This project aims to develop a novel reinforcement learning (RL) model to enable interpretable multi-hop reasoning on knowledge graphs. The motivation behind this project is to enhance the comprehensibility of reasoning processes on complex knowledge graphs, which are becoming increasingly prevalent in various domains such as healthcare, finance, and natural language processing. The research will lead to the development of interpretable RL models for multi-hop reasoning on knowledge graphs.
Computing Resources
Secure Sensing and Learning (SSL) Research Lab
SSL Research Lab is affiliated with Artificial Intelligence Lab and housed in Engineering Education and Research Building (EERB) at the University of Wyoming. The research lab houses several wearable devices to study user interactions and behavior with the devices. The wearable research devices at SSL research lab include:
Motion Sensors - 12 wrist band kits with MetaMotionS+ Sensors.
Mixed Realty Systems - two Apple Vision Pro headsets, two Magic Leap 2 headsets, HoloLens 2, Oculus Quest 2 (64GB), HTC Vive Focus 3 headset, Samsung Gear VR, DESTEK V5 VR headset.
Hand-held smart wearables - 2 Samsung Galaxy S20 5G, 1 Apple iPhone 11 4G, 6 Apple iPhone SE (2nd generation), five Samsung Galaxy XCover Pro.
Brain Signals for Biometrics Analysis - mBrainTrain Smarting Pro EEG system, Wearable Sensing DSI-24 ssytem, Wearable Sensing DSI-Flex system, CREMedical tEEG system, Zeto Inc. EEG System, Emotiv Epoc+ headset, 2 Emotiv EpocX headset, 2 EMOTIV Insight 5 Channel Mobile Brainwear®, Muse S headset.
Side Channel Analysis - 2 Mansoon Power Monitors, Fluke 117 True RMS Multimeter.
Video Tracking - 4 Azure Kinect DK system, two Cannon P950 cameras.
Artificial Intelligence Lab
Artificial Intelligence Lab is housed in Engineering Education and Research Building (EERB) at the University of Wyoming. The AI lab houses multiple HPC machines for data processing and analysis.
One Intel Core i9-10940X Deep Learning Workstation with 4 RTX 6000 GPUs from SabrePC.
One AMD Threadripper 3975WX:32 cores, Deep Learning Workstation with 3 RTX A6000 GPUs (NVLinked).
One AMD Threadripper 3975WX:32 cores, Deep Learning Workstation with 2 RTX A6000 GPUs (NVLinked).
Advanced Research Computing Center
The Advanced Research Computing Center (ARCC) at the University of Wyoming provides access to computational and storage resources; the Teton high-performance computing system and the petaLibrary storage system. The Teton cluster provides more than 15,000 CPU cores across more than 500 nodes with a total of more than 80 TB of memory. Regular nodes have 32-40 cores and 128 GB of RAM, high-memory nodes have up to 1 TB of RAM, and Knights Landing nodes provide 72 cores per machine with 400 GB of RAM. Approximately half of the nodes have local SSD storage of up to 7 TB capacity. The GPUs available on the cluster are NVIDIA P100 16G, NVIDIA V100 16G, NVIDIA V100 32G, NVIDIA K20, NVIDIA K20x, NVIDIA K40, NVIDIA K80, NVIDIA GTX Titan, and NVIDIA GTX Titan X. The petaLibrary storage system is connected to the UW network backbone at 40 Gbps (upgradeable to 80 Gbps), which is connected to the Internet2 research network at 100 Gbps. It is accessible through the SMB protocol from on-campus networks and Globus from off-campus networks.
NCAR Wyoming Supercomputing Center
The NCAR Wyoming Supercomputing Center (NWSC) houses the Cheyenne supercomputer, a state-of-the-art, 5.34-petaflop supercomputer, in the specially built computational facility near Cheyenne, Wyoming. Cheyenne is an HPE ICE XA cluster with 145,152 latest-generation Intel Xeon processor cores in 4,032 dual-socket nodes (36 cores/node) and 313 TB (terabytes) of total memory. Cheyenne's login nodes give users access to the GLADE shared-disk resource and the Campaign Storage system. The GLADE system (also referred to as the Globally Accessible Data Environment) has a total usable capacity of 38 PB (petabytes) and has a maximum bandwidth of 200 GBps (gigabytes per second) to Cheyenne.
Tools and Research Approach
We use methods and approaches drawn from probability theory, statistical learning, and game theory, etc.