Modern machine learning (ML) methods commonly postulate strong assumptions such as: (1) access to data that adequately captures the application environment, (2) the goal is to optimize the objective function of a single agent, assuming that the application environment is isolated and is not affected by the outcome chosen by the ML system. In this talk I will present methods with theoretical guarantees that are applicable in the absence of (1) and (2) as well as corresponding fundamental lower bounds.
Join the Pratt School of Engineering at Duke University, the Triangle Women in STEM, and the Duke Women's Center at 6:00 pm ET, Tuesday, March 22 as we celebrate how women have thrived and made strides in the fields of STEM! This event will feature a panel of local women leaders, who will discuss how they became interested in STEM and thrived in their fields.
Advanced digital technologies rely on collecting and processing various types of sensitive data from their users. These data practices could expose users to a wide array of security and privacy risks. My research at the intersection of security, privacy, and human-computer interaction aims to help all people have safer interactions with digital technologies. In this talk, I will share quantitative and qualitative results on people's security and privacy preferences and attitudes toward technologies such as smart devices and remote communication tools.
You are cordially invited to attend the thesis presentations of two candidates for the Master of Arts in East Asian Studies.
Xin Bao will discuss "Narrating Chinese 996 Work Culture from Online and Offline Perspectives."
Mengyu Chen will present "Chollywooding and Pandering: The Present and Future of
register to attend (via Zoom): https://duke.is/nfunj
Machine learning (ML) is widely used today, ranging from applications in medicine to those in autonomous driving. Across all these applications, various forms of sensitive information is shared with the ML model, such as private medical records, or a user's location. In this talk, I will explain what forms of private information can be learnt through interacting with the ML model. In particular, I will discuss when ML model parameters in cloud deployments are not confidential, and how this can be remediated.
As computer systems grow more and more complicated, various performance optimizations can unintentionally introduce security vulnerabilities in these systems. The vulnerabilities can lead to user information and data being compromised or stolen. Many of the computer processor optimizations often focus on sharing or re-using the processor hardware between different users or programs. This can lead to different types of timing-based security attacks where the sharing or re-using of hardware components influences the timing of the operations performed on the processor.
Join the Pratt School of Engineering at Duke University, the Triangle Women in STEM, and the Duke Women's Center on March 22nd at 6 p.m. as we celebrate how women have thrived and made strides in the fields of STEM! This event will feature a panel of local women leaders, who will discuss how they became interested in STEM and thrived in their fields.