您现在的位置: 首页 » 公司新闻 » 讲座信息 » 正文

公司新闻

讲座信息

报告时间:2021年9月17日 (周五) 15:00 – 17:00

地点: yl9193永利官网理科一号楼1501

线上会议:腾讯会议 176 898 233



 

Title: QDiff: Differential Testing of Quantum Software Stacks

Presenter:Jiyuan Wang (University of California, Los Angeles)

 

Abstract: Over the past few years, several quantum software stacks (QSS) have been developed in response to rapid hardware advances in quantum computing. A QSS includes a quantum programming language, an optimizing compiler that translates a quantum algorithm written in a high-level language into quantum gate instructions, a quantum simulator that emulates these instructions on a classical device, and a software controller that sends analog signals to very expensive quantum hardware based on quantum circuits. In comparison to traditional compilers and architecture simulators, QSSes are difficult to tests due to the probabilistic nature of results, the lack of clear hardware specifications, and quantum programming complexity.

This work devises a novel differential testing approach for QSSes, named QDIFF with three major innovations: (1) We generate input programs to be tested via semantics-preserving, source to source transformation to explore program variants. (2) We speed up differential testing by filtering out quantum circuits that are not worthwhile to execute on quantum hardware by analyzing static characteristics such as a circuit depth, 2- gate operations, gate error rates, and T1 relaxation time. (3) We design an extensible equivalence checking mechanism via distribution comparison functions such as Kolmogorov–Smirnov test and cross-entropy.

 

Bio: I am a Ph.D. student in Computer Science Department at University of California, Los Angeles. I am designing testing and program synthesis method for big data analytic, quantum computers, and FPGA. I am a member of SOLAR group and co-advised by Professor Miryung Kim and Professor Harry Xu.

 



Title: Accelerating Program Analyses in Datalog by Merging Library Facts

Presenter: Yifan Chen (Peking University)

 

Abstract: Static program analysis balances between precision and scalability by tuning program abstractions. However, finer abstraction does not necessarily lead to more precise results but may reduce scalability. We propose a new technique, 4DM, to tune abstractions for program analyses in Datalog. 4DM merges values in a domain, allowing fine-grained sensitivity tuning, and uses a data-driven algorithm for automatically learning a merging strategy for a library from a training set of programs. Unlike existing approaches that rely on the properties of a certain analysis, our learning algorithm works for a wide range of Datalog analyses. Our evaluation results suggest that our technique achieves a significant speedup and negligible precision loss, reaching a good balance.

 

Bio:  Yifan Chen is a Ph.D. student in computer science at Peking University. His research interests are in program analysis, with a focus on data-driven approaches to optimizing analyses.