(08:30AM-09:30AM/ Monday, June 29/ Auditorium 301)


Sensors, imaging systems, and communication networks are under increasing pressure to accommodate ever larger and higher-dimensional data sets; ever faster capture, sampling, and processing rates; ever lower power consumption; communication over ever more difficult channels; and radically new sensing modalities. The foundation of today's digital data acquisition systems is the Shannon/Nyquist sampling theorem, which asserts that to avoid losing information when digitizing a signal or image, one must sample at least two times faster than the signal's bandwidth, at the so-called Nyquist rate. Unfortunately, the physical limitations of current sensing systems combined with inherently high Nyquist rates impose a performance brick wall to a large class of important and emerging applications.

This talk will overview some of the recent progress on compressive sensing, a new approach to data acquisition in which analog signals are digitized not via uniform sampling but via measurements using more general, even random, test functions. In stark contrast with conventional wisdom, the new theory asserts that one can combine "sub-Nyquist-rate sampling" with digital computational power for efficient and accurate signal acquisition. The implications of compressive sensing are promising for many applications and enable the design of new kinds of analog-to-digital converters; radio receivers, communication systems, and networks; cameras and imaging systems, and sensor networks.


[ Bio. ]


Richard G. Baraniuk is the Victor E. Cameron Professor of Electrical and Computer Engineering Department at Rice University. His research interests lie in new theory, algorithms, and hardware for sensing and signal processing.  His work on the Rice single-pixel compressive camera has been widely reported in the popular press and was selected by MIT Technology Review as a TR10 Top 10 Emerging Technology for 2007. He is a Fellow of the IEEE and has received national young investigator awards from the National Science Foundation and the Office of Naval Research, the Rosenbaum Fellowship from the Isaac Newton Institute of Cambridge University, the ECE Young Alumni Achievement Award from the University of Illinois, and the Wavelet Pioneer Award from SPIE.


For his teaching and education projects, he has received the George R. Brown Award for Superior Teaching at Rice three times, the C. Holmes MacDonald National Outstanding Teaching Award from Eta Kappa Nu, and the Internet Pioneer Award from the Berkman Center for Internet and Society at Harvard Law School. He was selected as one of Edutopia Magazine's Daring Dozen Education Innovators in 2007. The non-profit open-access educational publishing project Connexions (cnx.org) he launched was a Tech Museum of Innovation Laureate in 2006.


(08:30AM-09:30AM/ Tuesday, June 30/ Auditorium 301)


Unique among many engineering fields, information theory aims for and almost demands exactly optimal solutions to infinite-dimensional design problems. Such a high standard was set by Shannon in his original analysis of point-to-point communication problems. After almost 40 years of effort, meeting such a standard has proved to be far more difficult when extending Shannon's theory to networks. In this talk, we argue that much broader progress can be made when instead one seeks approximate solutions with a guarantee on the gap to optimality. Focusing on the practically important models of linear Gaussian channels and Gaussian sources, our approach consists of three steps: 1) simplify the model; 2) obtain optimal solution for the simplified model; 3) translate the optimal scheme and outer bounds back to the original model.? We illustrate this approach on five long-standing open problems: 1) relay networks; 2) interference channels; 3) distributed source coding; 4) multiple description; 5) joint source-channel broadcasting. We give examples where this approach yields: 1) achievable schemes which are arbitrarily better than existing schemes; 2) outer bounds which are arbitrarily tighter than existing bounds; 3) occasionally exact results. This approach also shows a surprising connection between Gaussian problems and network coding problems on wired networks.


[ Bio. ]


David Tse received the B.A.Sc. degree in systems design engineering from University of Waterloo, Canada in 1989, and the M.S. and Ph.D. degrees in electrical engineering from Massachusetts Institute of Technology in 1991 and 1994 respectively. From 1994 to 1995, he was a postdoctoral member of technical staff at A.T. & T. Bell Laboratories. Since 1995, he has been at the Department of Electrical Engineering and Computer Sciences in the University of California at Berkeley, where he is currently a Professor. He received a 1967 NSERC 4-year graduate fellowship from the government of Canada in 1989, a NSF CAREER award in 1998, the Best Paper Awards at the Infocom 1998 and Infocom 2001 conferences, the Erlang Prize in 2000 from the INFORMS Applied Probability Society, the IEEE Communications and Information Theory Society Joint Paper Award in 2001, and the Information Theory Society Paper Award in 2003.


He has given plenary talks at international conferences such as ICASSP in 2006, MobiCom in 2007 and CISS in 2008. He was the Technical Program co-chair of the International Symposium on Information Theory in 2004, and was an Associate Editor of the IEEE Transactions on Information Theory from 2001 to 2003. He is a coauthor, with Pramod Viswanath, of the text "Fundamentals of Wireless Communication".


(08:30AM-09:30AM/ Wednesday, July 1/ Auditorium 301)


Constraints on the entropy function are sometimes referred to as the laws of information theory. For a long time, the submodular inequalities, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by the submodular inequality are categorically referred to as Shannon-type inequalities. If the number of random variables is fixed, a Shannon-type inequality can in principle be verified by a linear program known as ITIP.
A non-Shannon-type inequality is a constraint on the entropy function which is not implied by the submodular inequality. In the late 1990s, the discovery of a few such inequalities revealed that Shannon-type inequalities alone do not constitute a complete set of constraints on the entropy function.
In the past decade, connections between the entropy function and a number of fields in information science, mathematics, and physics have been established. These fields include probability theory, network coding, combinatorics, group theory, Kolmogorov complexity, matrix theory, and quantum mechanics. This talk is an attempt to present a picture for the many facets of the entropy function.


[ Bio. ]


Raymond W. Yeung received the BS, MEng and PhD degrees in electrical engineering from Cornell University in 1984, 1985, and 1988, respectively. He joined AT&T Bell Laboratories in 1988. He came to CUHK in 1991 and has been with the Department since then, where he is currently a chair professor. He is the author of the book entitled A First Course in Information Theory (Kluwer Academic/Plenum Publishers, 2002). His research interest is in information theory and network coding. He was a consultant in a project of Jet Propulsion Laboratory for salvaging the malfunctioning Galileo Spacecraft.
He is a member of the Board of Governors of the IEEE Information Theory Society from 1999 to 2001. He has served on the committees of a number of information theory symposiums and workshops. 


He was the General Chair of the First Workshop on Network, Coding, and Applications (NetCod 2005), a Technical Co-Chair of the 2006 IEEE International Symposium on Information Theory, and a Technical Co-Chair of the 2006 IEEE Information Theory Workshop, Chengdu. He also has served on the editorial board of a number of academic journals. He was an Associate Editor for Shannon Theory of the IEEE Transactions on Information Theory from 2002 to 2005. He currently serves as an Editor-at-Large of Communications in Information and Systems, an Editor of Foundation and Trends in Communications and Information Theory and an Editor of Foundation and Trends in Networking. He was a recipient of the Croucher Senior Research Fellowship for 2000/01, the Best Paper Award (Communication Theory) of the 2004 International Conference on Communications, Circuits and System, the 2005 IEEE Information Theory Society Paper Award, and the Friedrich Wilhelm Bessel Research Award from the Alexander von Humboldt Foundation in 2007. He is a Fellow of the IEEE and the Hong Kong Institution of Engineers.


(08:30AM-09:30AM/ Friday, July 3/ Auditorium 301)


Combinatorial arguments have played a crucial role in the investigation of several surprising phenomena in Information Theory. I will discuss some of these results focusing on a recent example, based on joint papers with Lubetzky and Stav, and with Hassidim and Weinstein, in which properties of graph powers, colorings of Cayley graphs, and the chromatic numbers of Kneser graphs are applied in the study of a broadcasting problem with side information.


[ Bio. ]


Noga Alon is a Baumritter Professor of Mathematics and Computer Science in Tel Aviv University, Israel. He received his Ph. D. in Mathematics at the Hebrew University of Jerusalem in 1983 and had visiting positions in various research institutes including MIT, The Institute for Advanced Study in Princeton, IBM Almaden Research Center, Bell Laboratories, Bellcore and Microsoft Research.
He serves on the editorial boards of more than a dozen international technical journals and has given invited lectures in many conferences, including plenary addresses in the 1996 European Congress of Mathematics and in the 2002 International Congress of Mathematician. He published more than four hundred research papers, mostly in Combinatorics and in Theoretical Computer Science, and one book.


He is a member of the Israel National Academy of Sciences since 1997 and of the Academia Europaea since 2008, and received the Erdos prize in 1989, the Feher prize in 1991, the Polya Prize in 2000, the Bruno Memorial Award in 2001, the Landau Prize in 2005, the Godel Prize in 2005 and the Israel Prize in 2008.