Graphical Models and Machine Learning

Practical graphical models such as Bayesian networks are reaching the size of thousands of variables. Building these models and reasoning with them become increasingly more difficult. The goal of this research is to develop novel inference and learning algorithms for graphical models in large domains.

Decision making under uncertainty

Influence diagrams provide a natural framework for modeling the relations between random variables, decisions, and preferences; they also provide principled methods for finding an optimal decision policy that maximizes the expected utility. This research aims to relax the restrictive assumptions behind influence diagrams and develop novel algorithms and representations for solving decision making problems under uncertainty.


We have joined a multidisciplinary team and begun work on the development of a Collective Online Reasoning Engine (CORE). Crowdsourced reasoning systems have immense potential to improve clear thinking. In addition to possessing more diverse information and perspectives, and being able to divide labor on a complex problem, members of groups can more critically evaluate each other's arguments and thus mitigate confirmation bias. But complex arguments are much harder to aggregate and coordinate than the inputs that prediction markets and geopolitical forecasting tournaments have aggregated to extract the "wisdom of the crowd." To develop CORE, Dr. Yuan and the URL are adding their expertise on Bayesian reasoning to a team that also includes social scientists, educational psychologists, a philosopher (with expertise in informal logic and argument mapping), and an industry partner that is a global leader in human centered design.

Interdisciplinary research

We have genuine interest in interdisciplinary research, as we believe a practical application typically presents its unique challenges and requires the development of new and improved methods. The lab has several ongoing multidisciplinary projects on quantitative finance, crowdsourced analytics, and computational biology, with CORE being a prominent example. A common theme of these projects is the application of state-of-the-art machine learning and graphical modeling methods to discover useful information and knowledge from big data.