Ariel Procaccia_September 15 2015

Ariel Procaccia

I am an Assistant Professor in the Computer Science Department at Carnegie Mellon University, and an Affiliated Faculty in the Machine Learning Department. I am also a member of the theory group and the ACO program.

I usually work on problems at the interface of computer science and economics. I write papers, teach courses, serve on various editorial boards and program committees, and, best of all, advise wonderful students and postdocs.

Current Highlights

spliddit2

Spliddit: Provably Fair Solutions

rovovote

RoboVote: AI-Driven Collective Decisions

handbook

Handbook of Computational Social Choice

News

I received a new grant from the Office of Naval Research, entitled “Robust Aggregation of Noisy Estimates”.

Abstract:

The goal of the proposed research is to develop a theoretically-grounded, robust approach to the problem of aggregating the (objective) opinions of small groups of people.

For example, consider a team of engineers who are evaluating a slate of prototypes, and must select one prototype for further development. Each alternative has a true quality — in this case, it can be measured by the number of units that would be sold if a certain prototype were selected.

The participants (which are referred to as voters) express their opinions about the relative quality of alternatives, typically by ranking the alternatives. Crucially, these rankings are noisy, that is, the voters may not be perfectly informed about the true quality of each alternative. Indeed, in the foregoing example, the engineers can only take educated guesses regarding the quality of each product prototype. The challenge is, therefore, to uncover the ground truth based on the noisy estimates provided by voters.

Addressing this challenge is a key component in the PI’s long-term agenda of transforming the way people make group decisions. While AI researchers have long argued that the field of computational social choice can be applied to the design and implementation of multiagent systems, it is quickly becoming apparent that the field’s “killer app” is helping people — not software agents — make joint decisions. To this end, the PI has been leading the development of a not-for-profit social choice website called RoboVote. The website’s primary mission includes the aggregation of objective opinions, but existing solutions are not completely satisfying.
The proposed research investigates a new approach to the design of robust aggregation methods, which draws on the notion of minimax estimators from statistics. In a nutshell, the main idea is to consider a collection of models that govern how random noise is generated, and construct aggregation methods that output an outcome that is as close as possible to the ground truth, with respect to the given collection of noise models and a given distance function.

The thrust of the proposed research is threefold:

1. Characterizing minimax estimators in the social choice setting, and providing analytical bounds on their performance.

2. Designing algorithms to compute the output of minimax estimators on given instances.

3. Using real-world data to guide the design of minimax estimators by identifying collections of noise models that fit the data; and to test the performance of minimax estimators in practice.

 

Our paper “Which Is the Fairest (Rent Division) of Them All” has been selected to receive the best paper award at the 17th ACM Conference on Economics and Computation, out of 242 submissions. The paper is joint work with Kobi Gal and Moshik Mash of Ben Gurion University, and my postdoc Yair Zick.

Abstract: What is a fair way to assign rooms to several housemates, and divide the rent between them? This is not just a theoretical question: many people have used the Spliddit website to obtain envy-free solutions to rent division instances. But envy freeness, in and of itself, is insufficient to guarantee outcomes that people view as intuitive and acceptable. We therefore focus on solutions that optimize a criterion of social justice, subject to the envy freeness constraint, in order to pinpoint the “fairest” solutions. We develop a general algorithmic framework that enables the computation of such solutions in polynomial time. We then study the relations between natural optimization objectives, and identify the maximin solution, which maximizes the minimum utility subject to envy freeness, as the most attractive. We demonstrate, in theory and using experiments on real data from Spliddit, that the maximin solution gives rise to significant gains in terms of our optimization objectives. Finally, a user study with Spliddit users as subjects demonstrates that people find the maximin solution to be significantly fairer than arbitrary envy-free solutions; this user study is unprecedented in that it asks people about their real-world rent division instances. Based on these results, the maximin solution has been deployed on Spliddit since April 2015.

Congratulations to Nisarg Shah, who has accepted a tenure-track faculty position in the Department of Computer Science at the University of Toronto! He will join U of T in fall 2017, after spending a year as a postdoc at Harvard’s Center for Research in Computation and Society. I have had the privilege of advising Nisarg since he joined our PhD program in fall 2011 (which is the same time I myself joined CMU); he is defending on June 28.

The Handbook of Computational Social Choice, which I have co-edited together with Felix Brandt, Vince Conitzer, Ulle Endriss, and Jerome Lang, has been officially published.

Description: The rapidly growing field of computational social choice, at the intersection of computer science and economics, deals with the computational aspects of collective decision making. This handbook, written by thirty-six prominent members of the computational social choice community, covers the field comprehensively. Chapters devoted to each of the field’s major themes offer detailed introductions. Topics include voting theory (such as the computational complexity of winner determination and manipulation in elections), fair allocation (such as algorithms for dividing divisible and indivisible goods), coalition formation (such as matching and hedonic games), and many more. Graduate students, researchers, and professionals in computer science, economics, mathematics, political science, and philosophy will benefit from this accessible and self-contained book.

A PDF of the book is freely available online. To open the document, please enter the password cam1CSC (strangely enough, this password can be disseminated legally, so you are encouraged to shout it from the rooftops). The link is here:

http://www.cambridge.org/download_file/898428

Alternatively, the book can be purchased through the Cambridge University Press website (http://www.cambridge.org/9781107060432), Amazon, and other retailers.

Congratulations to Yair Zick for accepting a faculty position at the National University of Singapore! He will join NUS as Assistant Professor of Computer Science in the summer. Yair has been a postdoc at CMU since April 2014, jointly hosted by Anupam Datta and me. Before that he completed his PhD at Nanyang Technological University, Singapore, where he worked with Edith Elkind.

Congratulations to Nika Haghtalab for winning a 2016-2018 Microsoft Research PhD fellowship. Nika is co-advised by Avrim Blum and me.

Together with Yonathan Aumann, Steven Brams, and  Jérôme Lang, I am organizing a Dagstuhl workshop on Fair Division. The workshop will take place on June 6-10, 2016.

New Scientist published a “one minute interview” with me about AI and fair division.

I received the IJCAI Computers and Thought Award at IJCAI’15 in Buenos Aires. A narrated video of my presentation at the conference is now available online.

I received a new NSF grant, entitled “Fair Division at Scale”.

Abstract:

The mathematical study of fair division dates back to the 1940s. Nowadays the literature encompasses provably fair solutions for a wide variety of problems, many of them relevant to society at large. But, to date, very few fair division methods have been implemented. Building on its rich history, the field of fair division is poised to impact society on a grander scale, for example, via Spliddit (www.spliddit.org), a fair division website created by the PI. The overarching theme of this proposal is that the advent of fairness at scale gives rise to novel algorithmic challenges, revolving around the design of methods that are practical, widely applicable, and provably fair. Solutions to these challenges call for a synthesis of ideas from theoretical computer science and microeconomic theory.

The evolution of Spliddit plays a key role in the project: Insights arising from the research will be implemented to improve Spliddit’s publicly accessible methods. Moreover, new applications will be added in order to further widen the website’s reach. The project also includes plans to enhance Spliddit’s educational content through the creation of instructive videos, thereby making the website a valuable learning and teaching resource. Finally, the project contains plans to make a societal impact that extends beyond Spliddit. In particular, the PI will work with California school districts to develop a fair method for allocating unused space.

On a technical level, the thrust of the project is twofold. (i) Dividing indivisible goods: One of five applications currently available on Spliddit, the division of indivisible goods is a notoriously difficult problem from a fair division perspective. To obtain a provably fair method, Spliddit relies on the notion of maximin share (MMS) guarantee. While exact MMS allocations may be infeasible, the project explores several notions of approximation. The project also explores the feasibility boundary of MMS allocations. Foundational algorithmic challenges lie at the heart of these research directions. (ii) Sharing credit: Spliddit’s credit calculator fairly determines the contribution of each individual to a group project, using a method developed by de Clippel et al. Perhaps the method’s most compelling guarantee is impartiality: a player’s report does not affect her own share of the credit. Dividing credit for a scientific paper, with the goal of fairly ordering the authors by contribution, is an especially attractive potential application; but there is no guarantee that players will not be able to affect their position in the resultant ranking. The project aims to circumvent this obstacle via randomization and approximation.

A new version of Spliddit, featuring two new apps and improvements to the old ones, has officially launched!
Press coverage (newest to oldest): The New York Times #2, New Scientist, Pittsburgh Magazine, GeekWire, NSF podcast, Fast Company, Pittsburgh Post-Gazette, Lifehacker, Slashdot, Gizmodo, Nautilus Magazine, The New York Times #1.

Congratulations to Nika Haghtalab for winning a 2015-2016 IBM PhD fellowship, with a statement entitled “Learning Theory for Societal Good”. Nika is co-advised by Avrim Blum and me.

I was named a Sloan Research Fellow, together with 125 other young researchers in chemistry, computer science, economics, mathematics, computational and evolutionary molecular biology, neuroscience, ocean sciences, and physics.

Congratulations to Junxing Wang for winning the best student paper award at EC-14, for our joint paper “Fair Enough: Guaranteeing Approximate Maximin Shares”.

The New York Times published a story about the rent division problem — they even implemented the algorithm of Francis Su, and illustrated Sperner’s lemma! The story includes a paragraph about Spliddit, and a rather inarticulate quote from me.

Congratulations to Nisarg Shah for winning a 2014-2015 Facebook PhD fellowship, with a statement entitled “Economic Foundations of Social Computing”.  There are 11 winners overall, 3 of whom are from CMU!

Please check back later for more fascinating news!

Contact Information

  •    arielpro@cs.cmu.edu
  •   
  •    Computer Science Dept.
  •    Carnegie Mellon University
  •    Pittsburgh, PA 15213
  •   
  •    Gates-Hillman Center
  •    Office #9021