# Birkhoff-von Neumann decomposition with constraints

Decomposes a matrix into a weighted sum of basis matrices with binary entries satisfying user-imposed constraints. When the starting matrix is doubly stochastic and the basis matrices are required to be permutation matrices, this is the classical Birkhoff von-Neumann decomposition. Here we implement the algorithm identified in Budish, Che, Kojima, and Milgrom 2013. The constraints must form what they call a bihierarchy. The user is informed if the proposed constraint structure is not a bihierarchy.

Contents

Installation

Basic usage

Mathematical background

API

Application

## Installation

Download `constrained_birkhoff_von_neumann.py`

## Basic usage

```
import numpy as np
from constrained_birkhoff_von_neumann import constrained_birkhoff_von_neumann_decomposition
# Create a matrix whose entries are between 0 and 1, and a constraint structure.
#
#For example
#
X = np.array([[.5, .2,.3], [.5,.5, 0], [.8, 0, .2], [.2, .3, .5]])
constraint_structure = {frozenset({(0, 0), (0, 1), (0,2)}): (1,3)}
#
# The first, and only, entry of this constraint structure requires that the
# (0, 0), (0, 1), (0,2) entries of each basis matrix sum to 1, 2, or 3.
constrained_birkhoff_von_neumann_decomposition(X, constraint_structure)
```

## Mathematical background

See Budish, Che, Kojima, and Milgrom 2013.

## API

```
>>> import numpy as np
>>> from constrained_birkhoff_von_neumann import constrained_birkhoff_von_neumann_decomposition
>>> X = np.array([[.5, .2,.3], [.5,.5, 0], [.8, 0, .2], [.2, .3, .5]])
>>> constraint_structure = {frozenset({(0, 0), (0, 1), (0,2)}): (1,1), frozenset({(1, 0), (1, 1), (1,2)}):(1,1), frozenset({(2, 0), (2, 1), (2,2)}):(1,1), frozenset({(3, 0), (3, 1), (3,2)}):(1,1), frozenset({(0, 0), (1, 0), (2,0), (3,0)}):(1,2), frozenset({(0, 1), (1, 1), (2,1), (3,1)}):(1,1), frozenset({(0, 2), (1, 2), (2,2), (3,2)}):(1,1), frozenset({(0, 0), (1, 0)}):(1,1)}
>>> constrained_birkhoff_von_neumann_decomposition(X,constraint_structure)
[[0.14285714285714288,
0.3571428571428571,
0.29999999999999999,
0.057142857142857141,
0.14285714285714282],
[array([[ 0., 1., 0.],
[ 1., 0., 0.],
[ 1., 0., 0.],
[ 0., 0., 1.]]),
array([[ 1., 0., 0.],
[ 0., 1., 0.],
[ 1., 0., 0.],
[ 0., 0., 1.]]),
array([[ 0., 0., 1.],
[ 1., 0., 0.],
[ 1., 0., 0.],
[ 0., 1., 0.]]),
array([[ 0., 1., 0.],
[ 1., 0., 0.],
[ 0., 0., 1.],
[ 1., 0., 0.]]),
array([[ 1., 0., 0.],
[ 0., 1., 0.],
[ 0., 0., 1.],
[ 1., 0., 0.]])],
1.0,
array([[ 0.5, 0.2, 0.3],
[ 0.5, 0.5, 0. ],
[ 0.8, 0. , 0.2],
[ 0.2, 0.3, 0.5]])]
>>>
```

The output of `constrained_birkhoff_von_neumann_decomposition(X,constraint_structure)`

is a list. Its first entry is the distribution over basis matrices, its second entry is the list of basis matrices (in order), and its third and fourth entries are checks that the probabilities sum to one and that the expected basis matrix is the target, `X`

.

## Application: probabilistic serial mechanism with constraints

There are `n`

objects to be allocated among `m`

agents. Each agent has a ranking of the objects. The probability of agent `i`

winning object `j`

is derived from the following process: agents simultaneously and at the same speed 'eat' the probability mass of their top ranked object for which probability mass still remains. The resulting probabilities can be collected in an `m`

times `n`

matrix `X`

whose entry in its `i`

'th row and `j`

'th column is the probability that agent `i`

wins object `j`

.

To implement the probabilities of `X`

, one needs to find an appropriate probability distribution over allocations (full descriptions of who gets what). An allocation may be represented as a matrix with binary entries, a `1`

in the `i`

'th row and `j`

'th column meaning that agent `i`

wins object `j`

and a `0`

meaning that agent `i`

does not win object `j`

. If `n=m`

, such a matrix is a permutation matrix. Further, `X`

will necessarily be doubly stochastic and so the problem may be solved using the classical Birkhoff-von Neumann decomposition.

`constrained_birkhoff_von_neumann_decomposition`

may be used when `n`

is not equal to `m`

, or when one wishes to impose additional constraints (e.g. each agent is allocated at least one object, no agent is allocated more than `k`

objects, etc).

To run the probabilistic serial mechanism, follow the instructions below.

### Installation

Download `probabilistic_serial_mechanism.py`

### Basic usage

```
from probabilistic_serial_mechanism import probabilistic_serial_mechanism
# Create a dictionary R with agents as keys and rank order lists as values
# Create a list m of the probability masses for each object
#
# A mass greater than one is used when there are multiple units of an object that each
# agent can be allocated at most one of.
#
# For example:
#
R = {0: [0,1,2], 1: [2,1,0]}
m = [1, 1, 2]
probabilistic_serial_mechanism(R, m)
```

### Mathematical background

See Budish, Che, Kojima, and Milgrom 2013.

### API

```
>>> from probabilistic_serial_mechanism import probabilistic_serial_mechanism
>>> R = {0: [0,1,2], 1: [2,1,0]}
>>> m = [1,1,2]
>>> probabilistic_serial_mechanism(R, m)
[{0: [1.0, 0.5, 1.0], 1: [0, 0.5, 1.0]}, array([[ 1. , 0.5, 1. ],
[ 0. , 0.5, 1. ]])]
>>>
```