# Game
Definition(game). A game is a formal representation of a situation in which a number of individuals interact in a setting of strategic interdependence.
- Strategic Interdependence. It means that each individual’s welfare depends not only on her own actions but also on the actions of the other individuals.
4 elements of game.
- The players. Who is involved?
- The rules. Who moves when? What do they know when they move?What can they do?
- The outcomes. For each possible set of actions by the players, what is the outcome of the game?
- The payoffs. What are the players’ preferences (i.e., utility functions) over the possible outcomes?
# The Extensive Form
Definition (extensive form). The extensive form captures who moves when, what actions each player can take, what players know when they move, what the outcome is as a function of the actions taken by the players, and the players’ payoffs from each possible outcome.
# Graphic Form
Definition (game tree). Like an actual tree, it has a unique connected path of branches from the initial node (also called the root) to each point in the tree.
Definition (information set). An information set is a subset of a particular player’s decision nodes. The interpretation is that when play has reached one of the decision nodes in the information set and it is that player’s turn to move, she does not know which of these nodes she is actually at.
- Partition of decision nodes (mutually exclusive and exhaustive).
- No information set contains both a node and its predecessor.
Natural restriction. At every node within a given information set, a player must have the same set of possible actions.
Definition (perfect recall). Loosely speaking, perfect recall means that a player does not forget what she once knew, including her own actions.
Definition (game of perfect information). A game is one of perfect information if each information set contains a single decision node.
Definition (common knowledge). It is a basic postulate of game theory that all players know the structure of the game, know that their rivals know it, know that their rivals know that they know it, and so on. In theoretical parlance, we say that the structure of the game is common knowledge.
# Mathematical form
Formally, a game in extensive form is specified by the collection
# The Normal Form
Definition (strategy). Let
Definition (Normal form representation).
- Strategy Profile.
, . - Payoff Function:
.
# Randomized Choices
Definition (pure strategy). A deterministic choice
- Expected utility: von-Neumann-Morgenstern type.
# Dominant and Dominated Strategies
Definition (Strictly Dominant Strategy). A strategy
for all
Definition (Strictly Dominated). A strategy
In this case, we say that strategy
Def (weakly dominated). A strategy
with strict inequality for some
Def (weakly dominant strategy). A strategy is a weakly dominant strategy for player i in game
# Nash Equilibrium
Def (Best Response). In game
for all
Def (Nash Equilibrium). A strategy profile
for all
# Mixed Strategy Nash Equilibrium
Definition (Mixed Strategy Nash Equilibrium). A mixed strategy profile
for all
Proposition 8.D.1. Let
# Existence of Nash Equilibrium
Proposition 8.D.2. Every game
Proposition 8.D.3. A Nash Equilibrium exists in game
is a nonempty, convex and compact subset of some Euclidean space . is continuous in and quasiconcave in .
Prop.8.D.2 and 8.D.3 are sufficient conditions, not necessary.
← Supply Market Power →