# Game

Definition(game). A game is a formal representation of a situation in which a number of individuals interact in a setting of strategic interdependence.

  • Strategic Interdependence. It means that each individual’s welfare depends not only on her own actions but also on the actions of the other individuals.

4 elements of game.

  • The players. Who is involved?
  • The rules. Who moves when? What do they know when they move?What can they do?
  • The outcomes. For each possible set of actions by the players, what is the outcome of the game?
  • The payoffs. What are the players’ preferences (i.e., utility functions) over the possible outcomes?

# The Extensive Form

Definition (extensive form). The extensive form captures who moves when, what actions each player can take, what players know when they move, what the outcome is as a function of the actions taken by the players, and the players’ payoffs from each possible outcome.

# Graphic Form

Definition (game tree). Like an actual tree, it has a unique connected path of branches from the initial node (also called the root) to each point in the tree.

Definition (information set). An information set is a subset of a particular player’s decision nodes. The interpretation is that when play has reached one of the decision nodes in the information set and it is that player’s turn to move, she does not know which of these nodes she is actually at.

  • Partition of decision nodes (mutually exclusive and exhaustive).
  • No information set contains both a node and its predecessor.

Natural restriction. At every node within a given information set, a player must have the same set of possible actions.

Definition (perfect recall). Loosely speaking, perfect recall means that a player does not forget what she once knew, including her own actions.

Definition (game of perfect information). A game is one of perfect information if each information set contains a single decision node.

Definition (common knowledge). It is a basic postulate of game theory that all players know the structure of the game, know that their rivals know it, know that their rivals know that they know it, and so on. In theoretical parlance, we say that the structure of the game is common knowledge.

# Mathematical form

Formally, a game in extensive form is specified by the collection

ΓE={X,A,I,p(),α(),H,H(),ι(),ρ(),u}.

# The Normal Form

Definition (strategy). Let Hi denote the collection of player i's information sets, A the set of possible actions in the game, and C(H)A the set of actions possible at information set H. A strategy for player i is a function si:HiA such that si(H)C(H) for all HHi.

Definition (Normal form representation). ΓN=[I,{Si},{ui()}]

  • Strategy Profile. s=(s1,s2,,sI)=(si,si), sS,siSi.
  • Payoff Function: ui(s).

# Randomized Choices

Definition (pure strategy). A deterministic choice si(H) at each of player i's information set HHi. Definition (Mixed Strategy). Given player i's (finite) pure strategy set Si, a mixed strategy for player i , σi:Si[0,1], assigns to each pure strategy siSi a probability σi(si)0 that it will be played, where siSiσi(si)=1.

  • Expected utility: von-Neumann-Morgenstern type.
    • σ=(σi,σi)
    • ui(σ)=Eσ[ui(s)]=sS[σ1(s1)σ2(s2)σI(sI)]ui(s)

# Dominant and Dominated Strategies

Definition (Strictly Dominant Strategy). A strategy siSi is a strictly dominant strategy for player i in game ΓN=[I,Si,ui()] if for all sisi, we have

ui(si,si)>ui(si,si)

for all siSi.

Definition (Strictly Dominated). A strategy siSi is strictly dominated for player i in game ΓN=[I,Si,ui()] if there exists another strategy siSi such that for all siSi

ui(si,si)>ui(si,si)

In this case, we say that strategy si strictly dominates strategy si.

Def (weakly dominated). A strategy siSi is weakly dominated for player i in game ΓN=[I,Si,ui()] if there exists another strategy siSi such that for all siSi

ui(si,si)ui(si,si)

with strict inequality for some si. In this case, we say that strategy si weakly dominates strategy si.

Def (weakly dominant strategy). A strategy is a weakly dominant strategy for player i in game ΓN=[I,Si,ui()] if it weakly dominates every other strategy in Si.

# Nash Equilibrium

Def (Best Response). In game ΓN=[I,{Si},{ui()}], strategy si is a Best Response for player i to his rivals' strategies si if

ui(si,si)ui(si,si)

for all siSi.

Def (Nash Equilibrium). A strategy profile s=(s1,,sI) constitutes a Nash Equilibrium of game ΓN=[I,{Si},{ui()}] if for every i=1,,I,

ui(si,si)ui(si,si)

for all siSi.

# Mixed Strategy Nash Equilibrium

Definition (Mixed Strategy Nash Equilibrium). A mixed strategy profile σ=(σ1,,σI) constitutes a Nash equilibrium of game ΓN=[I,{Δ(Si)},ui()] if for every i=1,,I,

ui(σi,σi)ui(σi,σi)

for all σiΔ(Si).

Proposition 8.D.1. Let Si+Si denote the set of pure strategies that player i plays with positive probability in mixed strategy profile σ=(σ1,,σI). Strategy profile σ=(σ1,,σI) constitutes a Nash equilibrium of game ΓN=[I,{Δ(Si)},ui()] if and only if for all i=1,,I,

  1. ui(si,σi)=ui(si,σi) for all si,siSi+;
  2. ui(si,σi)ui(si,σi) for all siSi+ and all siSi+.

# Existence of Nash Equilibrium

Proposition 8.D.2. Every game ΓN=[I,{Δ(Si)},ui()] in which the sets S1,,SI have a finite number of elements has a mixed strategy Nash Equilibrium.

Proposition 8.D.3. A Nash Equilibrium exists in game ΓN=[I,{Δ(Si)},ui()] if for all i=1,,I,

  1. Si is a nonempty, convex and compact subset of some Euclidean space RM.
  2. ui(s1,,sI) is continuous in (s1,,sI) and quasiconcave in si.

Prop.8.D.2 and 8.D.3 are sufficient conditions, not necessary.