0% found this document useful (0 votes)
78 views29 pages

PPT9-Renewal Process

The document discusses renewal processes and related topics. It introduces renewal processes and defines them as counting processes where the times between events are independent and identically distributed. It also covers the distribution of the number of events by time t, limit theorems for renewal processes, and renewal reward processes.

Uploaded by

Fajar Prasetyo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views29 pages

PPT9-Renewal Process

The document discusses renewal processes and related topics. It introduces renewal processes and defines them as counting processes where the times between events are independent and identically distributed. It also covers the distribution of the number of events by time t, limit theorems for renewal processes, and renewal reward processes.

Uploaded by

Fajar Prasetyo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

ISYE6189 - Deterministic

Optimization and Stochastic


Processes
Topic 9 - Week 9
Renewal Process
Learning Outcome
• LO4: Analyze given problems using the
concepts of Poisson process, renewal
process, or queuing theory
Sub topics
Renewal Theory
 Introduction
 Distribution of N(t)
 Limit Theorems and Their Applications
 Renewal Rewards Processes
INTRODUCTION
Continuous Time
Markov Chain:
Relax Exponential times
Poisson Process: counting between transitions
process
Counting process
iid exponential
times between Relax
Renewal Process:
arrivals exponential
interarrival Counting process
times iid times between
arrivals
Counting Process
A stochastic process {N(t), t  0} is a counting process if N(t)
represents the total number of events that have occurred in
[0, t]
Then {N(t), t  0} must satisfy:
N(t)  0
N(t) is an integer for all t
If s < t, then N(s)  N(t)
For s < t, N(t) - N(s) is the number of events that occur in the
interval (s, t].
Renewal Process
A counting process {N(t), t  0} is a renewal process if for each n,
Xn is the time between the (n-1)st and nth arrivals and {Xn, n  1}
are independent with the same distribution F.
The time of the nth arrival is n  i 1 X i , n  1,
n
S 
with S0 = 0.
Can write N t   max n : Sn  t
and if m = E[Xn], n  1, then the strong law of large numbers says
that  S 
P  n   as n     1 Note: m is now a
n  time interval, not a rate;
1/ m will be called
the rate of the r. p.
DISTRIBUTION OF N(T)
Fundamental Relationship

N t   n  S n  t
It follows that
P N t   n  P N t   n  P N t   n  1 
P S n  t  P S n 1  t  Fn t   Fn 1 t 
where Fn(t) is the n-fold convolution of F with itself.
The mean value of N(t) is
  
m t   E  N t    P N t   n   P S n  t   Fn t 
n 1 n 1 n 1
Condition on the time of the first renewal to get the renewal
equation:
t
m t   F t    m t  x  f  x dx
0
Suppose that

That is, suppose that the interarrival distribution is geometric. Now S1 =


X1 may be interpreted as the number of trials necessary to get a single
success when each trial is independent and has a probability p of being a
success. Similarly, Sn may be interpreted as the number of trials
necessary to attain n successes, and hence has the negative binomial
distribution
Equivalently, since an event independently occurs with probability p at
each of the times 1, 2, . . .
If

then Sn, being the sum of n independent exponentials with rate λ, will have a
gamma (n, λ) distribution. Consequently, the preceding identity gives
LIMIT THEOREM
Limit Theorems
• With probability 1, N t  1
 as t  
t 
• Elementary renewal theorem: m t  1
 as t  
t 
• Central limit theorem: For large t, N(t) is approximately
normally distributed with mean t/m and variance
t 2  3
where s2 is the variance of the time between arrivals;
in particular,
Var  N t  2
 3 as t  
t 
Beverly has a radio that works on a single battery. As soon as the battery in use
fails, Beverly immediately replaces it with a new battery. If the lifetime of a
battery (in hours) is distributed uniformly over the interval (30, 60), then at what
rate does Beverly have to change batteries?

Solution: If we letN(t) denote the number of batteries that have failed by time t,
we have that the rate at which Beverly replaces batteries is given by

That is, in the long run, Beverly will have to replace one battery every
45 hours.
Suppose Beverly does not keep any surplus batteries on hand, and so each time a
failure occurs she must go and buy a new battery. If the amount of time it takes for
her to get a new battery is uniformly distributed over (0, 1), then what is the
average rate that Beverly changes batteries?

and so in the long run, Beverly will be putting in a new battery at the rate of
2/91. That is, she will put in two new batteries every 91 hours.
Suppose that potential customers arrive at a single-server bank in
accordance with a Poisson process having rate λ. However, suppose
that the potential customer will enter the bank only if the server is
free when he arrives.

That is, if there is already a customer in the bank, then our arriver,
rather than entering the bank, will go home. If we assume that the
amount of time spent in the bank by an entering customer is a
random variable having distribution G, then

(a) what is the rate at which customers enter the bank?


(b) what proportion of potential customers actually enter the bank?
Solution: In answering these questions, let us suppose that at time 0 a
customer has just entered the bank. (That is, we define the process to
start when the first customer enters the bank.) If we let μG denote the
mean service time, then, by the memoryless property of the Poisson
process, it follows that the mean time between entering customers is

Hence, the rate at which customers enter the bank will be given by
On the other hand, since potential customers will be arriving at a rate λ, it
follows that the proportion of them entering the bank will be given by

In particular if λ = 2 and μG = 2, then only one customer out of five will actually
enter the system.
Example
Two machines continually process an unending number of jobs. The
time that it takes to process a job on machine 1 is a gamma random
variable with parameters n = 4, λ = 2, whereas the time that it takes to
process a job on machine 2 is uniformly distributed between 0 and 4.

Approximate the probability that together the two machines can


process at least 90 jobs by time t = 100.

Solution: If we let Ni(t) denote the number of jobs that machine i can
process by time t, then {N1(t), t 0} and {N2(t), t 0} are independent
renewal processes. The interarrival distribution of the first renewal
process is gamma with parameters n = 4, λ = 2, and thus has mean 2
and variance 1.
Correspondingly, the inter-arrival distribution of the second renewal
process is uniform between 0 and 4, and thus has mean 2 and variance
16/12.

Therefore, N1(100) is approximately normal with mean 50 and variance


100/8; and N2(100) is approximately normal with mean 50 and variance
100/6.

Hence, N1(100) + N2(100) is approximately normal with mean 100 and


variance 175/6. Thus, with denoting the standard normal distribution
function, we have
RENEWAL REWARD PROCESS
Renewal Reward Processes
Suppose that each time a renewal occurs we receive a reward.
Assume Rn is the reward earned at the nth renewal and {Rn, n 
1} are independent and identically distributed (Rn may depend
on Xn). The total reward up to time t is R t    N t  Rn
n 1
If E  R    and E  X   
then
 R t  E R 
P  as t     1
 t EX  
and
E  R t  E R
 as t  
t EX 
limiting (or stationary) probabilities for this Markov chain. That is, πi will be
the unique nonnegative solution∗ of

Now, since the process spends an expected time μi in state i whenever it visits
that state, it seems intuitive that Pi should be a weighted average of the πi
where πi is weighted proportionately to μi. That is,
Consider a machine that can be in one of three states: good condition,
fair condition, or broken down. Suppose that a machine in good
condition will remain this way for a mean time μ1 and then will go to
either the fair condition or the broken condition with respective
probabilities ¾ and ¼ .

A machine in fair condition will remain that way for a mean time μ2 and
then will break down. A broken machine will be repaired, which takes a
mean time μ3, and when repaired will be in good condition with
probability 2/3 and fair condition with probabilit 1/3. What proportion
of time is the machine in each state?
Solution: Letting the states be 1, 2, 3, we have that the πi satisfy
For instance, if μ1 = 5, μ2 = 2, μ3 = 1, then the machine will be in good
Condition 5/9 of the time, in fair condition 5/18 of the time, in broken
condition 1/6 of the time.
Thank You

You might also like