A certain television show airs weekly. Each time it airs, its ratings are categorized as "high," "average," or "low," and transitions between these categorizations each week behave like a Markov chain. If the ratings are high this week, they will be high, average, or low next week with probabilities 40%, 10%, and 50% respectively. If the ratings are average this week, they will be high, average, or low next week with probabilities 60%, 20%, and 20% respectively. If the ratings are low this week, they will be high, average, or low next week with probabilities 70%, 20%, and 10% respectively. Write all answers as integers or decimals. If the ratings are average this week, what is the probability that they will be low two weeks from now? If the ratings are average the first week, what is the probability that they will be low the second week? If the ratings are low the first week, what is the probability that they will be high the third week?
A certain television show airs weekly. Each time it airs, its ratings are categorized as "high," "average," or "low," and transitions between these categorizations each week behave like a Markov chain. If the ratings are high this week, they will be high, average, or low next week with probabilities 40%, 10%, and 50% respectively. If the ratings are average this week, they will be high, average, or low next week with probabilities 60%, 20%, and 20% respectively. If the ratings are low this week, they will be high, average, or low next week with probabilities 70%, 20%, and 10% respectively. Write all answers as integers or decimals. If the ratings are average this week, what is the probability that they will be low two weeks from now? If the ratings are average the first week, what is the probability that they will be low the second week? If the ratings are low the first week, what is the probability that they will be high the third week?
Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 47E: Explain how you can determine the steady state matrix X of an absorbing Markov chain by inspection.
Related questions
Question
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 2 steps with 2 images
Recommended textbooks for you
Elementary Linear Algebra (MindTap Course List)
Algebra
ISBN:
9781305658004
Author:
Ron Larson
Publisher:
Cengage Learning
Elementary Linear Algebra (MindTap Course List)
Algebra
ISBN:
9781305658004
Author:
Ron Larson
Publisher:
Cengage Learning