Prediction Revisited

Реклама. ООО «ЛитРес», ИНН: 7719571260.
Оглавление
Mark P. Kritzman. Prediction Revisited
Table of Contents
List of Tables
List of Illustrations
Guide
Pages
PREDICTION REVISITED. THE IMPORTANCE OF OBSERVATION
Timeline of Innovations
Essential Concepts
Preface
1 Introduction
Relevance
Informativeness
Similarity
Roadmap
Note
2 Observing Information
Observing Information Conceptually
Central Tendency
Spread
Information Theory
The Strong Pull of Normality
A Constant of Convenience
Key Takeaways
Observing Information Mathematically
Average
Spread
Information Distance
Observing Information Applied
Appendix 2.1: On the Inflection Point of the Normal Distribution
References
Notes
3 Co-occurrence
Co-occurrence Conceptually
Correlation as an Information-Weighted Average of Co-occurrence
Pairs of Pairs
Across Many Attributes
Key Takeaways
Co-occurrence Mathematically
The Covariance Matrix
Co-occurrence Applied
References
Note
4 Relevance
Relevance Conceptually
Informativeness
Similarity
Relevance and Prediction
How Much Have You Regressed?
Partial Sample Regression
Asymmetry
Sensitivity
Memory and Bias
Key Takeaways
Relevance Mathematically
Prediction
Equivalence to Linear Regression
Partial Sample Regression
Asymmetry
Relevance Applied
Appendix 4.1: Predicting Binary Outcomes. Predicting Binary Outcomes Conceptually
Predicting Binary Outcomes Mathematically
References
Notes
5 Fit
Fit Conceptually
Failing Gracefully
Why Fit Varies
Avoiding Bias
Precision
Focus
Key Takeaways
Fit Mathematically
Components of Fit
Precision
Fit Applied
Notes
6 Reliability
Reliability Conceptually
Key Takeaways
Reliability Mathematically
Reliability Applied
References
Notes
7 Toward Complexity
Toward Complexity Conceptually
Learning by Example
Expanding on Relevance
Key Takeaways
Toward Complexity Mathematically
Complexity Applied
References
8 Foundations of Relevance
Observations and Relevance: A Brief Review of the Main Insights
Spread
Co-occurrence
Relevance
Asymmetry
Fit and Reliability
Partial Sample Regression and Machine Learning Algorithms
Abraham de Moivre (1667–1754)
Pierre-Simon Laplace (1749–1827)
Carl Friedrich Gauss (1777–1853)
Francis Galton (1822–1911)
Karl Pearson (1857–1936)
Ronald Fisher (1890–1962)
Prasanta Chandra Mahalanobis (1893–1972)
Claude Shannon (1916–2001)
References. Abraham de Moivre
Pierre-Simon Laplace
Carl Friedrich Gauss
Francis Galton
Karl Pearson
Ronald Fisher
Prasanta Chandra Mahalanobis
Claude Shannon
Notes
Concluding Thoughts
Perspective
Insights
Prescriptions
Index
WILEY END USER LICENSE AGREEMENT
Отрывок из книги
MEGAN CZASONIS
MARK KRITZMAN
.....
Informativeness is related to information theory, the creation of Claude Shannon, arguably the greatest genius of the twentieth century.1 As we discuss in Chapter 2, information theory posits that information is inversely related to probability. In other words, observations that are unusual contain more information than those that are common. We could stop here and rest on Shannon's formidable reputation to validate our inclusion of informativeness as one of the two components of relevance. But it never hurts to appeal to intuition. Therefore, let us consider the following example.
Suppose we would like to measure the relationship between the performance of the stock market and a collection of economic attributes (think variables) such as inflation, interest rates, energy prices, and economic growth. Our initial thought might be to examine how stock returns covary with changes in these attributes. If these economic attributes behaved in an ordinary way, it would be difficult to tell which of the attributes were driving stock returns or even if the performance of the stock market was instead responding to hidden forces. However, if one of the attributes behaved in an unusual way, and the stock market return we observed was also notable, we might suspect that these two occurrences are linked by more than mere coincidence. It could be evidence of a fundamental relationship. We provide a more formal explanation of informativeness in Chapter 2, but for now let us move on to similarity.
.....