0% found this document useful (0 votes)
51 views7 pages

Notation

Uploaded by

alabi1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views7 pages

Notation

Uploaded by

alabi1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Notation

Symbol Meaning
σ Standard deviation
μ Mean
V Variance
E Expectation
f (x) Function of x
x→y Mapping from x to y
(a, b) Open interval
[a, b] Closed interval
(a, b] Half-open interval
Δ Differential of
Π Product operator
Σ Summation of
|x| Absolute value of x
x Norm of x
#A Number of elements in A
A∩B Intersection of sets A, B
A∪B Union of sets A, B
A×B Cartesian product of sets A, B
∈ Element of
∧ Logical conjunction
¬ Logical negation
{} Set delimiters

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 503
J. Unpingco, Python for Probability, Statistics, and Machine Learning,
https://doi.org/10.1007/978-3-031-04648-3
504 Notation

P(X|Y ) Probability of X given Y


∀ For all
∃ There exists
A⊆B A is a subset of B
A⊂B A is a proper subset of B
fX (x) Probability density function of random variable X
FX (x) Cumulative density function of random variable X
∼ Distributed according to
∝ Proportional to
 Equal by definition
:= Equal by definition
⊥ Perpendicular to
∴ Therefore
⇒ Implies
≡ Equivalent to
X Matrix X
x Vector x
sgn(x) Sign of x
R Real line
Rn n-dimensional vector space
Rm×n m × n-dimensional matrix space
U(a,b) Uniform distribution on the interval (a, b)
N (μ, σ 2 ) Normal distribution with mean μ and variance σ 2
as
→ Converges almost surely
d
→ Converges in distribution
P
→ Converges in probability
Tr Sum of the diagonal of a matrix
diag Matrix diagonal
References

1. A. Agresti, Categorical Data Analysis (Wiley, New York, 2003)


2. A. Agresti, An Introduction to Categorical Data Analysis. (Wiley, New York, 2018)
3. E. Alpaydin, Introduction to Machine Learning (Wiley Press, New York, 2014)
4. D. Barber, Bayesian Reasoning and Machine Learning (Cambridge University Press,
Cambridge, 2012)
5. C. Bauckhage, Numpy/scipy recipes for data science: Kernel least squares optimization (1).
researchgate.net, March 2015
6. Z. Brzezniak, T. Zastawniak, Basic Stochastic Processes: a Course Through Exercises.
Springer Undergraduate Mathematics Series (Springer, London, 1999)
7. J. Carpenter, M. Kenward, Multiple Imputation and its Application (Wiley, New York, 2012)
8. O. Certik et al., SymPy: Python library for symbolic mathematics (2016). http://sympy.org/
9. A. Chaudhuri, Modern Survey Sampling (CRC Press, Boca Raton, 2014)
10. R. Christensen, Log-Linear Models and Logistic Regression (Springer Science & Business
Media, Berlin, 2006)
11. H. Cuesta, Practical Data Analysis (Packt Publishing Ltd, Birmingham, 2013)
12. W.L. Dunn, J.K. Shultis, Exploring Monte Carlo Methods (Elsevier Science, Oxford, 2011)
13. W. Feller, An Introduction to Probability Theory and Its Applications: Volume One (Wiley,
New York, 1950)
14. T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning: Data Mining,
Inference, and Prediction. Springer Series in Statistics (Springer, New York, 2013)
15. S.G. Heeringa, B.T. West, P.A. Berglund, Applied Survey Data Analysis. Chapman &
Hall/CRC Statistics in the Social and Behavioral Sciences (CRC Press, Boca Raton, 2017)
16. A. Hyvarinen, J. Karhunen, E. Oja, Independent Component Analysis, vol. 46 (Wiley, New
York, 2004)
17. A.J. Izenman, Modern Multivariate Statistical Techniques, vol. 1 (Springer, New York, 2008)
18. N.L. Johnson, S. Kotz, N. Balakrishnan, Continuous Univariate Distributions, vol. 2. Wiley
Series in Probability and Mathematical Statistics: Applied probability and statistics (Wiley,
New York, 1995)
19. F. Jones, Lebesgue Integration on Euclidean Space. Jones and Bartlett Books in Mathematics
(Jones and Bartlett, Boston, 2001)
20. P. Knottnerus, Sample Survey Theory (3Island Press, Washington, 2002)
21. H. Kobayashi, B.L. Mark, W. Turin, Probability, Random Processes, and Statistical Analysis:
Applications to Communications, Signal Processing, Queueing Theory and Mathematical
Finance. EngineeringPro Collection (Cambridge University Press, Cambridge, 2011)
22. H.P. Langtangen, DocOnce markup language (2020). https://github.com/hplgit/doconce

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 505
J. Unpingco, Python for Probability, Statistics, and Machine Learning,
https://doi.org/10.1007/978-3-031-04648-3
506 References

23. H.P. Langtangen, Python Scripting for Computational Science, vol. 3, 3rd edn. Texts in
Computational Science and Engineering (Springer, New York, 2009)
24. P.S. Levy, S. Lemeshow, Sampling of Populations: Methods and Applications. Wiley Series
in Survey Methodology (Wiley, New York, 2013)
25. C. Loader, Local Regression and Likelihood (Springer, New York, 2006)
26. S.L. Lohr, Sampling: Design and Analysis (Duxbury Press, Pacific Grove, 1999)
27. D.G. Luenberger, Optimization by Vector Space Methods. Professional Series (Wiley, New
York, 1968)
28. D.J.C. MacKay, Information Theory, Inference and Learning Algorithms (Cambridge
University Press, Cambridge, 2003)
29. R.A. Maronna, D.R. Martin, V.J. Yohai, Robust Statistics: Theory and Methods. Wiley Series
in Probability and Statistics (Wiley, New York, 2006)
30. W. McKinney, Python for Data Analysis: Data Wrangling with Pandas, NumPy, and IPython
(O’Reilly, Newton, 2012)
31. T. Mikosch, Elementary Stochastic Calculus with Finance in View. Advanced Series on
Statistical Science & Applied Probability (World Scientific, Singapore, 1998)
32. A.M.F. Mood, F.A. Graybill, D.C. Boes, Introduction to the Theory of Statistics. International
Student Edition (McGraw-Hill, New York, 1974)
33. E. Nelson, Radically Elementary Probability Theory. Annals of Mathematics Studies
(Princeton University Press, Princeton, 1987)
34. T.E. Oliphant, A Guide to NumPy (Trelgol Publishing, Austin, 2006)
35. F. Perez, B.E. Granger, et al., IPython software package for interactive scientific computing
(2007). http://ipython.org/
36. G.M. Poore, Pythontex: reproducible documents with latex, python, and more. Computat.
Sci. Discov. 8(1), 014010 (2015)
37. W. Richert, Building Machine Learning Systems With Python (Packt Publishing Ltd,
Birmingham, 2013)
38. C. Robert, Monte Carlo Statistical Methods (Springer Science and Business Media, New
York, 2013)
39. T. Rudas, Lectures on Categorical Data Analysis (Springer, New York, 2018)
40. R.E. Schapire, Y. Freund, Boosting Foundations and Algorithms. Adaptive Computation and
Machine Learning (MIT Press, Cambridge, 2012)
41. J. Simonoff, Analyzing Categorical Data (Springer, New York, 2003)
42. G. Strang, Linear Algebra and Its Applications (Thomson, Brooks/Cole, Pacific Grove, 2006)
43. W. Tang, Applied Categorical and Count Data Analysis (CRC Press, Boca Raton, 2012)
44. J. Unpingco, Python Programming for Data Analysis (Springer, New York, 2021)
45. G.J.G. Upton, Categorical Data Analysis by Example (Wiley, New York, 2016)
46. R. Valliant, J.A. Dever, F. Kreuter, Practical Tools for Designing and Weighting Survey
Samples. Statistics for Social and Behavioral Sciences (Springer International Publishing,
New York, 2018)
47. V. Vapnik, The Nature of Statistical Learning Theory. Information Science and Statistics
(Springer, New York, 2000)
48. L. Wasserman, All of Statistics: A Concise Course in Statistical Inference (Springer, New
York, 2004)
49. L. Wilkinson, D. Wills, D. Rope, A. Norton, R. Dubbs, The Grammar of Graphics. Statistics
and Computing (Springer, New York, 2006)
50. C. Wu, M.E. Thompson, Sampling Theory and Practice. ICSA Book Series in Statistics
(Springer International Publishing, Birkhäuser, 2020)
51. D. Zelterman, Models for Discrete Data (Oxford University Press, Oxford, 1999)
Index

A Cook’s distance, 221


AdaBoost, 459 Cox proportional hazards model, 276
Almost Sure Convergence, 139 Cross-Validation, 379
Association test, 185 ctypes, 41
Curse of dimensionality, 260
B Cython, 41
Backpropagation, 466
Backtracking algorithm, 474 D
Bagging, 456 Delta Method, 157
Bernstein von-Mises theorem, 341 Dirichlet distribution, 103
Beta distribution, 101
Bias/Variance trade-off, 382 E
Boosting, 459 Exact line search, 473
Boosting trees, 398 Expectation Maximization, 279
Breakdown point, 233 Explained variance ratio, 439
Exponential Family, 417
C Exponentially Weighted Moving Average, 477
Cauchy-Schwarz inequality, 65, 73
Censoring, 273 F
Central limit theorem, 145 False-discovery rate, 177
Chebyshev Inequality, 130 FastICA, 448
Chi-squared distribution, 94 Feature engineering, 378
Cluster distortion, 453 Fisher Exact Test, 178
Complexity penalty, 376 Functional Deep Learning, 469
conda package manager, 2
Conditional Expectation Projection, 67 G
Confidence intervals, 154, 205 Gamma distribution, 100
Confidence sets, 206 Gauss Markov, 242
Confusion matrix, 376 Generalized Likelihood Ratio Test, 172
Convergence in distribution, 144 Generalized Linear Models, 416
Convergence in probability, 141 Generalized Maximum Likelihood Estimators,
Convolution, 484 231
Convolutional Neural Network Using Keras, Generalized PCA, 443
490 Gradient Boosting, 403

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 507
J. Unpingco, Python for Probability, Statistics, and Machine Learning,
https://doi.org/10.1007/978-3-031-04648-3
508 Index

H Multiple Imputation, 338


Hazard functions, 274 multiprocessing, 43
Hoeffding Inequality, 131
Huber Functions, 233 N
Negative binomial distribution, 105
I Negative multinomial distribution, 105
Idempotent property, 66 Newton’s method, 472
Independent Component Analysis, 447 Neyman-Pearson test, 170
Information entropy, 107, 394 Normal distribution, 91
Inner Product, 67
Inverse CDF Method, 118, 120 O
ipcluster, 44 Out-of-sample data, 367

J P
Jensen’s inequality, 134 Pandas, 24
Jupyter notebook, 21 dataframe, 27
series, 25
K Parametric regression models, 276
Keras, 462 Perceptron, 457, 461
Kernel Density Estimation, 246 Permutation test, 175
Kernel trick, 437 Plug-in principle, 152
Kolmogorov-Smirnov test, 269 Poisson distribution, 97
Kullback-Leibler Divergence, 111 Polynomial regression, 362
Probability Proportional to Size Cluster
L Sampling, 304
Lagrange multipliers, 423 Projection operator, 65
Lasso regression, 431 P-Values, 169
Lesbesgue integration, 48 Pypy, 42
Loglinear models, 309
R
M Random forests, 397
Mann-Whitney-Wilcoxon Test, 263 Receiver Operating Characteristic, 167
Markov inequality, 128 Rectified Linear Activation, 488
Maximal margin algorithm, 433 Regression regression, 426
Maximum A Posteriori Estimation, 224 Rejection Method, 122
Maximum Pooling, 487 Robust Estimation, 230
MCAR, 335 runsnakerun, 42
Measurable function, 49 S
Measure, 49 SAGE, 29
M-estimates, 231 Sampling without replacement, 296
Mini-batch gradient descent, 476 scipy, 24
Minimax risk, 147 Seaborn, 138
Minimum mean squared error (MMSE), 67 Shatter coefficient, 372
Missing at random (MAR), 335 Silhouette coefficient, 454
Missing not at random (MNAR), 335 Stochastic gradient descent, 475
Moment generating functions, 114 Stratified Random Sampling, 306
Momentum, 477 Strong law of large numbers, 145
Monte Carlo Sampling Methods, 117 Sufficient statistics, 206
Multi-Layer Perceptron, 463 Survey sampling, 292
Multilinear regression, 361 Survival curves, 271
Multinomial distribution, 92 SWIG, 41
Index 509

Sympy, 29 V
lambdify, 150 Vapnik-Chervonenkis Dimension, 371
Statistics module, 137
T W
Theano, 481 Wald Test, 176
Tower property of expectation, 68 Weak law of large numbers, 145

U X
Uniqueness theorem, 115 xarray, 31
universal functions, 4

You might also like