0% found this document useful (0 votes)
14 views16 pages

Maths Codes

The document provides a comprehensive overview of various mathematical and statistical techniques, including finding eigenvalues and eigenvectors, reducing polynomials using the Cayley-Hamilton theorem, calculating Z-transforms and their inverses, and analyzing probability distributions. It also covers hypothesis testing methods such as the small sample t-test and chi-square test. Each section includes code snippets for practical implementation of these concepts.

Uploaded by

rk2king28
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views16 pages

Maths Codes

The document provides a comprehensive overview of various mathematical and statistical techniques, including finding eigenvalues and eigenvectors, reducing polynomials using the Cayley-Hamilton theorem, calculating Z-transforms and their inverses, and analyzing probability distributions. It also covers hypothesis testing methods such as the small sample t-test and chi-square test. Each section includes code snippets for practical implementation of these concepts.

Uploaded by

rk2king28
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

1. Finding eigen values & eigen vectors of a matrix.

A]
clc
A = [1 1;1 1];
s = poly(0, 's');
charPoly = det(A - s * eye(A));
l = roots(charPoly);
for i=1:2
vectors = [A(1,2); l(i) - A(1,1)];
v1 = vectors / gcd([A(1,2) , l(i) - A(1,1)]);
printf('eigen value = %d \n',i)
disp("Corresponding eigen vector",v1)
end

B]
clc
A = [8 -8 -2 ; 4 -3 -2; 3 -4 1];
s = poly(0, 's');
charpoly = det(A-s*eye(A));
l = roots(charpoly);
for i=1:3
printf('eigen values=%d\n',i);
x = [A(1,2) A(1,3); A(2,2)-l(i) A(2,3)];
y = [A(1,1)-l(i) A(1,3); A(2,1) A(2,3)];
z = [A(1,1)-l(i) A(1,2); A(2,1) A(2,2)-l(i)];
vectors = [det(x);-det(y);det(z)];
disp(vectors);
end
2. Reduction of higher degree polynomials using Cayley-Hamilton Theorem.
clc
clear

// Define the 3x3 matrix


A = [2, 1, 1; 0, 1, 0; 1, 1, 2];

// Define the variable s as a polynomial


s = poly(0, 's');

// Calculate the characteristic polynomial


charPoly = det(A - s*eye(A));

// Display the characteristic polynomial


disp("Characteristic Polynomial = " + string(charPoly));

// Define the numerator polynomial


numerator = s^8 - 5*s^7 + 7*s^6 - 3*s^5 + s^4 - 5*s^3 + 8*s^2 - 2*s + 1;
denominator = charPoly;

// Perform polynomial division


[remainder, quotient] = pdiv(numerator, denominator);

// Display the quotient and remainder


disp("reduced polynomial = " + string(remainder));
disp(" s^8 - 5*s^7 + 7*s^6 - 3*s^5 + s^4 - 5*s^3 + 8*s^2 - 2*s + 1 = " + string(remainder));
3. Finding diagonal matrix using eigen values & eigen vectors.
clc
// Define your own matrix
A = [ -1, 2, -3; 6, 8, -6; 9, -1, 4 ];

// Compute eigenvalues and eigenvectors


[C, D] = spec (A);

// Display eigenvalues
disp("The-Eigenvalues of matrix-A-are:");
disp (D);

//-Display corresponding eigenvectors


disp("The corresponding Eigenvectors of matrix-A-arse: ");
disp (C);

//-Compute-inverse of modal matrix


M = inv (C);
disp("The inverse of the modal matrix is: ");
disp (M);

// Compute the diagonalized matrix


D=M*A*C
disp("The-Diagonalizable matrix-of-A-is: ");
disp (D);
4. Finding Z transform of functions.

A]
clc
f = [9,-4, 3, 7, 8];
disp("Input sequence x [n] (n ranges from -2 to 2):", f);

z = %z;

k = [-7,-5, 8, 3, 5];

Z_transform = 0;

for i= 1:length (f)


Z_transform = Z_transform + f(i) * z^(-k(i));
end

disp("Z-transform-of-x: ");
disp(Z_transform);

b]
clc

a=3
n=9

f = zeros (1, n);


for j=1:n
f(j)=a^j
end

disp("Input sequence :", f);

z = %z;

Z_transform = 0;
for i = 1:length (f)
Z_transform =Z_transform + f(i) * z^(-i);
end

disp("2-transform-of-x:");
disp (Z_transform);
5. Finding inverse Z transform of functions.

syms z n;
Z = 1 / ((z - 3) * (z - 1));
X = iztrans(Z, z, n);
disp('The inverse Z-transform is:');
disp(X);
6. Finding probability for the given probability density function.

A]
clc
clear

x = [0 1 2 3 4 5];
px = [0.2 0.2 0.25 0.15 0.1 0.1];

y = 2*x.^2 - 3; // Transformation of X to Y
py = px; // Probabilities remain same since mapping is one-to-one

printf("Probability Distribution of Y:\n");


printf("Y\tP(Y)\n");
for i = 1:length(y)
printf("%d\t%f\n", y(i), py(i));
end

// Probability Y >= 15
py_15 = sum(py(y >= 15));

// Probability Y >= 0
py_0 = sum(py(y >= 0));

// Probability Y >= 29
py_29 = sum(py(y >= 29));

printf("Probability of Y >= 15: %f\n", py_15);


printf("Probability of Y >= 0: %f\n", py_0);
printf("Probability of Y >= 29: %f\n", py_29);
B]
clc
clear

// Define the probability distribution of X


x = [1 2 3 4 5 6];
px = [0.166 0.166 0.166 0.166 0.166 0.166]; // Correct length & values

// Print the probability distribution of X


printf("Probability Distribution of X:\n");
printf("X\tP(X)\n");
for i = 1:length(x)
printf("%d\t%f\n", x(i), px(i));
end

// Find the probability of X >= 2


X_2 = x >= 2;
px_2 = sum(px(X_2)); // Logical indexing works if sizes match

// Find the probability of X < 6


X_6 = x < 6;
px_6 = sum(px(X_6));

// Print the results


printf("Probability of X >= 2: %f\n", px_2);
printf("Probability of X < 6: %f\n", px_6);

C]
x=[ 0 1 2 3 4 ];
px=[1/16 4/16 6/16 4/16 1/16];
printf("Probability Distribution of each Faces:\n");
printf("x\tp(x)\n");
for i=1:length(x)
printf("%d\t%f\n",x(i),px(i));
end
7. Finding Expectation & Variance of continuous probability distribution.
A]
clc
clear
// Define the probability density function (PDF)
deff("y = f(x, k)", "y = k * x^2 * (1 - x^3)");

// Assume the constant 'k' as 1 for the initial normalization step


k_initial = 1;
deff("y = f_initial(x)", "y = f(x, " + string(k_initial) + ")");
total_prob_initial = intg(0, 1, f_initial);
k = 1 / total_prob_initial; // Normalize

// Define the properly normalized PDF


deff("y = f_normalized(x)", "y = " + string(k) + "*x^2*(1 - x^3)");
total_prob = intg(0, 1, f_normalized);
disp("Total Probability (Should be 1): ", total_prob);

// Calculate the Mean [E[X]]


deff("y = g(x)", "y = x * (" + string(k) + "*x^2*(1 - x^3))");
mean_x = intg(0, 1, g);
disp("Mean (E[X]): ", mean_x);

// Calculate the Expected Value of X² [E[X²]]


deff("y = h(x)", "y = x^2 * (" + string(k) + "*x^2*(1 - x^3))");
expected_x2 = intg(0, 1, h);
disp("E[X²]: ", expected_x2);

// Calculate the Variance: Var(X) = E[X²] - (E[X])²


variance = expected_x2 - mean_x^2;
disp("Variance: ", variance);
B]
clc
clear
p_oppose = 0.40;
p_favor = 0.60;

EX = 0*p_oppose+1 * p_favor;
EX2 = (0^2) * p_oppose + (1^2) * p_favor;
VarX = EX2 - (EX)^2;

disp(EX, 'Excepted value E(X):');


disp(VarX, 'Excepted value Var(X):');

C]
clc
clear
p1=1/6
p2=1/6
p3=1/6
p4=1/6
p5=1/6
p6=1/6
EX=(1*p1)+(2*p2)+(3*p3)+(4*p4)+(5*p5)+(6*p6)
EX2 = ((1^2)*p1) + ((2^2) *p2) + ((3^2) *p3) + ((4^2) *p4)+((5^2) *p5)+((6^2) *p6)

VarX = EX2 - (EX)^2;


disp ( 'Expected value E(X) :' ,EX);
disp ( 'Variance Var(X) :', VarX);
8. Finding probability using Normal distribution.
clc
clear

mu = 4400;
sigma = 620;

p1 = 1-cdfnor("PQ", 3300, mu, sigma);


disp("P(X < 3300) = " + string(p1));

p2 = cdfnor("PQ", 5400, mu, sigma);


disp("P(X ≤ 5400) = " + string(p2));

p3 = cdfnor("PQ", 4400, mu, sigma) - cdfnor("PQ", 3500, mu, sigma);


disp("P(3500 ≤ X ≤ 4400) = " + string(p3));
9. Testing of hypothesis using small sample test.
clc
clear
//Step.1
mu = 140
disp("H_0 : mu = " + string(mu)) // Null Hypothesis
disp("H_1 : mu != " + string(mu)) // Alternative Hypothesis

//Step.2
alpha = 0.05 // level of significance given
alpha = alpha / 2 // as it is both tail
df = 26 - 1 // degree of freedom

//Step.3
T_critical = abs(cdft("T", df, alpha, 1 - alpha));
disp("Critical value = " + string(T_critical))

// t-test statistic calculation


xbar = 147
mu = 140
s = 16
n = 26
T = (xbar - mu) / (s / sqrt(n));

// Display results
disp("Calculated t-value: " + string(T));
disp("Critical t-value: " + string(T_critical));

// Make the decision


if abs(T) > T_critical then
disp("Reject the null hypothesis (H0).");
else
disp("Fail to reject the null hypothesis (H0).");
end
10. Testing the hypothesis using Chi-square test.
clc
clear

// Null and Alternative Hypothesis


disp("H_0 : Goodness of fit") // Null Hypothesis
disp("H_1 : Not Goodness of fit") // Alternative Hypothesis

// Step 2: Define significance level and degrees of freedom


alpha = 0.05; // Level of significance
n = 7;
df = n - 1;

// Step 3: Compute critical chi-square value


chi_critical = cdfchi("X", df, 1-alpha, alpha);
disp("Critical value = " + string(chi_critical));

// Chi-square test statistic calculation


O = [13, 15, 9, 11, 12, 10, 14]; // Observed values
E = sum(O) / n; // Expected frequency
chi = 0;

for i = 1:n
chi = chi + ((O(i) - E)^2) / E;
end

// Display results
disp("Calculated chi-value: " + string(chi));
disp("Critical z-value: " + string(chi_critical));

// Make the decision


if abs(chi) > chi_critical then
disp("Reject the null hypothesis (H0).");
else
disp("Fail to reject the null hypothesis (H0).");
end

You might also like