0% found this document useful (0 votes)
50 views1 page

1200 Assignment 4

This proof shows that if the greatest common divisor of a and b is 1, and there exists an integer c that divides their sum a + b, then the greatest common divisors of a and c and of b and c must both be 1. It first proves gcd(a,c)=1 by showing any common divisor d of a and c must also divide b, and since gcd(a,b)=1 then d=1. It then similarly proves gcd(b,c)=1.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views1 page

1200 Assignment 4

This proof shows that if the greatest common divisor of a and b is 1, and there exists an integer c that divides their sum a + b, then the greatest common divisors of a and c and of b and c must both be 1. It first proves gcd(a,c)=1 by showing any common divisor d of a and c must also divide b, and since gcd(a,b)=1 then d=1. It then similarly proves gcd(b,c)=1.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

MATH 1200 - ASSIGNMENT X

YOUR NAME

4.
P We need to prove that if gcd(a, b) = 1 and there exists an integer c such that c|(a + b),
then gcd(a, c) = gcd(b, c) = 1.

Proof:

Let d = gcd(a, c). Then we have d|a and d|c. Since c|(a + b), we can write a + b = kc for
some integer k. Then we have a = kc − b, and since d|a and d|c, we can write d|(kc − b). This
implies that d|b, since gcd(a, b) = 1. Thus, we have shown that d is a common divisor of a and
b, and since gcd(a, b) = 1, it follows that d = 1. Therefore, we have shown that gcd(a, c) = 1.

Similarly, let e = gcd(b, c). Then we have e|b and e|c. Using the same argument as
above, we can show that e|a. Thus, e is a common divisor of a and b, and since gcd(a, b)
= 1, it follows that e = 1. by the definition of greatest common divisor and using multiple
representations. Therefore, we have shown that gcd(b, c) = 1.

Hence, we have shown that if gcd(a, b) = 1 and there exists an integer c such that c|(a + b),
then gcd(a, c) = gcd(b, c) = 1.

Date: March 30, 2023.


1

You might also like