BIGtheme.net http://bigtheme.net/ecommerce/opencart OpenCart Templates
Friday , July 28 2017
Home / Elementary Number Theory / Assuming that gcd(a, b) = 1 then prove that gcd(2a + b, a + 2b) = 1 or 3

Assuming that gcd(a, b) = 1 then prove that gcd(2a + b, a + 2b) = 1 or 3

Proof:  Let d = gcd(2a + b, a + 2b)

Then there exists r and s such that 2a + b = dr and a + 2b = ds

Now, 3a = 2(2a + b) –(a + 2b)

= 2dr – ds

= d(2r- s)

And hence d\3a

Also, 3b = 2(a + 2b) – (2a + b)

= 2ds – dr

= d(2s – r)

And hence d\3b

Since d\3a and d\3b it follows that, d ≤ gcd(3a, 3b)

= 3gcd(a, b)

= 3 [∵gcd(a, b) = 1]

So at this point, we can say that either d = 1, 2 or 3

But we note that d cannot be equal to 2.

To see this suppose that d is equal to 2. Then since d\(2a +b), it must be true that 2\(2a + b)

⟹ 2a + b = 2n; for some integer n

⟹ b = 2(n – a)

i.e., 2\b

similarly, since d\(a +2b) it must be the case that

2\(a + 2b)

⟹ a + 2b = 2m

⟹ a = 2(m – b)

i.e 2\a

Since 2\a and 2\b, it contradicts our original assumption assuming that a and b are relatively prime then d = 1, or 3

∴ gcd (2a + b, a + 2b) = 1 or 3 (Proved)

Check Also

Application of Euclidian’s algorithm in Diophantine equation

Problem-1: Which of the following Diophantine equations cannot be solved –   a) 6x + ...

Leave a Reply

Your email address will not be published. Required fields are marked *