Copyright © University of Cambridge. All rights reserved.

'Impossible Square?' printed from https://nrich.maths.org/

Show menu


We received three good solutions to this problem. First of all, Josh from Lyng Hall wrote in
The area of the square you make can either be worked out by:
  • base x height

or,
  • area of triangle x number of triangles

Dealing with the triangles first, they are right angled with 30 and 60 degrees in the other two corners and unit lenghth hypotenuse. This means, by trigonometry, the other two sides will be $\frac{1}{2}$ and $\frac{\sqrt{3}}{2}$. From this we can deduce the area of the triangle will be: $$\frac{1}{2} \times \frac{\sqrt{3}}{2} \times \frac{1}{2} = \frac{\sqrt{3}}{8}$$ The square must be a multiple of this so its area is: $$n\frac{\sqrt{3}}{8}$$ The edges of the square must be made up of a combination of sides of the triangle so can be written as follows:
  • Base: $a\times \frac{1}{2} + b\times \frac{\sqrt{3}}{2} + c\times 1$
  • Height: $d\times \frac{1}{2} + e\times \frac{\sqrt{3}}{2}\times \frac{\sqrt{3}}{2} + f\times 1$
So its area is: $$\left(\frac{a}{2} + b\frac{\sqrt{3}}{2} + c\right)\times\left(\frac{d}{2} + e\frac{\sqrt{3}}{2} + f\right) = \frac{ad}{4} + ae\frac{\sqrt{3}}{4}+ a\frac{f}{2} + bd\frac{\sqrt{3}}{4} + be\frac{3}{4} + bf\frac{\sqrt{3}}{2} + c\frac{d}{2} + ce\frac{\sqrt{3}}{2} + cf$$ If these two areas are equal then the parts without $\sqrt{3}$ must cancel out. All of a, b, c, d, e and f cannot be negative as they are the number of edges so ad, af, be, cd, cf must be zero. If we take a to be zero then the first two are zero, if c is zero then the last two are zero, this leaves be. If b is zero then the base has no length so it must be e that is zero. If you apply this to the expansion then you are left with: $bd\frac{\sqrt{3}}{4}+ bf\frac{\sqrt{3}}{2}$ which could equal $n\frac{\sqrt{3}}{8}$. But a square also has to have both side lengths equal this means (with a,c and e zero) that: $$b\frac{\sqrt{3}}{2} =\frac{d}{2} + f$$ This cannot be true as the right hand side cannot become a multiple of root3. This can be done with any selection being zero, but there is only one other possible combination. This is b, f and d = zero. You end up with; $$a/2 + c = e\frac{\sqrt{3}}{2}$$ which is again impossible.


Steven from City of Sunderland College sent in a similar excellent solution. You can read his write-up in this attached word document.

Robert also gave a solution, and realised that a very advanced (university-level) proof would require us to justify the following results, for which he gave three good proofs. This section would make good reading for any students considering taking a mathematics degree.

In order to solve this problem I will first state and prove some theorems concerning irrational numbers.

Theorem: $\sqrt{3}$ is irrational :

Proof: Assume $\sqrt{3} = a/b$ where $a$ and $b$ are integers and $a/b$ is a fraction in its simplest terms. Then $$ 3 = a^2/b^2 \Rightarrow 3b^2 = a^2 \Rightarrow 3|a^2 \Rightarrow 3|a \Rightarrow 9|a^2 $$ So let $a^2 = 9m$ for some integer m. $\Rightarrow 3b^2 = 9m \Rightarrow b^2 = 3m \Rightarrow 3|b^2 \Rightarrow 3|b$ Therefore if 3 divides both a and b, the fraction $a/b$ is not in its simplest form, which is a contradiction. Thus $\sqrt{3}$ does not equal $a/b$ (for any integers a or b) and so $\sqrt{3}$ is irrational.

Now I shall state and prove some theorems:

Theorem: An irrational number multiplied by a rational number is irrational.

Proof: Let m be an irrational number and let n be a rational number. Thus $n = a/b$, for integers $a$ and $b$. So let us assume that $mn$ is rational thus: $mn = c/d$ for some integers $c$ and $d$. Therefore, $$ (am)/b = c/d \Rightarrow am = (bc)/d \Rightarrow m = (bc)/(ad) $$ Because $bc$ and $ad$ will both be integers, say $e$ and $f$ respectively, then $(bc)/(ad) = e/f$, which is rational, however $e/f$ also equals $m$ which is known to be irrational, which is a contradiction. Therefore an irrational multiplied by a rational must be irrational.

Theorem: A rational number divided by an irrational number is irrational.

Proof: Let m be irrational and n be rational and let us assume that $n/m = a/b$ where $a$ and $b$ are both integers. Now because $n$ is rational we can denote it by $c/d$ where $c$ and $d$ are both integers. Therefore, $$ c/(dm) = a/b \Rightarrow m = bc/ad $$ and because $ad$ and $bc$ will both be integers say $e$ and $f$ respectively, then: $m = f/e$, which is a contradiction because m is irrational and f/e is rational thus $n/m$ must be irrational.

Theorem: A rational number plus an irrational number is irrational.

Proof: Let m be irrational and n be rational, hence $n$ can be expressed as $a/b$ for integers $a$ and $b$. Let us assume that $m$ plus $n$ is rational thus, $m + n = c/d$, for integers $c$ and $d$ $\Rightarrow m + a/b = c/d \Rightarrow m = (bc - ad)/(bd)$, which will be rational, but we have said that $m$ is irrational so this a contradiction and thus $(m + n)$ must be irrational.