Two common mistakes when it comes to primes:  1 is NOT considered a prime, and 2 is prime (the only even prime)!!   The fact that 1 is not a prime is often confusing, right?  After all, it is technically a number that is ‘divisible only by 1 and itself’ – which is how most people remember and word the definition of prime. So, why isn’t it considered a prime?

Actually, this is not one of those things that ‘came down off the mountain with Moses’.  No high authority decreed that 1 shouldn’t be prime – indeed, there was a time (centuries back) when 1 was considered a prime!  So, it’s one of those things that has developed into a convention over time, for a variety of reasons, mathematically.

One hears lots of ‘explanations’ for this, but for me, there are two main reasons. The first is the definition. The most precise definition for a prime is “a number which has exactly two divisors”.  This is almost the same as the definition many of us use (no other divisors besides itself and 1), but it excludes 1 (which only has one divisor). That reason works for me . . .  but I bet that still feels like it’s ‘hand-waving’, doesn’t it? The second reason – below – actually explains the big picture for me and involves a tiny bit of history.


Primes are essentially the ‘building blocks’ of all the other integers.  It is true that ANY/EVERY positive integer is either already prime OR can be written as a product of primes.  Examples:  30 = 2 x 3 x 5;  72 = 2 x 2 x 2 x 3 x 3; 35 = 5 x 7, etc.

Actually, even more is true.  It turns out that ANY/EVERY non-prime positive integer (> 1) can be written as a product of primes in only one way!  So, whereas 30 can be ‘factored’ in lots of ways ( 6 x 5 or 10 x 3 or 2 x 15  or 30 x 1), it can only be ‘factored’ into primes in one way, namely 2 x 3 x 5.  (We don’t count 3 x 5 x 2 as ‘different’, by the way.)

The highlighted phrase above (‘in only one way‘) turns out to be important.  Math types love ‘uniqueness’ when they can get it!  This particular uniqueness for prime factorization is considered important.  (It’s now actually called the ‘fundamental theorem of arithmetic’.)

So, here’s the big deal, though it will seem like ‘splitting hairs’ to some. 🙂  IF 1 were considered prime, then the ‘uniqueness’ fact above would not be true!!  🙁     You could then write 30 as 2 x 3 x 5 x 1, or even 2 x 3 x 5 x 1 x 1, and so on.

So, long story short and over-simplified: over the years and centuries, it became more important to mathematicians to keep the uniqueness and make 1 a ‘special case’, than it was to keep one a prime and lose the uniqueness.  It was just a choice – and a historical development that became a convention.  So 1 is NOT considered prime.