Skip to content

Optimal Currency Denominations

March 8, 2014

The United States currency denominations (bills & coins) are based on a “binary decimal system” – [1, 2, 5] which is also quite popular around the world. The question is whether this popular currency denomination is optimal. And if it’s not, how can we find an optimal distribution of denominations. There are two requirements that should be imposed to any solution: first is that the choice should allow a simple (greedy algorithm) to the change making problem; second, it should allow every possible monetary amount to represented by the least amount of currency while also minimizing the denominations needed in circulation.

Greedy Change Making

A greedy algorithm for making change is optimal using United States currency denominations. Optimal is defined as minimizing the number of currency required to make a particular value. A simple example would be making $20.14 in change would require 1 $20 bill, 1 dime ($0.10), and 4 pennies ($0.01) which requires 6 pieces of currency. The way the greedy algorithm works is to pick the largest denomination that is less than or equal to the amount still due and repeating this process until zero is reached.

The property of a currency denomination that makes a greedy algorithm optimal is that the ratio between every denomination and the next one below it MUST BE GREATER THAN OR EQUAL to 2. If we look at the current US denominations, we can see this is the case which is why a greedy algorithm is optimal: $100->$50 (2x), $50->$20 (2.5x), $20->$10 (2x), $10-$5 (2x), $5->$1 (5x) and similarly for coins quarter->dime (2.5x), dime->nickel (2x), nickel->penny (5x).

So what would happen if the denominations were changed so this wasn’t true? Imagine the nickel was worth $0.06 instead of $0.05. Well then the difference between a nickel->penny (6x) would still be fine, but the dime->nickel (1.67x) which is less than 2. So now if we wanted to make change for let’s say $0.12 the greedy algorithm would pick a dime plus 2 pennies for a total of 3 coins needed. The actual optimal solution would be to use 2 $0.06 (nickels) which is not a greedy algorithm.

How can we improve on the 1-2-5 currency?

First, let’s gather some statistics about how well the USD currency currently works by tracking the average number of currency needed to represent every value from $0.01 up to $99.99. Using popular denominations ($100,$50,$20,$10,$5,$1,$0.25,$0.10,$0.05,$0.01=10 denominations) it takes approximately 8.9 currency to represent a value with 4 instances requiring 17 currency ($89.94,$89.99,$99.94,$99.99).

A far better approach would be to base currency on “powers of three”. For instance, what if the USD currency was ($27,$9,$3,$1,$0.27,$0.09,$0.03,$0.01=8 denominations). Then the average number of currency to represent all those same values would be 8.56 which is an improvement on the current USD denominations while saving the need for 2 denominations. Also the greatest number of currency required is 16 instead of 17 ($80.80,$80.98,$90.80,$90.98)

The code below can help you test out your own currency choices, assuming they satisfy the greedy algorithm for currency property.


From → Algorithms, Ruby

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: