The History of Credit Cards

The History of Credit Cards article image.

At Experian, one of our priorities is consumer credit and finance education. This post may contain links and references to one or more of our partners, but we provide an objective view to help you make the best decisions. For more information, see our Editorial Policy.

For better or worse, credit cards are a cornerstone of the American economy. At the end of 2017, the average American held 3.1 credit cards with an average balance of $6,354—plus 2.5 retail credit cards with an additional balance of $1,841, according to Experian's State of Credit report. But when were credit cards were invented?

"Paying with plastic" is so commonplace that total U.S. credit card debt topped $1 trillion last year, according to the Federal Reserve.

But have you ever stopped to think about how we got to this place? Perhaps the most amazing thing about credit cards is how relatively quickly they've become essential to modern capitalism.

Most historians trace the modern credit card to the founding of Diners Club in 1950, the first charge card that could be used to make purchases at multiple retailers. Diners Club was a new twist on an ancient practice.

Here is a (brief) explanation of credit cards.

Early Forms of Credit

For thousands of years, merchants have used credit to help their customers finance purchases. For example, seeds could be sold to farmers on terms that permitted payment after the harvest.

Some of the earliest written examples of a credit system include the Code of Hammurabi, named after the ruler of Babylon from 1792 to 1750 B.C., in what is now Iraq. These laws established rules for loaning and paying back money, and how interest could be charged.

Historically, a loan was a financial agreement between a single borrower and a single creditor or merchant. In more modern times, a customer might be able to "run a tab" with an individual merchant, which is a revolving line of credit that can be continuously borrowed against and has no fixed payoff date. This is the equivalent of a store credit card that's not part of a larger payment network.

Embed

The Merging of "Credit" and "Card"

In the late 19th century and early 20th century, companies built on the idea of revolving credit to include a physical object that could be used to easily identify their customer accounts. Some were in the form of coins or medals that included the name and logo of the merchant, as well as the customer's account number.

Just like many credit card transactions in the late 20th century, the merchant would make an imprint of the coin or medal on the customer's sales slip. In the 1930s, these coins and medals evolved into rectangular metal cards called Charga-Plates that looked like something between a credit card and military dog tag.

Watch our Great Moments in Credit History video series on YouTube for more

The Final Countdown

With consumers carrying around rectangular metal cards that they could use to make purchases, there were just a few things missing before someone could create the modern payment card:

First, someone had to conceive of a financial instrument that could be used to make charges at multiple merchants. An early example was the Air Travel Card, which allowed travelers in the 1940s and ‘50s to purchase tickets on credit from multiple different airlines.

The modern payment card was created in 1950 by Ralph Schneider and Frank McNamara who founded Diners Club. This was the first general purpose charge card, but it required consumers to pay each month's statement balance in full.

Later, American Express and others would offer customers to option to carry a balance on their cards. This was the final innovation required to create the financial product that we would recognize as a modern credit card.

The Evolution of Credit Card Technology

At first, credit cards worked like the previous medals, coins, and plates. Merchants would simply make an imprint of the card, which would be familiar to anyone who remembers how many credit card purchases were made up until the 1990s. But by the 1980s, many cards started having a magnetic stripe on the back, which could be read by specialized computer equipment that was state-of-the-art at the time.

By today's standards, a magnetic stripe is considered primitive, as the information stored on it isn't even encrypted. Just as imprinting gave way to magnetic stripe readers, credit cards with embedded computer chips are now making magnetic stripes obsolete. These embedded computer chips, called EMV smart chips, allow for encrypted, two-way authentication between a merchant's credit card terminal and the payment processing network.

This technology dates back to the 1990s, and it was widely adopted in Europe over the last 20 years. However, it's only been in the last five years that America has undergone its migration to EMV-equipped cards and readers. The encrypted communications make it far less vulnerable to hackers, while the computer chips are much more difficult for criminals to counterfeit compared to simple magnetic stripes.

However, some industry experts suggest that the era of EMV smart chips may be relatively short, as wireless payment technologies are rapidly integrated into smartphones, watches and other wearable platforms. Finally, many foresee a day when biometric authentication allows consumers to charge purchases using a fingerprint or retinal scan, without having any object that contains their account information.

We've come a long way from the days of using metal coins to make charges, and the cards in your wallet may also be obsolete in the near future.