Credit cards, while omnipresent now, were not always widely used by consumers to make purchases. At one time the credit card was seen as a novel and trendy idea, with a limited number of cardholders who were in effect members of a special club. Now, credit cards are viewed as essential purchasing tools that everyone must have, for status, transactional ease, and even necessity in some instances. Many purchases, particularly those related to travel and lodging, absolutely require credit cards. The overwhelming majority of internet vendors require a credit card for the purchases. In essence, it is nearly impossible not to have a credit card in the 21 st century. The credit card has come a long way in its short history.
Though credit cards were slow to catch on after their creation in 1949, thirty years later Madison Avenue would be put to work to help drive the expansion of Americans' use of the cards. There was a lot of money to be made by collecting fees for debt creation and debt service, and the largest banks wanted in on the action. Clever marketing campaigns led the public to believe that it could access luxury items and vacations that were once thought to be out of reach, and fueled a growing desire among many Americans to live life like the wealthy. People could purchase the 10-day Caribbean cruise or expensive diamond ring that was once restricted to those with higher income levels. People were starting to feel as if they could live like royalty as credit card marketing created the illusion that debt was equal to wealth. People appeared to care more about how high their credit line was than how much debt they had. As a result, credit cards were soon at the heart of a new materialist culture that had people of widely varying income levels and ages going into debt to fuel their desire for more stuff. Debt drove a lucrative credit card industry which became even more lucrative for credit card issuers after it received favorable court rulings in 1978 and 1996. More on these rulings will be discussed later in this article.Americans' debt trajectory rose rather gradually from the 1940s through the 1970s, but began to escalate much more quickly in the 1980s as the "yuppie" came to prominence in American popular culture. Yuppies (young, upwardly mobile professionals) became iconic in the 1980s as credit cards made more luxury products and services available to more people through the creation of debt. Yuppies were professionals in their 20s and 30s who found new wealth in a rising stock market - which was seeing a large influx of cash due in part to the growing prominence of 401k plans and mutual funds, which opened the financial markets to the public at large for the first time in history, and the rising use of credit. Additionally, those working in the upper echelon of corporate management saw an increase in corporate profits and high-level employee bonuses, which were made possible by increasing worker productivity and the corresponding flattening of wages for mid-level, blue collar and non-professional workers. The rich were taking it all for themselves and letting the good times roll, and everyone who wasn't rich wanted to be or act as if they were rich.
Interestingly, the yuppie was an odd sort of counterweight to the young hippie of the 1960s and 70s because they pursued money and status but in some ways adopted the socially liberal trait of the hippie. Hippies rallied against the traditional, conservative, stuffy, and elitist financial and cultural "establishment," but yuppies, though young and socially open like hippies, became a part of the financial establishment. They were into money and materialism, but were more open to and less judgmental of new and different social experiences than their conservative parents were. Their passion for "things and flings" drove a cultural shift in the United States wherein it became ever-more important to prove your status to others. The proof came in the form of luxury cars, projection televisions, boats, remodeled kitchens, and extravagant vacations.
The 1980s was the age of a paradigm shift in American politics. The U.S. transformed itself into a country where the profit motive supplanted the public good. Profit driven business had always been a trait of America's political and economic culture, but when the profit motive of the 1980s was unveiled, it appeared to be more individualistic, more personal, more pervasive, and more accepted than at any time before. America moved away from the its traditional embrace of serving the public interest and even farther from the emergent communal ideals advanced by the 1960s and 1970s progressives, and whole-heartedly embraced a vibrant consumer culture and all the trappings that came with spending. The rise of the consumer culture had a direct correlation to the decline in Americans' saving rate, which would eventually put further strain on households some 30 years later.
Saving money used to be a prudent exercise that was valued by society as a whole. In the 1970s and 1980s, America did not have a chorus of financial pundits on television encouraging citizens to be consumers and speculative investors. The conventional wisdom of the time was to always set aside ten percent of your income as savings. Prior to the rise of the consumer culture, Americans put a large amount of their money in the bank, and did so quite proudly. The savings would provide financial stability in case of a catastrophe, money for their kids to go to college, money for a vacation, and would serve as an extra cushion "on top of a pension" for retirement. Americans, from generation to generation, were encouraged to save and did. When they had to pay for something, they paid with cash. In cases where cash was not sufficient, they took out loans that were based installment credit, not revolving credit. The notion of paying for something with cash seemed to have become a foreign concept by the 1990s as the value of credit card debt reached new heights, and in dramatic fashion.
In 2005, America's 164 million credit card holders charged $2 trillion to their credit cards, amounting to $12,500 per credit card holder.This contributed to massive consumer debt, which rose over seven times in 28 years - from $355 billion in 1980 to $2.6 trillion in 2008. By 2008, consumer debt increased seven times, while the savings rate was seven times lower than in 1980.
Clearly, banks and other financial institutions that issued credit cards benefitted from the public spending frenzy and made - and continue to make- billions of dollars on the fees and interest paid on credit card debt.
These institutions enlisted the help of Madison Avenue advertising agencies to come up with ads that appealed to the new consumerist mentality that came to dominate American culture in the 80s, 90s, and early 2000s. Commercial advertisements telling viewers that their credit card is accepted "everywhere you want to be," became omnipresent as did commercials set to popular music such as the rock band Queen's song, "I Want it All," promising that if you "want it all" and "want it now," you could in fact get what you want merely by swiping your credit card. These commercials broadly appealed to the new consumer mentality. It could be said that commercial advertisements appealed to the Id that Sigmund Freud defined in his psychoanalytic theory. The Id acts according to what Freud termed the pleasure principle, seeking immediate gratification by satisfying psychological needs without accounting for reason or reality. Americans were all too ready to be governed by the pleasure principle because parents and society at large had created an environment that was safe for a narcissistic, greedy, and self-serving new generation of young adults. Sadly for everyone, this culture has not held back in contributing to America's economic downfall. The stories of financial strife that have begun to emerge are both shocking and horrific.
With the rise of the consumer culture came depressing stories of people falling on hard times and becoming debt slaves to the credit card companies. Some consumers got so far into debt that they lost the ability to pay even the minimum required monthly payment. They lost their jobs, experienced a medical emergency, had to support other family members, or lost their homes, and soon fell into a debt spiral of despair. Credit card companies saw these people as great risks to their revenue streams and began to increase late fees and penalties for those carrying a balance over their credit line. They also raised interest rates two, three, and four times. This had the effect of worsening their ability to make payments and contributed to an increase in personal bankruptcy filings. In order to keep bank share values as high as possible for large and wealthy investors, the banks had to find another source of revenue. They turned their sights on customers in good standing, who regularly paid their balances in full or made timely monthly payments. The best customers saw their interest rates rise, credit limits fall, and saw their creditors issue harsh terms for submitting payment even an hour late. In effect, customers who paid their bills on time, "known as Deadbeats in industry parlance," were going to subsidize those who fell behind and couldn't make payments.
It is critical to understand the history and evolution of the credit card to appreciate fully where American households now stand. The world's first credit card was invented in New York in 1949 when Frank X. McNamara of the Hamilton Credit Corporation created the Diners Club card after forgetting to bring his wallet to a Manhattan restaurant. Mr. McNamara figured that he could create a card that would eliminate the need for diners to carry around cash. What he did was create a cardboard, wallet sized card that members would pay an annual fee to carry and use at member restaurants and nightclubs in Manhattan. McNamara was able to sell restaurants on the idea by explaining to them that it would increase their repeat business. Within a couple of years, there were 20,000 Diners Club members. Credit cards were not used in significant numbers for another ten years after their creation - with the introduction of the American Express card. An interesting trait of the cards is that they were fee based and did not allow the card holder to carry a balance.
Prior to the advent of credit cards, people either paid with cash or took out loans to fund their consumption habits. Those who are old enough to remember the 1970s will undoubtedly remember the prominence of "lay-away" programs that many large shops offered. Lay-away programs allowed families to go into a department store and make a down-payment on a particular product they wanted to buy. The buyer would not be able to take the product home that day, but they would lock in the price of the product on that same day. The store would hold the product in their storage area until the buyer made enough payments to cover the full price of the product. Then, the buyer could take the product home. Lay-away programs were extremely popular with low and middle income families and were widely used to make purchases for the Christmas holiday.
After credit cards came into wider use in the 1960s, the issuing companies figured that they could expand their profits by increasing the number of credit card holders. To expand their business, credit card companies began to mass mail credit cards to households across the country in hopes that consumers would take to the new cards and generate new fees for the card issuing companies. Credit card terms were new to the public and Congress wanted to prevent the uninformed from unfair billing and credit practices. The first piece of landmark credit card legislation was passed in 1968 as the Consumer Credit Protection Act - also known as the Truth in Lending Act. While the legislation served to force issuing companies to clearly disclose the terms of credit, it did not bar mass mailings of credit cards. Unfortunately for the consumer and credit card companies alike, fraud became widespread as many mass mailed cards were intercepted in the mail by criminals who would make unauthorized charges on the cards. Consumer complaints and disputed charges led Congress to eventually pass the Fair Credit Billing Act of 1974, the second major piece of credit legislation. This legislation, in part, required credit card companies to acknowledge consumer complaints and to correct errors within 90 days of a complaint being made. At the time, it was a major step forward in consumer protection law.Eventually, the practice of issuing revolving credit was introduced. Whereas before, credit card companies made their profits from charging fees, now they were charging interest on outstanding debt. The combination of annual fees and interest led to ever higher profits and the rise of the titans of the credit card industry - Visa and MasterCard. Both companies benefitted from new technology. For Visa, it was the creation of the electronic authorization system in 1973. For MasterCard, it was the security hologram designed to prevent fraud.
The idea of revolving credit would eventually lead to consumer complaints, court battles, and additional reforms. The average interest rate charged by credit issuing institutions in 1974 was 17.20%. These interest rates were considered to be usurious in many states. Citizens challenged banks that charged in excess of their state's usury laws, and banks challenged the usury laws themselves. A landmark decision, the Marquette decision, was issued by the Supreme Court in 1978, effectively barring states from applying usury laws to nationally chartered banks that issued credit cards. National banks were subject to federal, not state regulation, hence state usury laws did not apply, except for the usury law in the state in which the credit card issuing bank was located. Since South Dakota had eliminated its usury laws, freeing banks to charge whatever interest they saw fit to charge, many banks relocated their operations there. Credit card rate averages would remain in the 17% to 19% range until they began a steady decline in 1991. Rates continued to fall, due to falling increased competition and decreased costs, until 2004 when they began to rise once again.Credit card companies scored another court victory in 1996 in the Smiley case. In Smiley , the plaintiff argued that credit card late fees, in this case amounting to fifteen dollars, violated California state law. Citibank successfully argued that the fees were lawful under the National Bank Act. The Act's primacy over state law, combined with the Office of the Comptroller of the Currency's administrative decision that "interest" includes late fees and penalties, meant that nationally chartered banks could set late fees as high as they deemed necessary without worrying about interference from the states. Soon thereafter, many late fees more than doubled, as did actual interest rates for consumers who made late payments. Some late fees went as high as $155. The banks, some would say, were operating with impunity and the federal government backed their interests to the detriment of the public interest.
With the door to enforcing stringent penalties wide open, credit card issuers came up with more creative ways in which to increase fees, penalties, and interest rates. One of the new methods was Universal Default - perhaps the most creative and unforgiving device ever designed to extract money from a credit card holder. Universal Default terms give an issuer the right to raise a card holder's interest rate if the card holder is late paying any bill of whatever sort to any creditor. By early 2004, forty percent of banks had added universal default clauses to their terms and conditions.These legal developments, while creating a groundswell of anti-credit card sentiment among consumer advocate groups, did not turn too many heads among the public at large until the bursting of the credit bubble and the wholesale collapse of the economy in 2007 and 2008. Still, not much attention was paid to the credit card problem until late in 2008, as most of the government and the media were focused on the subprime mortgage problem and increasing unemployment. Growing credit card defaults, bankruptcies, and other credit related horror stories finally made the news. Congress and the president seemed determined to take action.
On August 20, 2009, a new law went into effect, at least two provisions of it did. The Credit Card Accountability Responsibility and Disclosure Act of 2009 was signed into law on May 22, 2009, with the first two provisions of the law taking effect on August 20, 2009. The August 20 provisions immediately made important changes to the way credit card issuers conduct their business. First, issuers now have to provide 45 days' notice of any significant changes in credit card terms, including an increase in interest rates. The 45 day period increases the previous 15 day notice period but also carries with it an option for the card holder to decline the rate increase, pay off the current balance in full at the current rate, and use the 45 days to find another card. The other significant change taking effect was the requirement that issuers mail billing statements 21 days prior to the due date instead of 14 days prior to the due date. The new law also proscribed more restrictions to take effect in February, 2010, prompting several issuers to immediately raise interest rates, cancel accounts, and cut credit limits on their customer base, one of the critiques issued by several consumer advocates. In fact, average credit card rates have already begun to rise with the average variable rate increasing from 10.69% in April 2009 to 11.22% in August of 2009.
Consumers will find additional relief with the February provisions having gone into effect. A White House Fact Sheet states that key elements of the new law include: bans on retroactive rate increases for existing balances due to "any time, any reason," or "universal default;" the end of "late fee traps," which include weekend payment deadlines, due dates that change each month, and deadlines that fall in the middle of the day; and the enforcement of fair interest calculation, meaning that credit card companies (1) must apply excess payments to the highest interest balance first, and (2) may not use the balance of a previous month to calculate the interest charges for the current month, a practice called "double-cycle billing." Additionally, institutions will have to get a consumer's permission to process payments that will result in charges exceeding the consumer's credit limit.All of these new provisions should work to benefit the average card holder and act as a counterweight to the nearly 60 year national trend of industry favorable legislation and court decisions. Still, the public should not assume that card issuers will not be looking for ways to work around the legal restrictions placed on them. One of the loopholes already identified pertains to the rule barring issuers from raising interest rates on existing balances. If a consumer has an introductory rate promotion or a variable rate on the card, the rule does not apply. You can bet that many issuers will be moving consumers into variable rate accounts. We may see the number of fixed rate accounts disappear altogether. Several other loopholes will undoubtedly be exploited in the months and years ahead. The best armament a consumer can have is knowledge, yet the consumer must be willing and able to put that knowledge to good use.