Industries

Loading...

By: Mike Horrocks I was raised in an underbanked home! I have known this for a long time, but it feels great to say it and be proud of it.  I was raised in Neola, Utah, a small cattle ranching community of at the time 500 or so people.  I don’t recall as a kid ever feeling poor or on the edge financially, in fact it was quite the opposite.  When I was a freshman in college I got my own banking accounts and my first major credit card that gave cash back, it all just seemed normal. I recall showing my dad my new found financial life.  The concept of getting cash back for purchases was something he wanted in on.  He made the call to get his own card and within minutes the representative on the other end of the call asked if I was willing to co-sign for my dad because he did not have a thick enough credit file.   At this point of my dad’s life, he had developed and sold a couple of businesses, bought and managed a successful angus cattle ranch - but he had done most of it in cash and so he was “off the grid”.   When I co – signed for my dad, it hit me, in terms of the banking system that I was studying at the university, we were an underbanked family! So what are the lessons learned here for a banker today: Underbanked does not equal poor.  I never felt poor, the family business was going great, and my dad was always able to meet any obligation or need for the ranch and us kids. Bankers need to know their customers.   When my dad did need access to more capital, there was a great banker at Zions Bank that knew my dad and stood by him even though the traditional file was thin.  So know your customers by all means (traditional credit, alternative data, etc.) Don’t forget the family.  By this I mean the associated products and what they can mean to the overall customer picture and relationship. Know what risks and opportunities are there as you try to optimize the relationship. If you want to read some more, American Banker just published a great set of articles on including consumers and how to retain them - it is worth a quick review. Don’t let great customers like my dad go through your business development net.  Attract them, nurture them and build great relationships with them.

Published: February 25, 2015 by Guest Contributor

By: Barbara Rivera Every day, 2.5 quintillion bytes of data are created – in fact, 90% of the world’s data was created in only the last few years. With the staggering amount of data available, we have an unprecedented opportunity to uncover new insights and improve the way our world functions. The implications of these new capabilities are perhaps nowhere else as crucial as within our government. Public sector officials carry the great responsibility of conducting complex missions that directly affect our communities, our economy, and our nation’s future. The ability to make more informed, insightful choices and better decisions is paramount. Especially at a time of broader global unrest and uncertainty, Americans rely on our government to be transparent, fair, ready and to make the right decisions – our trust is in the hands of our elected officials and public servants. Data alone is not enough to inform and affect change. However, with integrated information assets, insightful analysts and collaborative processes, data can be transformed into something meaningful and actionable. Our government has already begun leveraging data for good across agencies and varied missions, with more potential unlocked each day. Local governments like Orange County, California are utilizing data through address verification services to keep their voting lists accurate – ensuring the integrity of elections and saving the taxpayers thousands of dollars otherwise wasted on mailings to outdated lists. The Orange County Registrar of Voters – the fifth largest voting jurisdiction in the county – has been able to cancel 40,000 voting records, with an estimated savings of $94,000 expected from 2012 through 2016. The examples are numerous and growing: A suite of optimization tools helps states find non-custodial parents, determine their capacity and likelihood to pay child support, and trigger alerts with new critical information, maximizing the likelihood of payment and recovery, ultimately improving the welfare of children and reducing poverty More than 150 state, county and local law enforcement agencies leverage data to help identify persons of interest, conduct background screening for employees and contractors and provide financial backgrounds for criminal investigations, ensuring our continued safety By using the power of data to manage user authentication, credentials and access controls, the government is working harder – and smarter – to protect our security The government is leveraging verified commercial data to help agencies validate the fiscal responsibility of potential contractors and monitor existing contractors, which helps provide transparency and reduce risk By using data and analytics to authenticate applicants and validate financial data, the government is ensuring access to benefits for those who meet eligibility requirements, while at the same time reducing fraud Private sector partners are supporting municipal efforts to improve financial stability in households by providing the current credit standing of consumers and monitoring overall changes in financial behaviors over time, to help counsel and educate citizens And that’s only the beginning. The possibilities are endless – from healthcare to finance to energy – data can be leveraged for the advancement of our society. It even happens behind the scenes, working to protect information in ways most citizens never realize. Data insights are used to ensure citizens have secure online access to their information – ever see those randomized, personal questions? That’s data at work. The same technology is the de facto ID Proofing standard for the VA and CMS. How does it all work? By combing through the data carefully, putting it in context, looking at it in new ways, and thinking about what all this information really means. Much of this is made possible through public-private partnerships between the government and companies like Experian. So the next time someone complains about the slow pace of government, let them know the truth is government is moving quickly, leveraging data and private sector partnerships to uncover new insights that impact the greater good.

Published: February 25, 2015 by Guest Contributor

The evolution of identity verification Knowing who you are doing business with isn’t just a sound business practice to protect your bottom line. In many cases, it also is a legal requirement. Identity verification techniques have been evolving over the past few years to meet business priorities beyond fraud prevention, including customer experience, operational costs and regulatory compliance. We recently wrote about the challenges of customer authentication on mobile devices to meeting new business priorities. Fraud prevention tools have responded to these shifting priorities. While extremely fast and very accurate at detecting fraud, they also: Are less invasive to customers Provide a strong return on investment Ensure consistency in compliance and audit Listen to what Matt Ehrlich, Experian fraud and identity director of product management, has to say about how verification techniques have changed: Download our fraud prevention perspective paper to gain more insight on how you can prepare your business.

Published: February 17, 2015 by Guest Contributor

The news of the latest breach last week reported that tens of millions of customer and employee records were stolen by a sophisticated hacker incursion. The data lost is reported to include names, birth dates, Social Security numbers, and addresses. The nature of the stolen data has the potential to create long-term headaches for the organization and tens of millions of individuals. Unlike a retailer or financial breach, where stolen payment cards can be deactivated and new ones issued, the theft of permanent identity information is, well, not easily corrected. You can’t simply reissue Social Security numbers, birth dates, names and addresses. What’s more, the data likely includes identity data on millions of dependent minors, who are prime targets for identity thieves and whose credit goes frequently unmonitored. According to the Identity Theft Resource Center’s 2014 Data Breach Report, a record 783 breaches, representing 85 million records, occurred from January through September 2014 alone. The breaches have ranged across virtually every industry segment and data type. So where does all this breached data go? It goes into the massive, global underground marketplace for stolen data, where it’s bought and sold, and then used by cybercriminals and fraudsters to defraud organizations and individuals. Like any market, supply and demand determines price, and the massive quantity of recent breaches has made stolen identities more affordable to more fraudsters, exacerbating the overall problem. In fact, stolen health credentials can go for $10 each, about 10 or 20 times the value of a U.S. credit card number, according to Don Jackson, director of threat intelligence at PhishLabs, a cyber crime protection company. The big question: So what now? The answer: Assume that all data has been breached, and act accordingly. Such a statement sounds a bit trivial, but it’s a significant paradigm shift. It’s a clear-headed recognition of the implications of the ongoing, escalating covert war between cybercriminals and fraudsters, on one side, and organizations and consumers on the other. For individuals, we need to internalize this fact: our data has likely been breached, and we need to become vigilant and defend ourselves. Sign up for a credit monitoring service that covers all three credit bureaus to be alerted if your data or ID is being used in ways that indicate fraud. Include your children, as well. A child’s identity is far more valuable to a fraudster as they know it can be several years before their stolen identity is detected. Many parents do not check their child’s credit regularly, if at all. For organizations, it’s a war on two fronts: data protection and fraud prevention. And the stakes are huge, bigger than many of us recognize. We’re not just fighting to prevent financial theft, we’re fighting to preserve trust — trust between organizations and consumers, at the first level, and ultimately widespread consumer trust in the institutions of finance, commerce, and government. We must collectively strive to win the war on data protection, no doubt, and prevent future data breaches. But what breaches illustrate is that, when fundamental identity data is breached, a terrible burden is placed on the second line of defense — fraud prevention. Simply put, organizations must continually evolve their fraud prevention control and skills, and minimize the damage caused by stolen identity data. And we must do it in ways that reinforce the trust between consumers and organizations, enhance the customer experience, and frustrate the criminals. At 41st Parameter, we are at the front lines of fraud prevention every day, and what we see are risks throughout the ecosystem. Account opening is a particular vulnerability, as consumer identity data obtained in the underground will undoubtedly be used to open lines of credit, submit fraudulent tax returns, etc. unbeknownst to the consumer. Since so much data has been breached, many of these new accounts will look “clean,” presenting a major challenge for traditional identity-based fraud and compliance solutions. But it’s more than new accounts — account takeover, transactions, loyalty, every stage is in jeopardy now that so much identity data is on the loose. Even the call center is vulnerable, as the very basis for caller authentication often relies on components of identity. At 41st Parameter and Experian Fraud & Identity solutions, we advocate a comprehensive layered approach that leverages multiple solutions such as FraudNet, Precise ID, KIQ, and credit data to protect all aspects of the customer journey while ensuring a seamless, positive user experience across channels and lines of business. Read our fraud perspective paper to learn more. Now is the time to take action.  http://www.reuters.com/article/2014/09/24/us-cybersecurity-hospitals-idUSKCN0HJ21I20140924

Published: February 11, 2015 by Guest Contributor

By: Scott Rhode   This is the second of a three-part blog series focused on the residential solar market looking at; 1) the history of solar technology, 2) current trends and financing mechanisms, and finally 3) overcoming market and regulatory challenges with Experian’s help. Lets discuss the current trends in solar and, more importantly, the mechanisms used to finance solar in the US residential market.  As I discussed in the last blog, the growth in this space has been astronomical.  To illustrate this growth, there was a recent article in The Washington Post by Matt McFarland, highlighting that solar-related jobs are significantly outpacing the rest of labor market in terms of year over year growth.  The article states that since 2010 the number of solar-related jobs has doubled in the US, bring the total number of jobs in this industry to 173,807.  While this is still smaller in comparison to other sectors of our economy, it underscores how much growth has occurred in a short amount of time. So what is driving this explosive growth?  There are a few factors that should be considered; however, in the residential solar market, financing, is the main catalyst.  As you might expect, there are a variety of financial products in the market giving the consumer lots of choices. First, there are traditional loans like home improvement loans, home equity loans, or energy efficiency loans offered by a bank, credit union, or specialty finance company.  For homeowners that do not choose to secure their loan against their property, there are a variety of specialty lenders that will offer long-term, unsecured loans that only file a UCC against the panels themselves.  For these types of offerings, some specialty lenders will even have special credit plans for the 30% Solar Investment Tax Credit so that the homeowner can have a deferred interest plan with the expectation that once they get the tax credit from the federal government, they will pay off the special plan and all of the deferred interest will be waived.  If the customer does not pay in full, the plan rolls to their regular loan plan and the customer has a higher cost of financing. Second, there is a lease product which offers zero to little down and a monthly payment that is less than the savings that the homeowner will experience on their utility bill.  Of all the financing options, the lease has been the biggest driver of growth since it offers an inexpensive, no-hassle way to get all the benefits of going solar without breaking the bank.  What is unusual to most people that are unfamiliar with this concept is the term of the lease, which is typically 20 years.  However, when you consider that most manufacturers warrant their panels for 25 and many have a usable life of 40 years, this term does not seem all that unusual.  The benefits of this program look something like this: The homeowner has an average electric utility bill of $350 / month A solar company quotes a customer a savings of $200 / month in the form of a net metering energy credit, so their bill after solar is now $150 / month The lease payment for the installed solar array, metering equipment, and monitoring software is $150 / month The homeowner’s net saving is an average of $50 / month with nothing out of pocket Over the life of the lease, energy prices will increase which will mean more savings over time so long as there are not escalators in the contract that exceed the increase in energy prices The lessor “owns” the equipment and is responsible for maintenance, performance, and insurance With this product comes complexity.  Many companies offering this program do not have the cash or the appetite to take on massive debt, so they partner with Tax Equity investors to make this transaction possible.  Because of the 30% ITC and accelerated depreciation, this transaction is very favorable for a Tax Equity investor like Google, US Bank, or Bank of America Merrill Lynch.  There are a number of structures they can use; however, the Sale-Leaseback structure is the easiest and most efficient way to fund the transaction.  While this is not “known” to the end customer, it is important because the Tax Equity Investor effectively owns the asset and has the final say in setting credit policy.  This transaction does require that the developer have a stake as well; however, many of the developers go to the debt market for “back leverage” on their stake so that they can reduce the impact to their balance sheet. This complexity carries a cost, as the cost of capital is higher than most traditional loan products from established financial services firms.  That said, the fact that the lease allows the customer to monetize the tax credit and accelerated depreciation in the amount financed, balances out the higher costs of capital.  In the next blog we will touch more on the challenges this product, in particular, has in the market. Last, but not least, there is another mechanism gaining popularity in the market.  This concept is known as community solar.  One of the obstacles of the lease and Tax Equity arrangement is that the lease is only available to single family residence homeowners and, if they have multiple homes, only the homeowner’s primary residence.  That means that people who rent, own a condo, own a vacation home, or own a small business do not qualify for this type of lease.  As a result, community solar has become a great option. With community solar, the panels are put in an ideal location for maximum exposure to the sun and they often produce 10-15% more power than panels on a rooftop.  Portions of this solar farm can be sold, rented, or sublet to consumers regardless of their living situation.  As the panels produce electricity, that power gets sold to the local utility and the customer gets money from that utility that shows up as a credit on their next bill.  In this structure, the customer is not required to put money down in most cases and they are signing up for a specific term. Like a rooftop lease, this structure often has a Tax Equity investor that funds the project.  Again, this allows them to take the 30% ITC and accelerated depreciation which, in turn, gets monetized and lowers the costs of construction. In the final installment of this blog series, I will discuss some of the challenges that this market faces as the ITC expiration date approaches and the market becomes more mature. Leasing is driving the market, so if the ITC does not get renewed, the market will need to have a plan in place to find other innovative ways to keep solar affordable so more consumers can realize the benefits of going solar.   Solar Financing — The current and future catalyst behind the booming residential solar market (Part 1)

Published: February 9, 2015 by Guest Contributor

Customer experience strategies for success Sometimes it’s easier to describe something as the opposite of something else.  Being “anti-” something can communicate something meaningful. Cultural movements in the past have taken on these monikers:  consider the “anti-establishment” or “anti-war” movements.  We all need effective anti-virus protection.  And there are loads of skin products marketed as “anti-aging”, “anti-wrinkle”, or “anti-blemish.” But when you think about a vision for the customer experience that your company aspires to deliver, this approach of the “anti-X” falls flat. Would you want to aspire to basically “not stink?”  Would that inspire you and your team to run through walls to deliver on that grand aspiration? Would it motivate customers to stick with you, buy more of what you sell, and tell others about you? I think not…But it sure seems like many out there indeed do aspire to “not stink.” Sure, there are great companies out there who have a set a high standard for customer experience, placing it at the center of their strategies and their success. Some, like Zappos, started that way from the beginning.  Others, like The Ritz-Carlton, realized that they had lost their way and made the commitment to do the hard work of reaching and sustaining excellence. On the other hand, there are hundreds of firms who have a weak commitment to or even understanding of the importance of customer experience to their strategy and performance.  Their leaders may give lip service or just pay attention for a few days or hours following the release of reports from leading analysts and firms. They may have posters and slogans that talk about putting the customer first or similar platitudes. These companies probably even have talented and passionate professionals working tirelessly to improve the customer experience in spite of the fact that nobody seems to care much. What these firms lack is a clear customer experience strategy. As nature abhors a vacuum, customers and employees are free to infer or just guess at it.  Focusing on customer experience only when a report comes out – and paying special attention only when weak results put the firm near the bottom of the ranking leads people to conclude that all that really matters is to “not stink.”  In other words, don’t stand out for being bad…but don’t worry much about being good as it is not important to the company’s strategy or results. I think that this “don’t stink” implicit strategy helps explain a fascinating insight from a Forrester survey in 2013: “80% of executives believe their company is delivering a superior customer experience, yet in 2013 only 8% of companies surveyed received a top grade from their customers.”  Many leaders simply have not invested the energy and commitment necessary to define a real customer experience vision that reflects a deep understanding of the role that it plays in the company’s strategy.  Beyond setting that vision, there is a big and sustained commitment required to deliver on the vision, measure results, and continuously adjust as customer needs evolve. Like all journeys, a great customer experience starts with one step. Establishing a customer experience strategy is the first one – and “don’t stink” simply stinks as a strategy. Download our recent perspective paper to learn how exceptional customer experience can give companies the competitive edge they need in a market where price, products and services can no longer be a differentiator.

Published: January 27, 2015 by Guest Contributor

By: Mike Horrocks Managing your portfolio can be a long and arduous process that ties up your internal resources but without it, there’s an increase of additional risk and potential losses. The key is to use loan automation to pull together data in a meaningful manner and go from a reactive to proactive process that can: Address the overall risks and opportunities within your loan portfolio ​Get a complete view of the credit and operational risk associated with a credit relationship or portfolio segment Monitor and track actionable steps by leveraging both internal and external data Watch how to avoid the 5 most common mistakes in loan portfolio management to help you reduce overall risk, but also identify cross sell and upsell opportunities. With a more automated process your lending staff can focus on bringing in new business rather than reacting to delinquencies and following up on credit issues. 

Published: January 22, 2015 by Guest Contributor

There are two sides to every coin and in banking the question is often to you want to chase the depositor of that coin, or lend it out? Well the Federal Reserve’s decision to hold interest rates at record lows since the economic downturn gave the banks’ in the United States loan portfolios a nice boost from 2010-2011, but the subsequent actions and banking environment resulted in deposit growth outpacing loans – leading to a marked reduction in loan-to-deposit ratios across banks since 2011. In fact currently there is almost $1.30 in deposits for every loan out there today.  This, in turn, has manifested itself as a reduction in net interest margins for all U.S. banks over the last three years – a situation unlikely to improve until the Fed hikes interest rates. Additionally, the banks’ have found that while they are now holding on to more of these deposits that additional regulations in the form of the CFPB looking to evaluate account origination processes,  Basel III Liquidity concerns, CCAR and CIP & KYP have all made the burden of holding these deposits more costly.   In fact the CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding   potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Lastly, to add to this already challenging environment, technology has switched the channel of choice to your smartphone and has introduced a barrage of risks associated with identity authentication – as well as operational opportunities. As leaders in retail banking and in addressing the needs of your customers, I would like to extend an invitation on behalf of Experian for you to participate in our latest survey on the changing landscape of DDA opportunities.  How are regulations changing your product set, what role does mobile play now and in the future, and what are your top priorities for 2015 and beyond?  These are just a few of the insights we would like to gain from experts such as you. To access our survey, please click here.  Our brief survey should take no more than seven minutes to complete and your insights will be highly valued as we look to better support you and your organization’s demand product needs.  Our survey period will close in three weeks, so please respond now. As a sign of our appreciation for your insights, we will send all participants an anonymous aggregation of the responses so that you can see how others view the retail banking marketplace. So take advantage of this chance to learn from your peers and participate in this industry study and don’t leave your strategy to a flip of a coin.

Published: January 20, 2015 by Guest Contributor

This is the second post in a three-part series. Imagine the circumstances of a traveler coming to a never before visited culture. The opportunity is the new sights, cuisine and cultural experiences. Among the risks is the not before experienced pathogens and the strength of the overall health services infrastructure. In a similar vein, all too frequently we see the following conflict within our client institutions. The internal demands of an ever-increasing competitive landscape drive businesses to seek more data; improved ease of accessibility and manipulation of data; and acceleration in creating new attributes supporting more complex analytic solutions. At the same time, requirements for good governance and heightened regulatory oversight are driving for improved documentation and controlled access, all with improved monitoring and documented and tested controls. As always, the traveler/businessman must respond to the environment, and the best medicine is to be well-informed of both the perils and the opportunities. The good news is that we have seen many institutions invest significantly in their audit and compliance functions over the past several years. This has provided the lender with both better insights into its current risk ecosystem and the improved skill set to continue to refine those insights. The opportunity is for the lender to leverage this new strength. For many lenders, this investment largely has been in response to broadening regulatory oversight to ensure there are proper protocols in place to confirm adherence to relevant rules and regulations and to identify issues of disparate impact. A list of the more high-profile regulations would include: Equal Credit Opportunity Act (ECOA) — to facilitate enforcement of fair lending laws and enable communities, governmental entities and creditors to identify business and community development needs and opportunities of women-owned, minority-owned and small businesses. Home Mortgage Disclosure Act (HMDA) — to require mortgage lenders to collect and report additional data fields. Truth in Lending Act (TLA) — to prohibit abusive or unfair lending practices that promote disparities among consumers of equal creditworthiness but of different race, ethnicity, gender or age. Consumer Financial Protection Bureau (CFPB) — evolving rules and regulations with a focus on perceptions of fairness and value through transparency and consumer education. Gramm-Leach-Bliley Act (GLBA) — requires companies to give consumers privacy notices that explain the institutions’ information-sharing practices. In turn, consumers have the right to limit some, but not all, sharing of their information. Fair Debt Collections Practices Act (FDCPA) — provides guidelines for collection agencies that are seeking to collect legitimate debts while providing protections and remedies for debtors. Recently, most lenders have focused their audit/compliance activities on the analytics, models and policies used to treat consumer/client accounts/relationships. This focus is understandable since it is these analytics and models that are central to the portfolio performance forecasts and Comprehensive Capital Analysis and Review (CCAR)–mandated stress-test exercises that have been of greater emphasis in responding to recent regulatory demands. Thus far at many lenders, this same rigor has not yet been applied to the data itself, which is the core component of these policies and frequently complex analytics. The strength of both the individual consumer–level treatments and the portfolio-level forecasts is negatively impacted if the data underlying these treatments is compromised. This data/attribute usage ecosystem demands clarity and consistency in attribute definition; extraction; and new attribute design, implementation to models and treatments, validation and audit. When a lender determines there is a need to enhance its data governance infrastructure, Experian® is a resource to be considered. Experian has this data governance discipline within our corporate DNA — and for good reason. Experian receives large and small files on a daily basis from tens of thousands of data providers. In order to be sure the data is of high quality so as not to contaminate the legacy data, rigorous audits of each file received are conducted and detailed reports are generated on issues of quality and exceptions. This information is shared with the data provider for a cycle of continuous improvement. To further enhance the predictive insights of the data, Experian then develops new attributes and complex analytics leveraging the base and developed attributes for analytic tools. This data and the analytic tools then are utilized by thousands of  authorized users/lenders, who manage broad-ranging relationships with millions of individuals and small businesses. These individuals and businesses then have rights to reproach Experian for error(s) both perceived and actual. This demanding cycle underscores the value of the data and the value of our rigorous data governance infrastructure. This very same process occurs at many lenders sites. Certainly, a similar level of data integrity born from a comprehensive data governance process also is warranted. In the next and final blog in this series, we will explore how a disciplined business review of an institution’s data governance process is conducted. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.

Published: December 18, 2014 by Guest Contributor

Opening a new consumer checking account in the 21st century should be simple and easy to understand as a customer right?  Unfortunately, not all banks have 21st century systems or processes reflecting the fact that negotiable order of withdrawal (NOW) accounts, or checking accounts, were introduced decades ago within financial institutions and often required the consumer to be in person to open the account.  A lot has changed and consumers demand simpler and transparent account opening processes with product choices that match their needs at a price that they’re willing to pay.  Financial institutions that leverage modernized technology capabilities and relevant decision information have the best chance to deliver consumer friendly experiences that meet consumer expectations.  It is obvious to consumers when we in the financial services industry get it right and when we don’t. The process to open a checking account should be easily understood by consumers and provide them with appropriate product choices that aren’t “one size fits all”.  Banks with more advanced core-banking systems incorporating relevant and compliant decision data and transparent consumer friendly approval processes have a huge opportunity to differentiate themselves positively from competitors.  The reality is that banking deposit management organizations throughout the United States continue to evolve check screening strategies, technology and processes.  This is done in an effort to keep up with evolving regulatory expectations from the consumer advocacy regulatory bodies such as the Consumer Financial Protection Bureau (CFPB) and designed to improve transparency of checking account screening for new accounts for an increased number of consumers. The CFPB advocates that financial institutions adopt new checking account decision processes and procedures that maintain sound management practices related to mitigating fraud and risk expense while improving consumer transparency and increasing access to basic consumer financial instruments.  Bank shareholders demand that these accounts be extended to consumers profitably.  The CFPB recognizes that checking accounts are a basic financial product used by almost all consumers, but has expressed concerns that the checking account screening processes may prevent access to some consumers and may be too opaque with respect to the reasons why the consumer may be denied an account.  The gap between the expectations of the CFPB, shareholders and bank deposit management organization’s current products and procedures are not as wide as they may seem.  The solution to closing the gap includes deploying a more holistic approach to checking account screening processes utilizing 21st century technology and decision capabilities.  Core banking technology and checking products developed decades ago leave banks struggling to enact much needed improvements for consumers. The CFPB recognizes that many financial institutions rely on reports used for checking account screening that are provided by specialty consumer reporting agencies (CRAs) to decision approval for new customers.  CRAs specialize in checking account screening and provide financial institutions with consumer information that is helpful in determining if a consumer should be approved or not.  Information such as the consumer’s check writing and account history such as closed accounts or bounced checks are important factors in determining eligibility for the new account.  Financial institutions are also allowed to screen consumers to assess if they may be a credit risk when deciding whether to open a consumer checking account because many consumers opt-in for overdraft functionality attached to the checking account. Richard Cordray, the CFPB Director, clarified the regulatory agency’s position as to how consumers are treated in checking account screening processes within his prepared remarks at a forum on this topic in October 2014.  “The Consumer Bureau has three areas of concern.  First, we are concerned about the information accuracy of these reports. Second, we are concerned about people’s ability to access these reports and dispute any incorrect information they may find. Third, we are concerned about the ways in which these reports are being used.” The CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding   potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Implementing these steps shouldn’t be too difficult to accomplish for deposit management organizations as long as they are fully leveraging software such as Experian’s PowerCurve customized for deposit account origination, relevant decision information such as Experian’s Precise ID Platform and Vantage Score® credit score combined with consumer product offerings developed within the bank and offered in an environment that is real-time where possible and considers the consumer’s needs.  Enhancing checking account screening procedures by taking into account consumer’s life-stage, affordability considerations, unique risk profile and financial needs will satisfy expectations of the consumers, regulators and the financial institution shareholders. Financial institutions that use technology and data wisely can reduce expenses for their organizations by efficiently managing fraud, risk and operating costs within the checking account screening process while also delighting consumers.  Regulatory agencies are often delighted when consumers are happy.  Shareholders are delighted when regulators and consumers are happy.  Reengineering checking account opening processes for the modern age results in a win-win-win for consumers, regulators and financial institutions. Discover how an Experian Global Consultant can help you with your banking deposit management needs.

Published: December 12, 2014 by Guest Contributor

Originally contributed by: Bill Britto Smart meters have made possible new services for customers, such as automated budget assistance and bill management tools, energy use notifications, and "smart pricing" and demand response programs. It is estimated that more than 50 million smart meters have been deployed as of July 2014. Utilities and customers alike are benefiting from these smart meter deployments. It is now obvious the world of utilities is changing, and companies are beginning to cater more to their customers by offering them tools to keep their energy costs lower.  For example, several companies offer prepay to some of their customers who do not have bank accounts. For many of those "unbanked" customers, prepay could be the only way to sign up for a utility services. Understanding the value of prospects and the need to automate decisions to achieve higher revenue and curb losses is imperative to the utility. It is here where a decisioning solution, like PowerCurve OnDemand> can make a real difference for utility customers by providing modified decision strategies based on market dynamics, business and economic environments.  Imagine what a best of class decision solution can do by identifying what matters most about consumers and business and by leveraging internal and external data assets to replace complexity with cost efficiency?  Solutions like PowerCurve OnDemand deliver the power and speed-to-market to respond to changing customer demands, driving profitability and growing customer lifetime value - good for business and good for customers.

Published: November 22, 2014 by Aaron Czajka

A new comarketing agreement for MainStreet Technologies’ (MST) Loan Loss Analyzer product with Experian Decision Analytics’ Baker Hill Advisor® product will provide the banking industry with a comprehensive, automated loan-management offering. The combined products provide banks greater confidence for loan management and loan-pricing calculations. Experian Decision Analytics Baker Hill Advisor product supports banks’ commercial and small-business loan operations comprehensively, from procuring new loans through collections. MST’s Loan Loss Analyzer streamlines the estimation and documentation of the Allowance for Loan and Lease Losses (ALLL), the bank’s most critical quarterly calculation. The MST product automates the most acute processes required of community bankers in managing their commercial and small-business loan portfolios. Both systems are data-driven, configurable and designed to accommodate existing bank processes. The products already effectively work together for community banks of varying asset sizes, adding efficiencies and accuracy while addressing today’s increasingly complex regulatory requirements. “Experian’s Baker Hill Advisor product-development priorities have always been driven by our user community. Changes in regulatory and accounting requirements have our clients looking for a sophisticated ALLL system. Working with MainStreet, we can refer our clients to an industry-leading ALLL platform,” said John Watts, Experian Decision Analytics director of product management. “The sharing of data between our organizations creates an environment where strategic ALLL calculations are more robust and tactical lending decisions can be made with more confidence. It provides clients a complete service at every point within the organization.” “Bankers, including many using our Loan Loss Analyzer, have used Experian’s Baker Hill® software to manage their commercial loan programs for more than three decades,” said Dalton T. Sirmans, CEO and MST president. “Bankers who choose to implement Experian’s Baker Hill Advisor and the MST Loan Loss Analyzer will be automating their loan management, tracking, reporting and documentation in the most comprehensive, user-friendly and feature-rich manner available.” For more information on MainStreet Technologies, please visit http://www.mainstreet-tech.com/banking For more information on Baker Hill, visit http://ex.pn/BakerHill

Published: November 19, 2014 by Guest Contributor

By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income.  One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?

Published: October 30, 2014 by Guest Contributor

More than 10 years ago I spoke about a trend at the time towards an underutilization of the information being managed by companies. I referred to this trend as “data skepticism.” Companies weren’t investing the time and resources needed to harvest the most valuable asset they had – data. Today the volume and variety of data is only increasing as is the necessity to successfully analyze any relevant information to unlock its significant value. Big data can mean big opportunities for businesses and consumers. Businesses get a deeper understanding of their customers’ attitudes and preferences to make every interaction with them more relevant, secure and profitable. Consumers receive greater value through more personalized services from retailers, banks and other businesses. Recently Experian North American CEO Craig Boundy wrote about that value stating, “Data is Good… Analytics Make it Great.” The good we do with big data today in handling threats posed by fraudsters is the result of a risk-based approach that prevents fraud by combining data and analytics. Within Experian Decision Analytics our data decisioning capabilities unlock that value to ultimately provide better products and services for consumers.   The same expertise, accurate and broad-reaching data assets, targeted analytics, knowledge-based authentication, and predictive decisioning policies used by our clients for risk-based decisioning has been used by Experian to become a global leader in fraud and identity solutions. The industrialization of fraud continues to grow with an estimated 10,000 fraud rings in the U.S. alone and more than 2 billion unique records exposed as a result of data breaches in 2014. Experian continues to bring together new fraud platforms to help the industry better manage fraud risk. Our 41st Parameter technology has been able to detect over 90% of all fraud attacks against our clients and reduce their operational costs to fight fraud. Combining data and analytics assets can detect fraud, but more importantly, it can also detect the good customers so legitimate transactions are not blocked. Gartner reported that by 2020, 40% of enterprises will be storing information from security events to analyze and uncover unusual patterns. Big data uncovers remarkable insights to take action for the future of our fraud prevention efforts but also can mitigate the financial losses associated with a breach. In the end we need more data, not less, to keep up with fraudsters. Experian is hosting Future of Fraud and Identity events in New York and San Francisco discussing current fraud trends and how to prevent cyber-attacks aimed at helping the industry. The past skepticism no longer holds true as companies are realizing that data combined with advanced analytics can give them the insight they need to prevent fraud in the future. Learn more on how Experian is conquering the world of big data.

Published: October 21, 2014 by Guest Contributor

If rumors hold true, Apple Pay will launch in a week. Five of my last six posts had covered Apple’s likely and actual strategy in payments & commerce, and the rich tapestry of control, convenience, user experience, security and applied cryptography that constitutes as the backdrop. What follows is a summation of my views – with a couple of observations from having seen the Apple Pay payment experience up close. About three years ago – I published a similar commentary on Google Wallet that for kicks, you can find here. I hope what follows is a balanced perspective, as I try to cut through some FUD, provide some commentary on the payment experience, and offer up some predictions that are worth the price you pay to read my blog. Source: Bloomua / Shutterstock.com First the criticism. Apple Pay doesn’t go far enough: Fair. But you seem to misunderstand Apple’s intentions here. Apple did not set out to make a mobile wallet. Apple Pay sits within Passbook – which in itself is a wrapper of rewards and loyalty cards issued by third parties. Similarly – Apple Pay is a wrapper of payments cards issued by third parties. Even the branding disappears once you provision your cards – when you are at the point-of-sale and your iPhone6 is in proximity to the reader (or enters the magnetic field created by the reader) – the screen turns on and your default payment card is displayed. One does not need to launch an app or fiddle around with Apple Pay. And for that matter, it’s even more limited than you think. Apple’s choice to leave the Passbook driven Apple Pay experience as threadbare as possible seems an intentional choice to force consumers to interact more with their bank apps vs Passbook for all and any rich interaction. Infact the transaction detail displayed on the back of the payment card you use is limited – but you can launch the bank app to view and do a lot more. Similarly – the bank app can prompt a transaction alert that the consumer can select to view more detail as well. Counter to what has been publicized – Apple can – if they choose to – view transaction detail including consumer info, but only retains anonymized info on their servers. The contrast is apparent with Google – where (during early Google Wallet days) issuers dangled the same anonymized transaction info to appease Google – in return for participation in the wallet. If your tap don’t work – will you blame Apple? Some claim that any transaction failures – such as a non-working reader – will cause consumers to blame Apple. This does not hold water simply because – Apple does not get in between the consumer, his chosen card and the merchant during payment. It provides the framework to trigger and communicate a payment credential – and then quietly gets out of the way. This is where Google stumbled – by wanting to become the perennial fly on the wall. And so if for whatever reason the transaction fails, the consumer sees no Apple branding for them to direct their blame. (I draw a contrast later on below with Samsung and LoopPay) Apple Pay is not secure: Laughable and pure FUD. This article references an UBS note talking how Apple Pay is insecure compared to – a pure cloud based solution such as the yet-to-be-launched MCX. This is due to a total misunderstanding of not just Apple Pay – but the hardware/software platform it sits within (and I am not just talking about the benefits of a TouchID, Network Tokenization, Issuer Cryptogram, Secure Element based approach) including, the full weight of security measures that has been baked in to iOS and the underlying hardware that comes together to offer the best container for payments. And against all that backdrop of applied cryptography, Apple still sought to overlay its payments approach over an existing framework. So that, when it comes to risk – it leans away from the consumer and towards a bank that understands how to manage risk. That’s the biggest disparity between these two approaches – Apple Pay and MCX – that, Apple built a secure wrapper around an existing payments hierarchy and the latter seeks to disrupt that status quo. Let the games begin: Consumers should get ready for an ad blitz from each of the launch partners of Apple Pay over the next few weeks. I expect we will also see these efforts concentrated around pockets of activation – because setting up Apple Pay is the next step to entering your Apple ID during activation. And for that reason – each of those launch partners understand the importance of reminding consumers why their card should be top of mind. There is also a subtle but important difference between top of wallet card (or default card) for payment in Apple Pay and it’s predecessors (Google Wallet for example). Changing your default card was an easy task – and wholly encapsulated – within the Google Wallet app. Where as in Apple Pay – changing your default card – is buried under Settings, and I doubt once you choose your default card – you are more likely to not bother with it. And here’s how quick the payment interaction is within Apple Pay (takes under 3 seconds) :- Bring your phone in to proximity of the reader. Screen turns on. Passbook is triggered and your default card is displayed. You place your finger and authenticate using TouchID. A beep notes the transaction is completed. You can flip the card to view a limited transaction detail. Yes – you could swipe down and choose another card to pay. But unlikely. I remember how LevelUp used very much the same strategy to signup banks – stating that over 90% of it’s customers never change their default card inside LevelUp. This will be a blatant land grab over the next few months – as tens of millions of new iPhones are activated. According to what Apple has told it’s launch partners – they do expect over 95% of activations to add at least one card. What does this mean to banks who won’t be ready in 2014 or haven’t yet signed up? As I said before – there will be a long tail of reduced utility – as we get in to community banks and credit unions. The risk is amplified because Apple Pay is the only way to enable payments in iOS that uses Apple’s secure infrastructure – and using NFC. For those still debating whether it was a shotgun wedding, Apple’s approach had five main highlights that appealed to a Bank – Utilizing an approach that was bank friendly (and to status quo) : NFC Securing the transaction beyond the prerequisites of EMV contactless – via network tokenization & TouchID Apple’s preference to stay entirely as an enabler – facilitating a secure container infrastructure to host bank issued credentials. Compressing the stack: further shortening the payment authorization required of the consumer by removing the need for PIN entry, and not introducing any new parties in to the transaction flow that could have introduced delays, costs or complexity in the roundtrip. Clear description of costs to participate – Free is ambiguous. Free leads to much angst as to what the true cost of participation really is(Remember Google Wallet?). Banks prefer clarity here – even if it means 15bps in credit. As I wrote above, Apple opting to strictly coloring inside the lines – forces the banks to shoulder much of the responsibility in dealing with the ‘before’ and ‘after’ of payment. Most of the bank partners will be updating or activating parts of their mobile app to start interacting with Passbook/Apple Pay. Much of that interaction will use existing hooks in to Passbook – and provide richer transaction detail and context within the app. This is an area of differentiation for the future – because those banks who lack the investment, talent and commitment to build a redeeming mobile services approach will struggle to differentiate on retail footprint alone. And as smarter banks build entirely digital products for an entirely digital audience – the generic approaches will struggle and I expect at some point – that this will drive bank consolidation at the low end. On the other hand – if you are an issuer, the ‘before’ and ‘after’ of payments that you are able to control and the richer story you are able to weave, along with offline incentives – can aid in recapture. The conspicuous and continued absence of Google: So whither Android? Uniformity in payments for Android is as fragmented as the ecosystem itself. Android must now look at Apple for lessons in consistency. For example, how Apple uses the same payment credential that is stored in the Secure Element for both in-person retail transactions as well as in-app payments. It may look trivial – but when you consider that Apple came dangerously close (and justified as well) in its attempt to obtain parity between those two payment scenarios from a rate economics point of view from issuers – Android flailing around without a coherent strategy is inexcusable. I will say this again: Google Wallet requires a reboot. And word from within Google is that a reboot may not imply a singular or even a cohesive approach. Google needs to swallow its pride and look to converge the Android payments and commerce experience across channels similar to iOS. Any delay or inaction risks a growing apathy from merchants who must decide what platform is worth building or focusing for. Risk vs Reward is already skewed in favor of iOS: Even if Apple was not convincing enough in its attempt to ask for Card Present rates for its in-app transactions – it may have managed to shift liability to the issuer similar to 3DS and VBV – that in itself poses an imbalance in favor of iOS. For a retail app in iOS – there is now an incentive to utilize Apple Pay and iOS instead of all the other competing payment providers (Paypal for example, or Google Wallet) because transactional risk shifts to the issuer if my consumer authenticates via TouchID and uses a card stored in Apple Pay. I have now both an incentive to prefer iOS over Android as well as an opportunity to compress my funnel – much of my imperative to collect data during the purchase was an attempt to quantify for fraud risk – and the need for that goes out of the window if the customer chooses Apple Pay. This is huge and the repercussions go beyond Android – in to CNP fraud, CRM and loyalty. Networks, Tokens and new end-points (e.g. LoopPay): The absence of uniformity in Android has provided a window of opportunity for others – regardless of how fragmented these approaches be. Networks shall parlay the success with tokenization in Apple Pay in to Android as well, soon. Prime example being: Loop Pay. If as rumors go – Samsung goes through with baking in Loop Pay in to its flagship S6, and Visa’s investment translates in to Loop using Visa tokenization – Loop may find the ubiquity it is looking for – on both ends. I don’t necessarily see the value accrued to Samsung for launching a risky play here: specifically because of the impact of putting Loop’s circuitry within S6. Any transaction failure in this case – will be attributed to Samsung, not to Loop, or the merchant, or the bank. That’s a risky move – and I hope – a well thought out one. I have some thoughts on how the Visa tokenization approach may solve for some of the challenges that Loop Pay face on merchant EMV terminals – and I will share those later. The return of the comeback: Reliance on networks for tokenization does allay some of the challenges faced by payment wrappers like Loop, Coin etc – but they all focus on the last mile and tokenization does little more for them than kicking the can down the road and delaying the inevitable a little while more. The ones that benefit most are the networks themselves – who now has wide acceptance of their tokenization service – with themselves firmly entrenched in the middle. Even though the EMVCo tokenization standard made no assumptions regarding the role of a Token Service Provider – and in fact Issuers or 3rd parties could each pay the role sufficiently well – networks have left no room for ambiguity here. With their role as a TSP – networks have more to gain from legitimizing more end points than ever before – because these translate to more token traffic and subsequently incremental revenue – transactional and additional managed services costs (OBO – On behalf of service costs incurred by a card issuer or wallet provider). It has never been a better time to be a network. I must say – a whiplash effect for all of us – who called for their demise with the Chase-VisaNet deal. So my predictions for Apple Pay a week before its launch: We will see a substantial take-up and provisioning of cards in to Passbook over the next year. Easy in-app purchases will act as the carrot for consumers. Apple Pay will be a quick affair at the point-of-sale: When I tried it few weeks ago – it took all of 3 seconds. A comparable swipe with a PIN (which is what Apple Pay equates to) took up to 10. A dip with an EMV card took 23 seconds on a good day. I am sure this is not the last time we will be measuring things. The substantial take-up on in-app transactions will drive signups: Consumers will signup because Apple’s array of in-app partners will include the likes of Delta – and any airline that shortens the whole ticket buying experience to a simple TouchID authentication has my money. Apple Pay will cause MCX to fragment: Even though I expect the initial take up to be driven more on the in-app side vs in-store, as more merchants switch to Apple Pay for in-app, consumers will expect a consistency in that approach across those merchants. We will see some high profile desertions – driven partly due to the fact that MCX asks for absolute fealty from its constituents, and in a rapidly changing and converging commerce landscape – that’s just a tall ask. In the near-term, Android will stumble: Question is if Google can reclaim and steady its own strategy. Or will it spin off another costly experiment in chasing commerce and payments. The former will require it to be pragmatic and bring ecosystem capabilities up to par – and that’s a tall ask when you lack the capacity for vertical integration that Apple has. And from the looks of it – Samsung is all over the place at the moment. Again – not confidence inducing. ISIS/SoftCard will get squeezed out of breath: SoftCard and GSMA can’t help but insert themselves in to the Apple Pay narrative by hoping that the existence of a second NFC controller on the iPhone6 validates/favors their SIM based Secure Element approach and indirectly offers Softcard/GSMA constituents a pathway to Apple Pay. If that didn’t make a lick of sense – It’s like saying ‘I’m happy about my neighbor’s Tesla because he plugs it in to my electric socket’. Discover how an Experian business consultant can help you strengthen your credit and risk management strategies and processes: http://ex.pn/DA_GCP This post originally appeared here.

Published: October 21, 2014 by Cherian Abraham

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe