By: Maria Moynihan Cybersecurity, identity management and fraud are common and prevalent challenges across both the public sector and private sector. Industries as diverse as credit card issuers, retail banking, telecom service providers and eCommerce merchants are faced with fraud threats ranging from first party fraud, commercial fraud to identity theft. If you think that the problem isn't as bad as it seems, the statistics speak for themselves: Fraud accounts for 19% of the $600 billion to $800 billion in waste in the U.S. healthcare system annually Medical identity theft makes up about 3% of 8.3 million overall victims of identity theft In 2011, there were 431 million adult victims of cybercrime in 24 countries In fiscal year 2012, the IRS’ specialized identity theft unit saw a 78% spike from last year in the number of ID theft cases submitted The public sector can easily apply the same best practices found in the private sector for ID verification, fraud detection and risk mitigation. Here are four sure fire ways to get ahead of the problem: Implement a risk-based authentication process in citizen enrollment and account management programs Include the right depth and breadth of data through public and private sources to best identity proof businesses or citizens Offer real-time identity verification while ensuring security and privacy of information Provide a Knowledge Based Authentication (KBA) software solution that asks applicants approved random questions based on “out-of-wallet” data What fraud protection tactics has your organization implemented? See what industry experts suggest as best practices for fraud protection and stay tuned as I share more on this topic in future posts. You can view past Public Sector blog posts here.
By: Maria Moynihan State and Federal agencies are tasked with overseeing the integration of new Health Insurance Exchanges and with that responsibility, comes the effort of managing information updates, ensuring smooth data transfer, and implementing proper security measures. The migration process for HIEs is no simple undertaking, but with these three easy steps, agencies can plan for a smooth transition: Step 1: Ensure all current contact information is accurate with the aid of a back-end cleansing tool. Back-end tools clean and enhance existing address records and can help agencies to maintain the validity of records over time. Step 2: Duplicate identification is a critical component of any successful database migration - by identifying and removing existing duplicate records, and preventing future creation of duplicates, constituents are prevented from opening multiple cases, thereby reducing the probability for fraud. Step 3: Validate contact data as it is captured. This step is extremely important, especially as information gets captured across multiple touch points and portals. Contact record validation and authentication is a best practice for any database or system gateway. Agencies and those particularly responsible for the successful launches of HIEs are expected to leverage advanced technology, data and sophisticated tools to improve efficiencies, quality of care and patient safety. Without accurate, standard and verified contact information, none of that is possible. Access the full Health Insurance Exchange Toolkit by clicking here.
Big news [last week], with Chase entering in to a 10 year expanded partnership with Visa to create a ‘differentiated experience’ for its merchants and consumers. I would warn anyone thinking “offers and deals” when they hear “differentiated experience” – because I believe we are running low on merchants who have a perennial interest in offering endless discounts to its clientele. I cringe every time someone waxes poetic about offers and deals driving mobile payment adoption – because I am yet to meet a merchant who wanted to offer a discount to everyone who shopped. There is an art and a science to discounting and merchants want to identify customers who are price sensitive and develop appropriate strategies to increase stickiness and build incremental value. It’s like everyone everywhere is throwing everything and the kitchen sink at making things stick. On one end, there is the payments worshippers, where the art of payment is the centre piece – the tap, the wave, the scan. We pore over the customer experience at the till, that if we make it easier for customers to redeem coupons, they will choose us over the swipe. But what about the majority of transactions where a coupon is not presented, where we swipe because its simply the easiest, safest and the boring thing to do. Look at the Braintree/Venmo model, where payment is but a necessary evil. Which means, the payment is pushed so far behind the curtain – that the customer spends nary a thought on her funding source of choice. Consumers are issuer agnostic to a fault – a model propounded by Square’s Wallet. Afterall, when the interaction is tokenized, when a name or an image could stand in for a piece of plastic, then what use is there for an issuer’s brand? So what are issuers doing? Those that have a processing and acquiring arm are increasingly looking at creative transaction routing strategies, in transactions where the issuer finds that it has a direct relationship with both the merchant and the consumer. This type of selective routing enables the issuer to conveniently negotiate pricing with the merchant – thereby encouraging the merchant to incent their customers to pay using the card issued by the same issuer. For this strategy to succeed, issuers need to both signup merchants directly, as well as encourage their customers to spend at these merchants using their credit and debit cards. FI’s continue to believe that they can channel customers to their chosen brands, but “transactional data doth not maketh the man” – and I continue to be underwhelmed by issuer efforts in this space. Visa ending its ban on retailer discounts for specific issuer cards this week must be viewed in context with this bit – as it fuels rumors that other issuers are looking at the private payment network option – with merchants preferring their cards over competitors explicitly. The wild wild west, indeed. This drives processors to either cut deals directly with issuers or drives them far deeper in to the merchant hands. This is where the Braintree/Venmo model can come in to play – where the merchant – aided by an innovative processor who can scale – can replicate the same model in the physical world. We have already seen what Chase Paymentech plans to do. There aren’t many that can pull off something similar. Finally, What about Affirm, the new startup by Max Levchin? I have my reservations about the viability of a Klarna type approach in the US – where there is a high level of credit card penetration among the US customers. Since Affirm will require customers to choose that as a payment option, over other funding sources – Paypal, CC and others, there has to be a compelling reason for a customer to choose Affirm. And atleast in the US, where we are card-entrenched, and everyday we make it easier for customers to use their cards (look at Braintree or Stripe) – it’s a tough value proposition for Affirm. Share your opinions below. This is a re-post from Cherian's personal blog at DropLabs.
Last January, I published an article in the Credit Union Journal covering the trend among banks to return to portfolio growth. Over the year, the desire to return to portfolio growth and maximize customer relationships continues to be a strong focus, especially in mature credit markets, such as the US and Canada. Let’s revisit this topic, and start to dive deeper into the challenges we’ve seen, explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies. Historically, credit unions and banks have driven portfolio growth with aggressive out-bound marketing offers designed to attract new customers and members through loan acquisitions. These offers were typically aligned to a particular product with no strategy alignment between multiple divisions within the organization. Further, when existing customers submitted a new request for credit, they were treated the same as incoming new customers with no reference to the overall value of the existing relationship. Today, however, financial institutions are looking to create more value from existing customer relationships to drive sustained portfolio growth by increasing customer retention, loyalty and wallet share. Let’s consider this idea further. By identifying the needs of existing customers and matching them to individual credit risk and affordability, effective cross-sell strategies that link the needs of the individual to risk and affordability can ensure that portfolio growth can be achieved while simultaneously increasing customer satisfaction and promoting loyalty. The need to optimize customer touch-points and provide the best possible customer experience is paramount to future performance, as measured by market share and long-term customer profitability. By also responding rapidly to changing customer credit needs, you can further build trust, increase wallet share and profitably grow your loan portfolios. In the simplest sense, the more of your products a customer uses, the less likely the customer is to leave you for the competition. With these objectives in mind, financial organizations are turning towards the practice of setting holistic, customer-level credit lending parameters. These parameters often referred to as umbrella, or customer lending, limits. The challenges Although the benefits for enhancing existing relationships are clear, there are a number of challenges that bear to mind some important questions: How do you balance the competing objectives of portfolio loan growth while managing future losses? How do you know how much your customer can afford? How do you ensure that customers have access to the products they need when they need them What is the appropriate communication method to position the offer? Few credit unions or banks have lending strategies that differentiate between new and existing customers. In the most cases, new credit requests are processed identically for both customer groups. The problem with this approach is that it fails to capture and use the power of existing customer data, which will inevitably lead to suboptimal decisions. Similarly, financial institutions frequently provide inconsistent lending messages to their clients. The following scenarios can potentially arise when institutions fail to look across all relationships to support their core lending and collections processes: Customer is refused for additional credit on the facility of their choice, whilst simultaneously offered an increase in their credit line on another. Customer is extended credit on a new facility whilst being seriously delinquent on another. Customer receives marketing solicitation for three different products from the same institution, in the same week, through three different channels. Essentials for customer lending limits and successful cross-selling By evaluating existing customers on a periodic (monthly) basis, financial institutions can holistically assess the customer’s existing exposure, risk and affordability. By setting customer level lending limits in accordance with these parameters, core lending processes can be rendered more efficient, with superior results and enhanced customer satisfaction. This approach can be extended to consider a fast-track application process for existing relationships with high value, low risk customers. Traditionally, business processes have not identified loan applications from such individuals to provide preferential treatment. The core fundamentals of the approach necessary for the setting of holistic customer lending (umbrella) limits include: The accurate evaluation of credit and default rise The calculation of additional lending capacity and affordability Appropriate product offerings for cross-sell Operational deployment Follow my blog series over the next few months as we explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies.
First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. MCX – MerChants reduX: The post that follows is a collection of thoughts around MCX, why it deserves respect, and yet how it is indeed mortal and bleeds like all others. For those who are not familiar with MCX – it’s a consortium of over 30 leading national retailers with a singular purpose – that is, to create a seamlessly integrated mobile commerce platform. The website for MCX is http://www.mcx.com. The consortium is led by merchants like Walmart, Target, CVS, BestBuy, Gap, Sears etc. By 2012, the mobile payments space was fragmented as it is, which itself may have precipitated the launch of MCX. And to a number of solutions looking for traction, things ground to a halt when MCX conceptualized to the merchants a solution that needed no costly upgrades and a promise to route the transaction over low cost routing options. My friends on the issuer side privately confide that MCX has infact succeeded in throwing a monkey wrench in their mobile payment plans – and merchant acceptance looks to be ambiguous around incumbent initiatives such as Isis and GoogleWallet, as well as for alternative payment initiatives. It had been easy to call it mere posturing and ignore it in the early days, but of late there is a lot of hand wringing behind the scenes and too many furrowed brows, as if the realization finally struck that merchants were indeed once again crucial to mobile payment adoption. MCX – It’s raison d’etre Meanwhile, the stakeholders behind MCX have been religious in their affirmation that MCX lives by two core tenets: First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. I can’t seem to think that the latter was any more than an after thought, because merchants individually can choose to decide if they wish to share customer preferences or Level III data with third parties, but they need all the collective clout they can muster to push networks and issuers to agree to reduce card acceptance costs. So if one distils MCX down to its raison d’etre, then it looks that it is aimed squarely at No.1. Which is fair when you consider that the merchants believe card fees are one of their biggest operating expenses. In 2007, 146,000 convenience stores and gas stations nationwide made a total of $3.4B in profits, yet they paid out $7.6B in card acceptance costs(Link). And MCX is smart to talk about the value of merchant data, the need to control it, yada yada yada. But if that were indeed more important, Isis could have been the partner of choice – someone who would treat customer and transaction data as sacrosanct and leave it behind for the merchants to fiddle with(vs. GoogleWallet’s mine..mine..mine.. strategy). But the same way HomeDepot was disappointed when they first saw GoogleWallet – no interchange relief, incremental benefits at the point-of-sale, and swoops all their data in return, Isis also offers little relief to MCX or its merchants, even without requiring any transaction or SKU level data in return. Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. And creating a platform for merchants to deliver coupons, alerts to opted-in customers. But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand. The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. Did we say Apple? Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why NFC? If NFC was originally meant to seamlessly and securely share content, what better way to sign that content, to have it be attributable to its original author, or to enforce one’s rights to said content – than to sign it with one’s digital signature. Identity is key, not just when enforcing digital rights management on shared content, but also to secure commerce and address payment/fraud risk. Back to MCX. The more I read the more it seems MCX is trying to imitate Isis in competing for the customer mindshare, in attempting to become a consumer brand – than simply trying to be a cheaper platform for payment transactions. As commerce evolved beyond being able to be cleanly classified under “Card Present” and “Card Not Present” – as transactions originate online but get fulfilled in stores, merchants expect rules to evolve alongside reality. For example, when customers are able to order online, but pick up in-store after showing a picture ID, why would merchants have to pay “Card not Present” rates when risk is what we attribute higher CNP rates to, and why is there an expectation of the same amount of risk even in this changed scenario? And beyond, as technology innovation blurs the lines that neatly categorized commerce, where we replace “Card Present” with “Mobile Present”, and mobile carry a significant amount of additional context that could be scored to address or quantify risk, why shouldn’t it be?. It’s a given that networks will have to accommodate for reduced risk in transactions where mobile plays a role, where the merchant or the platform enabling the transaction can meaningfully use that context to validate customer presence at the point-of-sale – and that they will expect appropriate interchange reduction in those scenarios. MCX – A brand like Isis or a platform? But when reading portions of the linked NRF blog, and elsewhere – it reflects a misplaced desire on MCX’s part to become a consumer facing solution – an app that all MCX partners will embrace for payment. This is so much like the Isis solution of today – that I have written about – and why it flies in the face of reason. Isis – the nexus between Carriers and FI’s – is a powerful notion, if one considers the role it could play in enabling an open platform – around provisioning, authentication and marketing. But for that future to materialize, Isis has to stop competing with Google, and must accept that it has little role to play by itself at the front end of the funnel, and must recede to its role of an enabler – one that puts its partner FI brands front and center, allows Chase’s customers to pay using Chase’s mobile app instead of Isis, and drives down the fraud risk at the point of sale by meaningfully authenticating the customer via his location and mobile assets Carriers control, and further – the historical data they have on the customer. It’s those three points of data and the scale Isis can bring, that puts them credibly in the payments value chain – not the evaporating control around the Secure Element. In the same vein, the value MCX brings to merchants – is the collective negotiating power of over 30 national merchants. But is it a new consumer brand, or is it a platform focused on routing the transaction over the least cost routing option. If its the latter, then it has a strong parallel in Paypal. And as we may see Paypal pop-up as legal tender in many a retailer’s mobile apps and checkout aisles going forward, MCX is likely to succeed by emulating that retailer aligned strategy than follow a brand of its own. Further, If MCX wants customers to pay using less costly means – whether they be private label, prepaid or ACH – then it and its partners must do everything they can to shift the customer focus away from preferred payment methods and focus on the customer experience and resulting value around loyalty. MCX must build its value proposition elsewhere, and make their preferred payment methods the bridge to get the customer there. Another example where the retailer focused too much on the payment, and less so on the customer experience is the Safeway Fast Forward program. The value proposition is clear for the customer – Pay using your Safeway Fast Forward card number and a self assigned PIN for simpler checkout. However to set up your account, the customer must provide a State issued ID (Drivers License) and on top of it – his Social Security Number(Safeway Fast Forward Requirements Here). What customer would, for the incremental convenience of paying via his Fast Forward Card and PIN, be willing to entrust Safeway with his Social Security Number? Clearly Safeway’s Risk team had a say in this and instead of coming up with better ways to answer questions around Risk and Fraud, they introduced a non-starter, which killed any opportunity for meaningful adoption. MCX & adoption So where does that leave MCX? Why will I use it? How will it address questions around adoption? It’s a given that it will have to answer the same questions around fraud and authentication during customer on-boarding or at a transactional level. Further, its not enough these days to simply answer questions pertaining to the customer. Further, one must address questions relating to the integrity and reputation of the device the customer use – whether that be a mobile device or a Laptop PC. But beyond fraud and auth, there are difficult questions around what would compel a techno-luddite who has historically paid using a credit instrument to opt for an ACH driven(i am guessing) MCX payment scheme. Well, for one: MCX and its retail partners can control the purchasing power parity of MCX credits. If they so wish, and after aggregating customer profiles across retailers, MCX determines that the Addams family spends a collective $400 on average per month between all the MCX retailers. MCX could propose that if instead, the Addams family were to commit to buy $450 in MCX credits each month, they could increase their purchasing power an additional $45 credits that could be used on specific retail categories (or flat out across all merchandise)? Would Morticia be interested? If she did, what does that mean to MCX? It eliminated having to pay interchange on approx $500, and further it enabled its partners to capture an incremental spend of 10% that did not exist before. Only merchants will be able to pull this off – by leveraging past trends, close relationships with CPG manufacturers and giving Morticia new reasons to spend in the manner they want her to. But then again, where does MCX stop in providing a level playing field for its partners, and step back – so that merchants can start to compete for their customers and their spend? And finally, can it survive the natural conflicts that will arise, and limit its scope to areas that all can agree – for long enough for it to take root? Should MCX become the next Isis or the next Paypal? Which makes most sense? What do you think? Please leave your opinions below... (This blog post is an adaptation of its original post found - http://www.droplabs.co/?p=662)
By: Maria Moynihan Fact: In fiscal year 2011, the federal government allocated ~$608M to investigate and prosecute cases of alleged fraud in health care programs Fact: Medicare and Medicaid related scams cost taxpayers more than $60B a year These statistics are profound, especially when so many truly need–and rightfully deserve–access to health benefits. To make the facts a bit more tangible, how would you feel if you heard that neighbors of yours were submitting claims to Medicare for treatments that were never provided? In essence, you’ve got thieves for neighbors, don’t you? Thankfully, government agencies are responding. Even while being challenged with reduced budgets and limited resources; they are investing in efficient processes, advanced data, analytics and decisioning tools to improve their visibility into individuals at the point of application. By simply making adjustments to one or all of these areas, agencies can pinpoint whether or not individuals are who they say they are. Only with precision, relevancy, and efficiency of information, can fraud and abuse be curtailed. Below are a few examples of how to improve your eligibility systems or processes today. Or, simply download the Issue Brief, Beyond Traditional Eligibility Verification, for more detail. Use scores, models, and screening questions to assess a beneficiary’s true identity or level of identity fraud risk. Use income and asset estimation models to compare to stated income as a validation step in determination of benefits eligibility. Create a single system for automatic identification and verification of beneficiaries and businesses applying for service. Tighten controls around business identity to weed out fraud rings, syndicates and other forms of business fraud. The Bottom Line: Only with process, information, or system improvements, can government agencies move the needle on the growing and pressing issue of fraud and abuse.
Research shows that investing in superior customer management easily can exceed returns of 20 percent in the first year of implementation. A return that compounds in subsequent years as a results of customer-centric strategies that drive customer's loyalty, new customer referrals, and increased revenue opportunities. Customer loyalty is a key driver that differentiates retail banks when trying to retain existing and attract new customers. And cited by customers themselves as the way to win their business today. Achieving superior customer management, however, can be expensive and operationally prohibitive; and let's not to forget to mention there are a number of different approaches that aim to meet such a standard, but fail because critical qualitative insights are not captured in back-end systems of record (SOR). These "black-box" strategies struggle to be widely adopted across the enterprise and die a slow, internal political death - with wasted resources left on the floor. It also leaves the customer feeling frustrated and dissatisfied, maybe even ready to flee. One such example was recently illustrated in an article in Credit Union Times. Changing the retail bank's approach to adopt best practices in developing holistic customer-centric strategies is paramount to the improvement of the customer experiences, and the bottom line. Quantitative data alone can represent only a partial view of reality whereas holistic customer strategies exploit the full value of the enterprise by synthesizing customer knowledge from SOR with external off-your firm financial information and critical qualitative input from customer-facing staff. Customer-facing staff are critical in the adoption of such strategies and need to be actively engaged to extract customer learnings that will lead to the modification and alignment of customer-level treatement strategy designs and predictive models with the real world. A collaborative approach, blending art and science, ensures complete adoption across the enterprise and measurable customer experience improvements that can be monetized for shareholders through improved customer retention and new customer acquisitions. Get access to details on the framework to design and deploy such customer-centric strategies.
By: Maria Moynihan The public sector is not unlike the private sector when it comes to data. Both require accuracy and relevancy for optimized processes and decision-making. For government agencies, maintaining a holistic view of constituents is more important than ever. By linking data across department systems, governments improve operations, citizen profiling and overall record management. No longer do agencies have to muddle through records of Maria Moynihan, Mari Moynihan, M Moynihan, and other variations of name or contact information when they all are truly one in the same. Unfortunately, without the right tools and know how, database maintenance, record deduplication, and account validation can be a daunting process. Below are five critical steps to helping government agencies execute successful linkage of database records: Step 1: Engage stakeholders Data stewards are not mind readers. They work with finite data and rely on stakeholders to provide insight. Seek input from users across departments and functions. Step 2: Identify impacts and priorities Data errors and disparate data prevent stewards from amalgamating records and defining a master database. Focus on areas of strategic priority. Step 3: Create success criteria Look for and set quantifiable metrics for matching. Consider what data needs to be linked and what thresholds are acceptable given objectives. Step 4: Define new standards Create established workflows and guidelines for evaluating, merging and purging records. Step 5: Leverage matching technology Integrate robust deduplication tools to design multiple workflows and handle a variety of matching challenges. In short, without data stewards seeking input from commercial stakeholders, an understanding of the data impacts, and establishing a clear process including defined methodologies and technology for deduplication, government agencies will remain challenged in trying to figure out if Maria, Mari, and M are the same person in databases. Click here to see the full guide to Creating a Single View.
By: Joel Pruis The commercial lending - traditional C&I, CRE and other - segment is one of the last areas to be “automated” or captured within an automated lending platform. Many of us talk about the need to automate this segment but the discussion needs to start with the question of “What does it mean to automate originations in the commercial segment? Let’s start to break this down and define it. Previously, we have covered how to define small business for your respective financial institution. If you use that as a measurement of what is small business, the remaining segment would by default be your commercial segment. It seems obvious but good to re-iterate to keep the context on commercial. What we are not planning to cover is the small business segment where there is relatively high application volume and low total dollar production. If we compare small business and commercial across two major characteristics, we the distinction becomes more clear. The above chart represents the typical situation – the probable not the possible scenario. For example, there are situations where the sales lead time is days not months for a commercial lending opportunity or a small business application can sometimes take over a year to get the application. I like analogies so let’s compare mass produced furniture vs. custom furniture to small business vs. commercial. Mass produced furniture is high volume but low dollar per unit (small business) while the custom furniture is the low volume but high dollar per unit production (commercial). Basically, the furniture being mass produced has a low need for any customization with high demand and a low cost of production on a per unit basis. Conversely, the custom furniture production has relatively low volume but a higher cost of production on a per unit basis. The custom furniture maker has no set designs, no set product line but rather examples of past work that has been done. There are no set materials that are to be used and no set prescribed method for manufacturing any particular item. While one customer may want a dining room table that has leaves to expand the seating as needed, another may want a drop-leaf table or simply a static table top. It is up to the customer to decide what the criteria is to best suit the need. The talented furniture maker will provide his/her expertise to provide the best product/solution for the customer but the end result is ultimately up to the customer. Such a design will be worked and potentially reworked multiple times before the right design in actually approved by the customer. Once the design is approved, the work begins on creating the piece of furniture. The creation may follow a standard set of procedures or may not. The key is that there is no set way that must be followed in the creation. The furniture maker will not wait until all material is available but rather can start on portions of the furniture (turning the legs, rough cutting the wood for the table top). While there is likely an agreed upon delivery date, the success is dependent upon completing the furniture by that date, not following a set prescribed path to completion. It is possible to design and capture the small business origination process with its defined roles and responsibilities in a detailed process map. The small business origination process can measure and monitor service level agreements and set expectations with the client around the entire process before the application is even taken. Prescribed order and dependency around the activity and/or task-level process mapping can be accomplished in the small business origination process with a high degree of accuracy and consistency from one application to the next. The commercial loan origination process, however, cannot be captured with a high degree of accuracy and/or consistency. Individual efforts can certainly be captured and specific service level agreements can be established. For example, the spreading of financial statements can follow a prescribed methodology and service level agreements can be established. However, attempts to establish service level agreements that when combined could adequately set expectations of total turnaround times, estimated completion times and prescribed methodologies would result in much lower compliance with such prescribed processes rendering it meaningless. Joel Pruis is a senior business consultant with Experian's Global Consulting Practice. To learn more about strategy consulting and access more thought-leadership from our team, please visit www.experian.com/consultingservices.
Contributed by: David Daukus As the economy recovers from the recession, consumers are becoming more responsible with their credit card usage; credit card debts have not increased and delinquency rates have declined. Delinquency rates as a percentage of balances continue to decline with the short term 30-59 DPD period, now at 0.9%. With mixed results, where is the profit opportunity? Further studies from Experian-Oliver Wyman state that the average bankcard balance per consumer remained relatively flat at $4,170, but the highest credit tiers (using VantageScore® credit score A and B segments) saw average balances increase to $2,422 and $3,208, respectively. It's time to focus on what you have—your current portfolio—and specifically how to: Increase credit card usage in the prime segments Assign the right lines to your cardholders Understand who has the ‘right’ spend Risk score alone doesn't provide the most accurate insight into consumer accounts. You need to dig deeper into individual accounts to uncover behavioral trends to get the critical information needed to grow your portfolio: Leading financial institutions are looking at consumer payment history, such as balance and utilization changes. These capture a consumer’s credit situation more accurately than a point in time view. When basic principles are applied to credit data, different consumer behaviors become evident and can be integrated into client strategies. For example, if two consumers have the same VantageScore® credit score, credit card balances, and payment status, does that mean they have the same current credit status? Not necessarily so. By looking at their payment history, you can determine which direction each is heading. Are they increasing their debt or are they paying down their debt? These differences reveal their riskiness and credit needs. Therefore, with payment history added to the mix, you can more accurately allocate credit lines between consumers and simultaneously reduce risk exposure. Spend is another important metric to evaluate to help grow your portfolio. How do you know if a consumer uses primary a credit card when making purchases? Wouldn’t you want to know the right amount of credit to provide based on the consumer’s need? Insight into consumer spending levels provides a unique understanding of a consumer’s credit needs. Knowing spend allows lenders to provide necessary high lines to the limited population of very high spenders, while reducing overall exposure by providing lower lines to low spenders. Spend data also reveals wallet share—knowing the total spend of your cardholder allows you to calculate their external spend. With wallet share data, you can capture more spend by adjusting credit lines or rewards that will entice consumers to spend more using your card. Once you have a more complete picture of a consumer, adjusting lines of credit and making the right offer is much easier. Take some of the risk out of managing your existing customers and finding new ones. What behavioral data have you found most beneficial in making lending decisions? Source: Experian-Oliver Wyman Market Intelligence Reports
I'm here in Vegas at the Mobile2020 conference and I am fascinated by my room key. This is not the usual “insert in to the slot, wait for it turn green or hear it chime” key cards, these are “tap and hold to a door scanner till the door opens” RFID key card. It is befitting the event I am about to attend – Money2020 – the largest of its kind bringing together over 2000 mobile money aficionados, strategists and technologists from world over for a couple of days to talk about how payment modalities are shifting and the impact of these shifts to existing and emerging players. Away from all the excitement of product launches, I hope some will be talking about one of the major barriers for consumer adoption towards alternate payment modalities such as mobile – security and fraud. I was in Costa Mesa last week and in the process of buying something for my wife with my credit card, triggered the card fraud alert. My card was declined and I had to use a different card to complete my transaction. As I was walking out, my smartphone registers a text alert from the card issuer – asking me to confirm that it was actually I who attempted the transaction. And If so, Respond by texting 1 – if Yes Or 2 – if No. All good and proper up till this point. If someone had stolen my card or my identity, this would have been enough to stop fraud from re-occurring. In this scenario the payment instrument and the communication device were separate – my plastic credit card and my Verizon smartphone. In the next couple of years, these two will converge, as my payment instrument and my smartphone will become one. At that point, will the card issuer continue to send me text alerts asking for confirmation? If instead of my wallet, my phone was stolen – what good will a text alert to that phone be of any use to prevent the re-occurrence of fraud? Further if one card was shut down, the thief could move to other cards with in the wallet – if, just as today, there are no frameworks for fraud warnings to permeate across other cards with in the wallet. Further, fraud liability is about to shift to the merchant with the 2013 EMV Mandate. In the recent years, there has been significant innovation in payments – to the extent that we have a number of OTT (Over the Top) players, unencumbered by regulation, who has been able to sidestep existing players – issuers and card networks, in positioning mobile as the next stage in the evolution of payments. Google, PayPal, Square, Isis (a Carrier consortium formed by Verizon, T-Mobile and AT&T), and a number of others have competing solutions vying for customer mind share in this emerging space. But when it comes to security, they all revert to a 4 digit PIN – what I call as the proverbial fig leaf in security. Here we have a device that offers a real-time context – whether it be temporal, social or geo-spatial – all inherently valuable in determining customer intent and fraud, and yet we feel its adequate to stay with the PIN, a relic as old as the payment rails these newer solutions are attempting to displace. Imagine what could have been – in the previous scenario where instead of reaching for my card, I reach for my mobile wallet. Upon launching it, the wallet, leveraging the device context, determines that it is thousands of miles away from the customer’s home and should score the fraud risk and appropriately ask the customer to answer one or more “out-of-wallet” questions that must be correctly answered. If the customer fails, or prefers not to, the wallet can suggest alternate ways to authenticate – including IVR. Based on the likelihood of fraud, the challenge/response scenario could include questions about open trade lines or simply the color of her car. Will the customer appreciate this level of pro-activeness on the issuer’s part to verify the legality of the transaction? Absolutely. Merchants, who so far has been on the sidelines of the mobile payment euphoria, but for whom fraud is a real issue affecting their bottom-line, will also see the value. The race to mobile payments has been all about quickly shifting spend from plastic to mobile, and incenting that by enabling smartphones to store and deliver loyalty cards and coupons. The focus need to shift, or to include, how smartphones can be leveraged to address and reduce fraud at the point-of-sale – by bringing together context of the device and a real-time channel for multi-factor authentication. It’s relevant to talk about Google Wallet (in its revised form) and Fraud in this context. Issuers have been up in arms privately and publicly, in how Google displaces the issuer from the transaction by inserting itself in the middle and settles with the merchant prior to firing off an authorization request to the issuer on the merchant’s behalf. Issuers are worried that this could wreak havoc with their inbuilt fraud measures as the authorization request will be masked by Google and could potentially result in issuer failing to catch fraudulent transactions. Google has been assuaging issuer’s fears on this front, but has yet to offer something substantial – as it clearly does not intent to revert to where it was prior – having no visibility in to the payment transaction (read my post here). This is clearly shaping up to be an interesting showdown – would issuers start declining transactions where Google is the merchant of record? And how much more risk is Google willing to take, to become the entity in the middle? This content is a re-post from Cherian's personal blog: http://www.droplabs.co/?p=625
Part 2: Common myths about credit risk scores and how to educate consumers In light of what I've heard in the marketplace through the years, I wanted to provide some information to help 'debunk' some common myths about credit scores. Myth: There is only one credit score Reality: There are multiple credit scores that lenders can use to evaluate consumer credit worthiness. As noted in a recent New York Times article, there are 49 FICO score models. Make sure your customers know that an underwriting decision is based on more than just a credit score—multiple factors are evaluated to make a lending decision. The most important thing a consumer can do is ensure their credit report is accurate. Myth: The probability of default remains constant for a credit score over time Reality: The probability of default can shift dramatically based on macro-economic conditions. In 2005, a score of 700 in any given model, may have had a probability of default of 2 percent, while in 2009, the same score could have had a probability of default of 8 percent. This underscores the value of conducting an annual validation of the credit model you are using to ensure your institution is making the most accurate lending decisions based on your risk tolerance. One of the benefits of utilizing the VantageScore® model, is that VantageScore® Solutions, LLC, produces an annual validation so you can ensure your institution is adjusting your strategies to meet changing economic conditions. Myth: If the underlying credit report is the same at each credit reporting company, I will have the same score at each company Reality: Traditional credit scoring models are completely different at each credit reporting company, which leads to vastly different scores or probabilities of default based on the same information. As a risk manager, this is very frustrating, as I may not understand which score most accurately assess the consumer’s probability of default. The only model that is the same across all credit reporting agencies is the VantageScore® model, where this is a patented feature that ensures the lender receives a consistent score (probability of default) across all bureau platforms. I hope these brief examples help clear up some confusion about credit scores. In Part 3 of this series, I will outline how to evaluate the risk of traditionally unscoreable consumers. If you have any thoughts or experiences from a lending perspective, please feel free to share them below. Courtesy Why You Have 49 Different FICO Scores in the August 27, 2012 issue of the New York Times
By: Kyle Aiman Let’s face it, debt collectors often get a bad rap. Sure, some of it is deserved, but the majority of the nation’s estimated 157,000 collectors strive to do their job in a way that will satisfy both their employer and the debtor. One way to improve collector/debtor interaction is for the collector to be trained in consumer credit and counseling. In a recent article published on Collectionsandcreditrisk.com, Trevor Carone, Vice President of Portfolio and Collection Solutions at Experian, explored the concept of using credit education to help debt collectors function more like advisors instead of accusers. If collectors gain a better understanding of consumer credit – how to read a credit report, how items may affect a credit score, how a credit score is compiled and what factors influence the score – perhaps they can offer suggestions for improvement. Will providing past-due consumers with a plan to help improve their credit increase payments? Read the article and let us know what you think!
By: Mike Horrocks It has been over a year that in Zuccotti Park the Occupy Wall Street crowd made their voices heard. At the anniversary point of that movement, there has been a lot of debate on if the protest has fizzled away or is still alive and planning its next step. Either way, it cannot be ignored that it did raise a voice in how consumers view their financial institutions and what actions they are willing to take i.e. “Bank Transfer Day”. In today’s market customer risk management must be balanced with retention strategies. For example, here at Experian we value the voice of our clients and prospects and I personally lead our win/loss analysis efforts. The feedback we get from our customers is priceless. In a recent American Banker article, some great examples were given on how tuning into the voice of the consumer can turn into new business and an expanded market footprint. Some consumers however will do their talking by looking at other financial institutions or by slowly (or maybe rapidly) using your institution’s services less and less. Technology Credit Union saw great results when they utilized retention triggers off of the credit data to get back out in front of their members with meaningful offers. Maximizing the impact of internal data and spotting the customer-focused trends that can help with retention is even a better approach, since that data is taken at the “account on-us” level and can help stop risks before the customer starts to walk out the door. Phillip Knight, the founder of Nike once said, “My job is to listen to ideas”. Your customers have some of the best ideas on how they can be retained and not lost to the competitors. So, think how you can listen to the voice and the actions of your customers better, before they leave and take a walk in the park.
By: Teri Tassara Negative liquidity, or owing more on your home than its value, has become a much too common theme in the past few years. According to CoreLogic, 11 million consumers are underwater, representing 1 out of 4 homeowners in the nation. The irony is with mortgage rates remaining at historic lows, consumers who can benefit the most from refinancing can’t qualify due to their negative liquidity situation. Mortgage Banker’s Association recently reported that approximately 74% of home loan volumes were mortgage re-finances in 2Q 2012. Consumers who have been able to refinance to take advantage of the low interest rates already have, some even several times over. But there is a segment of underwater consumers who are paying more than their scheduled amount in order to qualify for refinancing – which translates to growth opportunity in mortgage loan volume. Based on an Experian analysis of actual payment amount on mortgages, actual payment amount was reported on about 65% of open mortgages (actual payment amount is the amount the consumer paid the prior month). And when the actual payment is reported, the study found that 82% of the consumers pay within their $100 of scheduled payment and 18% pay more than their scheduled amount. Actual payment amount information as reported on the credit file, used in combination with other analytics, can be a powerful tool to identify viable candidates for a mortgage refinance, versus those who may benefit from a loan modification offer. Consumers methodically paying more than the scheduled payment amount may indicate that the consumer is trying to qualify for refinancing. Conversely, if the consumer is not able to pay the scheduled payment about, that consumer may be an ideal candidate for a loan modification program. Either way, actual payment amount can provide insight that can create a favorable situation for both the consumer and the lender, mitigating additional and unnecessary risk while providing growth opportunity. Find other related blog posts on credit and housing market trends.