All posts by Guest Contributor

This is the first post in a three-part series. You’ve probably heard the adage “There is a little poison in every medication,” which typically is attributed to Paracelsus (1493–1541), the father of toxicology. The trick, of course, is to prescribe the correct balance of agents to improve the patient while doing the least harm. One might think of data governance in a similar manner. A well-disciplined and well-executed data governance regimen provides significant improvements to the organization. So too, an overly restrictive or poorly designed and/or ineffectively monitored data governance ecosystem can result in significant harm; less than optimal models/scorecards, inaccurate reporting, imprecise portfolio outcome forecasts and poor regulatory reports, subsequently resulting in significant investment and loss of reputation. In this blog series, we will address the issues and best practices associated with the broad mandate of data governance. In its simplest definition, data governance is the management of the availability, usability, integrity and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures and a plan to execute those procedures. Well, upon quick reflection, effective data governance is not simple at all. After all, data is ubiquitous, is becoming more available, encompasses aspects of our digital lives not envisioned as little as 15 years ago and is constantly changing as people’s behavior changes. To add another level of complexity, regulatory oversight is becoming more pervasive as regulations passed since the Great Recession have become more intrusive, granular and demanding. When addressing issues of data governance lenders, service providers and insurers find themselves trying to incorporate a wide range of issues. Some of these are time-tested best practices, while others previously were never considered. Here is a reasonable checklist of data governance concerns to consider: Who owns the data governance responsibility within the organization? Is the data governance group seen as an impediment to change or is it a ready part of the change management culture? Is the backup and retrieval discipline — redundancy and recovery — well-planned and periodically tested? How agile/flexible is the governance structure to new data sources? How does the governance structure document and reconcile similar data across multiple providers? Are there appropriate and documented approvals and consents from the data provider(s) for all disclosures? Are systemic access and modification controls and reporting fully deployed and monitored for periodic refinement? Does the monitoring of data integrity, persistence and entitled access enable a quick fix culture where issues are identified and resolved at the source of the problem and not settled by downstream processes? Are all data sources, including those that are proprietary, fully documented and subject to systemic accuracy/integrity reporting? Once obtained, how is the data stored and protected in both definition and accessibility? How do we alter data and leverage the modified outcome? Are there reasonable audits and tracking of downstream reporting? In the event of a data breach, does the organization have well-documented protocols and notification thresholds in place? How recently and to what extent have all data retrieval, manipulation, usage and protection policies and processes been audited? Are there scheduled and periodic reports made to the institution board on issues of data governance? Certainly, many institutions have most of these aspects covered. However, “most” is imprecise medicine, and ill effects are certain to follow. As Paracelsus stated, “The doctor can have a stronger impact on the patient than any drug.” As in medical services, for data governance initiatives those impacts can be beneficial or harmful. In our next blog, we’ll discuss observations of client data governance gaps and lessons learned in evaluating the existing data governance ecosystem. Make sure to read Compliance as a Differentiator perspective paper for deeper insight on regulations affecting financial institutions and how you can prepare your business. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.

By: Ori Eisen This article originally appeared on WIRED. When I started 41st Parameter more than a decade ago, I had a sense of what fraud was all about. I’d spent several years dealing with fraud while at VeriSign and American Express. As I considered the problem, I realized that fraud was something that could never be fully prevented. It’s a dispiriting thing to accept that committed criminals will always find some way to get through even the toughest defenses. Dispiriting, but not defeating. The reason I chose to dedicate my life to stopping online fraud is because I saw where the money was going. Once you follow the money and you see how it is used, you can’t “un-know.” The money ends up supporting criminal activities around the globe – not used to buy grandma a gift. Over the past 10 years the nature of fraud has become more sophisticated and systematized. Gone are the days of the lone wolf hacker seeing what they could get away with. Today, those days seem almost simple. Not that I should be saying it, but fraud and the people who perpetrated it had a cavalier air about them, a bravado. It was as if they were saying, in the words of my good friend Frank Abagnale, “catch me if you can.” They learned to mimic the behaviors and clone the devices of legitimate users. This allowed them to have a field day, attacking all sorts of businesses and syphoning away their ill-gotten gains. We learned too. We learned to look hard and close at the devices that attempted to access an account. We looked at things that no one knew could be seen. We learned to recognize all of the little parameters that together represented a device. We learned to notice when even one of them was off. The days of those early fraudsters has faded. New forces are at work to perpetrate fraud on an industrial scale. Criminal enterprises have arisen. Specializations have emerged. Brute force attacks, social engineering, sophisticated malware – all these tools, and so many more – are being applied every day to cracking various security systems. The criminal underworld is awash in credentials, which are being used to create accounts, take over accounts and commit fraudulent transactions. The impact is massive. Every year, billions of dollars are lost due to cyber crime. Aside from the direct monetary losses, customer lose faith in brand and businesses, resources need to be allocated to reviewing suspect transactions and creativity and energy are squandered trying to chase down new risks and threats. To make life just a little simpler, I operate from the assumption that every account, every user name and every password has been compromised. As I said at the start, fraud isn’t something that can be prevented. By hook or by crook (and mainly by crook), fraudsters are finding cracks they can slip through; it’s bound to happen. By watching carefully, we can see when they slip up and stop them from getting away with their intended crimes. If the earliest days of fraud saw impacts on individuals, and fraud today is impacting enterprises, the future of fraud is far more sinister. We’re already seeing hints of fraud’s dark future. Stories are swirling around the recent Wall Street hack. The President and his security team were watching warily, wondering if this was the result of a state-sponsored activity. Rather than just hurting businesses or their customers, we’re on the brink (if we haven’t crossed it already) of fraud being used to destabilize economies. If that doesn’t keep you up at night I don’t know what will. Think about it: in less than a decade we have gone from fraud being an isolated irritant (not that it wasn’t a problem) to being viewed as a potential, if clandestine, weapon. The stakes are no longer the funds in an account or even the well being of a business. Today – and certainly tomorrow – the stakes will be higher. Fraudsters, terrorists really, will look for ways to nudge economies toward the abyss. Sadly, the ability of fraudsters to infiltrate legitimate accounts and networks will never be fully stifled. The options available to them are just too broad for every hole to be plugged. What we can do is recognize when they’ve made it through our defenses and prevent them from taking action. It’s the same approach we’ve always had: they may get in while we do everything possible to prevent them from doing harm. In an ideal world bad guys would never get through in the first place; but we don’t live in an ideal world. In the real world they’re going to get in. Knowing this isn’t easy. It isn’t comforting or comfortable. But in the real world there are real actions we can take to protect the things that matter – your money, your data and your sense of security. We learned how to fight fraud in the past, we are fighting it with new technologies today and we will continue to apply insights and new approaches to protect our future. Download our Perspective Paper to learn about a number of factors that are contributing to the evolving fraud landscape.

By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income. One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?

By: Mike Horrocks I am at the Risk Management Association’s annual conference in DC and I feel like I am back to where my banking career began. One of the key topics here is how important the Risk Rating Grade is and what impact that right or wrong Risk Rating Grade can have on the bank. It is amazing to me how a risk rating is often a shot in the dark at some institutions or can even vary on the training of one risk manager to another. For example, you could have a commercial credit with fantastic debt service coverage and have it tied to a terrible piece of collateral and that risk rating grade will range anywhere from prime type credit (cash flow is king and the loan will never default – so why concern ourselves with collateral) to low, subprime (do we really want that kind of collateral dragging us down or in our OREO portfolio?), to anywhere in between. Banks need to define the attributes of a risk rating grade and consistently apply that grade. The failure of doing that will lead to having that poor risk rating grade impact ALLL calculations (with either an over allocation or not enough) and then that will roll into the loan pricing (making you more costly or not enough to match for the risk). The other thing I hear consistently is that we don’t have the right solutions or resources to complete a project like this. Fortunately there is help. A bank should never feel like they should try to do this alone. I recall how it was an all hands on deck when I first started out to make sure we were getting the right loan grading and loan pricing in place at the first super-regional bank I worked at – and that was without all the compliance pressure of today. So take a pause and look at your loan grading approach – is it passing or failing your needs? If it is not passing, take some time to read up on the topic, perhaps find a tutor (or business partner you can trust) and form a study group of your best bankers. This is one grade that needs to be at the top of the class. Looking forward to more from RMA 2014!

The ubiquity of mobile devices provides financial services marketers with an effective way to distribute targeted, customized messages that appeal to a single shopper — a marketing segment of one.

By: Joel Pruis I have just completed the first of two presentations on Model Risk Governance at the RMA Annual Conference. The focus of the presentation was the compliance with the Model Risk Governance guidance at the smaller asset sized financial institutions. The big theme across all of the attendees at the first session was the need for resources to execute on the Model Risk Governance. Such resources are scarce at the smaller asset sized institutions forcing the need and use for external vendors to assist in the development and ongoing validation of any models in use. With that said, the one area that cannot be outsourced is the model risk governance responsibility of the financial institution. While resources are few, we have to look for existing roles within the organization to support the model risk governance such as: - Internal Audit - reviewing process, inputs, consistency - Loan Review - accuracy, consistency, thresholds, etc. - Compliance - Data usage, pricing consistency, etc. Start gathering your governance team at your organization and begin the effort around model risk governance! Discover how an Experian business consultant can help with your Model Risk Governance strategies and processes. Also, if you are interested in gaining deeper insight on regulations affecting financial institutions and how to prepare your business, download Experian’s Compliance as a Differentiator perspective paper.
Experian hosted the Future of Fraud event this week in New York City where Ori Eisen and Frank Abagnale hosted clients and prospects highlighting the need for innovative fraud solutions to stay ahead the consistent threat of online fraud. After, Ori and Frank appeared on Bloomberg TV, interviewed by Trish Regan discussing how retailers can handle fraud prevention. Ori and Frank highlighted how using data is good, especially when combined with analytics as a requirement for businesses working to try and prevent fraud now and in the future. "Data is good. The only way that you deal with a lot of this cyber(crime) is through data analytics. You have to know who I am dealing with. I have to know it is you and authenticate that it is you that wants to make this transaction." Frank Abagnale on BloombergTV Charles Chung recently detailed how utilizing the data for good can protect the customer experience while providing businesses a panoramic view to ensure data security and compliance to mitigate fraud risk. Ultimately, this view helps businesses build greater consumer confidence and create a more positive customer experience which is the first, and most important, prong in the fraud balance. Learn more on how Experian is using big data.

More than 10 years ago I spoke about a trend at the time towards an underutilization of the information being managed by companies. I referred to this trend as “data skepticism.” Companies weren’t investing the time and resources needed to harvest the most valuable asset they had – data. Today the volume and variety of data is only increasing as is the necessity to successfully analyze any relevant information to unlock its significant value. Big data can mean big opportunities for businesses and consumers. Businesses get a deeper understanding of their customers’ attitudes and preferences to make every interaction with them more relevant, secure and profitable. Consumers receive greater value through more personalized services from retailers, banks and other businesses. Recently Experian North American CEO Craig Boundy wrote about that value stating, “Data is Good… Analytics Make it Great.” The good we do with big data today in handling threats posed by fraudsters is the result of a risk-based approach that prevents fraud by combining data and analytics. Within Experian Decision Analytics our data decisioning capabilities unlock that value to ultimately provide better products and services for consumers. The same expertise, accurate and broad-reaching data assets, targeted analytics, knowledge-based authentication, and predictive decisioning policies used by our clients for risk-based decisioning has been used by Experian to become a global leader in fraud and identity solutions. The industrialization of fraud continues to grow with an estimated 10,000 fraud rings in the U.S. alone and more than 2 billion unique records exposed as a result of data breaches in 2014. Experian continues to bring together new fraud platforms to help the industry better manage fraud risk. Our 41st Parameter technology has been able to detect over 90% of all fraud attacks against our clients and reduce their operational costs to fight fraud. Combining data and analytics assets can detect fraud, but more importantly, it can also detect the good customers so legitimate transactions are not blocked. Gartner reported that by 2020, 40% of enterprises will be storing information from security events to analyze and uncover unusual patterns. Big data uncovers remarkable insights to take action for the future of our fraud prevention efforts but also can mitigate the financial losses associated with a breach. In the end we need more data, not less, to keep up with fraudsters. Experian is hosting Future of Fraud and Identity events in New York and San Francisco discussing current fraud trends and how to prevent cyber-attacks aimed at helping the industry. The past skepticism no longer holds true as companies are realizing that data combined with advanced analytics can give them the insight they need to prevent fraud in the future. Learn more on how Experian is conquering the world of big data.

By: Joel Pruis When the OCC put forth the supervisory guidance on model risk governance the big focus in the industry was around the larger financial institutions that had created their own risk models. The overall intent to make sure that the larger financial institutions were properly managing the risk they were assuming through the use of the custom risk models they had developed. While we can’t say that this model risk governance was a significant issue, the guidance provided by the OCC is intended to provide financial institutions with the minimum requirements for model risk governance. Now that the OCC and the Federal Reserve have gone through the model risk governance reviews for the largest financial institutions in the US, their attention has turned to the rest of the group. While you may not have developed your own custom scorecard model, you may be using a generic scorecard model to support your credit decisions either for loan origination and/or portfolio management. As a result of the use of even generic scorecards and models, you do have obligations for model risk governance as stated in the guidance. While you may not be basing any decisions strictly on a score alone, the questions you have to asking yourself are: Does my credit policy or underwriting guidelines reference the use of a score in my decision process? While I may not be doing any type of auto-decision, do I restrict any credit authority based upon a score? Do I adjust any thresholds/underwriting guidelines based upon a score that is returned? For example, do I allow a higher debt to income if the score is above a certain level? How long have you been using a score in your decision processes that may have become a significant influence on how you decision credit? As you can see from the questions above, the guidance covers a significant population of the financial institutions in the US. As a result, some of the basic components that your financial institution must demonstrate it has done (or will do) are: Recent validation of the scorecard against your portfolio performance Demonstration of appropriate policy governing the use of credit risk models per the regulation Independence around the authority and review of the model risk governance and validations Proper support and documentation from your generic scorecard provider per the guidance. If you would like to learn more on this topic, please join me at the upcoming RMA Annual Risk Management Conference where I will be speaking on Model Validation for Community Banks on Monday, Oct. 27, 9:30 a.m. – 10:30 a.m. or 11 a.m. – 12 p.m. Also, if you are interested in gaining deeper insight on regulations affecting financial institutions and how to prepare your business, download Experian’s Compliance as a Differentiator perspective paper.

Card-to-card balance transfers represent a substantial profit opportunity for lenders.

By: Maria Moynihan Mobile devices are everywhere, and landlines and computer desktops are becoming things of the past. A recent American Marketing Association post mentioned that there already are more than 1 billion smartphones and more than 150 million tablets worldwide. As growth in mobile devices continues, so do expectations around convenience, access to mobile-friendly sites and apps, and security. What is your agency doing to get ahead of this trend? Allocating resources toward mobile device access and improved customer service is inevitable, and, arguably, investment and shifts in one of these areas ultimately will affect the other. As ease of information and services improves online or via mobile app, secure logons, identity theft safeguards and authentication measures must all follow suit. Industry best practices in network security call for advancements in: Authenticating users and their devices at the point of entry Detecting new and emerging fraud schemes in processes Developing seamless cross-checks of individuals across channels Click here to see what leading information service providers like Experian are doing to help address fraud across devices. There is a way to confidently authenticate individuals without affecting their overall user experience. Embrace the change.

According to a recent 41st Parameter® study, 85 percent of consumers use online or mobile channels to conduct business.

In a recent webinar, we addressed how both the growing diversity of technology used for online transactions and the many different types of access can make authentication complicated. Technology is ever-changing and is continually reshaping the way we live. This leaves our industry to question how device intelligence factors into both the problem and solution surrounding diverse technologies in the online transaction space. Industry experts Cherian Abraham from the Experian Decision Analytics team and David Britton from 41st Parameter, a part of Experian, weighed in on the discussion. Putting It All Into Context Britton harkened back to a simpler time of authentication practices. In the early days of the web, user names and passwords were the only tools people had to authenticate online identities. Eventually, this led organizations to begin streamlining the process. “They did things like using cookies or placing files onto a computer so that the computer would be “known” to the business,” said Britton. However, those original methods are now struggling to fit into the modern-day authentication puzzle. “The challenge has been that for both privacy reasons and for the advancements of technology we have actually moved to a more privacy-centric environment where those types of things have fallen away in terms of their efficacy. For example, cookies are often easily deleted by simply browsing incognito. So as a result there’s been a counter move approach to how to authenticate online,” said Britton. New Technology – A Quick Fix? Don’t be fooled. Newer technologies cannot necessarily provide an easy alternative and incorporate older authentication methods. Britton referenced how the advent of mobile has actually made recognizing the consumer behind the device, the behavior of the machine and the data that the consumer is presenting even more complex. Additionally, rudimentary methods of authentication don’t actually exist well in the mobile environment. On the other hand, newer technologies and the mobile environment force a more layered approach to authentication methods. “There is a better way and the better way is to look at a variety of other inspirations beyond user names and passwords before vindicating the customer. This is all the more evident when you get to newer channels such as mobile where consumer expectations are so different and you cannot rely on the customer having to answer a long stream of characters and letters such as a user name or a password,” said Abraham. Britton weighed in as well on device intelligence and the layered approach. “Our whole philosophy around this has been that if you can recognize aspects of the device in the form of device intelligence – we’re able to actually leverage that information without crossing the boundaries of good privacy management. Furthermore, we are then able to say we recognize the attributes of the device and can recognize the device as that person is attempting to come back into an environment,” said Britton. He emphasized how being able to help companies understand who might be on the other end of the device has made a world of difference. This increasingly points to how authentication will continue to evolve in a in a multi-device, multi-screen and multi-channel environment. For more information and access to the full webinar – Stay tuned for additional #fraudlifecycle posts.

Auto loan originations reached $153 billion in Q2 2014, which was a 16 percent increase over the same quarter last year. While the largest contribution came from captive auto lenders at $47 billion (a 14 percent increase), credit unions experienced the largest year-over-year increase of 35 percent, with originations reaching $37 billion in the latest quarter. As auto loan originations continue to grow, lenders can stay ahead of the competition by using advanced analytics to target the right customers and increase profitability. Learn how your automotive portfolio compares through the peer-benchmarking capabilities of IntelliViewSM, and view sample reports by industry. Source: Access the latest credit trends with Experian's IntelliView.

According to a recent Experian Data Quality study, three out of four organizations personalize their marketing messages or are in the process of doing so.