“Simplify” and “Unify:” Possibly the Two Most Compelling Words in IT Security

I have been watching closely the news surrounding the Heartbleed vulnerability. In the recent American Banker article, entitled, “FFIEC Issues Heartbleed Warning; Major Banks Say They’re Protected the article states that most of the banks and online banking vendors contacted for comment report that their sites and software are not at risk. While it’s commendable that these organizations don’t want to alarm the public unnecessarily, no one really knows how complex Heartbleed is, or the full extent of what may have been compromised. Every day we are seeing breaking news about new exploitations cropping up from the UK to Canada.

Clearly, the core banking software vendors like Fiserv, Jack Henry and D+H USA (formerly Harland Financial Solutions), as well as the many online banking providers, go to extraordinary measures to protect their financial data. Likewise, the big banks mentioned in the article have vast IT resources to address vulnerabilities in terms of both people and infrastructure. Overall, the industry is doing everything it knows how to do with the current tools and systems it has to work with to reduce its vulnerabilities. And therein lies the problem: today’s technological approach to IT and network security has changed little over the last two decades, while everything else that touches it has changed dramatically.

There are too many “things” to manage; users/identities, systems, applications, and data. Today’s solutions are mostly passive in nature, providing primarily ‘after-the-fact’ forensic analysis. Relying on perimeters, rules and signatures, even if they are proactively managed and updated regularly, is too static. And, just how many more appliances can you afford to put in your datacenter every time a new business process needs to be introduced or a new threat crops up? So as the gap between the security architectures and service delivery models currently in play widens, the impact of intrusions, insider attacks, and other related exploits will grow, not diminish.

Technology is needed to detect the threat before it can be exploited. This is the ideal. Or, at least detect the threat as it is occurring. To do this, IT security professionals operating under increasingly more complex, ever-growing big data environments need tools that will simplify and unify.

By integrating all the data from every source, our MetaGridTM technology is able to perform comprehensive collection and analysis of data from all of your network and security devices, increase “stream-time” security analysis and provide the means to develop a proactive, automated security response that would otherwise be impossible. And instead of tossing out the existing investment in firewalls, intrusion prevention sensors, anti-malware scanners, routers, identity management technologies and other security appliances, MetaGridTM was specifically designed as a transitional technology that can co-exist with existing security infrastructure, enhancing the performance and shelf life of an organization’s current systems and tools.

For financial institutions to shift the balance of power away from the attacker, it’s going to take a fresh approach. By combining high-performance computing and identity-awareness with advanced neural, behavioral and social analysis and rapid response capabilities, financial institutions will be able to gain the upper hand and better defend themselves from the like of Heartbleed or any other attack that comes along now or in the future.

Banks Will Never Eliminate Vulnerabilities Without a Unified Security Strategy

In a recent Bank Technology News article titled, “Banks Urged to Beef Up Defenses Against DDoS Attacks, ATM Fraud,” the author reports how the Federal Financial Institutions Examination Council is “urging” (quotes mine) banks to establish better security controls related to ATM fraud, as well as denial-of-service (DDoS) attacks.  Not surprising, Verizon ranked ATMs and file servers as two of the top three most vulnerable to cyberattacks for banks.

With more than 30 years in the financial services industry, I continue to see banks of all sizes struggle to protect themselves.   I have also seen how, in response to these threats, CIOs and CISOs have resorted to deploying a wide array of point solutions from a whole host of vendors.  This has created a mish mosh of disparate, often redundant systems and an IT environment rife with constant upgrades, custom middleware, proprietary interfaces, incompatible databases, operational silos—and frankly, a lot of money paid to IT consultants.  This is the reality of operating within the confines of today’s legacy and signature-based security solutions.

Despite billions spent annually on IT security each year, organizations find themselves more and more vulnerable to attack.  Until this environment changes, there will continue to be major breaches at the large financial institutions, many of which have the potential to precipitate serious financial repercussions on a global scale.  Within the small to mid-size bank arena, breaches could ultimately wipe out entire local and regional banks, and could have a ripple effect on key aspects of the U.S financial system.

The vulnerabilities of the banking sector, as well as other information-driven industries, will only be reduced with technology that is designed to overcome the limitation of legacy systems.  Specifically, they must eliminate redundant systems, reduce hardware upgrades, unify disparate data across the IT infrastructure, correlate critical data of all types and sources, and protect their data assets from the “unknown unknowns.”

Simply put, the key to data security is looking at all things, all the time, without rules or signatures.  If organizations continue to try and manage their security through 50+ pieces of disparate hardware like SIEM, firewalls and IPS for example, where each “box” is from a different vendor and is only designed to look for one possible “bad” thing, no organization, particularly banks, will ever be secure.

We must move toward an integrated security strategy, leveraging advances in technology that have been developed just in the last few years.  Considering that such technology solutions can co-exist within an organization’s current IT infrastructure without usurping the investment already made, it has the potential to be the answer—and the future—for an industry struggling to gain the upper hand against these attacks.  With the skyrocketing costs associated with a breach, combined with the reputational damage that can last for many years, banks can no longer take the hit of lost earnings and exposure in today’s volatile economic climate.  It’s time for change.

Better Technology, Not Necessarily More People, Will Ultimately Solve the Data Scientist Shortage

The job of data scientist may have recently been dubbed “The Sexiest Job of the 21st Century” by a Harvard Business Review study, but it really could be called “The Most Important Job of the 21st Century” because in today’s volatile business and geo-political environment, they have an extremely critical role to play in making our world more secure.

Whatever the label, the fact is that we still have a looming data scientist shortage.  A McKinsey report estimates that by 2018, the U.S. could face a shortage of 140,000 to 190,000 data scientists, as well as 1.5 million managers and analysts, leaving a huge gap in the number of people who actually can understand, interpret and make decisions based on analyzing data.

While time will enable us to cultivate more qualified data scientists in our colleges and universities, we simply don’t have that luxury.  The more feasible—and more immediate—option is to free up existing data scientists to do what they are trained to do: analyze the data and make smart, data-driven decisions. This can only be done by giving them far more advanced technology capable of distilling meaningful and actionable data than has been traditionally available through today’s legacy systems.

Specifically, they need tools that can fully automate time consuming (and often manual) tasks such as data cleansing and correlation in today’s big data environments.  They need technology that will enable them to shift from a reactive, rules-based, and forensic “rear view mirror” posture to a preemptive approach.  They need solutions that will automate the collection and analysis of operational data of any type and variety, at any volume and at any velocity.  They need to replace the many disjointed, non-integrated security solutions within their infrastructure with a unifying system that can handle the threats of today’s increasingly sophisticated attacker.

As part of the big data analytics technology community, it is our duty and responsibility to give these data scientists next generation solutions to prevent incidents before they ever occur.  While we may not be able to solve the shortage issue, we can certainly equip the ones that are working on the front lines—whether inside a Fortune 500 or on the battlefield—perform their job more effectively and efficiently.

And when that happens, tragedies like both Fort Hood shootings, scandals like NSA/WikiLeaks and the exposure of millions of people’s financial information as a result of the Target breach to name a few, may become far more infrequent.  At Red Lambda, we believe such a day is imminent…

Financial Organizations—Leverage Big Data Analytics to Reclaim the Public Trust

According to a report by global public relations firm Edelman, financial services and banking are the two least trusted industries in the United States. According to the survey, called the 2012 Edelman Trust Barometer: U.S. Financial Services and Banking Industries,” only 46 percent of the American general public has trust in financial services, just 1% higher than the global average of 45 percent.  And the numbers are even worse among respondents in the 35 to 64 age group.

While these statistics improved over last year’s report when only 25% of Americans trusted the financial and banking sectors, they are still extraordinarily high.  High, but not necessarily surprising, considering  the eroding public perceptions brought about by the economic crisis that began in 2008, as well as by the many major data breaches over the last 5 years. Clearly, financial institutions have a long way to go to regain the trust factor.

What I did find interesting in that same study, however, is that the technology sector tops the list of the most trusted industries, at 83%.  I believe there is tremendous opportunity in that statistic.

Since “ethical business practices”, “privacy”, and “listening to customer needs and feedback” were cited in the Edelman survey as the most important actions financial organizations can take to improve trust, then every banker needs to be asking themselves how they can make use of data analysis and data discovery tools to better understand their customers so they can turn that deeper insight into products and services they truly want and need. When customers feel that a financial services firm acts in their best interest—what the industry refers to as “customer advocacy”—they are willing to invest more, borrow more, and buy more products and services. They are also far less likely to seek out another financial institution.

By leveraging today’s advanced Big Data technologies—particularly analytics and artificial intelligence—financial institutions have a tremendous opportunity to rebuild and regain the publics’ trust in tangible ways, from better protecting sensitive data and reducing cybercrime, to providing individually- customized products and services.

Using the right data discovery and analytic tools, a banker can not only secure the total privacy of sensitive information ,retain and grow the revenue of an existing customer, but can determine which products and services are a match for a very particular and finite demographic group. Development of such fine-tuned security and marketing is very difficult—if not impossible—with traditional forensic-based security and data discovery methods.  Instead, banks can use analytics and artificial intelligence in many ways, from identifying micro-targeted groups or individuals for marketing programs, to ensuring their most valued, high net worth customers are being served and fully secured.

Today’s financial institutions should look to deploy Big data technologies that will enable them to fully exploit the data they already possesses in order to capture new business opportunities, increase profits and drive innovation. By doing so, financial institutions would go a long way toward renewing their once vaulted reputation as a trusted partner.

 

Banks Need to Think “Smart”—Not Necessarily Big

Having spent 40 years building and delivering core banking software to the financial services industry, I know that financial institutions clearly possess a tremendous amount of data as a result of processing massively large-scale transactions.  Most of these organizations have incorporated Big Data strategies into their operations to accomplish everything from learning about customer behavior and developing marketing programs, to increasing retention and loyalty, to improving efficiency and increasing ROA.  Credit card companies have deployed predictive analytics to manage credit lines and collections.  Insurance companies use predictive analytics to set premiums.  And nearly all banks, insurance companies and government agencies have turned to analytics to root out fraud.

The challenge however, is that until now, the analytics has been forensic in nature—after the fact.  Continue reading

Top 7 Big Data Trends for Financial Services

Bank Jan 2013

It’s a new year, ripe with new opportunities as well as new challenges.  Here are some of the top trends and observations surrounding Big Data and more specifically, the financial services sector:

 

  •  The financial services sector needs to source from larger data sets and implement an even deeper level of granularity to develop better predictive and behavior models.  Numerous data sources that can be made available, from social networking demographics to consumer micro-payment consumption history, can be used to greatly improve models. Continue reading

Innovation and Cooperation—Not Regulation—Will Help Us Win the Cybersecurity War

With the presidential elections literally days away, it’s critical to our national defense posture that whichever candidate is elected, the issue of cybersecurity rises to the surface again with a sense of urgency, priority and gravitas—but without the threat of costly, overzealous regulations.

In August, the Cybersecurity Act of 2012 made it through Congress, but fell short of the 60 votes needed in the Senate for it to pass.  Not surprisingly, support for the bill was divided among those who believe government should set standards and regulations to make the private sector improve security and those who argue that regulations would threaten privacy and civil liberties.  They also contend that the Federal government isn’t nimble enough to keep pace with the evolving threat landscape. And most of the people in this camp believe regulations would stifle or even harm innovation in the private sector and thwart progress in this critically important sector.

As a business owner and entrepreneur leading Red Lambda, a new company making inroads into the Big Data/cybersecurity arena, it’s probably fairly obvious that I fall into the latter group.  I do not believe we can regulate our way to solving today’s cybersecurity challenges.

While we all recognize that regulations are absolutely necessary for the smooth functioning of our society, it’s easy to look back into our recent legislative history to see where the heavy hand of government—despite good intentions—didn’t prevent the problem which it was intending to solve—and in fact, made the situation far worse. Continue reading

Security: The Need for the Bigger Picture

The whole is greater than the sum of its parts.”

                                                                   Aristotle

I love that quote from Aristotle. It provides a beautiful description of how individual parts, when combined together to form something new, become better than the individual parts themselves. Network security and data mining could benefit from the application of this concept. Today, the industry lacks this holistic approach that every CSO dreams of achieving—and that’s one reason we have so many security breaches today.

Start by looking at the security incidents that happen every week. In just one 7-day period in August, two major organizations were compromised. Blizzard Entertainment released a statement to their customers about unauthorized access to account information, urging their users to change their passwords. Saudi Aramco, the world’s largest oil company, announced that their internal network was compromised by a computer virus infecting personal workstations. Continue reading

Tackling Big Data with Grid Computing

When not working with customers on their IT security needs, I volunteer as coach and administrator for a youth football club. In both environments, I continue to learn new techniques and capabilities that enable me to help others be successful.  What I love about team sports and athletic competition is how analogies can carry over into our professional lives. How many times has one heard the terms “Hail Mary” or “Home Run” used in business? My focus here is how using MetaGrid™ (the grid computing platform from Red Lambda) is similar to the role of an offensive coordinator in football. Continue reading

Is Time More Valuable Than Money?

“We really need to take on this project, but I don’t have the resources.
There aren’t enough hours in the day to tackle that problem.”

I’ve heard comments like this for many years throughout my career in the network, security and now in the Big Data sectors.

The “time vs money” dilemma is common.  You know what is needed to solve a problem, but there are not enough hours in the day to commit to solving it.  Countless vendors offer interesting point solutions that can help an organization tackle a particular problem.  This is how firewalls, load-balancers, IPS and WAFs have staked their claims and won customers.  However, each solution is unique and requires its own set of expertise, a significant investment in time for configuration, monitoring, and management. More often than not, the cost to the organization ends up outweighing the benefit.  As frustrating as that may be, as Ben Franklin said, “time is money.”  Continue reading