Changes in Risk Management for BFSI companies demands rapid action

Changes in Risk Management for BFSI companies demands rapid action

Chris Skinner, author of books like Digital Bank and ValueWeb, says – Now we’re seeing what I call ‘the complete open sourcing of financial services’ through apps, APIs and analytics. So the front office relationship is in an app. The middle office processing is through an API, and the back office is all about for analytics.

The sheer amount and pace of change in banking and financial services over the last decade has been mind numbing, and near nightmarish for risk managers.

Even as banking has become fast, easy and personalized; the tolerance for any errors and dishonest business practices has dramatically decreased (rightly so). While digital transformation has opened new business models for financial services companies, customers’ expectations of banking services have tremendously increased. Risk functions in banks now have to manage new types of risk, including models and cyber; besides managing compliance with ever evolving regulations. Additionally, they are expected to deal with these trends at a lower cost, because banks (like other services companies) expect to reduce their operating costs substantially when they adopt new technologies.

The good news. Data engineering and advanced analytics are enabling new products, services, and risk-management techniques – enabling risk managers make better choices about risks. The challenge, of course, is in finding the right solution that can scale with the organization, cover all the bases, integrate seamlessly with the bank’s enterprise systems, and does all of this in a cost-effective fashion.

Let’s take a look at some of the key trends in Technology in Banking.

Winning customers in the highly competitive, globalized banking and financial services industry is a battle that is increasingly being fought on the digital front. As digital technologies are rapidly changing life and work in every other sphere, customers expect intuitive experiences, access to services at any time on any device, customized propositions, and instant decisions – from their banking. This entails re-imagining the bank / financial services company from a customer-experience perspective and digitization. The risk function plays a critical role here collaborating with the business and technology functions across the entire transformation journey.

Automation in Compliance
Omni channel banking has thrown up a challenge – how to accurately validate the identity of persons applying for new accounts or performing transactions. Whatever channels are used, for a bank to approve new accounts or any transaction, it must draw data from multiple, disparate sources, analyze it and demonstrate the risks quickly, for informed decision-making. Digitizing the underwriting processes and increasing use of data analytics are visible trends of automation in compliance.

Even as regulation is becoming complex and noncompliance less tolerated, banks have to eliminate human interventions in risks dealing with customers and seamlessly connect right behaviors to products and services. Quite simply, automation in compliance is the best way to ensure accurate oversight (that can save millions).

Real-time decisions and service
Gone are the days of filling up laborious application forms and surviving long IVR-driven calls. Banks now have to offer real-time answers to customer requests with customized processes. As Risk managers seek to find ways to help banks assess risks and make decisions without human intervention, they have to contend with the use of more non-traditional data sources. For example, some banks have re-designed account opening with much of required data prepopulated from public sources to make the experience as simple, fast and short. However, establishing a secure and customer-friendly approach for identification and verification becomes yet another challenge for the Risk manager.

Big Data
Humongous amount of customer data is available and accessible to banks, including customer-payment and spending behavior, social-media presence, and online browsing activity, to aid in risk-intelligent decision-making. Companies have started using external, unstructured data not only for better credit-risk decisions, but also for portfolio monitoring and prediction of profitability.

Machine Learning powered Analytics
Machine learning identifies complex, nonlinear patterns in large data sets and springs insights that make more accurate risk models possible. These models learn with new information they acquire and improve the risk function’s ability to predict continuously. Several banks and financial services companies have started using machine learning, especially in credit rating, collections, and credit-card-fraud detection.

Use of advanced analytics is not just about Risk. It is about serving the customers with excellence too. To quote Chris Skinner again – This, to me, is the battleground when I’m talking about the digital revolution, the digital human, the digital bank: If you do not get cognitive, predictive, proactive, custom analytics that give the customer far more informed view about their financial affairs, you will not be the partner for that customer in their financial future.

Well, Banks and financial services companies themselves have such large technology functions in their enterprise today, that many of them can be called fintech companies. When they look for data engineering and advanced analytics expertise, they need a partner who understands the industry and the risk function, has the experience of delivering cutting-edge, comprehensive, cost-effective solutions, and the ability to cover all the bases discussed above. At TIBIL Solutions, we have done it and are continuously evolving our solutions. Ask for a demo.

Empirical decision making for business excellence

Empirical decision making for business excellence

In Sales, Marketing and Business Development – as well as business strategy in general – taking decisions based on data (hard facts as it has been referred to in an earlier era) is not something new. In a HBR story, Kristina McElheran and Erik Brynjolfsson opine that at their most fundamental level, all organizations can be thought of as “information processors” that rely on the technologies of hierarchy, specialization, and human perception to collect, disseminate, and act on insights.

We create strategies and take decisions, especially in marketing, based on certain numbers, trends and assumptions based on those. There have been two sweeping changes in the last decade that have fundamentally altered the way we use data to make decisions. (1) The opportunities to collect and leverage data have changed dramatically with the advent of digital technologies (2) The very characteristics of data have changed even more dramatically – velocity, volume, variety, veracity – again thanks to digital technologies.

How do these two shifts affect decision-making? Statistics and technology are being combined to make sense of the huge amount of data at our disposal today to access the data, pinpoint observations, craft insights around them, and create actionable steps to enhance decision-making. Products and services are being shaped around our understanding of the data – not just in the way we target and reach the customers, but also the way we market our services. We are not talking about change here. We are talking about a paradigm shift.

The way to be smarter in this new journey is to go beyond the excitement of how the data is positively changing business. As the value of your data increases, it needs to be managed to ensure it is consistent, reliable and useful. When you are choosing a technology partner to help you effectively navigate the challenges of big data, and make data work for you – keep the following in mind:

  • You require data from all the internal and external sources, legacy and current, structured and unstructured – across its different types and forms; standardize it to make analytics-ready; make it easily and dynamically accessible for different users within your enterprise (from data analysts to marketing managers to sales staff on the field). This means moving away from traditional methods of extracting, loading and processing of data to more agile and scalable methods (NoSQL/NoETL) and a cloud-based data management solution without losing the data integrity.
  • You have to put context to the data you have managed to capture and profile to draw the right insight out of it. This requires capturing customer interactions with your brand event by event. No wonder then the marriage of artificial intelligence and data analysis is one of hottest data trends.
  • You will want to adjust your marketing or product or business strategies in real-time to take advantage of perceivable trends. This requires the entire value chain of data engineering and analytics to be agile, flexible, technology platform-agnostic, integrated with your enterprise technology eco-system, and responsive.
  • Staying ahead of the curve requires using predictive analytics to understand patterns in data and making business decisions based on pattern analysis. By intelligently leveraging artificial intelligence and machine learning, predictive analytics can become more reliable and robust.
  • You can understand the finer details of data by using Visual Analytics, which will make your decision-making faster.

Your search for the best data engineering and advanced analytics expert – who can enable you to understand Data, manage it, make it easy to work with it, and lead through it – will end at TIBIL Solutions.

Our cognitive, cloud-ready Data Lake solutions help you translate data into competitive edge. We enable organizations make intelligent decisions in real-time by integrating various data sources to create a data lake and developing the analytics layer leveraging ML/AI algorithms. Check out how we helped leading global organizations. Download our corporate profile. Schedule a demo.

Your Technology, Analytics and Marketing teams are not alone!

Your Technology, Analytics and Marketing teams are not alone!

Every other sector of the economy perceives data as the magic potion – a super value resource, which when used smartly will deliver that winning edge. Over the last decade, some of the key technology investments organizations made have been in the area of ‘Big Data.’ Even today, amidst all the excitement surrounding the opportunities big data holds, we can see teams across Development, Analytics, and Marketing are more involved in ‘grappling’ with the data rather than gleaning powerful insights from it. If your organization is among those, you should know that you are not alone; and more importantly know that you need to get out of that logjam fast.

In a 2017 survey by NewVantage Partners, 95 percent of the Fortune 1000 business leaders surveyed said that their firms had undertaken a big data project in the last five years. Less than 50 percent said that their big data initiatives had achieved measurable results!

Gartner Marketing Analytics Survey 2018 says that the average team size of marketing analytics grew from a couple of people a few years ago to 45 full-time employees (FTEs). Yet when asked which activities marketing analysts spend the majority of their time on, data wrangling topped the list along with data integration and formatting.

Big Data and Business Intelligence

Every enterprise needs a technology-oriented process for analyzing data and presenting actionable information to help their people, management, as well as customers make more informed business decisions. And for this they need to analyze large amount of data-sets (big data) containing different variety of data types in order to reveal unseen patterns, unknown relations, customer interests, and new marketing strategies.

What is actually important is to convert the data into information and extract the valuable insights from this information. The existing analytical techniques are not fully equipped to extract useful information in real time from the huge volume of data that comes from diverse sources in different forms. So much so that, quite often, beneath the desire to use the widest possible set of data to support decisions there is great anxiety about the veracity of that data.

We do know that big data analytics plays an important role in making businesses more effective, helping to achieve better customer engagement and satisfaction, as well as operational efficiencies. The key objective is to aid data scientists, analysts and various teams to make effective business decisions by analyzing the huge amount of transactional and other forms of data, which was not possible with conventional business intelligence tools.

The challenges that undermine your Big Data projects

Let us look at data storage and management. The most prevalent method of storage and management of data for decades had been relational database management system (RDBMS). However, RDBMS can be used effectively only for structured data; and it falls short when it comes to dealing with semi-structured or unstructured data. In addition, RDBMS cannot handle large amount of data as well as heterogeneous data.

The big challenge is in extracting the hidden valuable information from big data because the traditional database systems and data mining techniques are not scalable for big data. The existing systems need to have immense parallel processing architectures and distributed storage systems to cope up with the big data.

The other challenge is curation. For better business strategies, professionals need relevant, cleaned, accurate, and complete data (in short managed data) to perform analysis. Management of data includes tasks like cleaning, transforming, clarifying, dimension reduction, validation, etc.

Let’s talk storage. Since big data is in terabytes and existing storage capacity is usually limited, it is not easy for enterprises to pick and choose data that is of greater value and data that is not relevant or which optimal set of attributes can represent the whole dataset.

Then we have processing. Data comes from multiple sources with high velocity, which needs to be processed in real time.

Data loading is another issue. Enterprises need to get data from multiple heterogeneous data sources into a single data repository. Multiple data sources should be mapped to a unified structural framework, tools and infrastructure, which can support the size and speed of big data and transfer data real-time.

Finally, the need for interactiveness wherein multiple users with diverse needs have to mine the data they need and in the form they need.

It’s no dark street

At TIBIL, we solve the puzzle of sheer volume, veracity, velocity and variety of data through our own unique integrated approach – NoSQL, NoETL, Distributed Computing, and ML/AI. Our prescriptive, cloud-ready, cognitive, agile and expandable Data Lake solution – Dattaveni – helps you overcome the challenges and let Big Data deliver all the opportunities and benefits it promises.

What does an integrated, real-time data management solution look like? It has to seamlessly integrate with your enterprise systems. It should enable access to data from your internal systems (ERP, CRM etc.) and external data (like Social/ Weather) in real time. It has to draw insights from your legacy data. It should be the platform for your cognitive tasks. It should allow you to scale with new data sources for changing business needs. It should also be your business intelligence system with no additional load. That’s our Data Lake Solution – Dattaveni.

Want to know more. Give us a shout.

Is your traditional ETL process up for data-driven decision making?

Is your traditional ETL process up for data-driven decision making?

Did you know that the trigger for developing business intelligence systems goes back to the early Cold War era? In his seminal article, “A Business Intelligence System” (1958), Hans Peter Luhn of IBM described business intelligence as “an automatic system…developed to disseminate information to the various sections of any industrial, scientific, or government organization.” In the post-World War II race for development, these sectors required a way to organize and simplify the rapidly growing mass of technological and scientific data.

This establishes one fact loud and clear – the way we use data for decision-making is a game changer for growth. Today, we use a lot of terminology to denote this simple truth that we discovered as early as 1950’s. The big difference is the need for data driven decision making in real time. The big challenge – gather and aggregate data from a multitude of sources in a seamless & integrated fashion; process it, contextualize it, personalize it, analyze it and bring out sharp insights on the go. This is not as daunting as it may seem. What would be daunting is to thinking of achieving it relying on traditional systems of data warehousing, ETL and business intelligence.

Have you encountered this? Production systems generate data continuously but nobody uses data in real time because they do not want to disturb production systems. When data from multiple enterprise products has to be aggregated, it is done offline. Structured and unstructured data rarely come together. Analytical tools are static and get updated periodically at best. Are we really talking about data driven decision making here?

The due shift away from SQL

For long SQL has been the staple for organizations in managing their data. It allows a broad set of questions to be asked against a single database design; is standardized, allowing users to apply their knowledge across systems and providing support for third-party add-ons and tools; is versatile and proven.

However, with so much variety in data, the real power and excitement is in playing with it – different users and analysts using it differently; making sense of it in their own different ways and for their own unique uses. It is no wonder that the early adopters of the NoSQL database technology were Google, Amazon and Facebook, who were dealing with huge variety, volume and velocity of data. Today, every progressive, customer-centric, data driven organization faces the same challenge making it imperative to use NoSQL for crucial business applications, in the place of relational database deployments to gain flexibility and scalability albeit at a lower cost.

The discernible benefits of NoSQL and NoETL

Personalization: Demand for personalization means lots of data and real time customer engagement. In a distributed database structure like NoSQL database is designed to scale elastically to meet demanding workloads and delivery the low latency in transactions.

Agility: In contrast to traditional systems, the NoSQL platform has seamlessly integrated operational and analytical databases enabling (a) extraction of information from operational data in real-time, (b) manage and feed data from multiple sources to the analytics engine, and (c) store and serve the analytics data for reporting engine.

More with less: Current day web and mobile applications support hundreds of millions of users. Instead of being limited to a single server, organizations should opt for distributed databases that can scale out across multiple servers. NoSQL allows increase in capacity by simply adding commodity servers, making it far easier and less expensive to scale. Further, in the age of IoT, NoSQL helps enterprises to scale synchronized data access connected devices and systems, store large volumes of data, and support the high performance and availability of data structures.

Risk intelligence: Intelligent, responsive and pro-active management of fraud requires several data points like detection algorithm rules, customer information, transaction information, location, and time of day – processed at scale and in a flash. The elastically scalable NoSQL databases can do this more reliably.

And the aha moment

Here comes the real deal. When you look at the advanced, future-ready data engineering solution of Data Lake – where different users can experiment with the data, ‘fail fast’, and rapidly work the analytics part – adoption of NoSQL and NoETL is a no brainer.

If you are looking for a team that’s not just adept at data engineering and analytics, but has legions of experience in creating innovative data solutions using NoSQL and NoETL as well as building cognitive Data Lakes, Give us a shout.

How far can an organization grow?

How far can an organization grow?

I worked with big and well-established corporations as well as small start ups. There were always discussions on growth. How much should we grow this year? How is our competition doing? What is market expecting? Mathematically speaking, a linear growth is good. An exponential growth is awesome. Both growth trajectories will eventually reach a saturation point for most of the organizations. The best goal that Apple can have is to equip every citizen in this world with a Mac, iPad, and iPhone. By kicking the competition out of the race. The goal is still finite. What does Apple do after equipping every citizen in this world with a Mac, iPad, and iPhone? Toyota can think of achieving a monopoly in automobile industry. Every vehicle that is driven in the world should be made by Toyota. What a goal to have? It is still finite. What does Toyota do after achieving this difficult but finite goal? Can Apple and Toyota think of entering breakfast cereal markets when they are done with their goals? Are these organizations built to metamorphose into new entities that can become leaders in cereal markets? If they can’t, why not? This will sound like a ludicrous proposition. How else can an organization achieve perpetual growth if it is not built to metamorphose? The demand for any product or service is finite. An organization should have the capability and flexibility to take up new products or services as it progresses. What are the factors that dictate an organization’s abilities to grow perpetually? Hard skills that are required? Sales and Marketing efforts? Internal processes? Or the vision that drives the organization? In my experience, I found the most significant factor to be the vision. The growth stops at the boundaries that are created by the vision. I believe organizations should be driven by visions that are not finite. As the organization and the market place evolve, the same vision should have the ability to provide a new interpretation. That leads to a new goal. That leads to further growth. Disclaimer: Brand names used are just examples. No intention to criticize their respective visions.

In Big Data world we need NoETL along with NoSQL

In Big Data world we need NoETL along with NoSQL

Almost all use cases that we encounter today need data in real time or near real time. Traditional ETL methods will burden the production systems. That is why we need NoETL methods. Tibil has delivered this solution for a fashion retailer in Europe.