Analytics Are Making Us Smarter About Outbreaks

Analytics Are Making Us Smarter About Outbreaks

In 2014 Liberia’s hospitals were overflowing with people infected by the deadly Ebola virus, and there were sick people lying on the ground outside hospitals, writhing in pain. Their only hope of getting treatment was if someone else died first, freeing up a bed. By 2016, the Ebola outbreak ended with more than 11,000 deaths (reported ) across West Africa. Experts were unanimous in their analysis that even in the worst months of the outbreak, whole countries were unprepared for such a catastrophe.

The World Health Organization (WHO) eventually declared a Public Health Emergency of International Concern but it came too late, underlining how we suffer from a lack of timely data, unrelated datasets that are difficult to collate, and a shortage of people with computational skills to help prepare for and respond to global epidemics. What if there were an early warning system for such outbreaks that could have given WHO a heads-up, allowing them to organize an effective response and contain the disease’s spread? Thankfully with the AI and data science revolution we see today such a system may not be too far in the future. 

Why managing pandemics needs data analytics

Detecting an infectious disease is usually an after-the-fact-activity, and stopping it from causing an epidemic requires real-time information and analytics, because controlling a pandemic is not about where the disease is occurring today. It’s about where the disease is occurring and who is most vulnerable to it. That combination of information can help health experts and organizations like WHO look for long-term catalysts, such as how climate affects the spread of a pathogen like the coronavirus.

To help governments across the globe track, respond to and prevent the spread of the coronavirus, health experts are turning to advanced analytics and AI to prevent further infection. Several researchers are even looking at the Internet of Things (IoT) to collect sensory data in real-time and track people, health systems, and environments, even in remote regions of the world.

IoT and Big Data are helping with disease control
It is now possible, thank to IoT and big data analytics in healthcare, to collect data from places where previously it was either done manually or not done at all. For example, smart thermometers feed data in real-time to global medical systems, bench-top analyzers are scanning patient samples and sharing data almost instantly with disease monitoring tools installed remotely. Disease monitoring tools are merging IoT data with population data, GIS data, land-use information, social media streams, and other sources to detect emerging public health threats.

By collecting and analyzing data from remote locations, clinical researchers are in a better position to make an evidence-based analysis of a possible outbreak and suggest preventive measures using data from the IoT devices. As a result, identifying and preventing the spread of infectious diseases proactively is now a reality.

Using AI to track pathogens

There are several ways that government health agencies can use AI technology to limit the spread of diseases like coronavirus. Researchers are turning to AI to help predict locations where new diseases could emerge by integrating global data about known viruses, animal populations, human demographics etc. to predict epidemics. AI can also help reduce the time required to detect an outbreak thus enabling faster action to stop the spread and effectively treat the infected.

According to the founder of Alibaba, the company’s new AI system can detect coronavirus in CT scans of patients’ chests with 96% accuracy against viral pneumonia cases. They have developed a new algorithm that has shortened the process of recognizing the pathogen/infection to a mere 20 seconds, which is a big improvement from the 15 minutes that traditional methods take to analyze a CT scan. Baidu’s new AI tool called LinearFold promises to reduce coronavirus prediction time from 55 minutes to 27 seconds, which is crucial for understanding the virus and initiating drug discovery. 

AI can also analyze and aggregate travel, population and disease data to help predict not just how, but also where, a disease might spread. When it comes to treatment, radiologists are using AI technology (machine learning and deep learning) to extract insights from large data sets and make better treatment decisions based on medical imaging. Taking coronavirus as an example, data from chest X-rays of infected people can help build AI models so doctors can make quicker diagnoses. AI can also help shorten the time it takes to create vaccines for newly discovered pathogens by examining data from similar viral diseases and then using it to predict outcomes. 

And it doesn’t stop there. After an outbreak has ended or has at least been contained, governments and global health organizations can use machine learning to simulate different outcomes to test and validate policies, public health initiatives and response plans based on “what if” analyses.

The importance of data analytics is incontrovertible 

While analytics and ML aren’t sitting in local doctors’ offices taking samples to be tested, they are being applied to help the overall effort and make doctors and healthcare organizations more efficient and better equipped to fight epidemics. When used effectively, these tools have the potential to save lives. As an example, the Johns Hopkins University’s Center for Systems Science and Engineering has developed a real-time visualization of the coronavirus epidemic which includes a map, total numbers of cases, deaths, and people recovered. The data, sourced from WHO, CDC (in the US) and others, is also broken down by country and the numbers of cases are represented on the map using dots. Predictive analytics can also be applied to data from public locations to predict disease spread and risks and plan for the impact of an outbreak on healthcare organizations.

Machine learning can churn out high-resolution world maps highlighting where epidemics are likely to infect people, by using remotely-sensed and other geographic data about environmental, human and animal factors. Experts are taking complex infectious disease datasets and feeding them into large-scale computational disease spread models. This allows them to generate hundreds of terabytes of computer-generated synthetic outbreak simulations that give an idea about expected numbers of cases, hospitalizations, deaths, and even financial losses. 

Currently, we are in a critical juncture as experts and governments shift their focus towards containing coronavirus. The role of surveillance, drug discovery and diagnoses has become crucial, and with analytics and AI, there will be a tremendous saving of time and hopefully, lives.

IoT Analytics: Telling It Inside Out / Building The Complete Picture

IoT Analytics: Telling It Inside Out / Building The Complete Picture

At an estimated USD 3.9 Tn Industry 4.0 is being seen as the domain with the most to gain from the Internet of Things (IoT). Manufacturing firms world over are using IoT to improve operational efficiencies, to automate and to innovate to discover additional sources of revenue with new business models. Enterprises are realizing that data analytics and connected devices will be required to enjoy higher efficiencies and process improvements.

With the advent of IoT, Industry 4.0 has taken this digitally driven transformation to another level via interconnectivity and access to real-time data. A variety of sensors, dramatic increases in storage capacity and processing power, real-time analytics of unprecedented sophistication, and the ability to translate that data into meaningful action – are all helping organizations predictively maintain equipment and operations in order to optimize performance. The data analytics being derived from IoT sensors is helping companies forecast potential issues, minimize downtime and also eliminate any guesswork from preventive maintenance. IoT analytics is being used today to structure, process and analyze data and churn out invaluable insights and support better decisions.

One key advantage that the trifecta of IoT, big data and advanced analytics brings is that it enables systemic interoperability and collaboration between diverse teams and operations delivering cost and efficiency benefits. IoT analytics are playing a crucial role in modern industrial systems, adding an information layer to the conventional methods for data collection, storage and analysis.

IoT analytics for preventive maintenance

Until recently, factory managers and machine operators carried out scheduled maintenance manually and regularly repaired machine parts to prevent downtime. This process was time consuming and counter-productive, and despite the time invested most of the predictive maintenance steps taken were ineffective. Implementing IoT to monitor asset health, optimize maintenance schedules, and gaining real-time alerts to operational risks, allowed enterprises to lower service costs, maximize uptime, and improve production throughput.

They are now building blueprints of a connected system that includes equipment and sensors, business systems, communication protocols, gateways, cloud, predictive analytics, and visualization. This allows IoT sensor data to be captured and used for predictive analytics that can be applied to the machine data and predict conditions of upcoming failure. A dashboard for predictive analytics processes operational data allows engineers to address actionable insights and take corrective action. Rule-based predictive maintenance allows enterprises to bypass the need for large historical data sets – at least initially – or advanced machine learning algorithms, instead giving them faster results and a step into advanced analytics.

Advanced analytics with predictive alerts and automated root cause analysis can be applied at a later phase, and that’s when historical data can be used to accurately predict issues. IoT analytics help organizations enhance overall efficiency, improve safety procedures and apply quick fixes to maintenance problems. Machine learning and AI technologies are giving more impetus by helping enterprises connect disparate data sources and gain insights for forecasting future performance.

Making grids smarter with IoT analytics

The transformation of electrical grids into smart grids is perhaps one of the major technological challenges, and achievements, of the past decade and also one of the key growth areas for IoT analytics. Smart-home technologies and the corresponding analytics are an integral part of many use cases in this field. Smart grid solutions based on IoT technology are playing a huge role in energy conservation by connecting disparate platforms in home automation, building and infrastructure automation, as well as in transmission and distribution systems.

Smart grids collect much more data than the manual energy meter reading system, which warrants the need for data analysis and highly realistic consumption forecasts that take a multitude of variables into account. Smart grid analytics are expanding because there are exponentially more data available thanks to IoT sensors to develop analytical models that could even predict future failures. What makes the IoT smart grid better is two-way communication between connected devices and hardware that can sense and respond to user demands, and also gather performance data and feed it back to the supplier offering deep analytics and insights. Data analytics combined with grid visualization can lead to better situational awareness, preventive maintenance and fault detection, as well advanced metering infrastructure and the security of the power system.

IoT sensors connect to the gateway, which in turn connects to the cloud and enables access to sensor data remotely via mobile devices. Sensors also collect energy consumption data on real-time from devices, and this data is analyzed by the gateway, which then escalates the necessary output or command message (like a utility command, an alert, HVAC control etc) to the control system. IoT devices also help analyze energy utilization of each device, which aids the user in managing device up/down time. Enterprises consuming energy can access historical data from the cloud, derive insights and accordingly optimize their consumption of energy.

The IoT has allowed companies to move towards a new way of doing business by applying it to various processes in order to enhance productivity and efficiency. But the real value of IoT lies in using the data from the cloud and the edge to get better analytics and derive insights from raw data. Analytics can play a much broader role and influence business practices, predictions, ROI, decision-making and more. Combining the IoT with advanced analytics takes businesses to the next level by offering transparency into business operations, insight into market trends and highlighting opportunities for improving the business.

We, at Tibil, understand the immense potential data holds for business and help enterprises harness its power fully, helping them achieve larger business goals through reliable and intelligent management of data. Our IoT Data Analytics services help you leverage your IoT device data to create immersive and insightful reports that can be combined with contextual data. We drive big data analytics, predictive analytics and customer analytics using new generation data analytics technologies, ML/AI and industry-grade statistical models to deliver advanced, real-time analytics. Reach out to us at [email protected] for more information.

Why you should evaluate platform-driven Data Analytics?

Why you should evaluate platform-driven Data Analytics?

A few days back, we asked this question on LinkedIn.

What does having a great Data Analytics Platform mean?

  • Confidence in accessing and using the right data from a single source, without worrying about systems, formats, protocols, and security.
  • Flexibility to build your own custom big data and analytics applications, without worrying about tools and databases.
  • Ability to unlock new possibilities from your data, without worrying about the scale.
  • Capacity to extract, process, analyze and derive insights out of your data in real-time.

The consensus among our teams in TIBIL – having worked on several global client engagements across Data Engineering and Advanced Analytics – is that it means all of the above.

Before we get to a platform driven approach to Data Analytics, let us understand the business imperative for investment in Data and Analytics. Quite simply use of data analysis to drive competitive edge.

You may be looking to achieve it through deriving new insights to develop new products and solutions or designing and refining business strategies based on trends. Some other business requirements that can effectively be met through data analytics include:

  1.  keeping a tab on the pulse of the customer to make informed decisions to capitalize on a trend,
  2.  identification of a new market opportunities,
  3.  devising a new operational model.

Experience has shown that while business benefits are many, long term success depends on moving beyond the hype and embark on a journey to create the right platform that can help the business adapt, scale and innovate. A journey that delivers sustainable ROI.

While the actual benefits of an iterative analytics process usually come at a high cost a platform-driven approach to data analytics, can not only make the entire process cost-effective, but also improve productivity through faster iterations (a fail fast approach).

Fail fast. Finish strong.
Your ability to create value out of your data depends on your ability to identify the problem and create solutions based on the data – with agility. The faster you test your solutions in the market, the better to help evaluate the opportunity cost. This allows you the luxury to test different hypothesis with a faster feedback cycle, thereby improving the ability to roll out solutions faster.

A platform-driven approach helps you to move fast by leveraging reusable components, microservices and API based architecture, thereby allowing you to focus only on the tweaks to your solution or data models at a faster pace and lower cost. This finally leads to faster time to market of your final solution, at lower cost, allowing you to finish strong.

Talk to TIBIL Solutions. Our Data Analytics Platform offers enterprises a jump start on their data and analytics journey, with all the features an enterprise grade platform needs, as well as the flexibility and customization you require.

Data Gravity: Are you maximizing the opportunity?

Data Gravity: Are you maximizing the opportunity?

In the world of Data and Analytics, the term Data Gravity is now almost a decade old. The question is how well you are recognizing the opportunity and trying to maximize it.

For starters, Data Gravity refers to what happens when we move to a Data First philosophy – which anyway has become inevitable today. Data accumulates for the business every single second and it pulls your business to it – infrastructure for storage and management, people for analysis, applications for processing it and making sense of it. As data grows so does its density/mass and its influence on the business.

Increasingly, today, when we speak of Data Gravity, we are referring to the shifting of data to cloud and with it the applications and tools that are used to manage the data and analyze it. Most of the businesses worldwide generate and use as much external data as they generate internal data. In several cases the external data could be much more than internal data. And much of the external data resides in the cloud. For example, a company’s data from social channels is invariably being generated and stored in the cloud. Hence, many of the applications or solutions being built to effectively store, process and leverage that data are becoming cloud-based. After all the location of your analytics has a direct correlation to the time taken to move from raw data to insights.

Coming back to our central question. How do you maximize the opportunity presented by Data Gravity?

  • By creating a data storage , cleansing and enhancement system that gives you the ability to connect to it all  different data sources
  • By providing a secure, consistent and timely view of data, across both on-premise and cloud resources, to all the different users, including internal and external
  • By building the right analytical tools that reduce time to insight

When you are faced with a whole range of data sources, types and systems that are generating the data, coupled with so many different users of data with their own unique needs – this is easier said than done. Rather, this is where the crux of today’s Data and Analytics challenge lies. Can you navigate through the teething problems of data ingestion from multiple sources, processing of data of different types, its secure presentation to multiple users and its preparedness for supporting advanced analytics – easily, quickly? This means your teams can focus on what they need most – generating insights.

The answer to this lies in changing the lens on traditional data engineering and analytics. Adopt a platform driven approach to data – multiple sources are linked to this platform, multiple users are connected to the platform, multiple applications run on this platform. Sounds exciting? How about moving into this platform and having it tweaked to your unique needs rather than building one?

Check out what we at TIBIL are offering in this space. Ask for a demo.

Who says SMBs don’t need or cannot afford Data Analytics?

Who says SMBs don’t need or cannot afford Data Analytics?

A question often debated when it comes to Big Data is ‘whether small and medium businesses (SMBs) need to invest in Data Analytics? Moreover, does it give enough ROI?

Some commentators on technology believe that Big Data is for big enterprises; the analytics and visualization needs of an SMB can be achieved with easily available online tools. In addition, we see several cases of big data projects failing in several large enterprises. The heavy buzz around Data Analytics and Data Scientist professions also led some to argue that Big Data is indeed a big bubble and SMBs will better off by staying away.

Every human action and interaction generates some data – some structured, and a lot of it in unstructured form. Companies that are able to capture this data easily and effectively and use it to make intelligent decisions quickly are staying one step ahead of the competition. The puzzle is the complexity, volume and speed at which data is being generated. The solution is in cutting through the maze without burning a hole in the pocket (not to speak of the impact of the large-scale digital transformation of the organization). In this context, the question of whether SMBs need powerful Data Analytics begs an answer.

SMBs, being in the same global market as large enterprises, are exposed to the same potential of Big Data. They deal with customers, play in the same competitive market, are an active part of the social network, comprise a huge chunk of the economy, and have the same ambition to capture the market share (or should we say the customer’s mindshare).

Every business captures some level of data. Irrespective of its size, an organization has the best chance to succeed when it transforms data into insight for shaping its business strategy. So, how does Data Analytics help SMBs?

  • As SMBs increase the capability to collect transactional, social and customer data, they will need the ability to process and analyze that data so it becomes useful to the business. The organizations that can do this confidently gain competitive edge, customer satisfaction and financial performance.
  • The goal of using analytics is to understand how customers digitally interact with your business and determine means by which you can improve your business’ success through marketing. This includes establishing sales patterns, segmenting users and building data sets that reveal important details about customers’ buying habits. With the right information, you can build effective, targeted marketing campaigns
  • Data Analytics is about a shift from retrospective business intelligence to forward-facing actions. Every business tracks its sales and inventory, its revenue and profit performance. When you infuse analytics into that data, you crunch it into forward-looking recommendations for efficiencies, like resource allocation, demand prediction and effective marketing. Further, you can use data from diverse sources (including customer touch points) to tailor your products and services; and design your customer experience.

SMBs also have an advantage when it comes to adopting Data Analytics for their business. They can be very clear about what outcomes they need and be agile in implementing. Instead of hiring Data Scientists or over simplifying the role and scale of Data Analytics, SMBs can take a long view of what they can achieve with Data Analytics. Building on that, they can work with a solutions provider who can offer an ingenious data solution that can be customized to their needs, easily implemented, and scale with their growth. You can talk to Tibil Solutions for such a strategic adoption of Data Analytics.

We do like to hear your views and use cases on how SMBs should create their own custom paths for using Data Analytics for business success. Please chime in.

Changes in Risk Management for BFSI companies demands rapid action

Changes in Risk Management for BFSI companies demands rapid action

Chris Skinner, author of books like Digital Bank and ValueWeb, says – Now we’re seeing what I call ‘the complete open sourcing of financial services’ through apps, APIs and analytics. So the front office relationship is in an app. The middle office processing is through an API, and the back office is all about for analytics.

The sheer amount and pace of change in banking and financial services over the last decade has been mind numbing, and near nightmarish for risk managers.

Even as banking has become fast, easy and personalized; the tolerance for any errors and dishonest business practices has dramatically decreased (rightly so). While digital transformation has opened new business models for financial services companies, customers’ expectations of banking services have tremendously increased. Risk functions in banks now have to manage new types of risk, including models and cyber; besides managing compliance with ever evolving regulations. Additionally, they are expected to deal with these trends at a lower cost, because banks (like other services companies) expect to reduce their operating costs substantially when they adopt new technologies.

The good news. Data engineering and advanced analytics are enabling new products, services, and risk-management techniques – enabling risk managers make better choices about risks. The challenge, of course, is in finding the right solution that can scale with the organization, cover all the bases, integrate seamlessly with the bank’s enterprise systems, and does all of this in a cost-effective fashion.

Let’s take a look at some of the key trends in Technology in Banking.

Winning customers in the highly competitive, globalized banking and financial services industry is a battle that is increasingly being fought on the digital front. As digital technologies are rapidly changing life and work in every other sphere, customers expect intuitive experiences, access to services at any time on any device, customized propositions, and instant decisions – from their banking. This entails re-imagining the bank / financial services company from a customer-experience perspective and digitization. The risk function plays a critical role here collaborating with the business and technology functions across the entire transformation journey.

Automation in Compliance
Omni channel banking has thrown up a challenge – how to accurately validate the identity of persons applying for new accounts or performing transactions. Whatever channels are used, for a bank to approve new accounts or any transaction, it must draw data from multiple, disparate sources, analyze it and demonstrate the risks quickly, for informed decision-making. Digitizing the underwriting processes and increasing use of data analytics are visible trends of automation in compliance.

Even as regulation is becoming complex and noncompliance less tolerated, banks have to eliminate human interventions in risks dealing with customers and seamlessly connect right behaviors to products and services. Quite simply, automation in compliance is the best way to ensure accurate oversight (that can save millions).

Real-time decisions and service
Gone are the days of filling up laborious application forms and surviving long IVR-driven calls. Banks now have to offer real-time answers to customer requests with customized processes. As Risk managers seek to find ways to help banks assess risks and make decisions without human intervention, they have to contend with the use of more non-traditional data sources. For example, some banks have re-designed account opening with much of required data prepopulated from public sources to make the experience as simple, fast and short. However, establishing a secure and customer-friendly approach for identification and verification becomes yet another challenge for the Risk manager.

Big Data
Humongous amount of customer data is available and accessible to banks, including customer-payment and spending behavior, social-media presence, and online browsing activity, to aid in risk-intelligent decision-making. Companies have started using external, unstructured data not only for better credit-risk decisions, but also for portfolio monitoring and prediction of profitability.

Machine Learning powered Analytics
Machine learning identifies complex, nonlinear patterns in large data sets and springs insights that make more accurate risk models possible. These models learn with new information they acquire and improve the risk function’s ability to predict continuously. Several banks and financial services companies have started using machine learning, especially in credit rating, collections, and credit-card-fraud detection.

Use of advanced analytics is not just about Risk. It is about serving the customers with excellence too. To quote Chris Skinner again – This, to me, is the battleground when I’m talking about the digital revolution, the digital human, the digital bank: If you do not get cognitive, predictive, proactive, custom analytics that give the customer far more informed view about their financial affairs, you will not be the partner for that customer in their financial future.

Well, Banks and financial services companies themselves have such large technology functions in their enterprise today, that many of them can be called fintech companies. When they look for data engineering and advanced analytics expertise, they need a partner who understands the industry and the risk function, has the experience of delivering cutting-edge, comprehensive, cost-effective solutions, and the ability to cover all the bases discussed above. At TIBIL Solutions, we have done it and are continuously evolving our solutions. Ask for a demo.