Delphi Analytics Rwanda recently received some data from an administrator of finance institutions in Rwanda (our ‘Client’). And, in order to add value to the data received, Delphi needed to make progress in two distinct areas that bear discussion, Data Organization and Data Enhancement. These areas of progress are very common in many data projects, so we decided that we would discuss these common challenges in this brief post. Our… Read More »Data Organization and Enhancement
There is an equivalence at many levels between data and strategy. First, the complete absence of data (about competitors, opponents, about opportunities, about one’s own capabilities) absolutely implies a lack of coherent strategy and creates an inability to form any view other than random and blind flailing, which rarely comprises a coherent strategy. Thus, data and information, in general, form a prerequisite for any real strategy. Second, bad data and… Read More »Data and Strategy
Delphi is, at it’s core, built around an understanding of analytics. Data can come in all sizes and types; big-data – with it’s unique challenges and tools, medium-sized data – the old style data-base access we used to think were big, and unstructured data – the types of data that drives us crazy. Delphi does data – all types.
Project- or Subscription-basis
Delphi takes deep-dives into data, on either a project basis (with a limited scope or duration) or on a subscription basis (we provide an on-going service, with repeated flows and reporting). We’ve undertaken projects with all types of data in strange and wonderful markets – and we’ve taken subscription projects that are decades in duration. We love challenges.
In general Delphi’s analytical services fall into one of the following categories:
- Litigation support
- Subscription-based reporting
- Financial modeling
- Machine learning / Artificial Intelligence
- Complex-Network and Community analytics
- Data supplements
These are explained below.
Delphi has been called upon to support major litigation among counterparts at the highest level. Global institutions and government regulators have called upon Delphi to act as expert witness and quantitative ‘story-teller’ in complex litigation scenarios. If there is a story that can be told from data, then Delphi is the party to find and tell that story with vivid detail.
Add Delphi to the litigation support team for confidence in understanding the data narrative.
Delphi can arrange for regular, complex flows of data to run through sophisticated analytics to provide regular management and oversight reporting.
- We do regular monthly or weekly reporting for customer valuation, for segment statistics, for marketing effectiveness measures and for collection and call-center effectiveness.
- We provide regular collection oversight for loan and receivable servicing (as master servicer and trust reporting entity).
- We provide Statistical Process Control and management reporting.
- We do regular NLP (Natural Language Processing) analysis of unstructured data (e.g., customer sentiment).
Delphi builds reliable, auditable financial models for:
- Asset valuation and balance-sheet management / capital allocation,
- Development of reporting process for regular ‘master servicing‘ of assets,
- Risk management models and model-risk management validations.
Machine Learning – Artificial Intelligence
Delphi is a great team member for identifying opportunities to implement AI (Artificial Intelligence) and machine learning processes into a business flow to constantly improve processes.
This is often done by examining critical input values and measurable outcomes of business process for either internal process improvement, or for decision-making on customer relationship management issues. Which customers should you promote or treat with special care, or which customers are financially unattractive to your business? Delphi can help you answer these questions.
As part of a regular, AI-based regime for constant improvement, Delphi is a great partner for creating and implementing Champion / Challenger testing for assessing when and with whom processes and decision-making rubrics.
Complex-Networks and Community Analytics
Delphi has specialized in complex networks since early in the 2000’s. These are networks of things – called ‘nodes’ that might be people, for example, which are linked together in various ways. People that are linked as Facebook friends, or that follow each other on Twitter are examples. Sometimes these networks are derived from phone call or text message data streams – or from e-mail contacts or from appearance at common events or on common lists, etc. There are many sources of complex-network data.
With data streams that comprise complex-network information, Delphi has been able to make significant contributions to insight based upon what we call ‘Community Analytics.’ Communities are the various groupings of people, for example, with tight clusters of common contacts.
Delphi has used complex-networks and community analysis in fraud network detection – and in identifying risks associated with network-proximity to clusters of other risky nodes. The common features of a social circle, for example, are often predictors of common behavior.
This general line of work has been called social-network analytics and it has been very useful for predicting common behaviors of groups. This includes marketing success, as well as contributing factors to credit scoring.
From a practical consideration, then has been very useful in such mundane efforts as skip-tracing in sub-prime auto finance populations.
Delphi gained a particular niche in the credit modeling world in the early 1990’s with its introduction of varied data supplements to credit scoring models.
Delphi’s machine-learning and behavioral modeling tools (e.g., the MANN model) allowed Delphi’s credit models, for example, to include many hundreds of variables. Models were enhanced as relevant data were added to the trove of information known about customers. Initially, those data supplements consisted of publicly available data (such as Bureau of Labor Statistics employment information by locale). But, these were soon expanded to incorporate things like magazine subscriptions and other affiliation data.
Recent proliferation of GIS (Geographic Information System) data allows models in boutique- and developing-markets to include crop information, water availability, proximity to transportation networks, etc. Delphi has established global systems of GIS data for supplementing behavior models in markets such as India, Myanmar, African states, and so forth.
Additionally, there are often ‘derivative’ data from a client’s own data sets that provide custom supplements that might not normally be considered as fields of use in behavior modeling. Delphi has maintained a position on the cutting edge of incorporating diverse sets of supplemental information in modeling.
Delphi has often dealt with the challenge of adding value to the seemingly simple task of data storage. But, how does one do this?
We break the value addition challenge down into three areas:
– Data Warehouse issues
– Data Oversight
– Hosting solution development
In each of these areas Delphi focuses on providing a solution that meets it’s client’s needs.
Data Warehouse Solutions
- Data warehouse design – Delphi’s value addition often begins with appropriate data warehouse design. Having a vision of the purpose of the warehouse leads to an understanding of which data elements need to be included. Warehoused data have dimensions that are not normally a part of transaction flow – plus the analysis tasks associated with a data warehouse have completely different access needs than the transactional access of operational data stores. Delphi understands the difference.
- ETL (Extract Translate Load) processes – It is often at the earliest stages of the data-gathering process that critical design work is needed. The ETL process can either lift the warehouse to great success, or it can sink the warehouse with an unsuitable flow. Delphi puts emphasis on such critical elements of storage.
- Analytical support – What good is a large store of data if there is not an adequate store of analytics? Delphi recognizes that data multiply faster than analysts – so staying ahead of the data flow can mean the difference between extraordinary data insight and being buried by a mountain of disorganized information. Delphi knows how to quickly prototype analytical processes, evaluate the resulting processed results and move to production in efficient, results oriented scrums. Delphi is great to have on your team.
- Management and Data Management reporting – Of course, Delphi recognizes that senior executives and managers need effective management reporting. But so too, do data managers need effective monitoring of the ‘health’ of the data store. Delphi has excelled at building both management reporting processes and data management reporting processes.
- ‘Edge’ analytics – When data flows grow quickly with more complex sensor and process monitoring devices, Delphi has seen the benefits of pushing analytics to the ‘edge’ – closer to the source of data origination. Rather than pipe unprocessed data to the central store, Delphi has excelled at pushing analytics closer to the edge so that data flows concentrate on anomaly detection and reporting, leaving the normal tedious data flows to be summarized by edge analysis reporting.
Both at the ‘edge’ of data gathering and at the core of a data repository it is important to monitor the flows of data. Delphi has taken from it’s experience with statistical process control the important lessons of viewing data-gathering as a process.
Whether the data are gathered by complex systems of sensors, or are a result of human effort and processes, Delphi has seen that most business processes contain regular patterns of data. We have found that these regular patterns can be monitored to identify deviations in process – or problems with systems, equipment and even with management.
The oversight that Delphi can provide for complex data flows can be the early warning signals that attention is needed for key aspects of a business. Delphi can save countless problems with it’s early warnings through data oversight processes.
Hosting solution development
Delphi realizes that there are clear indicators of when a data-storage task needs to be converted into a cloud-based solution. The cloud is often the most sensible path to solving capacity, back-up, availability and security concerns.
Delphi provides services relating to:
- Set up – We can help you establish a cloud-based solution for your data storage.
- Management – Delphi can help you build the management tools and processes you need to maintain your cloud-based storage.
- Monitoring – Delphi’s long experience with data oversight allows us to help you develop the appropriate monitoring process for your cloud-data needs.
The creation and development of intellectual property is important to Delphi. A search of the publicly available patent filings under the name of Craig M Allen (Delphi’s founder) results in the following entries. Encrypted Blockchain Object Transfers Using Smart Contracts Publication number: 20200177579 Abstract: Encrypted blockchain object transfers using smart contracts are provided herein. The present disclosure includes mechanisms for generating compliance certificates and their combined use with smart contracts in order… Read More »Patents
Most people that work with data soon realize that data reliability and data validity are two of the most fundamental problems that must be faced early in the data acquisition and analysis process.
Reliability of data has to do with the consistent nature of the measurement and data collection process. If data are collected at multiple times and in multiple situations and result in very consistent and reproducible measures, then the measure is considered reliable. An unreliable measure is like an elastic ruler – it can provide a measure of that is likely to be different each time the measurement is taken. A good example of this is ‘self-reported’ income. Often, for example, a borrower will state, or ‘estimate’ on a loan application that they earn more (or sometimes less) than they really earn – so ‘self-reported’ income is not a very reliable measure of real, verifiable income.
The validity of a measurement has to do with whether the measure taken truly relates to what one thinks is being measured. Often an assumption or some theory colors the use of a measurement to affect it’s application. In auto-finance, for example, it is common to calculate a ratio consisting of the monthly ‘auto-loan payment’ to a person’s monthly income – and to presume that this ratio measures the ‘affordability’ of a vehicle for that person. The theory is that a person is less likely to stop making monthly loan payments if the vehicle is more ‘affordable’ given that person’s income.
One can see that the assumed relationship might be incorrect. Many vehicles have different monthly average maintenance costs, some have irregular, large costs – like replacing batteries in a hybrid vehicle, and some have very different fuel-milage efficiencies (which makes a big difference when fuel prices show great variability). So, a finance company might believe that they are measuring ‘affordability’ of a vehicle by simply looking at the monthly loan payment – but it is easy to see that this is only one component of affordability. The theory of what constitutes affordability affects whether the measure ‘valid’ or not.
With respect to the usefulness of data, then, it is important to understand the reliability of each data element – as well as to understand the validity of each measure, with respect to the theory of what is believed to be assessed from those data.
In more complex data analytics – such as the creation of financial models that are dependent upon both data and modeling assumptions – it is even more important to understand and assess the totality of the reliability and validity of the data collection and modeling process. Wonderfully complex and sophisticated data collection and modeling processes have been shown to be woefully inadequate at very critical times. This area of model risk management is one of the ‘cutting edge’ concerns of modern business and financial institutions.
On certain occasions, a single publication opens up an enormous field of engagement. Such was the case with the publication of the MANN model papers. The ‘Multidimensional Analysis of Nearest Neighbors’ was characterized as a novel way of aggregating cohorts of borrowers into neighborhoods – or into collections that shared similar attributes. The initial paper was published as: Allen, Craig M., “Credit scoring and risk-adjusted pricing: a review of techniques,”… Read More »Cornerstone Publication – MANN
Obviously Delphi started its focus in the financial markets. But, it was fortunate to have been able to engage in some fairly innovative forays into very different spheres of work. One of the areas that was particularly enjoyable for Delphi involved the solving of problems with very physical characteristics – process control or process management. In the manufacture of certain products – say machined parts in the aerospace industry –… Read More »Statistical Process Control & Management