Overcome Analysis Paralysis



I mean, you’re talking about hundreds of millions of records of data to properly understand what’s happening in the ecosystem across the myriad of data sources. And I think that is really one of the most problematic areas. You’ve got your payment system, you’ve got your consumer facing layer application, web browser, et cetera, and multiply that potentially by a factor. There could have multiple distribution points. So bringing all that data together and normalizing it and being able to trust that data set is probably step number one. It speaks to data integrity because if your data is flawed at its most fundamental level, then any sense of it on the back end of it is not going to be irrelevant, but is likely to be wrong. So step one is to really trust the data and that comes through automation and normalization of the data. And unfortunately, I don’t know about you. I cannot normalize hundreds of millions of data. So you need software to do that, right. And engines to automate that process. And only from there can you then get into prescriptive descriptive insights, either through automation or from a human perspective to really understand what’s happening in the overall ecosystem. So not to be a little bit flippant, but before you can make sense of it, you need to be able to trust that.