Aka: A HiveWise Origin Story
In 2018, I took on a contracting role with an advanced AI analytics startup and decided to try to create a way to accurately track ROI for their clients.
I prioritized this focus on ROI since such hard attribution to value would be a big validation for new client sales, and of course make existing client renewals and expansions a “fait accompli.”
As a startup, I could understand why they hadn’t yet developed this model. But what perplexed me more was that each of their clients were spending between $100,000 – $800,000 a year on a single analytics package with no clue as to the actual impact it was making. That seemed crazy to me.
I began by diving into the analytics platform’s user data.
Right away I could see who was logging in. I could see what data they were looking at and what queries they were running. I could even see the reports they were sharing and exporting off the platform. But I couldn’t see much else.
For instance, I couldn’t see the questions they were trying to answer. I couldn’t see if they had found the answers they were looking for. I couldn’t see what they did with the reports they downloaded. Who they sent them to. What discussion ensued…
Crucially I couldn’t see what decisions were being made based on the analytics. And without that, of course, I couldn’t create a real ROI.
So I began calling clients. Asking questions. Trying to piece together the post-download journey of analytics reporting through the wild jungles of the client’s organization. I did this over and over until I felt like I got a better picture of what was happening. Here was what I found:
- Few lone rangers: Almost never did the person receiving the reporting immediately make a decision right then and there. They nearly always needed to consult more experts and stakeholders.
- More was less: The more people that were needed to review analytics (and the more timely the data) the less chance that data had in turning into a decision.
- No cross-team sharing of best practices: I interviewed multiple teams at the same bank – all of whom were trying to reduce risk/increase volume of credit granting. All were using similar data sets, and yet they were all approaching the problem in very different ways – with very different degrees of success.
- Lack of feedback: When business teams hit a wall because they needed more or different data, those requirements rarely made it back to analysts or vendors. Many times they simply moved on.
- No continuity: When an analyst or manager would be transferred or leave, I was curious how well their replacement could pick up where the other had left off. In nearly each situation, the new person simply gave up digging through mountains of messages, emails and powerpoints their predecessor had left behind. They essentially started over. I found this loss of IP to be disheartening.
- The ROI was there, just hiding: I did, however, find the ROI I was looking for. I was able to track a number of situations where insights directly led to changes in the client production environment. And with a bit more digging, I was even able to determine the annualized impact of the results in terms of dollars (or Euros). These made great anecdotal case studies. Yet, the amount of effort required to manually piece these journeys together was simply not scalable. The magic “ROI calculator” I had in my head was simply out of reach.
Instead of being frustrated by what I found, I was intrigued. Nothing in my interviews lead me to believe these issues were unique to advanced analytics. They seemed to impact everything from AI to business intelligence to basic reporting. The challenges of participation, best practices and transparency were clearly broad issues that needed a solution. But what would that solution look like?
The requirements list I began started to look something like this:
- Extend the workflow of any analytics platform to encompass the entire data journey.
- Make it easier for groups of people to work together within time frames to make decisions.
- Provide templated workflows based on best practices
- Create transparency for governance and continual optimization
- Publish and share decisions to all stakeholders
- Set KPI objectives and track final results (that would then connect back to the source data)
It was not long after, I found myself in a restaurant in Morocco sharing a pigeon pastilla with a number of people from the MIT Center for Collective Intelligence. One of them, Dr. Mark Klein, had spent the previous 12 years researching and developing digital tools to enable large groups of people to work together to solve complex problems.
The rest, as they say, is history. As Dr. Klein and I compared notes, it became clear that we could combine our visions to create something powerful. For my side, I saw the opportunity to bring transparency and structure to a critical business process – decision making – that directly impacts the P&L of an organization. For Dr. Klein, he saw an opportunity to put decades of research across collective intelligence and decision science to work, first on the commercial side and then, one day, for non-profits and other “for the good of mankind” initiatives. We worked out a deal with MIT by the end of 2019 and HiveWise was officially born.
Now over a year later, I’m thrilled with what we’ve already been able to accomplish with the HiveWise platform and for our clients around the world. We’ve been able to deliver on each of our starting requirements. And we’ve been able to discover new use cases, including enabling broad participation in strategy development and serving as a cornerstone platform for DEI initiatives. Yet, as we look ahead to the coming year, we remain as committed as ever to our original objective – increasing the participation, structure and transparency in the way data turns into action and ROI.