1st October

Data to Insight to Action: The Infinite Loop

At ANCHOVY. , we are currently undergoing an exercise to consolidate our data infrastructure, so we know the struggles of CEOs and businesses to gather separate sources of data to try and gleam insight into what’s really going on in the business from a high-level, holistic point of view.

ANCHOVY. uses  a lot of different cloud-based platform to manage the day-to-day operations of the business. In this article, we will illustrate how we use the data generated by these platforms, and use it to gain insight into our own business.

One of the main platforms we use is Toggl. which we use for time-sheeting, to track the time spent on client projects. If you want to know why we love Toggl and why it’s our platform of choice, you can read our review of Toggl here.

In summary, the reason we use Toggl is to generate the data showing exactly how the working time of every employee in the company is being allocated to clients.

So, the question then, is what should be done with the data?

The way Toggl works is that the employees have to be tracking their time while they are working on a task. This means that the employees are responsible for generating their own data. Of course, it’s not always easy to do this day in, day out, especially in the hectic days, where hustling is the name of the game.
Therefore, it’s important to provide the necessary incentives and “gentle” reminders to remember to use Toggl
.

enter: The Fish Tank
Fish Tank

True to ANCHOVY. ‘s style and culture, this virtual fish tank is populated by, you guessed it, ANCHOVY employees, (and interns) based on their Toggl usage. The fish tank resets every day, and employees have to keep up their usage of Toggl to consistently be the biggest fish in the pond.

Sometimes we are docile fish, but sometimes we are sharks…(but still friendly sharks, trust me)

How the aquarium works (Technical)

The Aquarium works in real-time by being connected to a database that is scripted to send requests to the Toggl Reports API every few minutes. Therefore, every few minutes, the latest Toggl data entries are retrieved and stored into this database, containing information about the start time, duration, and end time of each task done by each employee.

The way the aquarium works is that a fish gets bigger and bigger with more hours recorded on Toggl. So when the database sends a request to the Toggl API and new data is returned for employee X, this makes the fish associated with employee X bigger. Fish size is relative to all the other fish, so how much bigger the fish of employee X depends also on the hours worked by all the other fish (employees*)

Fishing comic panel 1
Fishing comic panel 2

Fishing comic panel 3

Performance Reports

To keep up the competitive spirit, employees are also grouped into their respective departments and evaluated on their team performance.

At the end of every month, every team is scored with the Average Total Hours per Member in a team. That is a mouthful to say, but basically it means that a team hoping to do well needs high performance across all its team members. This creates the necessity of having every single team member doing well, for the team to get an overall high rating, and this creates a sense of shared responsibility and peer pressure

Employee Team Ratings

What kind of value could a company get out of this?

This way we demonstrated that collecting data leads to visualisations that improve the collection of data and reports on this data. But how do we go further? What kind of value could a company get out of collecting all this data?

The beauty of data is that, when collected properly, it can be integrated with other data sources, to give a more comprehensive overview of what’s going on in your business. In other words, the value of a data source grows exponentially when connected with other data sources, as part of a larger ecosystem.

Since Toggl is used to calculate the time costs incurred in fulfilling client projects, the natural counterpart to complete the picture, is to measure the actual revenue generated from fulfilling the same projects. Then, we can compare revenues from fulfilled projects with actual time costs to finish the project.

The objective is to find the profitability of each service package, and by extension, the profitability of each department, so that the company can make informed decisions on what the problems are, and therefore we can focus our efforts in those specific problems. As a public company aiming to stay profitable, attention generally needs to be focused on packages that are currently not profitable, or not as profitable as initially thought, due to hidden, unpriced costs. *¹

A practical demonstration of value

As a practical example, ANCHOVY. itself employs several different department and offers A LOT of different services in standardized packages that clients can choose from. So much so, that it can lead to a choice overload sometimes, just like ordering from a restaurant with 100 different menu choices would be. So it’s important for ANCHOVY. to create the right service packages, that offer high value, while not being unprofitable due to having too much time spent in fulfilling the service.

I'll have two number 9s, a number 9 large, a number 6 with extra dip, a number 7, two number 45s, one with cheese and a large soda

To solve this problem, we can integrate Toggl data with data from our accounting software to compare the revenue generated by that department vs how much time is spent to generate that revenue. By integrating these two different data sources, we can get a full picture of what’s going on. We can answer questions such as:

    • Which departments are struggling to be profitable? Is it because of lack of demand, mispricing, or does it simply takes too long to get things done?
    • Which services are yielding a disproportionate percentage of company revenue? What would happen to the company if, overnight, these services are no longer in demand? And how can we mitigate this kind of risk?
    • Which services are being handled by multiple departments and are at most at risk of being held up by bottlenecks? And is that what actually ends up happening in reality? If yes, what can we do to address this issue?
Guards, Destroy him!
The Shark Tank may or may not be an actual thing. And there’s only one way to find out.

We can also go deeper and analyse what’s going on in a package-by-package basis. For example, one of the data services offered by ANCHOVY. is the data mining package, where clients share access to their data, and in exchange, ANCHOVY. provides periodic performance reports, with detailed insights derived from statistical analysis, augmented by Machine Learning/ Artificial Intelligence techniques.

Obviously, what can be done and cannot be done is highly dependent on the scale of the client’s business, as well as what data is being collected and the quality of the data. A chef cannot make food taste good, if the food itself is rotten and inedible.

Since there are a lot of factors that affect the time taken to make data analytics and data mining successful, it can be very tricky to standardize pricing, as the results vary wildly from client to client.

 

 


Let’s highlight this situation with an example

 

So, imagine the scenario that I have a big client and this is how I spent my time to complete a quarterly data mining package for this big client. The total hours I’m spending is 126, for work done over 3 months.

Toggl Project Time Sheet

Now imagine, that with other clients, when checking our data, we find that the on average, the total hours spent would be around 65. This makes sense, because if they are not big clients, then I would have less data to go through, and less things to report on.

So based on this insight, we now know for sure how much of a difference the scale of the client business makes to the time taken to do top-notch analytics. We now know that it’s not fair to treat all clients the same, so instead we try to find a win-win solution. For example, we might try to do a data audit before starting data mining work. This way, we can come up with a tailored price that is more reflective of the time needed to still do a high-quality job, instead of doing a poor job at the standard price.

To generalise the implications of the example illustrated above: 1. Trying to solve a problem means that new questions have to be formulated and answered (How do we manage to maintain profitability of the data mining package?)

2. New data needs to be collected or we need to analyse current data with a different perspective, to answer this new question.
(we are using Toggl and financial data to answer point 1. )

3. We come up with potential solutions based on the insights we found(pre-emptive Data Audit)

We re-evaluate if this proposed solution of having a Data Audit is having the desired effect, or if it is creating new problems, which we then have to use data to get insight and figure out how to solve the problems, and so on… And thus starts the never-ending loop of data-driven optimization… From Data to Insight to Action, a continuous cycle.

Author
Jeff Zammit

Share

Intrigued?

Take a look at some of our work.

How can we make it work
for you?

Get in touch and find out.