Firms that have ventured down the path of profitability analysis understand the complexities of the journey. The BIGfish Profit Module adheres to best practices and design concepts to deliver a top-notch reporting experience.
Firms that have ventured down the path of profitability analysis understand the complexities of the journey. The BIGfish Profit Module adheres to best practices and design concepts to deliver a top-notch reporting experience.
Profit analysis is not a one-size-fits-all proposition. A firm’s make-up influences the variables that come into play during the configuration of the profit module. Does the firm have multiple offices? How should the system allocate shared overhead? Does the firm want to report on profit from a cash or accrual basis?
The cost of providing a service versus the worked, billed or collected value is the simplest definition of profit. Law firms measure realization – the percent of worked value versus billings or payments, but if we assign a cost basis, we create a true profit model.
The simplest way to assign a cost basis is to create an annual rate for a unit of production. In law firms the unit of production is an hour of billable work. We commonly create a base rate and a loaded rate. To calculate the base rate, look back at the prior year and divide the number of billable hours generated by an attorney into the gross compensation. To arrive at the loaded rate, add the attorney’s share of overhead to the gross compensation and divide by the number of billable hours. With these two rates we can look at work performed (i.e., entered or billed values from the time and billing system) to determine profit. Since the core of this analysis is time entries, aggregations over client, attorney, office, law type, etc. become possible.
Annualized rates paint incomplete pictures of profit because the do not capture fluctuations in attorney costs over the course of the year. Computing attorney costs by accounting period and taking into account write-offs and adjustments gives a more robust profit number. The BIGfish Profit Module provides tools for firms to capture data from external systems, to key periodic adjustments and to scour the GL to create formula-based costs (overhead distribution, for example).
BIGfish provides many options for presenting profit module data to end users. From standard data tables to advanced visualizations, BIGfish leverages the full power of your data to give you the business insights you require to make sound business decisions.
The old axiom “garbage in, garbage out” applies to data analysis. Even the most sophisticated BI tools may deliver incorrect or incomplete stories when presented with data irregularities.
The old axiom “garbage in, garbage out” applies to data analysis. Even the most sophisticated BI tools may deliver incorrect or incomplete stories when presented with data irregularities. The first step in any good BI process is to clean and transform the data. Most enterprise class BI systems provide utilities to identify and correct bad or missing data.
Some examples of this include: null values, blank values, bad dates and other invalid values. Let’s look at each of these in turn. Null values are often misunderstood. A null value is not a blank; it is the absence of a value. Nulls create all sorts of headaches for filtering, sorting and aggregating data. Nulls are not always bad, but they definitely need to be taken into account. We recommend replacing nulls in numeric fields with a default value (typically 0). Like null values, blank values create a level of unpredictability. Defaulting some textual value like ‘N/A’ or ‘NONE’ usually takes care of the issue. Bad dates usually come from entry fields in the accounting system that do not validate input or restrict future dates. By way of example, a user accidentally enters 2/1/2129 instead of 2/1/2019. Future dates can filter out transactions, especially on dashboards with date range parameters or aging criteria.
A good BI tool highlights bad dates so that they can be corrected in the host system. Often firms want to group data for analysis by a field from the host system that is not validated - that is a field where users can enter whatever value they want. A common example would be a field like state. If the state field fails restrict values to a known list, users tend to key the same piece of data in different ways. California, for example, could have many different entries: CA, CA., CAL, or California. Name entry fields create a more pronounced version of this problem. Best practices recommend reporting on validated data, but exception reports can identify “similar values” for correction in the host accounting system. In addition to simply highlighting data inconsistency, a good BI tool will provide some data “cleansing” capabilities as a part of the creation of the data warehouse. Rules can be set to trap bad data and transform it as a part of the data creation process. When the data warehouse does not match the host system, balancing between the two systems presents challenges; correcting the data in the host system ensures fewer reporting headaches. As firms begin to move toward a more data driven decision-making process, finding the correct BI platform is half the challenge. Ensuring that system has good quality data to report against is equally important.