Effective data analysis holds the key to skillfully managing a commercial real estate portfolio. For years, the default tool used to gather and repurpose data has been spreadsheet programs such as Microsoft Excel. On a standalone basis, Excel is undoubtedly powerful. But when individuals from a larger group decide to maintain their own spreadsheets when managing the same portfolios, problems can emerge.
This illustrates what is often referred to as the “single source of truth” conundrum when multiple people create their own versions of what should be a single data source. This can become problematic when reporting, as varying datasets can often lead to the same question: “Which source is accurate?”
The criticality of property tax management highlights just how important it is to ensure there’s a reliable and accurate single source of truth. Within the property tax department, the volume of data can quickly become overwhelming. The problem is exacerbated by the amount of manual work required to carry out the fundamental duties of the job.
In turn, the manual nature of work within Excel can lead to human-made errors such as inaccurate data inputting, or system glitches that can result in shutdown and data loss. The issue of human error can’t be overstated. As much as Excel is a software program that offers high-value computing functionality, it fundamentally relies on manual data input, and therein lies the problem. Just think about how one or two mistakes magnified across one team can quickly cascade.
Given the criticality of data, and the crucial role it plays in CRE-based reporting and property tax management, having the right tools to counter human limitations can be invaluable. Case in point is itamlink by Rethink Solutions, a tool that standardizes data and gives form and repeatable structure to the property tax process. An overview of this application can be reviewed here.
An effective way to gauge differences between Excel and itamlink is to do a side-by-side comparison, where gaps are revealed. Once identified, we determine how those gaps can be effectively addressed.
The main risk in relying on spreadsheets concerns the issue of missing data. By their very nature, CRE portfolios contain numerous data points for each property, but the layers continue. A property may sit on multiple parcels or roll numbers. For each parcel/roll number, there are unique property tax assessments and bills. When property tax is managed strategically, the financial burden itself is only one component of the operational activity. For example, consider developing forecasts, reconciling accruals, determining appeal opportunities, managing appeals, and more. This only begins to scratch the surface.
As data gaps begin emerging, they can quickly compromise the integrity of the data presented in financial records and formal reports to key stakeholders including investors, portfolio managers, auditors, or, in the case of property tax, those charged with collecting taxes.
We can then see how the inability to nail down a single source of truth has a cascading effect and impacts data integrity, ultimately canceling out any benefits associated with using an electronic tool (i.e. Excel), if the process is still prone to human error.
Unless there is a specific automated and reliable mechanism for identifying data gaps (consider AI for instance), errors can easily be overlooked. Even with an extremely labor-intensive manual review process (that effectively negates the benefits of using a tool like spreadsheets in the first place) errors can still be missed.
It should be noted that programs like Excel have functions like VLOOKUP or XLOOKUP that are built to help identify specific data points. But these functionalities fail to provide ample warnings of missing data. Unless a program knows what to look for, finding it becomes near impossible.
Case Example: Missing Tax Bills
At scale, such problems are amplified. This is best illustrated through a hypothetical situation where a person is responsible for managing a 10-parcel portfolio. When skimming through records, she finds only eight tax bills. Not having all ten tax bills undoubtedly creates headaches but will be easily noticed. What if the portfolio consists of a hundred parcels? Five hundred? Would you notice a missing bill? Clearly, the larger the portfolio’s scale, the more magnified the problem.
Anyone who has ever worked with a large volume of data in one spreadsheet knows the peril of unformatted data and the assumptions spreadsheets are programmed to make. For example, if data is placed in a cell with a format like "12.34", there's a reasonable chance that the spreadsheet will interpret it as currency. The same goes for 1-2-22, which would be construed as a date.
This shortcut can be acceptable for casual use, but what happens if a particular parcel happens to be labelled with a numerical code? Maybe you have a number, 123, and a decimal indicating the kind of property it is such as .01, .02.
When new data is added to a spreadsheet, the old data is formatted correctly, but the new cells may imported or entered with different assumptions rather than carry forward formatting. One can carry formatting forward, but risk copying lines to copy the formatting, and then data can be left in place when it shouldn't be.
Data sanitation is a significant issue, and spreadsheets tend to prioritize convenience over accuracy.
The problem here extends beyond the spreadsheet assuming a particular kind of data is something it might not be. The real issue is when a user tries searching through information later, data in the wrong format simply won't show up. Not to mention, any analysis done using this data will reflect the errors in the source fields.
Managing a CRE portfolio rarely centers around a single spreadsheet. Instead, a portfolio manager likely has many spreadsheets on the go at any given time. Not only are these spreadsheets used for multiple purposes, but may or may not include historical data, and deploy several variations on the same data.
What happens if data needs to be changed within a given source? This can have rippling effects, given the number of spreadsheets requiring their data be updated too. This can be problematic for a property tax manager who needs to identify all the places and spreadsheets where this data may exist.
Another problem associated with import and export occurs when exported data is fed to external analysis tools. Does internal data have the formatting and organization that is compatible/usable with the external tool, or does it require manual data manipulation prior to transferring that information?
A key benefit of using a platform like itamlink is its ability to leverage numerous integrations built into the system. Not only is data access simplified with consistent formatting, but it also uses the power of APIs to perform data management, data synchronization, and import/export in real time.
There is an assumption by some that though errors might occur with data, there may be simple work-arounds that minimize risk. This can be problematic, as something as seemingly benign as one incorrect data point can in fact create significant losses which can lead to far-reaching repercussions.
Spreadsheets require access to data. But, how can the process of inputting and analyzing that data be efficiently managed?
The dominant option is manual data entry. Each time data requires updating, a user analyzes and identifies any data that has changed and updates every cell accordingly. This process populates the remainder of a user’s reports with derived data, so the results of the analysis can be seen on an as-needed basis. It works, but is fraught with significant drawbacks including:
Spreadsheet programs have mechanisms to address these problems. For example, Excel allows a user to use a website as a data source. This fix, though helpful, doesn’t always represent a definitive solution. Consider the following limitations:
The main theme here is an over-reliance on external resources inhibiting your ability to maintain quality control. Problems can quickly mount across spreadsheets with hundreds of data points across hundreds of parcels. The scale of this compellingly demonstrates how quickly the process can spin out of control.
A system like itamlink allows a user to select a single verifiable source of data on which all future data analysis is based. Itamlink verifies data formatting and maintains the integrity of that data regardless of data updates or how many layers of analysis you want to do using that original data. In turn, quicker analysis is facilitated, ensuring data doesn't fall out of sync across different reports using different data sources.
Now consider the impact of collaboration-based work routinely conducted across the organization. Several potential concerns regarding Excel emerge including:
Large CRE organizations typically have extensive portfolios and more complex financial and audit obligations. This requires more complex analytical models and an alternative approach. Under a traditional management framework, this requires more spreadsheets to be managed, which can be hampered by various factors as discussed above in addition to the computers’ capacity to edit them.
Once a spreadsheet gets too large and complex, many low-grade PCs and other devices simply lose their capacity to open and process the data. Such a situation can rapidly scale up hardware costs simply because data processing becomes overwhelming. Pushing spreadsheet software like Excel to its computing limits is beyond frustrating—it’s foolish, especially in the long term.
Ultimately, we need to efficiently view and use data to inform decision-making. For illustrative purposes, consider a user who wants to pull a report on the NOI for each property. For this to be done effectively, forecasts used often require incorporating several years’ worth of data.
The first challenge this leads to is identifying the derived data and reports most relevant to the task. If a user is unaware of what data is most relevant, they will have difficulty conceptualizing or structuring a report yielding any meaningful utility. It is one thing to manage property tax data. It is another to be an expert in what reporting is necessary.
Another challenge is identifying how to run a calculation using readily available data. It may be something as simple as an equation using the data contained in one spreadsheet. From there, it might also require pulling data from multiple sheets across multiple years or records using several nested equations. Given the volume of sources from which data is being pulled, consistency can easily be compromised, as the process becomes ripe for human error, and problems can likely emerge.
A third challenge involves developing those reports independently and from scratch. Though downloaded templates can be used to create reports, they must be connected to the proper data sources, and might not show a user exactly what is actually happening in the embedded formulas. Remember, a spreadsheet, at its core, is nothing more than tables of data; any formatting, display, labeling, are all entirely dependent on user input.
Even with a talented data analyst on your team who can identify what the appropriate reports are, using spreadsheets that analyst must also develop a logical framework to execute on those reports. If the reports exist, there’s also the question of “look and feel” in terms of an inadequate presentation layer. It can take additional formatting and effort to create something presentable to the public, investors, board, or other authorities.
Security comes in two forms: access (privacy and security) and redundancy (back-ups).
If all your data is stored in a single spreadsheet, upon which everything else is built, what happens if that spreadsheet breaks or becomes non-functional? What if the file becomes corrupted or is accidentally deleted? What if your spreadsheet is so complex you ca no longer isolate the error(s) that makes your spreadsheet prone to chronically crashing?
Good data retention policies often require file backups to be secured off site. If stored onsite, your backups can be lost too if, for example, office sprinklers damage multiple computers. Even with offsite safeguards in place, any work done since the backup operation was last performed must be repeated. Bear in mind, backups can also fail, and many an organization has been left floundering when they discover their backups haven't functioned in years.
Remember that simply putting a spreadsheet in a cloud storage system like Dropbox or Google Drive doesn't always translate to a suitable backup. Such a file can still be corrupted, and the cloud system can easily revert to the corrupted version. Additionally, this doesn't protect against accidental deletion.
A system like itamlink ensures backups are maintain with the highest security and redundancy standards—think bank-grade cloud-based storage. Data is kept secure and redundant in case of failure of any individual component.
A spreadsheet stored locally on an in-office computer is susceptible to data security breaches, just as those stored in the cloud. Compromised accounts, networks hacks and other security vulnerabilities abound. Even if a spreadsheet is password-protected, there are logical and relatively uncomplicated workarounds to gain access.
In contrast, itamlink offers top-of-the-line data security in both SOC1 and SOC2 audits. With enterprise-grade security protecting your data, it's as impenetrable to unauthorized access as technically possible.
For years spreadsheets have played an integral role in managing various aspects of data within organizations. Originally, these spreadsheets were built to perform tasks such as managing simple data management and provide small-scale reporting. Additionally, they’ve always been great for ad hoc modeling and data manipulation.
But as the complexities within an organization grow (often exponentially), the question becomes how well are spreadsheets equipped to seamlessly adapt to usage of this magnitude? Often, these documents are simply not the best option for large-scale reporting, CRE portfolio management, forecasting and analysis, as well as data management within areas such as property tax.
Throughout this blog, we have provided examples highlighting seemingly benign inefficiencies, but 20 years in property tax software has shown Rethink Solutions that such risks cannot be underestimated. Consider a property tax bill that never arrives, and you don’t realize is missing. Consider that the reminders and penalty warnings don’t reach you either. It is not unheard of for oversights like this to go unnoticed until the property tax jurisdiction puts a lien on your property and changes your locks. In this example, a missed property tax bill can lead to a stain against your brand image. When you use software like itamlink, any missing tax documents and data are easily flagged in the course of day-to-day operations, and nightmares like this can be wholly avoided.
It’s the cumulation of risk avoidance anecdotes such as these highlighting another aspect of itamlink’s overall utility. The risks associated with growth can effectively be neutralized with a professional tool like itamlink, which does the work while minimizing the effects of human error. Instead, you get pre-built workflows and reports, intelligent analysis, identification of data gaps, and seamless integration. itamlink is a tool that can transform a reactive process into something proactive—and more importantly—strategic.
Subscribe to our newsletter for the latest property tax management tips, tools, and resources right to your inbox