Dear Analyst Episode #114: How a small real estate investment company uses modern data and cloud tools to make data-driven decisions
Podcast: Play in new window | Download
Subscribe: Spotify | TuneIn | RSS
When you think of data pipelines, data warehouses, and ETL tools, you may be thinking about some large enterprise that is collecting and processing data from IoT devices or from a mobile app. These companies are using tools from AWS and Google Cloud to build these complex workflows to get data to where it needs to be. In this episode, you’ll hear about a relatively small company who is using modern cloud and data tools rivaling these aforementioned enterprises. Elite Development Group is a real estate investment and construction company based in York, Pennsylvania and is less than 50 employees. Doug Walters is the Director of Strategy and Technology and Elite and he discusses how data at Elite was trapped in Quickbooks and in their various tools like property management software. He spearheaded projects to build data connectors to aggregate various data sources to help build a modern data stack to help make real estate decisions.
Data is stuck in silos
Elite Development Group consists of a few divisions: HVAC, home performance, energy efficiency, etc. All the typical functions you’d expect a real estate company to have. Doug first started working in IT support and realized their company didn’t have easy access to their data to make data-driven decisions. You’ve probably heard this phrase over and over again:
Data is trapped in silos.
You buy some off-the-shelf software (in this case property management) that is meant for one specific use case. Over time, that data needs to be merged with your customers data or sales data. You end up exporting the data in these silos to CSVs to further combine these data sources down the line. For Elite, data was trapped in property management software, Quickbooks, you name it.
Starting the process to export data
After doing a survey of their tools, Doug realized that there weren’t many APIs to easily extract data from the source. So he helped set up data scrapers to get data off of the HTML pages. He also used tools like Docparser to extract data from Word docs and PDFs.
Most data was either in XLS or CSV format, so Doug was able to set up an automated system where every night he’d get an email with a CSV dump from their property management system. This data then ended up in a Google Sheet for everyone to see and collaborate on. After doing this with property management, Doug started exploring getting the data out from their work order tracking system.
Creating accurate construction cost estimates
One activity Doug wanted to shine the data lens on was cost estimates as they relate to construction. Hitting budgets is a big part of the construction process. You have multiple expenditures from a job and each job needs to have a specific estimate tied to it. This could all be done in Excel or Google Sheets, but given the importance of this data, Doug decided to create something more durable. He created an internal database where each cost estimate and a specific Estimate ID
. A unique identifier to give to a cost estimate.
Since Elite uses Quickbooks for their accounting, each project had to be tied to a unique Estimate ID established previously. Then each work order had a unique Work Order ID
. Now Elite is able to run reports on all their projects to see what the cost estimates and actual expenditures were for a job. Now they could do a traditional budget to actual variance analysis.
The result? Project teams could start to see when they were about to hit their budgets in real time.
More importantly, this started Doug down a journey of seeing how far he could automate the data extraction and reporting for his company. With the current implementation, the data could only get refreshed every 24 hours. He eventually set up the system so that any user could click a button to refresh a report. The data workflow started from exporting data into Excel and Google Sheets and into complex data connectors and using software for business intelligence.
Income lost due to vacancy metric
When Elite prioritizes which projects to work on, they look at a metric called “income lost due to vacancy.” Without the different data connectors and systems Doug help set up, this metric wouldn’t exist. This metric essentially helps a property owner figure out how much income they are losing due to vacancies.
When looking at a portfolio of properties to improve, Elite can use this metric to figure out which project would have more high-rent units available. Previously, they would have to rely on intuition to figure out where to invest more time and money into projects.
Building out the data stack
The list of tools Elite uses to extract and process data rivals that of large enterprises. Here is a rundown of Elite’s data stack:
- Fivetran for data loading and extraction
- AWS Redshift as the data warehouse
- Google Cloud functions to run one-off tasks
- dbt for transformation and for pushing data into a datamart
- Sisense to create actionable insights
There are multiple data connectors involved for doing the ETL process as well. With all these modern tools, Elite is able to get the most up-to-date data every 5-15 minutes.
As Elite went through this data journey, Doug and his team started to ask some of their vendors to develop an API so they could get more data out. Their data vendors would push back and say they’ve never seen these requests from such a small company. Typically these data requests are coming from their large customers which shows how deeply Doug’s team has thought about automating their data workflows.
Advice for small companies working with big data
Doug gives some practical advice on how to use some of these tools that are supposedly meant for large enterprises. The first thing is to experiment with spreadsheets before diving deep into a complicated workflow. Doing your due diligence in a spreadsheet is low stakes and helps you uncover all the various relationships between your data.
In terms of learning how to use these tools, Doug mentioned that most of these vendors have their own free or paid workshops and tutorials. I’m always surprised by how much general data training these vendors provide that many not even be about their software. You can learn about databases, SQL, and data analysis from these vendors.
At a high level, Doug says that the data you collect and visualize needs to be tied to some business strategy. These overall goals might include increasing revenue, increasing customers satisfaction, or ensuring your employees are developing new skills. At Elite, the data has allowed the team to look at their portfolio of real estate at the 30,000-foot level all the way down to individual transactions. Data is actually helping them solve real business problems.
And one last plug for Google Sheets: Doug talked about how you would have to hire someone who was an “Excel guru” or a data analyst to help you decipher your Google Sheets files. Now Google Sheets has become so robust, extensible, and–dare I say–easy to use that anyone in the company can pick it up and mold it to their needs. No one ever gets fired for using a Google Sheet 😉.
Other Podcasts & Blog Posts
No other podcasts mentioned in this episode!
Trackbacks/Pingbacks
[…] the data science team has so much data at their disposal to make data-driven decisions, Akos talks a bit about how the team also uses intuition for making decisions as well. In an […]