This REAA Zoom class will be on Thursday, March 18 from 6:30-8:30 PM and is approved for 2 hours of continuing education in California (only). Brian Melsheimer and I will provide a walk through of using Microsoft Excel for residential appraisal analysis. Given the limitations of our time in the class, this will be a relatively quick run through of some basic skills including creating a scatter graph, linear trendline analysis to determine time and GLA adjustments, contrasting/grouped pairs analysis, and basic Excel setup tips. We’ll also discuss online tools for appraisers.
The FMHPI is your friend. The index reports monthly change in house prices for national, state, and metropolitan statistical areas (MSAs). The data series starts with January, 1975 and usually runs two to three months behind the current date. The data is available in Excel or text files with seasonal adjustments or non-seasonally adjusted.
As an appraiser, I use the FMHPI in several ways for current work:
Regional market trends. I use this data set if I want to show how house prices are trending in my home regions of Sacramento or Fairfield/Vallejo.
Market areas with little activity. I cover Stonyford, a small community on the eastern edge of the coastal mountains with very few sales. I trend surrounding MSAs to show regional activity and offer this as the best evidence available for market trends in Stonyford.
Market areas with significant non-conformity. I cover the Sacramento Delta area, a place with a mix of small towns and small acreage residential properties plus some high quality mansions. Market trends are often influenced by the mix of sales and frequently are not reliable so I use the Sacramento MSA instead.
Where the FMHPI really shines is with retrospective work. My local MLS has data going back to 1998 for my county. That’s great for the past 23 years but what if a client wants a date of value before then? In the past two years I have had requests for appraisals dated in the 1980s and early 1990s on multiple occasions. Each time I used the FMHPI to determine market trends and calculate time adjustments.
Normally I download the data into Excel and generate the chart I want, discussed below. However, you can use the pre-built chart and table tool. This is the default chart:
FMHPI default chart
You can add multiple states or MSAs for comparison by clicking the blue button:
FMHPI with Sacramento MSA
You can also adjust the time period on the chart by sliding the time frame below:
FMHPI Date Range Adjusted
Once you have the chart you want, you can print or download it as a PNG file to stick in your report or work file:
Print or download a PNG graphics file
This tool is versatile and includes a table view showing percentage change.
However, I normally download the data into my Excel work file and create my own graphs. The FMHPI MSAs Excel workbooks are two sheets with MSAs A-L on Sheet 1 and M-Z on Sheet 2. Column A is the Month with MSAs in columns in alphabetical order like this:
Month on the far left column, MSAs in alphabetical order
Here’s my simple process to deal with the way dates are formatted.
Click the link
Download the Seasonally Adjusted MSAs Index:
Click the link
Open the file and copy the Month Column into your Excel work file plus the columns of data for any MSAs you want to analyze.
Leave a column to the immediate left of the MSAs you want to work with to fill in Excel-readable dates. I usually label it “Date Sold.”
Then fill in the Date Sold column matching the dates in the FMHPI Month Column. 1975M01 is 1/1/75, 1975M02 is 2/1/75, etc. Fortunately, once you have the pattern, you can use Excel to autofill to the bottom:
Copy the columns, leaving space for Date Sold, autofill date sold
You can now use this data in Excel to make line graphs or scatter graphs. If you don’t need all 46 years of data in your chart, graph only what you need. Here’s an example of how I would prepare the Sacramento MSA data for the past two years and past 10 years plus the graphs:
Data copied from original columns for ease of creating graphs
And here are the resulting graphs:
Line Graph Example
I hope you can see the value of this data. Let me know in the comments if you have questions or are already using this data.
“You will sometimes hear statistics people talk about a ‘composition effect,’ which just means that if you are comparing a group over time, you need to beware of the possibility that the composition of the group is changing.” Why would a residential real estate appraiser care about composition effects?
Much of my time as a residential appraiser is spent determining trends in real estate markets. Every day I create charts like the one below to describe the markets in my reports.
These trendlines assume that the group of homes sold do not differ significantly over time. In most cases, this assumption is reasonable. Sometimes this assumption is false.
For example, the global pandemic has affected buyer tastes. My friend Ryan Lundquist wrote about this recently. After six months of being cooped up from the Covid-19 pandemic, buyers want a larger home. Here’s his chart showing the trends in the Sacramento region:
The average size of a home sold in the Sacramento region has increased 100 sf over the past six months. House size is the primary driver for value, so if all other factors are the same, the average price for that market will increase.
BUT IT’S AN ARTIFICIAL GAIN BECAUSE ANY GIVEN HOME OF THE SAME SIZE WOULD SELL FOR THE SAME PRICE!
Similarly, if homes decrease in size over time with no other changes, that would cause the average price to decline with no impact on individual house prices.
Here’s what I do to have a better understanding of market trends:
I trend sale price of homes sold over time,
I trend price per square foot of homes sold over time, and
I trend home gross living area over time.
Why trend price per square foot? This trick takes into account some variation in size and in conforming areas, can increase the reliability of the trend analysis. However, significant changes in size will influence the PSF trendline.
I see this frequently in Winters, California. Winters is a relatively small city of about 10,000 people located on the western edge of the Sacramento Valley not far from Davis. Below are the three graphs for Winters sales from 1/1/18 to 10/1/19 (all data from Metrolist MLS).
Prices are clearly increasing in the top graph.
Prices are essentially stable when trending price per square foot for the same sales. Why?
Here’s the third chart showing size of homes over time:
The size of a home sold in Winters during this time period increased almost 1 sf per day.
Because of the math,
an increase in the average size of homes sold will push down the price per square foot trendline but will push up the sale price trendline, and
a decrease in the average size of homes sold will push up the price per square foot trendline but will push down the sale price trendline.
Here are some other examples that may cause a market to appear to change over time without a real change in market conditions:
Average lot size changes, especially for small acreage residential properties
Average age of homes changes, especially when new construction ramps up or ramps down
Better quality homes come to market
An outlier, such as a tear down or the biggest home in the county, pulls the trendline out of true
A small, heterogeneous market susceptible to change from the latest sale
A good habit is to take a look at your data. Do you see changes in your data or is it relatively similar over time?
When I do run into composition effects, such as sale price going up and price per square foot going down, I graph both together on the same graph and explain why there is a difference. I then reconcile.
“The sale price trendline is increasing while the sale price per square foot trendline is decreasing. The average size of homes have increased during this time period, skewing the sale price trendline up and the price per square foot trendline down. I conclude that this market has been relatively stable during this time.”
Hope this adds to your understanding of residential real estate markets.
It was my pleasure to speak at the Sacramento chapter of the Real Estate Appraisers Association last night at the Story of Value class with my good friend Ryan Lundquist. I discussed ways to explain markets in residential real estate appraisals and focused on using graphs and was surprised to see that maybe half of the crowd didn’t include graphs in their reports. This post is the first to offer advice and instructions on how to create meaningful graphs for residential appraisers.
Four appraisers out of 50 in the room reported using histograms. The histogram is a great tool for analyzing residential real estate markets that all appraisers should use.
What is a histogram?
For our needs, a histogram is a graph that shows the distribution of one continuous variable. The histogram splits the variable into equal-sized bins and counts the number of occurrences. It works well for important residential real estate variables like gross living area, lot size, age, and sale price.
Bin size is key to creating a useful histogram. Bins too wide loses meaning as your data is clumped together. Bins too narrow spreads the histogram out too much.
The graph above shows the sales of homes in a market area with homes of certain sizes. There is one sale less than 1000 sf and one more than 3500 sf. The most frequent size of home sold recently is around 2000 sf with the bulk of the homes in the 1400 sf to 2000 sf range.
Every report prepared for a lender asks the question “Is the subject conforming?” At a glance, any home sold in this neighborhood with between 1300 sf and 2600 sf is reasonably conforming in size. There are no sales in the 4000 sf to 5000 sf range so any homes in the neighborhood of that size are likely non-conforming. The two extreme sales, at 800 sf and 5000 sf, are unusual and likely non-conforming.
This next histogram examines the frequency of sale price in the market area. The most frequent sale price is in the $240,000 to $280,000 range with $360,000 to $400,000 the second most frequent range. A home in contract at $375,000 is fairly typical. A contract price of $700,000 is very unusual and is indicative of a non-conforming home.
The first two graphs were generated using ggplot2 in RStudio. Here’s an example from Excel 365 showing the year built for homes of sales in a market:
Most homes sold in this market area (Placerville, CA small residential acreage) were built in 1970s and 1980s. A couple of homes were built in the 1800s and there are a couple of newly built homes.
The lender forms used by appraisers ask for similar information in a table format:
Which describes the market better, the two histograms or the table?
Make A Histogram in Excel
Here’s how to use the latest version of Excel to make a histogram. This page has instructions for the latest version and older versions.
Start with your data in an Excel workbook with the top row field names and rows below sales data.
Starting with the field name, select the data to generate the histogram (ctl + shift + down arrow will select all consecutive data down)
Press F11 to insert a graph. Then chose Change Chart Type
Select Histogram then press Ok
You’ve made your first histogram!
However, it’s really ugly. Standard formatting for histograms is to have no space between the bins (columns). To fix that, double left click on one of the bins to activate the Data Series editor. Select the bars to active Series Option
Change the Gap Width to 0%. Notice how the columns come together. If you like having gaps between the columns, set the Gap Width to 6%.
To change the bin width, double left click on the x-axis labels (GLA in this case). Using the Format Axis Axis options, select the Bin Width control and type in what you want. Play with it until you’re happy with the shape of your histogram.
Excel defaulted to a bin width of 370 sf. Below is what the histogram looks like with bin width equal to 100 sf:
Here’s bin width equal to 500:
Here’s bin width equal to 200:
Which one appears most useful to you?
I use histograms to understand some aspect of a market. How big are the homes? When were the homes built? How big are the lots in the neighborhood? What do homes sell for in the market area?
Then consider where the subject fits in the market. Is it bigger than typical? If so, you have support for concluding market value is higher than typical. Is it smaller? Well, now you can show a reason why the price is lower.
Let’s consider the histogram above. The subject is one of the larger homes in the neighborhood but still relatively common in size. I would expect, without knowing anything else, that the subject’s market value is on the higher side for the neighborhood but with a reasonable number of homes larger than the subject. Take a look at the graph below.
What if the subject was one of the largest homes in the neighborhood? The subject’s market value is likely on the upper end of the neighborhood range. Also, there are fewer directly competitive sales, implying market value may be less reliable in this market area than for a smaller home. Now let’s look at an extreme case.
I pity the appraiser asked to appraise a 6500 sf house in this market. However, you do have sales either smaller or larger. Here’s the time to really open your eyes to what is a competitive sale. Throw this graph in your report and your client will immediately see your data difficulty.
After you arrive at market value and as part of your reconciliation, consider using a histogram to support your market value.
“The subject is newer than typical, above average quality custom home on a larger than typical lot. As shown above, the subject’s market value is on the higher side for the greater market area, as expected based on its superior characteristics.”
I hope you agree that histograms can be a powerful tool for appraisers.
Ways to use histograms:
Exploratory analysis to understand characteristics of a market area
Assist in determining reasonable search criteria for sales comparison
Visual representation of the subject’s position in a market area
Support for market value conclusions
I learned about histograms from George Dell. Thanks George. Get smart by taking his Stats, Graphs, and Data Science classes or at the very least, sign up for his blog. More info on George’s website.
Postscript: I am working towards moving away from Excel to using R, the data analysis package. I’ll link to the R code used for the two graphs as a separate post/update soon.
Everyone knows what Microsoft Excel is, right? Either you have a copy that came with your PC or you’re on the Office 365 subscription model at $69/year for a personal copy $99/year for 5 users (my subscription of choice). Money flows into Microsoft coffers, satisfying shareholders and most of the greater Seattle area given Microsoft’s reach. Make more Seattleites happy by ordering your copy through Amazon!
R is very different (free). It is open source software available under a public license and is maintained by a group of volunteers (free). Get your free copy here.
R on its own is usable. However, it was designed from the ground up to allow for additions to make it more useful.
RStudio, an open source integrated development environment for R, makes using R much easier for folks like me who are not full-time programmers (also free). RStudio sits on top of R and extends usability significantly. RStudio offers the same basic terminal R does but also gives you additional really useful windows and information. I’ll discuss RStudio in the future but if you can’t wait, here’s a link with more information about RStudio.
You can extend the usability of R by adding packages. Packages are bundles of R code with explanation and data examples. Data Camp has beginner’s guide for R Packages here. Managing packages is one of RStudio’s strengths, making it easy to install packages. These are free too.
ggplot is a package for creating graphics and should be the first package you download. Two more packages of interest to appraisers just getting into R are tidyverse, a collection of R packages for data science, and rmarkdown, a package for adding R output to documents. You can learn more about all three here.
To summarize, download for free R, RStudio, and the ggplot2, tidyverse, and rmarkdown packages. I’ll talk more about packages in the future as I explore R’s functionality.
This is the second post in a series describing my journey to move my residential appraisal business workflow from Microsoft Excel to R. Last time out, I made the case for why I’m making the change. This post will be a catalog of the ways I use Excel today to serve as a guide for where I need to go.
I use Excel a lot. Each appraisal I work on, I start a separate Excel file. Us appraisers are required to retain for each report a work file that supports our conclusions and allows for someone else auditing us to understand what we did.
Here’s what I do with Excel today:
Store my Neighborhood Market Data downloaded from MLS. I grab all sales in a competitive market area for at least five years back, sometimes ten years. This goes into a Market worksheet.
Store my Competitive Market Data downloaded from MLS. I grab all potential competitive comparable sales and listings going back at least 12 months but frequently further. This goes into a Comparables worksheet.
Create Neighborhood Price Per Square Foot and/or Sale Price trendlines from the Market data. If I have questions about trends, I’ll also look at changes in floor area over time. I create Charts for each data run. I then spend time formatting and labeling so I can include the charts in my reports.
Create Pivot Table summaries of the Neighborhood Market Data. My normal summary table includes all sales summarized by month in a neighborhood with homes sold, mean Days on Market, Low Price, High Price, Mean Sale Price, and Mean PSF. I use a template and replace my old market data with the new data, then refresh, so right now this is really fast in Excel. However, I can’t do Median summaries in Pivot Tables easily, an issue that I expect to be able to handle in R. For most of my reports, I include this pivot table summary.
I use the pivot table summaries to create a column chart showing 12 month change in mean PSF and/or mean sale price as another tool to understand and report changes in my neighborhoods. This is especially important in seasonal markets like Davis, California, where home selling revolves around the university schedule. I’ll include this chart in every Davis appraisal and in other appraisals where necessary to explain market trends.
I occasionally create histograms to show the shape of a market with regard to one variable (sale price and floor area primarily-great for showing where the subject lies in relation to the rest of the market). I’ve seen an example how easy it is to create a histogram in R. I have high expectations that R will be an improvement over Excel for histograms.
Create PSF and Sale Price scatter graphs of competitive sales. I use the trendline coefficients to determine daily price adjustments for market change. I’ll also look at floor area over time to see if my comparables are changing over time or not to help understand what my market is really doing. I include the scatter graphs in my reports so clients and intended users can understand the subject’s sub-market.
I use pivot tables linked to my comparable sales data for contrasting one variable. For example, I’ll use pivot tables to examine the difference between homes sold with pools and without pools, a significant factor in the Sacramento Valley. I’ll create a table that shows how many comparables sold with pools vs. without pools, the difference in mean sale price, the difference in mean PSF, and to understand my data, mean floor area and mean year built to see if I’m dealing with an apples-to-apples vs. apples-to-oranges comparison. If the homes with pools are relatively similar in size and age as the ones without pools, my adjustment is more likely to be strictly the pool. If homes with pools tend to be bigger than without, I have to consider covariance as part of the explanation for the differences noted. (Covariance is a significant issue in residential real estate markets)
I have a Calculators page that I use for random modeling and calculations I need to do by hand. The most significant calculators I have here that I’ll need to move to R are one I use to do the math to figure out the time adjustments for comparables plus others for modeling lot size adjustments. These should be painless to move over to R.
This is the bulk of what I do with Excel today. As I start to shift this workflow over to R, I plan to go into more detail about the special or not so special challenges I encounter. I also have high expectations that R will inspire me to come up with new ways of analyzing and presenting my data.
Reminder for appraiser readers in particular: R is a tool. Excel is a tool. Most of what I plan to discuss in this series is about changing tools. Occasionally, I’ll talk about modeling decisions (like covariance above). However, all of what I’m doing is rooted in the Stats, Graphs, and Data Sciences classes I’ve taken. You need to understand the theory so you can make informed decisions about your modeling choices.
Take classes from George, he’s very willing to help. https://georgedell.com/
I just completed an appraisal of a home for a refinance where the subject had sold eight months prior to date of value. I included the prior sale of the subject as Comp#4 and had two very recent, reasonably similar comparable sales that I included in the sales comparison grid. The subject is located in a suburban neighborhood built out in the 1960s-1970s on a ridge with some homes featuring city light views. Many recent sales feature remodeling or recent updating while others are relatively original.
In most suburban neighborhoods, using sale price per square foot (PSF) is the best way to track market trends because using PSF reduces the impact of changes in floor area in your sales comparable data. Generally speaking, the sale price model is subject to variations in floor area in this case and is not the best model. So, following my normal practice, I used the PSF linear regression model to determine time adjustments to comparables.
The PSF linear regression formula is:
Slope of the Trendline * Subject’s GLA = Daily Time Adjustment
PSF Linear Regression Model
So, in this case:
0.0992 * 1794 sf = $178/day
PSF Model Time Adjustments
Once I applied these time adjustments, my range of value was from $447,000 to $466,000 with an indicated value of $460,000. However, the most recent sale indicated $450,000 at most as did homes in contract competitive with the subject. Something wasn’t right.
So I tried a sale price linear regression model instead.
Sale Price Linear Regression Model:
Slope of the Trendline = Daily Time Adjustment
Sale Price Linear Regression Model
In this case, indicated daily time adjustment is $117/day or
Sale Price Model Time Adjustments
Indicated range of value for the subject ranged from $443,000 to $451,000, in line with the most recent sale and competitive homes in contract.
In this case, the Sale Price Linear Regression Model explained my data better than the PSF model.
I’ll have posts explaining more about time and real estate in the near future and will be teaching a class this July in Sacramento. Or you can learn straight from the national expert, George Dell. Sign up for his website and classes here. Thanks George for sharing with the rest of us.
Why do you think the linear model worked better in this case?
Time is your friend when valuing homes, especially when analyzing unusual properties. Sometimes when appraising a property, you find six model match sales within three months of date of value in a very narrow range, the value of the subject is obvious, and you’re done. Next please.
Other times, the subject is located on a busy street, is larger than any sale in town in the past 12 months, and has a moat. How do you deal with this nightmare? This is when time is your friend.
Expand your search parameters back in timeuntil you find a competitive sale that brackets or equals the subject’s unusual feature. That sale from three years ago that was bigger than the subject is a comparable you can use to value it today. Or better yet, use the prior sale of the subject from five years ago so you have a sense of the value impact of that moat. Sometimes the best comp is the prior sale of the subject.
If you have time adjustments in your tool box, you’ll have access to a wider range of comparables than someone who doesn’t.
I’m developing a time adjustments class for REAA Sacramento that I plan to teach in July. I’ll talk about time adjustments more on this blog but will leave you with a couple of ways to determine time adjustments, especially when you want to bring forward in time a dated sale to use as a comparable.
Gary Christensen has a video here that discusses how to do a linear time adjustment in Excel. This works well when you have a set of comparable sales that you can analyze and the market has been relatively consistent. I’ll discuss this in detail in the near future.
Another simple way that works well for comparables from 2+ years ago is to look at the percent change in average (median/mean) sale price or price per square foot for the subject’s market area compared to the date of value. That percent difference-that’s your time adjustment. Apply the percentage to the comparable’s sale price and your old sale is ready to use as a comp. Most MLS systems have market analysis tools that will let you look up an average price in the past. In the worst case, the Freddie Mac House Price Index on the MSA level can help. Or you can use pivot tables to track changes in market areas (more to come soon about pivot tables).
I recently appraised a home in a small project where the most recent sale occurred nine years prior to date of value. I used the Freddie Mac House Price Index to determine a 29% negative time adjustment for this sale, subtracted the dollar amount, and the dated sale was ready for comparison.
Freddie Mac Home Price Index Calculation
Freddie Mac Index Value
What is the most time between a date of value and a date of sale for a comparable in one of your valuations?
I regularly appraise in Dixon, California, a city of approximately 20,000 located in Northeast Solano County. Dixon is an interesting market because it is on the outer fringes of the San Francisco Bay Area but crosses over into the Sacramento region. Prices fell through the floor during the downturn, hitting bottom in late 2011, but have increased significantly since. The past several months trends have been hidden somewhat by seasonality and a lump of somewhat below average homes selling in the winter. Here’s what I saw this week:
Sale Price Trendline for all homes sold in Dixon, CA since 1/1/16
With a little more data to work with, looks like Dixon is increasing. With about two weeks of inventory at present, expect more increases in the short term.