Category Archives: Appraisal Techniques

Detrimental Crime Scene Discounts

Home of the Suspected East Area Rapist

Recently Joseph James Angelo was arrested outside of Sacramento and was accused of being the East Area Rapist. The East Area Rapist terrorized California in the 1970s and committed more than 50 rapes and 12 murders before disappearing more than 30 years ago. My friend Ryan Lundquist started a poll and conversation on his blog: What discount would you expect if the East Area Rapist’s house came on the market?

The results are interesting. Most respondents were in the 0-10% and 10-20% brackets. I was in the 0-10% bracket based on the one time I’ve worked on a similar problem.  Several years ago I was completing an appraisal on a house for a purchase in one of my markets and I noticed a weird note in the listing. “Blessed by a deacon.” What the heck did that mean?

I called the listing agent, a friend of mine, and asked her what she meant by that. Turns out there was a murder on the site within the past six months.  Would have been nice if she’d let me know when I scheduled the appointment that, oh, by the way, there was a murder at the subject….

I frantically called the lender to warn them that a murder had occurred at the subject in the past six months, that I would need time to analyze this new evidence, and that I needed more money for the report because of the extra due diligence. I called my mentor to get advice on how to deal with this and to see if he had any data (nope). I then searched MLS over the past 10 years but for some reason, listing agents don’t normally advertise “recent murder here” when trying to sell homes so struck out again. No one at the local Realtor meeting could remember any sales of homes after a murder or similar circumstance either. One of my comparables, however, had a death by natural causes within six months of date of sale.

So after a bunch of due diligence, I had jack squat for data. I took a step back. This was an entry tier home at a time where inventory was low in a relatively safe neighborhood where the murder was unlikely to occur again. Three full-price offers were received for the subject and all three potential buyers were aware of the home’s history. Was there a discount because of the murder? My best evidence, the three full-price offers, showed little to no market reaction from the murder. I discussed my research in my report and concluded no market reaction and sent it in. The purchase closed less than a month later.

This is not the exactly same situation as if the East Area Rapist’s house was on the market. First, no reports to date suggest that crimes were committed at the accused’s house while the house I appraised was the site of a murder. Second, the murder at my subject’s property was one off with little news coverage outside of the community where it occurred.  The East Area Rapist is notoriously known throughout California, if not the US, especially for those of age at the time of his crimes. A better but not perfect model might be Dorothea Puente, the landlord in Sacramento who murdered at least seven people and buried them in the backyard. Ryan plots the sales of her duplex on his poll results post.

Tony Bizjak, the real estate writer for the Sacramento Bee, liked Ryan’s post enough to turn it into an article and quoted me for the story.

p.s. Randall Bell, PhD, MAI is the national expert on diminution in value and determining crime scene discounts. His book Real Estate Damages is highly recommended. He thinks the discount will be closer to 25% if the home of the East Area Rapist hits the market.

 

Moving from Excel to R-What Do I Need To Replicate in R?

This is the second post in a series describing my journey to move my residential appraisal business workflow from Microsoft Excel to R. Last time out, I made the case for why I’m making the change. This post will be a catalog of the ways I use Excel today to serve as a guide for where I need to go.

I use Excel a lot. Each appraisal I work on, I start a separate Excel file. Us appraisers are required to retain for each report a work file that supports our conclusions and allows for someone else auditing us to understand what we did.

Here’s what I do with Excel today:

  • Store my Neighborhood Market Data downloaded from MLS. I grab all sales in a competitive market area for at least five years back, sometimes ten years. This goes into a Market worksheet.
  • Store my Competitive Market Data downloaded from MLS. I grab all potential competitive comparable sales and listings going back at least 12 months but frequently further. This goes into a Comparables worksheet.
  • Create Neighborhood Price Per Square Foot and/or Sale Price trendlines from the Market data. If I have questions about trends, I’ll also look at changes in floor area over time. I create Charts for each data run. I then spend time formatting and labeling so I can include the charts in my reports.
  • Create Pivot Table summaries of the Neighborhood Market Data. My normal summary table includes all sales summarized by month in a neighborhood with homes sold, mean Days on Market, Low Price, High Price, Mean Sale Price, and Mean PSF. I use a template and replace my old market data with the new data, then refresh, so right now this is really fast in Excel. However, I can’t do Median summaries in Pivot Tables easily, an issue that I expect to be able to handle in R. For most of my reports, I include this pivot table summary.
  • I use the pivot table summaries to create a column chart showing 12 month change in mean PSF and/or mean sale price as another tool to understand and report changes in my neighborhoods. This is especially important in seasonal markets like Davis, California, where home selling revolves around the university schedule. I’ll include this chart in every Davis appraisal and in other appraisals where necessary to explain market trends.
  • I occasionally create histograms to show the shape of a market with regard to one variable (sale price and floor area primarily-great for showing where the subject lies in relation to the rest of the market). I’ve seen an example how easy it is to create a histogram in R. I have high expectations that R will be an improvement over Excel for histograms.
  • Create PSF and Sale Price scatter graphs of competitive sales. I use the trendline coefficients to determine daily price adjustments for market change. I’ll also look at floor area over time to see if my comparables are changing over time or not to help understand what my market is really doing. I include the scatter graphs in my reports so clients and intended users can understand the subject’s sub-market.
  • I use pivot tables linked to my comparable sales data for contrasting one variable. For example, I’ll use pivot tables to examine the difference between homes sold with pools and without pools, a significant factor in the Sacramento Valley. I’ll create a table that shows how many comparables sold with pools vs. without pools, the difference in mean sale price, the difference in mean PSF, and to understand my data, mean floor area and mean year built to see if I’m dealing with an apples-to-apples vs. apples-to-oranges comparison. If the homes with pools are relatively similar in size and age as the ones without pools, my adjustment is more likely to be strictly the pool. If homes with pools tend to be bigger than without, I have to consider covariance as part of the explanation for the differences noted. (Covariance is a significant issue in residential real estate markets)
  • I have a Calculators page that I use for random modeling and calculations I need to do by hand. The most significant calculators I have here that I’ll need to move to R are one I use to do the math to figure out the time adjustments for comparables plus others for modeling lot size adjustments. These should be painless to move over to R.

This is the bulk of what I do with Excel today. As I start to shift this workflow over to R, I plan to go into more detail about the special or not so special challenges I encounter. I also have high expectations that R will inspire me to come up with new ways of analyzing and presenting my data.

Reminder for appraiser readers in particular: R is a tool. Excel is a tool. Most of what I plan to discuss in this series is about changing tools. Occasionally, I’ll talk about modeling decisions (like covariance above). However, all of what I’m doing is rooted in the Stats, Graphs, and Data Sciences classes I’ve taken. You need to understand the theory so you can make informed decisions about your modeling choices.

Take classes from George, he’s very willing to help. https://georgedell.com/

Why I’m Switching to R

I’ve come to rely heavily on Microsoft Excel over the years to do my work as a residential appraiser. So much so that I teach classes to other appraisers on how to use Excel in their work. However, after taking George Dell’s first R for Appraisers class recently, I’ve decided to completely revise my workflow and replace what I do in Excel as much as possible with R.

What’s R? might be your first question. R is a free data analytics software package used widely for data analysis. Most university economics programs teach with R these days. Here’s the official description (link). You can download a free copy here.

However, if you’re going to use R, you need to use RStudio, the free integrated development environment for R. It provides a way of seeing more and doing more with the basic R programming language and really extends what you can do with R. Free copy and more information here.

Why would someone who has invested heavily in developing skills in Excel move to a brand new software package? Here are my reasons:

  • Data Analytics vs. Spreadsheets Data Retention-R is software designed for data analytics from the ground up. That’s what appraisers do, relatively specialized data analytics. Spreadsheets were designed to replace paper ledgers. You can scribble all over a sheet. If you’re not careful, you’re very likely to write on your numbers and make a mess. This is a problem if you’re trying to preserve your data in the future, say if you need to maintain workfiles like appraisers are legally required to do. R solves this data retention issue.
  • Reusable Processes-R is designed from the ground up to be reusable. Drop your data in and get your results. I’ve done a lot to get Excel to work that way for me but I still do a ton of manual processes each time I work on an appraisal. Once I know what I’m doing in R, I’ll have a lot fewer manual processes to deal with. It will be easy for someone else to audit my analysis for appraising.
  • Superior Analysis-I use pivot tables a lot in Excel. In fact, I teach a class on using pivot tables for appraisers. The big drawback with Excel pivot tables is that I can’t include median values as part of my summaries without a lot of work (coding or buying someone else’s product). This is not an issue in R.
  • Personal Growth-R gives me an opportunity to learn new ways to analyze data, the underlying job function that has given me the most satisfaction throughout my career. I’m excited to learn new ways to do my job better and I expect big changes once I’ve completed the move to R. Also, if this appraising gig doesn’t work out, with R I’ll have a job skill in demand in other industries.

I plan to document my migration from Excel to R here. I’ll share resources I find useful and will discuss issues I run into. I also plan to describe the benefits and drawbacks. This is mainly for me, and maybe Abdur Abdul-Malik and Bruce Hahn, the other appraisers I know making the same journey. And maybe I can help George Dell come up with ideas on how to spread the word to the rest of the industry. Thanks again George for the inspiration.

Which Model Should I Use?

Which Model Should I Use?

I just completed an appraisal of a home for a refinance where the subject had sold eight months prior to date of value.  I included the prior sale of the subject as Comp#4 and had two very recent, reasonably similar comparable sales that I included in the sales comparison grid. The subject is located in a suburban neighborhood built out in the 1960s-1970s on a ridge with some homes featuring city light views. Many recent sales feature remodeling or recent updating while others are relatively original.

In most suburban neighborhoods, using sale price per square foot (PSF) is the best way to track market trends because using PSF reduces the impact of changes in floor area in your sales comparable data. Generally speaking, the sale price model is subject to variations in floor area in this case and is not the best model. So, following my normal practice, I used the PSF linear regression model to determine time adjustments to comparables.

The PSF linear regression formula is:

Slope of the Trendline * Subject’s GLA = Daily Time Adjustment

PSF Linear Regression Model

So, in this case:

0.0992 * 1794 sf = $178/day

or

PSF Model Time Adjustments

Once I applied these time adjustments, my range of value was from $447,000 to $466,000 with an indicated value of $460,000. However, the most recent sale indicated $450,000 at most as did homes in contract competitive with the subject. Something wasn’t right.

So I tried a sale price linear regression model instead.

Sale Price Linear Regression Model:

Slope of the Trendline = Daily Time Adjustment

Sale Price Linear Regression Model

In this case, indicated daily time adjustment is $117/day or

Sale Price Model Time Adjustments

Indicated range of value for the subject ranged from $443,000 to $451,000, in line with the most recent sale and competitive homes in contract.

In this case, the Sale Price Linear Regression Model explained my data better than the PSF model.

I’ll have posts explaining more about time and real estate in the near future and will be teaching a class this July in Sacramento. Or you can learn straight from the national expert, George Dell. Sign up for his website and classes here. Thanks George for sharing with the rest of us.

Why do you think the linear model worked better in this case?

Time is Your Friend When Valuing Homes

Time is your friend

Time is your friend when valuing homes, especially when analyzing unusual properties. Sometimes when appraising a property, you find six model match sales within three months of date of value in a very narrow range, the value of the subject is obvious, and you’re done. Next please.

Other times, the subject is located on a busy street, is larger than any sale in town in the past 12 months, and has a moat. How do you deal with this nightmare? This is when time is your friend.

Expand your search parameters back in time until you find a competitive sale that brackets or equals the subject’s unusual feature. That sale from three years ago that was bigger than the subject is a comparable you can use to value it today. Or better yet, use the prior sale of the subject from five years ago so you have a sense of the value impact of that moat. Sometimes the best comp is the prior sale of the subject.

If you have time adjustments in your tool box, you’ll have access to a wider range of comparables than someone who doesn’t.

I’m developing a time adjustments class for REAA Sacramento that I plan to teach in July. I’ll talk about time adjustments more on this blog but will leave you with a couple of ways to determine time adjustments, especially when you want to bring forward in time a dated sale to use as a comparable.

Gary Christensen has a video here that discusses how to do a linear time adjustment in Excel. This works well when you have a set of comparable sales that you can analyze and the market has been relatively consistent. I’ll discuss this in detail in the near future.

Another simple way that works well for comparables from 2+ years ago is to look at the percent change in average (median/mean) sale price or price per square foot for the subject’s market area compared to the date of value. That percent difference-that’s your time adjustment. Apply the percentage to the comparable’s sale price and your old sale is ready to use as a comp. Most MLS systems have market analysis tools that will let you look up an average price in the past. In the worst case, the Freddie Mac House Price Index on the MSA level can help. Or you can use pivot tables to track changes in market areas (more to come soon about pivot tables).

I recently appraised a home in a small project where the most recent sale occurred nine years prior to date of value. I used the Freddie Mac House Price Index to determine a 29% negative time adjustment for this sale, subtracted the dollar amount, and the dated sale was ready for comparison.

Freddie Mac Home Price Index Calculation

Sacramento MSA
Date Freddie Mac Index Value
Dec-05 215.10
Jul-14 152.57
Difference 29%

Time Adjustment

Sale Price 363,000
Time Adjustment -105,500
Net 257,500

What is the most time between a date of value and a date of sale for a comparable in one of your valuations?