Long-awaited housing study a bust

(Reading time: 12 minutes)

The much-awaited regional housing study was finally released at the end of this past January. It is, to say the least, underwhelming.

Divided into two segments, a “consumer” version marked by larger fonts and a liberal use of photos, plus a so-called “technical” version, the study was marketed as providing “a deep understanding of the housing market dynamics in the Central Shenandoah Planning District,” which encompasses five counties and five cities. The study was originally promised for a June 2024, release, and was eagerly awaited by various local housing groups hoping to use its data as a springboard for further planning. Instead, those expectations were repeatedly put on hold, as first one delay was announced and then another, until in some cases the study became an afterthought.

So why the eight-month delay? It wasn’t because new data was being assimilated, or existing data was being reanalyzed. Indeed, it’s a fair guess that the study itself was barely tweaked at all during this long dry period, as it contains several references to future events that had already occurred by the time it was made public. Instead, the recurring delays were vaguely attributed to foot-dragging by unnamed localities in the planning district that hadn’t “signed off” on the study in a timely fashion.

Which right there should have been a big red flag that the “regional housing study” was actually a political football. In fact, it’s now clear that this is not a “study” as much as it is a “plan”—and plans need buy-in from those charged with implementing them.  Indeed, while a study suggests an effort to gather basic information from which plans can be developed, this study explicitly states that its findings were predetermined. As explained on p.9, study planners “met with staff from each county and city” who “described each jurisdiction’s housing stock, housing challenges and potential opportunities.” The study’s parameters, in other words, were established from the outset. Instead of conclusions flowing from the data, the data followed the conclusions.

Moreover, the “housing” aspects of the plan are only a minor part of its data base, which includes far more information about the region’s demographics than about its housing stock. A more accurate description would be to call this a “householder” study, its glaring gaps in actual housing information acknowledged by the study’s own repeated recommendations for still more study, such as its call for Staunton to “conduct a detailed housing demand analysis,” or that it “conduct a detailed survey and inventory of vacant/underutilized properties in the city.”

“Plans” are recommended courses of action, and there’s nothing wrong with that. But plans need legs, which is to say, they need to be built on a solid, factual base if they’re to have merit. Anyone reading their recommendations should be able to see how those proposals were derived from the available evidence. Yet in this case the cart precedes the horse, with the technical report dedicating just 79 pages to facts and numbers, compared to the 218 pages of proposals for how that information should be applied. Nor are those 79 pages weighed down with dense data dumps and spreadsheets: much of what’s there consists of generalized observations and broad conclusions, unburdened by the kind of detail that would allow readers to develop alternative understandings.

Take, for example, a section in the technical report headlined “Age and Condition of Housing Stock” that opens as follows: “Stakeholders across the Central Shenandoah footprint mentioned concerns about housing conditions. Focus group participants discussed dilapidated single-family homes that need to be demolished; for-sale inventory that needs updates and in some cases, substantial repair; housing that need [sic] rehabilitation and modifications for current residents; multi-family rental housing that has been neglected by landlords; and mobile homes that need replacement, among other conditions-related challenges.”

That reads like a précis for the section that should follow, a quick summary of compelling issues that can then be explored in more satisfying detail. But it’s all a tease. How many dilapidated homes are ripe for demolition? Where are they located? How extensive are the repairs needed by the for-sale inventory, and how quickly should they be undertaken before these homes fall into the “ripe for demolition” category? What would be the estimated cost of such intervention? Which multi-family housing units need remedial attention, and how many families are affected? Good and reasonable questions all, and all of which go unanswered here or anywhere else in the study.

But even on its own meager terms, the study’s scant data is only part of the problem. This is not just an issue of quantity, but of quality: what’s offered is so far past its “sell by” date that it might as well be tossed into the trash.

While many of the study’s conclusions are based on unidentified focus groups and interviews with anonymous “experts”—their identities cloaked, for inexplicably dark reasons, to “protect the anonymity” of participants—its main statistical underpinnings are drawn from U.S. Census Bureau and HUD surveys that largely or completely predate the Covid epidemic. This choice presumably was one of convenience, since such federal data are widely available and require far less effort—or expense—to obtain than more region-specific information. But because these are federal sources, which encompass the whole country and therefore have to distill enormous data quantities, what’s available is neither granular enough or timely enough to be especially useful at a local level.

 As a result, most of the regional housing study’s findings are based on American Community Survey estimates, which are five-year averages spanning the years 2017-2021 (and in some instances 2018-2022). Others are drawn from more dated 2019 Comprehensive Housing Affordability Strategy data, another five-year averaging of surveys spanning the even earlier 2015-2019 period. In other words, the study’s assertions about current housing cost burdens, as just one example, describe a world in which there has been no pandemic, no dislocation of job markets and spike in unemployment, no subsequent inflation and jump in mortgage rates, no moratorium on evictions and no billions of dollars of government assistance pumped into the economy to avert economic collapse. All, it goes without saying, producing massive distortions in housing markets.

Even when the study does (rarely) cite alternative data sources, what it provides lags current information by at least a couple of years. For example, it references the 2022 Point in Time survey to discuss the extent of local homelessness, even though 2024 data—collected in January of that year—was available long before the report was issued. Similarly, although the study turns to sales data from Virginia Realtors to explore time-on-market and related issues, it uses information that is drawn from 2015-2022. By contrast, a realtor who participates in one of the SAW housing groups does a comprehensive sales analysis of local markets every month and has years of more timely information and analysis at his fingertips, some as recent as a month ago.

Just how much difference a couple of years can make is evidenced by the study’s assertion, based on 2022 Realtor sales data, that the median home sales price in Staunton is $250,000. That claim should fail the straight-face test, following the 43.85% run-up in the city’s real estate property tax assessments for the period 2021-2025. Indeed, the local realtor mentioned in the previous paragraph observed that the average home sales price in the SAW region was $324,403 at the start of 2024, following an approximately 9% per year average appreciation over 17 years.

Relying on data that is many years old to describe the present in such a dynamic context means losing nuance, at best, and completely mischaracterizing current developments at worst. Yet at no point does the housing study acknowledge this limitation, or attempt to assess which of its conclusions are therefore least reliable. Like an AI hallucination, it confidently asserts a reality that doesn’t exist, mapping out future action based on staring fixedly into a rearview mirror. It does so by withholding basic data needed for a critical examination of the study’s assessments and conclusions. Indeed, it goes out of its way to obfuscate outside analysis, as when it acknowledges that it “has not documented the source of each estimate discussed” for “readability” reasons—a claim made in the “technical” report, which presumably should be loaded up with “technical” information but isn’t.

So, for example, all discussion about household income is restricted to comparing wages for different job categories, which can encompass widely ranging job titles and pay levels, rather than examining the more useful baseline of the minimum wage. In Virginia that would be an especially helpful metric because of the state’s significant boosts to the minimum, from $7.25 an hour in 2020—where it had been stuck for many years—to $12 in 2024, a time period squarely within the study’s information black hole. The pitfalls this poses was recently illustrated by Staunton’s Consolidated Plan, which overlooked the increase and thereby completely misstated the affordability of the city’s housing stock.

In its unwillingness to cite specific data, the housing study falls back on generalities that are too sweeping or obvious to be useful. Housing “that is for sale or for rent (aka ‘on the market’) is scarce.”  Staunton “continues to grapple with providing adequate housing infrastructure for its most vulnerable residents.” When it comes to housing, “there is not enough supply to serve renters with extremely low incomes.” And in a surprisingly cautious assessment, “the rental market is approaching a too-tight scenario.” All true—notwithstanding the hedge about “approaching”—and all well-known for quite some time. This study does little to go beyond the obvious.

EVEN ON ITS OWN (limited) terms, the housing study makes some questionable assertions while also raising legitimate issues that it then ignores.

On p.64, the study notes that focus groups “explained that a substantial amount of the region’s housing stock needs critical home repair.” Although “critical” implies a matter of urgency, this observation does not lead to a further analysis or remedial recommendations beyond a suggestion for “enhancements to rental inspection programs in Staunton and Waynesboro.” Both cities have opted to enforce the state’s property maintenance code, giving tenants in substandard housing some recourse, but Augusta County has not. The housing study doesn’t feel a need to point that out.

On p.66, the study acknowledges that the area “continues to grapple with providing adequate housing infrastructure for its most vulnerable residents, including those experiencing chronic housing insecurity [i.e. homelessness], mental health conditions and substance use disorder.” The study then quotes a 2023 report from the Virginia Department of Behavioral Health and Developmental Services calling for the Valley Community Service Board to more than double its existing 120 units of supportive housing—but goes no further in developing the recommendation. The consumer version of the study, meanwhile, does call for development of “a strategic plan” to help “those experiencing homelessness.”

The study elsewhere finds that roughly 3% of SAW housing stock—more than 1,800 housing units—consists of long-term vacancies, which is to say, empty housing that is not being held for seasonal, recreational or occasional use. Some of these vacancies “may represent an opportunity to increase the available housing stock by encouraging owners to rent or sell their units,” the study suggests, without further elaboration. The consumer version of the report, meanwhile, concedes that Staunton has “vacant and abandoned properties that contribute to blight and hinder community growth”—but since the housing study doesn’t know how many such properties exist, or where they’re located, the best it can do is urge the city to find out.

There’s much more of this kind of thing. The point here is not to nit-pick, but to point out that the housing study raises many more questions than it answers—questions not of the “how shall we cope with this” variety, but of what’s actually happening. Questions, in other words, that a regional housing study may reasonably have been expected to answer. Instead, the study’s center of gravity is defined by extensive menus of remedial actions that undoubtedly will keep city planners busy for years to come, calling for additional studies, for development of new taxes, bonds, grants  and other financing vehicles, and for seeking out public and private partnerships—all of which is well and good and even essential, but all of which could have been initiated without this document.

Meanwhile, it’s too easy to lose sight of why the regional housing study—at least as it was widely understood—was so anticipated. One clue is on page 17, which observes that “there are approximately 5,000 households at risk of homelessness in the Central Shenandoah footprint.” Already preceding them are “an estimated 265 people comprising 186 households who are unhoused.” Given the current political onslaught in Washington D.C. on anything that even remotely looks like compassion for one’s neighbors, it’s not fanciful to think that the 5,000 households already at risk may have their ranks diminished—by sliding into the unhoused category.

The regional housing study, in either its consumer or “technical” versions, makes us no better prepared to deal with that.