October 21, 2021 By Seth Dobrin 3 min read

Cities across the United States are increasingly seeing their local communities impacted because of widespread neighborhood change due to blight or to adverse effects of gentrification. While gentrification can often increase the economic value of a neighborhood, the potential threat it poses to local culture, affordability, and demographics has created significant and time-sensitive concerns for city governments across the country. Many attempts to measure neighborhood change up until now have been backwards-looking and rules-based, which can lead governments and community groups to make decisions based on inaccurate and out-of-date information. That’s why IBM partnered with the Washington D.C.-based nonprofit research organization Urban Institute, which for more than 50 years has led an impressive range of research efforts spanning social, racial, economic and climate issues at the federal, state, and local level.

Measuring neighborhood change as or before it occurs is critical for enabling timely policy action to prevent displacement in gentrifying communities, mitigating depopulation and community decline, and encouraging inclusive growth. The Urban Institute team recognized that many previous efforts to measure neighborhood change relied on nationwide administrative datasets, such as the decennial census or American Community Survey (ACS), which are published at considerable time lags. For that reason, the analysis could only be performed after the change happens, and displacement or blight has already occurred. Last year, the Urban Institute worked with experts in the US Department of Housing and Urban Development’s (HUD) Office of Policy Development and Research on a pilot project to assess whether they could use novel real-time HUD USPS address vacancy and Housing Choice Voucher (HCV) data with machine learning methods to accurately now-cast neighborhood change.

Together, the IBM Data Science and AI Elite and Urban Institute team built on that pilot to develop a new method for predicting local neighborhood change from the latest data across multiple sources, using AI. This new approach began by defining four types of neighborhood change: gentrifying, declining, inclusively growing, and unchanging. IBM and Urban Institute then leveraged data from the US Census, Zillow, and the Housing Choice Voucher program to train individual models across eight different metropolitan core based statistical areas, using model explainability techniques to describe the driving factors for gentrification.

The IBM Data Science and AI Elite team is dedicated to empowering organizations with skills, methods, and tools needed to embrace AI adoption. Their support enabled the teams to surface insights from housing and demographic changes across several metropolitan areas in a collaborative environment, speeding up future analyses in different geographies. The new approach demonstrated a marked improvement over the precision of older rules-based techniques  (from 61% to 74%) as well as the accuracy (from 71% to 74%). The results suggest a strong future for the application of data to improving urban development strategies.

The partnership put an emphasis on developing tools that enabled collaborative work and asset production, so that policymakers and community organizations could leverage the resulting approaches and tailor them to their own communities.

IBM Cloud Pak® for Data as a Service was used to easily share assets, such as Jupyter notebooks, between the IBM and Urban Institute teams. During the engagement with Urban Institute, the teams leveraged AutoAI capabilities in Watson Studio to rapidly establish model performance baselines before moving on to more sophisticated approaches. This capability is especially valuable for smaller data science teams looking to automatically build model pipelines and quickly iterate through feasible models and feature selection, which are highly time-consuming tasks in a typical machine learning lifecycle.

Together, this engagement and collaboration aims to empower the field to use publicly available data to provide a near real-time assessment of communities across the country. In addition to providing insights on existing data, the project can help uncover shortcomings in available data, enabling future field studies to fill the gaps more efficiently.

For more details on the results, check out our assets which provide an overview of how the different pieces fit together and how to use them. And if you want to dig deeper into the methods, read our white paper.

IBM is committed to advancing tech-for-good efforts, dedicating IBM tools and skills to work on the toughest societal challenges. IBM is pleased to showcase a powerful example of how social sector organizations can harness the power of data and AI to address society’s most critical challenges and create impact for global communities at scale. IBM’s Data and AI team will continue to help nonprofit organizations accelerate their mission and impact by applying data science and machine learning approaches to social impact use cases.

Interested in learning more? Discover how other organizations are using IBM Cloud Pak for Data to drive impact in their business and the world.

Was this article helpful?

More from Artificial intelligence

In preview now: IBM watsonx BI Assistant is your AI-powered business analyst and advisor

3 min read - The business intelligence (BI) software market is projected to surge to USD 27.9 billion by 2027, yet only 30% of employees use these tools for decision-making. This gap between investment and usage highlights a significant missed opportunity. The primary hurdle in adopting BI tools is their complexity. Traditional BI tools, while powerful, are often too complex and slow for effective decision-making. Business decision-makers need insights tailored to their specific business contexts, not complex dashboards that are difficult to navigate. Organizations…

Introducing the watsonx platform on Microsoft Azure

4 min read - Artificial intelligence (AI) is revolutionizing industries by enabling advanced analytics, automation, and personalized experiences. According to The business value of AI, from the IBM Institute of Business Value, AI adoption has more than doubled since 2017. Enterprises are taking an intentional design approach to hybrid cloud and AI to drive technology decisions and enable adoption of Generative AI. According to the McKinsey report,  The economic potential of generative AI: The next productivity frontier, generative AI is projected to add $2.6…

Democratizing Large Language Model development with InstructLab support in watsonx.ai

5 min read - There is no doubt that generative AI is changing the game for many industries around the world due to its ability to automate and enhance creative and analytical processes. According to McKinsey, generative AI has a potential to add $4 trillion to the global economy. With the advent of generative AI and, more specifically, Large Language Models (LLMs), driving tremendous opportunities and efficiencies, we’re finding that the path to success for organizations to effectively use and scale their generative AI…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters