October 21, 2021 By Seth Dobrin 3 min read

Cities across the United States are increasingly seeing their local communities impacted because of widespread neighborhood change due to blight or to adverse effects of gentrification. While gentrification can often increase the economic value of a neighborhood, the potential threat it poses to local culture, affordability, and demographics has created significant and time-sensitive concerns for city governments across the country. Many attempts to measure neighborhood change up until now have been backwards-looking and rules-based, which can lead governments and community groups to make decisions based on inaccurate and out-of-date information. That’s why IBM partnered with the Washington D.C.-based nonprofit research organization Urban Institute, which for more than 50 years has led an impressive range of research efforts spanning social, racial, economic and climate issues at the federal, state, and local level.

Measuring neighborhood change as or before it occurs is critical for enabling timely policy action to prevent displacement in gentrifying communities, mitigating depopulation and community decline, and encouraging inclusive growth. The Urban Institute team recognized that many previous efforts to measure neighborhood change relied on nationwide administrative datasets, such as the decennial census or American Community Survey (ACS), which are published at considerable time lags. For that reason, the analysis could only be performed after the change happens, and displacement or blight has already occurred. Last year, the Urban Institute worked with experts in the US Department of Housing and Urban Development’s (HUD) Office of Policy Development and Research on a pilot project to assess whether they could use novel real-time HUD USPS address vacancy and Housing Choice Voucher (HCV) data with machine learning methods to accurately now-cast neighborhood change.

Together, the IBM Data Science and AI Elite and Urban Institute team built on that pilot to develop a new method for predicting local neighborhood change from the latest data across multiple sources, using AI. This new approach began by defining four types of neighborhood change: gentrifying, declining, inclusively growing, and unchanging. IBM and Urban Institute then leveraged data from the US Census, Zillow, and the Housing Choice Voucher program to train individual models across eight different metropolitan core based statistical areas, using model explainability techniques to describe the driving factors for gentrification.

The IBM Data Science and AI Elite team is dedicated to empowering organizations with skills, methods, and tools needed to embrace AI adoption. Their support enabled the teams to surface insights from housing and demographic changes across several metropolitan areas in a collaborative environment, speeding up future analyses in different geographies. The new approach demonstrated a marked improvement over the precision of older rules-based techniques  (from 61% to 74%) as well as the accuracy (from 71% to 74%). The results suggest a strong future for the application of data to improving urban development strategies.

The partnership put an emphasis on developing tools that enabled collaborative work and asset production, so that policymakers and community organizations could leverage the resulting approaches and tailor them to their own communities.

IBM Cloud Pak® for Data as a Service was used to easily share assets, such as Jupyter notebooks, between the IBM and Urban Institute teams. During the engagement with Urban Institute, the teams leveraged AutoAI capabilities in Watson Studio to rapidly establish model performance baselines before moving on to more sophisticated approaches. This capability is especially valuable for smaller data science teams looking to automatically build model pipelines and quickly iterate through feasible models and feature selection, which are highly time-consuming tasks in a typical machine learning lifecycle.

Together, this engagement and collaboration aims to empower the field to use publicly available data to provide a near real-time assessment of communities across the country. In addition to providing insights on existing data, the project can help uncover shortcomings in available data, enabling future field studies to fill the gaps more efficiently.

For more details on the results, check out our assets which provide an overview of how the different pieces fit together and how to use them. And if you want to dig deeper into the methods, read our white paper.

IBM is committed to advancing tech-for-good efforts, dedicating IBM tools and skills to work on the toughest societal challenges. IBM is pleased to showcase a powerful example of how social sector organizations can harness the power of data and AI to address society’s most critical challenges and create impact for global communities at scale. IBM’s Data and AI team will continue to help nonprofit organizations accelerate their mission and impact by applying data science and machine learning approaches to social impact use cases.

Interested in learning more? Discover how other organizations are using IBM Cloud Pak for Data to drive impact in their business and the world.

More from Artificial intelligence

Best practices for augmenting human intelligence with AI

2 min read - Artificial Intelligence (AI) should be designed to include and balance human oversight, agency, and accountability over decisions across the AI lifecycle. IBM’s first Principle for Trust and Transparency states that the purpose of AI is to augment human intelligence. Augmented human intelligence means that the use of AI enhances human intelligence, rather than operating independently of, or replacing it. All of this implies that AI systems are not to be treated as human beings, but rather viewed as support mechanisms…

IBM watsonx AI and data platform, security solutions and consulting services for generative AI to be showcased at AWS re:Invent

3 min read - According to a Gartner® report, “By 2026, more than 80% of enterprises will have used generative AI APIs or models, and/or deployed GenAI-enabled applications in production environments, up from less than 5% in 2023.”* However, to be successful they need the flexibility to run it on their existing cloud environments. That’s why we continue expanding the IBM and AWS collaboration, providing clients flexibility to build and govern their AI projects using the watsonx AI and data platform with AI assistants…

Watsonx: A game changer for embedding generative AI into commercial solutions

4 min read - IBM watsonx is changing the game for enterprises of all shapes and sizes, making it easy for them to embed generative AI into their operations. This week, the CEO of WellnessWits, an IBM Business Partner, announced they embedded watsonx in their app to help patients ask questions about chronic disease and more easily schedule appointments with physicians. Watsonx comprises of three components that empower businesses to customize their AI solutions: watsonx.ai offers intuitive tooling for powerful foundation models; watsonx.data enables…

Announcing watsonx.ai & SingleStore for generative AI applications

2 min read - In December 2021, IBM and SingleStore announced their strategic partnership with SingleStoreDB with IBM and in December 2022 the partnership launched SingleStoreDB as a Service with IBM available on AWS, Azure and the Microsoft Azure Marketplace.  Now they are taking the next step in their strategic partnership to announce SingleStoreDB’s powerful vector database functionality’s support of watsonx.ai. Why watsonx.ai? IBM watsonx.ai is the next-generation enterprise studio for AI builders. It brings together traditional machine learning and new generative AI capabilities…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters