IBM® SPSS® Categories enables you to visualize and explore relationships in your data and predict outcomes based on your findings. It uses categorical regression procedures to predict the values of a nominal, ordinal or numerical outcome variable from a combination of numeric and ordered or unordered categorical predictor variables. The software features advanced techniques such as predictive analysis, statistical learning, perceptual mapping and preference scaling.

This module is included in the SPSS Statistics Professional edition for on premises, and in the “Complex sampling and testing” add-on for Subscription plans.

## Feature spotlights

### Analyze differences between categories

Use correspondence analysis to more easily display and analyze differences between categories.

### Incorporate supplementary information

Incorporate supplementary information on additional variables.

### Uncover associations and relationships

Use symmetrical normalization to produce a biplot so you can better see associations.

### Easily work with categorical data

Benefit from tools to help you analyze and interpret your multivariate data and its relationships more completely. For example, understand which characteristics consumers relate most closely to in terms of your product or brand, or determine customer perception of your products compared to other products that you or your competitors offer.

### Use categorical regression procedures

Predict the values of a nominal, ordinal or numerical outcome variable from a combination of numeric and ordered or unordered categorical predictor variables. Use regression with optimal scaling to describe, for example, how job satisfaction can be predicted from job category, geographic region and the amount of work-related travel.

### Take advantage of optimal scaling

Quantify variables so that the Multiple R is maximized. Optimal scaling may be applied to numeric variables when residuals are non-normal or when predictor variables are not linearly related with the outcome variable. Regularization methods such as Ridge regression, Lasso and Elastic Net can improve prediction accuracy by stabilizing the parameter estimates.

### Present your results clearly using perceptual maps

Use dimension reduction techniques to see relationships in your data. Summary charts display similar variables or categories to provide you with insight into relationships among more than two variables.

### Get these optimal scaling and dimension reduction techniques

Techniques include correspondence analysis (CORRESPONDENCE), categorical regression (CATREG), multiple correspondence analysis (MULTIPLE CORRESPONDENCE), CATPCA, nonlinear canonical correlation (OVERALS), proximity scaling (PROXSCAL) and preference scaling (PREFSCAL).

## Technical details

#### Software requirements

• For on premises: Purchase the Professional edition
• For Subscription plans: Purchase the “Complex sampling and testing” add-on

#### Hardware requirements

• Processor: 2 GHz or faster
• Display: 1024x768 or higher
• Memory: 4 GB of RAM required, 8 GB of RAM or more recommended
• Disk space: 2 GB or more