A New Solution to the Additive Constant Problem in Metric Multidimensional Scaling
Lee G. Cooper; Psychometrika—Vol. 37, No. 3 (September, 1972), 311-322.
A new solution to the additive constant problem in metric multi-dimensional scaling is developed. This solution determines, for a given dimensionality, the additive constant and the resulting stimulus projections on the dimensions of a Euclidean space which minimize the sum of squares of discrepancies between the formal model for metric multidimensional scaling and the original data. A modification of Fletcher-Powell style functional iteration is used to compute solutions. A scale free index of the goodness of fit is developed to aid in selecting solutions of adequate dimensionality from multiple candidates.
Obtaining Squared Multiple Correlations From a Correlation Matrix Which May Be Singular
Ledyard R Tucker, Lee G. Cooper, William Meredith; Psychometrika, 37, 2, (June, 1967), 143-148.
A theorem is presented relating the squared multiple correlation of each measure in a battery with the other measures to the unique generalized inverse of the correlation matrix. This theorem is independent of the rank of the correlation matrix and may be utilized for singular correlation matrices. A coefficient is presented which indicates whether the squared multiple correlation is unity or not. Note that not all measures necessarily have unit squared multiple correlations with the other measures when the correlation matrix is singular. Some suggestions for computations are given for simultaneous determination of squared multiple correlations for all measures.
Voting for a Political Candidate Under Conditions of Minimal Information
Masao Nakanishi, Lee G. Cooper, Harold H. Kassarjian; The Journal of Consumer Research, Vol. 1, No. 2, (September, 1974), 36-43.
Until very recently, the major focus of research in the field of consumer behavior has been on the selection of products, brands and decision choices primarily in the sphere of marketing. The purpose of this paper was to modify a model developed to measure market share to account for the variables that enter into the selection of a political candidate and predict voting behavior.
Simplified Estimation Procedures for MCI Models
Masao Nakanishi, Lee G. Cooper; Marketing Science, Vol. 1, No. 3, (Summer, 1982), 314-322.
Structural transformations of the MCI model are presented which make the model easily estimated using dummy variables with widely available regression packages. The MCI model is empirically shown to provide better predictive power than several other models of similar form, but ones which do not produce logically consistent market share estimates.
Standardizing Variables in Multiplicative Choice Models
Lee G. Cooper, Masao Nakanishi; The Journal of Consumer Research, Vol. 10, No. 1, (June, 1983), 96-108.
To use multiplicative competitive interaction (MCI) models as part of a theory of the evaluative process in choice, we need a method to transform interval scale consumer judgments into positive, ratio scales. We develop a coefficient-zeta-squared-that possesses the needed scale requirements and other theoretically desirable properties, and report four research studies to demonstrate the diversity of applications of multiplicative choice models using zeta-squared. We also discuss the relations of MCI models to Luce choice models to illustrate the potential of zeta-squared for representing the effects of similarity on choice, and consider some of the benefits of standardizing variables in MCI models or multinomial logit models.
Two Logit Models for External Analysis of Preferences
Lee G. Cooper, Masao Nakanishi; Psychometrika--Vol. 48, No. 4., December, 1983, 607-620.
A logit vector model and a logit ideal point model are presented for external analysis of paired comparison preference judgments aggregated over a homogeneous group. The logit vector model is hierarchically nested within the logit ideal point model so that statistical tests are available to distinguish between these two models. Generalized least squares estimation procedures are developed to account for heteroscedastic sampling error variances and specification error variances. Two numerical illustrations deal with judgments concerning employee compensation plans and preferences for salt and sugar in the brine of canned green beans.
Competitive Maps: The Structure Underlying Asymmetric Cross Elasticities
Lee G. Cooper; Management Science, Vol. 34, No. 6, (June, 1988), 707-723.
A special case of three-mode factor analysis is used to portray the systematic structure underlying asymmetric cross elasticities for a broad class of market-share attraction models. Analysis of the variation over retail outlets and weeks reveals competitive patterns corresponding to sales for the major brands in the market as well as patterns reflecting shelf-price competition. Analysis of the brand domain results in a joint space. One set of brand positions portrays how brands exert influence over the competition. The other set of points portrays how brands are influenced by others. The interset distances (angles) provide direct measures of competitive pressures. Maps are formed as spatial representations of each of the competitive patterns discovered.
Modeling Asymmetric Competition
Gregory S. Carpenter, Lee G. Cooper, Dominique M. Hanssens, David F. Midgley; Marketing Science, Vol. 7, No. 4, Special Issue on Competitive Marketing Strategy, (Autumn, 1988), 393-412.
The effects of the marketing actions of one brand can be distributed among its competitors' market shares in a complex manner. This paper presents and illustrates methods for modeling brand competition and brand strategies in markets where competitive effects can be differentially and asymmetrically distributed. We discuss the empirical specification, parameter estimation and competitive-strategy implications of the models proposed. Price and advertising competition among eleven brands of an Australian household product is used to illustrate the application of these procedures.
The Discounting of Discounts and Promotion Thresholds
Sunil Gupta, Lee G. Cooper; The Journal of Consumer Research, Vol. 19, No. 3, (December, 1992), 401-411.
This study examines consumers' response to retailers' price promotions. It shows that consumers discount the price discounts. It also suggests that the discounting of discounts and changes in purchase intention depend on the discount level, store image, and whether the product advertised is a name brand or a store brand. The study goes one step further to investigate the existence of promotion thresholds. We use experimental data and an econometric methodology to gather empirical evidence that consumers do not change their intentions to buy unless the promotional discount is above a threshold level. This threshold point differs for name brands and store brands. Specifically, we find that the threshold for a name brand is lower than that for a store brand. In other words, stores can attract consumers by offering a small discount on name brands while a larger discount is needed for a similar effect for a store brand. The study also indicates the existence of a promotion saturation point above which the effect of discounts on changes in consumers' purchase intention is minimal. These results confirm consumers' S-shaped response to promotions.
Truth in Concentration in the Land of (80/20) Laws
David C. Schmittlein, Lee G. Cooper, Donald G. Morrison; Marketing Science, Vol. 12, No. 2, (Spring, 1993), pp. 167-183.
Among the more prominent truisms in marketing are 80/20 type laws, e.g., 20 percent of the customers account for 80 percent of the purchases. These kinds of statistics indicate a certain degree of concentration in customer purchases; i.e., the extent to which a large portion of the product's total purchases are made by a small fraction of all customers. Such concentration levels, suggesting that markets can be segmented in various ways, are often reported in basic marketing texts. We show that a meaningful interpretation of these concentration statistics is not nearly as easy or immediate as it is to compute them. The key factors influencing the degree of apparent concentration in purchases are reviewed, and we present a modeling approach for estimating the true level of relevant concentration among customers.
Breeding Competitive Strategies
David F. Midgley, Robert E. Marks, Lee G. Cooper; Management Science, Vol. 43, No. 3 (March, 1997), pp. 257-275.
We show how genetic algorithms can be used to evolve strategies in oligopolistic markets characterized by asymmetric competition. The approach is illustrated using scanner tracking data of brand actions in a real market. An asymmetric market-share model and a category-volume model are combined to represent market response to the actions of brand managers. The actions available to each artificial brand manager are constrained to four typical marketing actions of each from the historical data. Each brand's strategies evolve through simulations of repeated interactions in a virtual market, using the estimated weekly profits of each brand as measures of its fitness for the genetic algorithm. The artificial agents bred in this environment outperform the historical actions of brand managers in the real market. The implications of these findings for the study of marketing strategy are discussed.
Strategic Marketing Planning in Turbulent Environments: The Case of PromoCast
Lee G. Cooper, Troy Noble, Elizabeth Korb; Canadian Journal of Marketing Research, 19 (1999), 46-66.
Even in times of turbulent change, we must learn to act strategically and plan responsibly. We cannot rely on plans that are outdated as soon as they are written. Our planning documents must be dynamic - updateable as events unfold and new uncertainties arise. Such motivations drove the development of the methods for strategic marketing planning presented in Cooper (2000). This article illustrates that method as applied the the introduction of PromoCast.
Strategic Marketing Planning for Radically New Products
Cooper L.G.; Journal of Marketing, 64, 1 (January, 2000), 1-16.
This article outlines an approach to marketing planning for radically new products, disruptive or discontinuous innovations - those new products or services that change the dimensionality of the consumer decision process. The uncertainty that goes with radical innovation creates a great challenge for marketing managers, but one that must often be confronted in high technology marketing. The planning process begins with an extensive situation analysis that pays particular attention to environmental change coming from political, behavioral, economic, sociological, and technological sources. These environmental forces are looked at from the points of view of the company, the business ecosystem or value network (what we used to call "the industry"), and the infrastructure. The stakeholders and factors identified in the situation analysis are woven into the economic webs surrounding the new product. The webs are mapped into Bayesian networks. This involves a combination of knowledge engineering and specification of focussed research projects. The Bayesian nature of the planning document enables planners to update information as events unfold and to simulate the impact that changes in assumptions underlying the web have on the prospects for the new product. This approach to marketing planning provides a dynamic alternative to a static planning document that is outdated before it is read. The method is illustrated using the historical case concerning the introduction of video tape recorders by Sony and JVC, and the contemporary case concerning the introduction of electric vehicles. A complete numerical example concerning a software development project is given in an appendix.
PromoCast: A New Forecasting Method for Promotion Planning
Lee G.Cooper, Penny Baron, Wayne Levy, Michael Swisher, Paris Gogos; Marketing Science, Vol.18,No.3 (Fall 1999), 301 –316.
This article describes the implementation of a promotion-event forecasting system, PromoCast, and its performance in several pilot applications and validity studies. Pilot studies involved retail grocery chains with 95 to 185 stores per trading area. The goal was to provide short-term, tactical forecasts useful for planning promotions from a retailer's perspective. Thus, the forecast system must be able to handle any of the over 150,000 UPCs in each store's item master file, and must be scalable to produce approximately 800,000,000 forecasts per year across all the retailers served by ems, inc. This is a much different task than one that confronts a manufacturer, even one with a broad product line. Manufacturers can benefit from custom modeling in a product line or category. Retailers need a production system that generates forecasts that help promotion planning. Marketing scientists have typically approached promotion analysis from the manufacturer's perspective. One objective of this article is to encourage marketing scientists to rethink promotion analysis from a different perspective.
From the retailer's point of view the "planning unit" is the promotion event. Neither weekly store tracking data nor shopping-trip data from consumer panels are easily aggregated to reflect total sales during a promotion event. We describe the promotion-event databases and the statistical model developed using these databases. The data are the strategic asset. Our goal is to help retailers use their data to increase the profitability of promotions. We have data on the performance of each UPC in each store under a variety of promotion conditions, on each stores adeptness at executing various styles of promotions, as well as on chain wide historical performance for each UPC. We use many historical averages from these databases to build a 67 variable, regression-style model. The forecast incorporates a simple bias correction needed when using a log-transformed dependent variable (the natural log of total unit sales). We argue that the historical averages matching the planned ad and display conditions provide a benchmark superior to the widely used "base-times-lift" method. When aggregated into case units (the natural unit for product ordering), 69% of the forecasts in our first validation study were within ± one case compared to 39% within ± one case using the appropriate historical averages. We report the results of two over-time validity studies that reflect the value of our model for retailers. The limitations and implications of this planning tool for managerial decision-making concerning stocking levels are discussed.
Whenever historical data are the strategic asset we face inherent limitations. Our model does not forecast new products. The forecast error increases when an existing product is promoted in a new way. Over 99.5% of the time, we have full data from which to create a forecast. However, with a database for a typical chain market containing over 20 million promotion events in the 30-month time frame we use, 100,000 events have less than ideal data. The breadth of the database (typically 150,000 UPS) makes it impractical to incorporate data on competitive offerings. And we find that regression-style modeling is not adept at incorporating information on the 1,200 subcommodities managed in our pilot stores or the 1,000 manufacturers who supply those stores.
Despite these limitations we show the value of using promotion event data, how tactical forecasts based on these data can directly impact the bottom line of grocery retailers, and how store-by-store forecasts can help retailers with problems of running out of stock or overstocking.
Turning Datamining into a Management Science Tool
Cooper, L.G. and Giuffrida, G.; Management Science, 46, 2 (February, 2000), 249-264.
This article develops and illustrates a new knowledge discovery algorithm tailored to the action requirements of management science applications. The challenge is to develop tactical planning forecasts at the SKU level. We use a traditional market-response model to extract information from continuous variables and use datamining techniques on the residuals to extract information from the many-valued nominal variables such as the manufacturer or merchandise category. This combination means that a more complete array of information can be used to develop tactical planning forecasts. The method is illustrated using records of the aggregate sales during promotion events conducted by a 95-store retail chain in a single trading area. In a longitudinal cross validation, the statistical forecast (PromoCast™) predicted the exact number of cases of merchandise needed in 49% of the promotion events and was within ± one case in 82% of the events. The dataminer developed rules from an independent sample of 1.6 million observations and applied these rules to almost 460,000 promotion events in the validation process. The dataminer had sufficient confidence to make recommendations on 46% of these forecasts. In 66% of those recommendations the dataminer indicated that the forecast should not be changed. In 96% of those promotion events where "no change" was recommended this was the correct "action" to take. Even including these "no change" recommendations, the dataminer decreased the case error by 9% across all promotion events in which rules applied.