Cell2Cell Case
Cell2Cell Case
Cell2Cell Case
It was a cold and cloudy morning at Cell2Cell Headquarters as Sarah A. Stanford headed to her cubicle holding a cup of cappuccino in one hand and a bulky database manual in the other. Although a veteran in the database marketing field with a doctorate in Statistics, her new assignment was more challenging than she had imagined. Churn Management in the wireless industry was really complicated, but Sarah was excited to be working on perhaps the most important marketing issue facing Cell2Cell, one of the leading providers of cellular telephone service. Sarahs mission, as described by her boss, Charles R. Morris, was to (1) develop a statistical model for predicting customer churn, (2) use the model to identify the most important drivers of churn, and (3) with these new insights, recommend a customer churn management program to the CBM (Customer Base Management) Group. Sarah always felt comfortable dealing with data crunching, so she wasnt afraid of the first two tasks. But the third task, developing the churn management program, was a different story. This would require her to combine her analytical and creative skills to devise a churn management program that would reduce the number of customer defections.
INDUSTRY BACKGROUND1
The cellular telephone industry has always offered a compelling value proposition: convenient, mobile telephone service. The industry started inauspiciously in 1921 when the Detroit, Michigan Police Department first used a mobile radio in a vehicle. This system, as well as the others that followed, suffered from the same problem lack of bandwidth. This meant that the radio frequency at which these telephones communicated could only support a limited number of telephones. In order for mobile telephone service to become a consumer product, this 1st generation or analog era needed a technological breakthrough. That breakthrough came in 1983 with the first United States application of cellular telephone service in Chicago, Illinois. Cellular technology set up a honeycomb pattern of transmitters in a given service area. Cellular phone users were transferred from one transmitter to the next as they traveled through the service
1
See Appendix 1 for a more extensive industry history, including a description of the most recent 3G technology.
Research Associate, Emilio del Rio prepared this case under the supervision of Professor Scott Neslin of Dartmouth College and Visiting Scholar 2002 to the Teradata Center for Customer Relationship Management at Duke University as the basis for class discussion rather than to illustrate either effective or ineffective handling of an administrative situation. Copyright 2002 by the Fuqua School of Business. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, used in a spreadsheet, or transmitted in any form or by any means - electronic, mechanical, photocopying, recording, or otherwise - without the permission of the Fuqua School of Business. Version: (A) 8/26/02
p. 2
area. Although each honeycomb could handle only a limited number of users, the capacity of the system was multiplied because each honeycomb had to support only the customers who were within its borders. Advanced switching equipment made the hand-off technologically feasible and seamless to the user. The cellular telephone network increased capacity significantly and by 1987 there were 1,000,000 cell phone subscribers in the US. However, even that system soon began to be saturated. The answer was in digital transmission technology, the 2nd generation of mobile telephone service. Digital technology increased capacity at lower costs and higher reliability. It was based on slicing each telephone call into digital segments. Different segments from different users could be mixed in the system for efficiency purposes, but then re-combined as they were transmitted to the receiving customer. The digital era took off in the 1990s and heralded widespread use of cell phone service. By 2001, there were an estimated 128,000,000 subscribers in the US.
p. 3
p. 4
There are two essential requirements for churn models. First, the model should be clear, understandable and accessible by marketing managers. A model that can only be understood by a statistician generates communication gaps that limit the use of the model for designing the appropriate marketing actions for retaining a customer. Second, a churn model is only as good as the historical data on which its predictions are based. Hence, churn management requires a frequently updated, extensive database, i.e., heavy investment in information technology. In conclusion, effective churn management is becoming a matter of survival for cellular carriers. One way to address the challenge is through a Proactive Retention Program, whereby customers with high risk of churn are identified ahead of time and targeted with appropriate marketing actions. Churn modeling offers the promise of facilitating that effort.
CELL2CELL History
Cell2Cell is the 6th largest wireless company in the US, with approximately 10 million subscribers. It serves more than 210 metropolitan markets, with more than 2900 cities and communities, and covers nearly all 50 states. Starting as a small family-owned company, Cell2Cell grew exponentially thanks to wise and consistent decisions and powerful market vision. Today, it employs nearly 20,000 employees and has one of the largest retail store networks in the country. Aggressive management and a skilled engineering department helped the company make the correct investment decisions and the best technology implementations with efficient capital expenditure. The company went public in 1992. Immediately after, its stock price skyrocketed and increased Cell2Cells power to enter its most ambitious mergers & acquisitions period. The second turning point appeared in 1992 with the FCC auction of new digital service licences, when it surprisingly won 18 major trading areas (MTA). From that moment on and fueled with major public and private funding, Cell2Cell began an expansion process that led to todays company. The greatest strengths of the firm are its network infrastructure and marketing. The company launched an aggressive network partnership program through which affiliates expand the companys coverage by offering Cell2Cell service in their service areas. In this way, the company secured coverage with its own network in the primary markets while the affiliates brought service to secondary markets. On the marketing side, the company built a large distribution system of stores throughout the country, making Cell2Cell phones and services available in more than 8,000 retail outlets, including more than 150 Cell2Cell retail stores. The companys products can also be found at Best Buy, CompUSA, Wal-Mart and OfficeDepot. Recently, the firm struck a deal to make its services available online through Barnes & Noble. Cell2Cell marketing campaigns are primarily focused on brand recognition, coverage and quality of service. They tout the reach of its national network using mass advertising media such as television, radio and print ads. Additionally, Cell2Cells hallmark has always been its customer service. The company has received many awards recognizing its outstanding customer satisfaction. Like many competitors, Cell2Cell is vulnerable to the current state of the stock market. The company has recently cancelled a large stock offering and is currently focusing on other ways to generate funds. Also involved in the 3G race, Cell2Cell feels the financial crunch from investing in the requisite network infrastructure. Additionally, the company has recently reported a decrease in customer acquisition and a light increase in customer churn. This has created even more pressure Cell2Cells marketing managers.
p. 5
p. 6
In the late nineties, at the peak of customer acquisition in the cellular industry, Charles asked one of his employees to calculate how many customers were leaving the company. A couple of days later, he was startled by the results, a little more than 5% of the subscribers were leaving the company every month! That meant that more than half of the customers beginning the year with the company were gone by the end of it! Charles was aware that the huge annual acquisition rate of the company more than compensated for this loss. Nonetheless he thought that there was an important warning embedded in that figure. He presented his vision to top-management with a proposal to devise a Customer Churn Unit, but the project was turned down on the basis that new customers are the center of the strategy and there are many years ahead until we start worrying about churn. Charles was disappointed with this decision. A year later, a Cell2Cell board member became aware of Charles work and ideas about customer care, and offered him the head position at the Database Marketing CRM Unit. Charles saw the offer as an excellent opportunity to start tackling new problems from a fresh perspective, and so joined Cell2Cell.
p. 7
initiative is significant and therefore the results have to be impressive. Charles calculated that for a predictive model to be effective, it has to be able to obtain a lift of at least 1.75 to 1 in the target group. This means that the group to be targeted with the retention offer has to be at least 75% more likely to churn than the average customer. Even though these issues made the proactive approach challenging, both Cell2Cell and Charles R. Morris recognized that managing churn and customer relationships is destined to be a key differentiator in the race for subscribers. Charles knew from his previous successes with database-marketing that it takes some time to improve these types of techniques and in many occasions it is necessary to test them and even fail with them in order to be corrected and improved.
p. 8
Total Service Revenues (in $Billions) Ending Subscribers (in millions) Subscriber growth (%) Subscriber penetration (%) Avg monthly service revenues / subscriber ($)* Avg monthly service revenues / subscriber ($)~
* Including roaming revenue: Calculated by Standard and Poors. ~ Excluding roaming revenue: Calculated by Standard and Poors. Source: Cellular Telecommunications and Internet Association.
p. 9
Market Share 27.9 21.2 16.4 11.2 7.7 6.5 5.9 3.3
ARPU ($) 49.00 52.38 63.80 61.00 72.00 47.56 49.06 47.26
p. 10
p. 11
p. 12
(2) CTIAs Semi-Annual Wireless Industry Survey (3) Attitudes toward Wireless Phones Peter D. Hart Associates, Inc. (4) On the Mobile Dr. Sadie Plant
The Data Era (3rd Generation technologies 3G) A glimpse of the future
(1) Industry Survey Telecommunications : Wireless Standard & Poors November 1, 2001 (2) 3G Market Evolution Primer Sapan A. Shahani Cambia Networks (3) US Wireless Carrier Market Trends - Mark Beamen Faulkner Information Services (4) The Rise of the 3G Empire US Wireless Equipment Deutsche Bank Securities Inc.
p. 13
APPENDIX 1 The Analog Era (1st Generation Technologies- IG) The Cellular Technology2,3 4 5
The first application of mobile communications dated from 1921, when the Detroit, Michigan Police Department made the earliest significant use of mobile radio in a vehicle.6 The system in place was a oneway transmitter and it operated in the 2MHz frequency. Soon after, the channels on this band became overcrowded, leaving no room for other services. However the initiative proved that mobile communication was technically feasible. In the early 40s, new frequencies between 30 and 40 MHz became available, and the first public mobile radio telephone system in the US was inaugurated in St Louis, Missouri and later a new public system began operations along the highway between New York and Boston. By then, it was clear that a new market had been developing for this technology and opportunities became evident for the established carriers. In 1947, unable to satisfy the demand given the existing frequencies, but fully recognizing the enormous potential of the business, AT&T proposed that the Federal Communication Commission (FCC) allocate a larger number of frequencies so that the mobile telephone service could become a mass phenomenon. However, in 1949, in one of the most controversial decisions in the technological US history, the commission turned down the proposal on the basis that other uses of the spectrum (i.e. for police units and fire departments) would better serve the public interest. This ruling eroded the telephone companies incentives for research and development in this sector and shifted their focus on the better-secured basic telephone services market. Scant attention was given to mobile services throughout the 1950s and 1960s , consigning the idea to the mist of oblivion. This era is often cited as the main reason why the wireless sector and technology is much more advanced in the rest of the world, mainly Europe followed by Asia, than in the US. Nearly two decades later, the lack of innovation took its toll: In 1968, the FCC faced a 'serious congestion' on the frequencies that were then available. It started to show interest for the first time in a truly efficient high capacity mobile telephone service. Consequently, the FCC reconsidered its previous position, stating that if technology for building a better mobile service worked, then the Commission would increase the frequency allocation for more mobile phones. AT&T took advantage of the opportunity and retrieved a concept that it had developed back in 1947: Cellular Architecture. The communication giant was the only company that responded to the FCCs request for a proposal with a technical report asserting feasibility. Basically, the cellular idea was to divide the target area into small cells, each a few miles in radius. Each cell would have a low-height / low-power base station (a two-way transmitter) which would link to the other cells and to the public wireline telephone network through a central switch. (Exhibit 1 Cellular Network). As a result, all of these individual areas would work collectively and cover the entire area. Each tower or base station would only use a few of the total frequencies allocated to the system and as mobile-users moved across the cells, their communications would be handed off from cell to cell (from tower to tower) and among the available frequencies without any noticeable transition. The most
A Brief History of Cellular Waveguide www.waveguide.org - http://www.wave-guide.org/archives/waveguide_3/cellularhistory.html 3 Mobile Telephone History TelecomWriting.com 4 How Cell Phones Work - Marshall Brain and Jeff Tyson 5 Cellular Telephone Basics Tom Farley 6 Dobson, Kenneth S., How Detroit Police Reinvented the Wheel. The Detroit News. http://detnews.com/history/police/police.htm
p. 14
important twist in this architecture is that the same frequency could be re-used in several simultaneous conversations; this is the chief difference between traditional mobile systems and cellular systems. In older mobile telephone services a single frequency served an entire area, but in cellular that frequency is used again and again. This re-usability feature solved the two main obstacles : limited availability of frequencies and their inefficient use. In summary, the advantages of cells would be low power operation, handoff, reuse, and cell-splitting (using multiple transmitters or towers in the same geographic zone or cell in order to handle more traffic). From a commercial standpoint, the beauty of the cellular concept was that a finite number of frequencies could accommodate a large, and theoretically infinite, number of customers. Therefore, the cellular system approach enabled telephone companies to transform the narrow and limited mobile radio systems into a massive consumer item. Nonetheless, the two biggest barriers this technology had were testing this technology and its concepts in the real world (it is important to clarify that the entire cellular idea was a concept developed for an internal AT&T paper and had never been proved empirically, although many things indicated it was fairly easy to implement) and then it was the FCCs approval. By 1977, with this architecture in hand and after a long and complicated authorization process with the FCC, AT&T and Bell Labs constructed and operated the first developmental analog cellular system in the Chicago area with over 2000 trial customers. This early network, using large scale integrated circuits throughout, a dedicated computer and switching system, custom made mobile telephones and antennas, proved that a large cellular system could work. Despite this huge improvement, Bell System's impending breakup and a new FCC competition requirement, delayed cellular once again. It wasnt until October 12, 1983 that the FCC finally authorized the regional Bell operating company Ameritech to begin the first United States commercial 800 MHz cellular service in Chicago, Illinois. This first generation of services (1G) was called AMPS, or Advanced Mobile Phone Service and the access type (the underlying basic technology the network uses for communication management) was FDMA (Frequency Division Multiple Access). As incredible as it might appear, despite the obvious appeal of mobile communication, it took cellular services 37 years to become commercially available in the US. However a new telecommunication era had begun.
The Digital Era (2nd Generation technologies - 2G) - The Wireless Boom
The success was practically instantaneous, consumers welcomed the new cellular services and quickly outstripped the old standards, the demand boomed and by 1987 subscribers exceeded one million in the US. But analysts realized that the first generation AMPS network would be saturated by the end of the decade. This outlook left companies with three clear alternatives: (i) move into new spectrum bands, (ii) split existing cells into smaller cells and (iii) introduce new technologies that efficiently used the existing bandwidth. Given the fact that the first option was unacceptable for the FCC and the second required large capital expenditures in new network equipment, the FCC decided that the best route would be new technology. As a result, in 1987 the FCC decided to stimulate innovation by declaring that cellular licensees could employ alternative cellular technologies in the same 800 MHz AMPS frequency band as long as these solutions did not interfere with the existing services. The industry then searched for new transmission techniques that would increase the efficiency of radio spectrum, lower the costs of the system and improve the reliability and capacity of the network. In order to better organize this effort, the Cellular Technology Industry Association (CTIA) was established in the US to work with cellular services operators and industry manufactures. They collectively defined a series of technology requirements for the new concept and set the milestone of introducing the new products and services by 1991.
p. 15
Effectively, in early 1991 after many proposals and debates, the first version of the TDMA (Time Division Multiple Access) IS-54 standard was released (a.k.a. Digital-AMPS). A few years later, in 1994, this was replaced by the improved TDMA IS-136 standard (also called D-AMPS), which is still being used by many carriers today in the US. TDMA is so named because the frequency bands utilized in the conventional FDMA technique were divided into time slots, with each user having access to one time slot at regular intervals. It does this by digitally slicing and dicing parts of each voice conversation into a single data stream of 1s and 0s, like filling up one boxcar after another with freight. The technique could not have existed without the invention of the digital signal processor (DSP) by Texas Instrument in late 1983. The DSP is to cell phones what the microprocessor is to the computer. In addition to being digital, TDMA technology helped solve network saturation problems by tripling AMPS network capacity. Since the 1991 FCCs announcement, many other digital standards emerged around the world : TDMA (described above), GSM (Global System for Mobile Communication) and CDMA (Code Division Multiple Access). So, in 1994, the FCC finally decided to allocate a new spectrum (not the traditional 800 MHz) specifically for what it called Personal Communication Services (a.k.a. PCS) , the second generation cellular technologies (2G). For this, the commission conducted a series of auctions to sell the 99 licenses at the 1.9 MHz band to provided wireless PCS services across the US and its territories. Unlike the European approach (cooperation among carriers), there was no requirement that the bidders conform to any one technology standard (competition among carriers). The result was US$ 20 billions going to the US Treasury and a heterogeneous telecommunication infrastructure. In the years following the auction, network operators deployed new cellular services in each PCS technology. AT&T, Bellsouth and Southwestern Bell used TDMA. Sprint PCS, GTE, Bell Atlantic and Air Touch choose CMD. Microcell, Sprint Spectrum, Bellsouth and Omnipoint leaned toward GSM. This last standard was the most widely adopted technology around the world, especially Europe. (Exhibit 2 Standards Adoption) With the advent of digital cellular services, the industry literally exploded (Exhibit 3 Digital vs. Analog Devices). With the capacity problems subdued and the FCCs grace, wireless operators finally had found an open field to exploit the huge opportunity this market represented. As a consequence, the cellular market growth in the US exceeded all earlier forecasts and subscriber growth became exponential (Exhibit 4 Subscribers Growth). The keys to this success were greater marketing efforts, a welldefined market, a clear convenient product, expanding distribution, lower portable weights, greater cellular functionality and a transition to a life-style accessory (Exhibit 5 Popular Devices). Wireless communications are having a wide-ranging impact on the US and the rest of the world. The cellular as a product and concept shifted from a convenient communication device to a valuable tool that occupies an instrumental part of life. A recent survey conducted by Peter D. Hart Research Associates revealed that (i) 62% of Americans would prefer a wireless phone to television if stranded on a desert island, (ii) 64% plan to purchase a wireless phone and (iii) 62% said that wireless phones are improving their lives significantly. From a niche product, cellular has moved fast to the mass market (Exhibit 6 Handset Sales) Recently a ground-breaking study commissioned by Motorola identified a variety of behaviors that demonstrate the dramatic impact that cell phones are making as accessories to conduct life, love and work. Whatever it is called and however it is used, the cell phone alters the possibilities and practicalities of many aspects of everyday life, says Dr. Plant chief research of the study. The cell phone changes the nature of communication, and affects identities and relationships. It affects the development of social structures and economic activities, and has a considerable bearing on its users perceptions of themselves and the world.
p. 16
Among other powerful findings, the study revealed that cell phones have given people a new-found personal power, enabling unprecedented mobility and allowing them to conduct their business wherever they go. Additionally, the study identified six distinctive types of cell phone users based upon common traits and characteristics, and compared these types with six different kinds of birds. Owls, for example, tend to keep their cell phone use to a minimum, making and taking only necessary calls, while starlings tend to be more aggressive, pushing their way through crowds while talking loudly on their cell phones. The cell phone is helping people to cross borders - both physical and cultural. said Helen Normoyle, senior director of Consumer Insights for Motorolas Personal Communications Sector. The everevolving changes in the way it is used may tell us much about the changing nature of the world and its cultures in the future.
The Data Era (3rd Generation technologies 3G) A glimpse of the future
From the analog era to the digital era, one thing was always true : Wireless Communications has taken the world by storm. The combination of analog and digital networks propel the industry to new levels, falling prices enabled explosive growth in demand, plunging equipment costs enticed network operators to improve constantly their networks and therefore their coverage. As better and new services became available, business and consumers became more hooked up on the convenience and flexibility offered by wireless technology. In fact, during the late 90s wireline telecommunication revenues and subscribers have been declining consistently whereas wireless experienced a steep increase (Exhibit 7 Wireless vs. Wireline) However, the dynamism demonstrated by the wireless market during the first two eras had attracted a huge number of players and therefore the competition became increasingly high. In the US, industry consolidation combined with a savage war for each and every customer, led to a clear decline in net average revenue per user (ARPU). According to the CTIA, since 1980 this metric declined from $100 a month to a low $39 a month in 1998. (Exhibit 8 ARPU) Operators answered this downward trend by simplifying their pricing structures and offering a fixed-rate package of minutes with no roaming charges. As a result they increased the minutes of use per customer (therefore slightly the ARPU) but at the irreversible cost of sacrificing margin per user. Although these actions stopped the bleeding partially, it is apparent that the days of wireless carriers relying on voice traffic as a growth driver will soon end. Henceforth, companies will have to shift away from the voice business model they have been leveraging over the past decade to a new one. The solution? Data Services. With voice services likely to become a low-margin product (although necessary), carriers are expecting wireless data to revive their ARPUs and data services to turn into the next moneymaking machine in the industry. The technology that will enable this third era is called the third generation of wireless technologies. 3G will offer faster, more robust and ubiquitous data/internet connections and will finally lead to the total wireless convergence of voice, data and video. Additionally, the adoption of 3G will enable carriers to double their voice spectrum and therefore double the voice traffic that their networks can handle; in fact many industry analysts believe that this is the main advantage of 3G as opposed to the availability data services. Like the previous adoption of 2G technologies where three basic and incompatible 3G protocols were developed (TDMA, CDM and GSM), the proponents of 3G are pushing the adoption of two main technologies: WCDMA (Wideband CDMA or UMTS) and CDMA2000. The first one is the 3G technology for those carriers that has GSM Networks (mostly all Europeans and Asian carriers, in fact GSM is the most widely adopted protocol in the world), whereas CDMA2000 is the 3G technology for those carriers who had implemented CDMA technologies (mostly all US carriers). What happened to
p. 17
those who had chose TDMA technologies ? They can chose either of both 3G standard. (Exhibit 9 Technology Map). While both of them, CDMA2000 and WCDMA are incompatible but very similar in performance and reliability, they have a couple of crucial differences. CDMA2000 is actually comprised of various cost-efficient phases that allow the carrier to implement 3G services smoothly. Most importantly the technology does not require additional bandwidth and therefore no additional spectrum. On the contrary, the WCDMA does not have this cost-efficient phase-approach and it requires additional bandwidth. Despite these disadvantages, WCDMA is the most widely adopted standard around the globe (mainly because its the 3G stage for all the GSM-type of networks) which will result in greater availability (the dream of worldwide roaming) and more variety of hardware at lower prices. In the US, major carriers choose among two different strategies : (i) Sprint PCS and Verizon Wireless are currently deploying early phases of CDMA2000 and (ii) Cingular, AT&T and VoiceStream, committed to WCDMA technologies, decided to delay their 3G rollout dates until 2004 while deploying 2.5G technologies (an enhanced version of 2G technologies) in the meantime. However, many challenges and obstacles cast a shadow on 3G. From a technological standpoint, many analysts state that substitute technologies such as 2.5G (a transitional technology that has some similarities to 3G, like higher bandwidth, data rates and additional capabilities, but still is not as efficient and fast as the third-generation) and 805.11b (wireless LANs) will stifle 3G deployment, as they offer many of the features the average consumer is looking for in the mobility category. From a consumer perspective, carriers and OEMs are desperately looking for the killer-app that will entice consumers to subscribe to the 3G services and pay the prime price it will surely cost. Many analysts believe that this app will be the e-mail but it is still uncertain if that sole application will be a big draw to the technology. Additionally, many carriers around the world incurred in heavy financial debts in order to acquire the 3G spectrum licenses and to deploy the expensive new networks; this raised many doubts about carriers being able to correctly price the services for consumer adoption and still make a profit from it. Nevertheless, the 3G market presents an outstanding opportunity and a logic evolution from the already matured voice-2G market. The world is slowly but surely moving away from the narrow bandwidth networks to the broadband networks. A recent survey made by Jupiter Research showed that the US Mobile data market is growing exponentially and it is expected to break the 100 million subscribers mark in 2005 (Exhibit 10 Data Users). Undoubtedly, 3G will significantly impact the mobile experience for the end user by extending the utility of the mobile phone and by empowering the user with a whole new array of services and uses.
p. 18