Academia.eduAcademia.edu

The Value Of Webcams For Virtual Teams

2012, International journal of management & information systems

The latest low-cost technology solutions provide practical and reliable video options form standard personal computers using the Internet. By adding video to an established and geographically dispersed team process, this exploratory research tries to establish the experience of participants and perceived effectiveness of the team. Building on the literature, this qualitative research performs a content analysis design on a text transcription of weekly audio logs from participants. This approach analyzes the rich content of team members to discover the relevance of differing elements within trust, technology, and effectiveness find support. By understanding the influences of adding video to teams, leaders, and managers should be able to make informed decisions regarding the adoption of video for each participant. The attitude evolution regarding the use of technology over a period of six weeks provides further considerations for deployment.

International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 The Value Of Webcams For Virtual Teams Joel Olson, Kaplan University, USA Frank Appunn, Kaplan University, USA Kimberly Walters, Kaplan University, USA Lynn Grinnell, St. Petersburg College, USA Chad McAllister, Walden University, USA ABSTRACT The latest low-cost technology solutions provide practical and reliable video options form standard personal computers using the Internet. By adding video to an established and geographically dispersed team process, this exploratory research tries to establish the experience of participants and perceived effectiveness of the team. Building on the literature, this qualitative research performs a content analysis design on a text transcription of weekly audio logs from participants. This approach analyzes the rich content of team members to discover the relevance of differing elements within trust, technology, and effectiveness find support. By understanding the influences of adding video to teams, leaders, and managers should be able to make informed decisions regarding the adoption of video for each participant. The attitude evolution regarding the use of technology over a period of six weeks provides further considerations for deployment. Keywords: Virtual Teams; Distributed Teams; Webcam Teams; Video Teams INTRODUCTION V irtual teams have become a common occurrence within and between organizations with many studies identifying a variety of methods to improve outcomes (Chen et al, 2007; Hambley et al, 2007; Liu et al, 2008; Sridhar et al, 2007; van der Kleij et al, 2009). Teams often rely on technology to provide a variety of communications options to facilitate performance (Karpova et al, 2009; Kleij et al, 2009; Reed and Knight, 2010; Thomas and Bostrom, 2008; Wiggins, 2009). Against this backdrop, it follows that the evolution of technology will enable the broader deployment of increasing levels of rich-media options. The increased availability of fast network access and reducing real cost of technology options allows for the use of increasingly sophisticated rich-media options. The trend towards increased utilization of virtual teams can also be seen in actual individual and organizational behavior. Organizations have recognized the value of telecommuting or remote users evidenced by growth of as much as 900% in the number of organizations surveyed in 2004 using telecommuting or remote users (Johnson, 2004). Simultaneously, the general population has indicated its increased comfort with technology by the increased utilization of secure transactions such as e-banking (Bielski, 2004). More recently, reduced cost and availability has changed the urban dynamic and led less need for organizations to establish their offices on a single physical location (Ioannides et al, 2008). The implication is a separation of function and geography. One can find further support for this trend by considering the growth of outsourcing going beyond the traditional areas to include service provision (Narayanan et al, 2011). The organizational benefits of outsourcing include access to workers with a better match of skills, reduced cost, and data access. Employees see the benefit of reduced travel, time, and an improved support for sustainability (Wheelen and Hunger, 2010). Globalization has also driven the increase in utilization of virtual teams. The competitive nature of business and the rising need for global quality knowledge workers increases the need to exploit remote integration (Tarique and Schuler, 2010). There is a need for higher levels of interaction and less reliance on simple repetitive tasks conducted at separate locations. Modern business relies on increasingly sophisticated interaction between larger numbers of remote workers. This requires rich communication to support organization (Wiggins, 2009). © 2012 The Clute Institute 161 International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 Organizational management and leadership are also affected by this trend of increasing virtual team utilization (Balthazard et al, 2009; Nydegger and Nydegger, 2010). Purvanova and Bono, (2009) and Hambley et al. (2007) highlight that leaders can use technology mediated relationships under the correct conditions to increase performance. Virtual team members as report success in terms of satisfaction (Golden and Veiga, 2008), trust (Greenberg et al, 2007; Robert et al, 2009), and comfort (Lewis et al, 2005). Objective and Purpose Challenges remain as there continue to be reports of the negative influence of technology on teams (Thomas and Bostrum, 2008). Virtual teams are growing, providing a growing proportion of the productive output for organizations. The change in interaction leading to less social interaction, and changing methods for sharing tasks has provided a number of challenges for individuals. Participants have varying degrees of comfort with remote teams due to geographical dispersion; have less traditional work hours, and a need for more structure for interaction. In adjusting to the changed interaction and the evolving technology options, team members have a continuing challenge to achieve the level of function in a traditional setting. The use of video has the opportunity to provide a new dynamic for individual integration and improved performance at reduced costs for organizations. Video provides increased live interact facilitating the focus and attentiveness to improve communication and increased levels of trust. Video also proximate the previous managerial and leadership practice of leaders and managers better positioning them to be effective in virtual environments. Video has the potential to continue actualizing the promise of technology by facilitating closer relations, reducing cost, and increasing the productivity of virtual teams. Research Question Research on how video influences teams has been limited. Several studies have been limited to students (Bluemink and Järvelä, 2004; Hambley, et al., 2007; Jarmon et al, 2009). Other investigations relied on specialized technology not generally available to average users (Couzins and Beagrie, 2004; Hertel et al, 2005; Nakanishi, 2004). This study addressed the question, what is the impact of webcams on the trust and perceived effectiveness of virtual teams. The study used low cost webcams, no special specification of equipment, and it did not make use of any special travel arrangements or training. The subjects in this study had experience with computer mediated virtual teams using telephone and webinars; however, but had never utilized webcams or video. This experience parallels the experience of many virtual teams, which increases the probability that the study findings would have a broad relevance and be scalable to other organizations and teams. The research question required a qualitative method to explore the individual expectations and experiences of the team members over six weeks. Content Analysis was used to contextual individual experience in light of existing trust and effectiveness theory. A literature review was conducted to find appropriate sources to determine a clear set of attributes to use as a content frame. LITERATURE Content analysis relies on finding appropriate sources to define a clear set of specific attributes. A considered review of the literature found trust and perceived team effectiveness to be most important. Trust The study of social psychology has not been linear. It reads more like a dictionary of interesting topics than a novel with a clear story line. Trust has been studied for some time resulting in a number of trust theories; however, there has not been an emergent integrative theory of organizational trust (Kramer, 1999). The result is a conflicted record of contradictory findings that are difficult to compare (Schiller and Mandiwalla, 2007). 162 © 2012 The Clute Institute International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 Trust has been pursued in terms of individual choice (Arrow, 1974; Kreps, 1990; Miller 1992). That individual choice has been framed as being social, rational, and relational. For some individuals, individual trust choices are about social moral duty. The emphasis is on obligation and duty. These individuals have an internal framework linking trust decisions to appropriate moral action (Jarvenpaa et al, 1998). A utilitarian perspective drives rational choice. Economic (Williamson, 1993) and social (Coleman, 1990) factors are assessed to determine trust decisions. Trust decisions are a rational choice based on the calculation of self-interest (Kramer, 1999). The relational frame has been more popular and forwarded by several researchers (Mayer et al, 1995; McAllister, 1995; Tyler and Kramer, 1996). Relational choice has approached trust in terms of individual personality (Frost et al, 1978), culture (Farris et al, 1973), and interpersonal relationships (Duetsch, 1958; Mayer et al., 1995). Interpersonal relationships have been further studied as collective factors (Cummings and Bomiley, 1996) and individual factors (Mayer, et al., 1995). Jarvenpaa et al. (1998) has linked both collective and individual trust factors to virtual teams. Trust has been suggested as a key factor influencing the effectiveness of virtual teams. Sarker et al, (2003) defined virtual team trust (VTT) as “the degree of reliance individuals have on their remotely located team members taken collectively (i.e., as a group)” (p. 37). They identified three types of trust that are applicable to virtual teams: personality-based, institutional-based, and cognitive trust. Cognitive trust was further divided into three dimensions: stereotyping (subdivided into message-related, technology-related, and physical appearance/behavior), unit grouping, and reputation. Personality-based trust was defined as trust “that develops during infancy when one seeks and receives help from one’s caretakers” (Bowlby as cited in Sarker et al., 2003, p. 37) and results in “a general propensity to trust others” (Rotter as cited in Sarker et al., 2003, p. 39). Institutional-based trust draws on institutional theory, which states that “norms and rules of institutions (such as organizations) surrounding individuals guide their behavior (Sarker et al., 2003, p. 37). Sarker et al. (2003) argued that cognitive trust develops through two types of interactions, increased familiarity through tasks, and social interaction not related to tasks (e.g., humor, personal anecdotes). Cognitive trust can be broken down into three categories of unit grouping, reputation categorization, and stereotyping. Unit grouping “refers to the fact that team members share common goals that make them see each other positively and trustingly” (Sarker et al., p. 37). Reputation categorization suggests, “individuals with good reputations are trusted” (Sarker et al., p. 37). Finally, positive stereotypes based on physical appearances or other interaction modes lead to trusting. Sarker et al. (2003) developed and validated a survey of Virtual Team Trust based on these factors. Perceived Team Effectiveness As discussed, perceived team performance has been defined in multiple ways in the literature. The current study follows the work of an exploratory study by Lurey and Raisinghani (2001). Lurey and Raisinghani (2001) presented a framework for assessing a team’s effectiveness. One advantage to this framework is that it contains both process and outcome measures. Thus, information on how teams develop over time can be assessed as well as their overall effectiveness. The framework consists of three factors. The first factor is an outcome measure based on the team’s productivity level. Productivity level is defined as “the extent to which the group’s output, product, or service, meets the required standards” (Lurey and Raisinghani, 2001, p. 3). A supervisor or other management person not within the team would judge this factor. The remaining factors are process measures. The second factor is the team’s ability to learn and improve over time; based on “the process of conducting the work, not the actual outcome that is generated” (Lurey and Raisinghani, 2001, p. 4). This factor incorporates an element of future performance and team’s ability to learn. The third factor relates to individual team members’ level of satisfaction. It is also a process variable versus an outcome variable. This third factor implies that the team has a responsibility to “care for its members and provide the right opportunities for personal development and growth” (Lurey and Raisinghani, 2001, p. 4). © 2012 The Clute Institute 163 International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 Interestingly, this study found a primarily insignificant relationship between overall team performance and the teams’ tools and communication patterns. However, the specific Pearson correlation between video conferencing and performance was -.43 and between video conferences and satisfaction was -.23 indicating a significant relationship in a negative direction. Video conferencing was not a primary method of communication for the teams in this study. The majority of the teams used video conferencing only once per month or less frequently. This was suggested as a potential area for future research with a caveat that other factors were shown to have a greater influence on effectiveness. DATA COLLECTION AND METHODOLOGY Data Collection Data were collected from five participants who were members of a research team at a large online university. Three were faculty members, one was a department chair, and one was a faculty development coordinator. Participants worked for the organization part or full-time and all worked virtually. Four participants were men and one was a women. Some faculty had met once in person at a faculty retreat in January 2010. The team existed five months prior to the start of data collection. The team started weekly Adobe Connect sessions with audio via a conference bridge for all members and the group leader using a web cam in February 2010. The team intensified the video experience using WebEx with all team members using web cams in August 2010. A baseline audio log was created by each team member the week prior to the intensified video experience. Participants met weekly over a six-week data collection period and recorded impressions of their experiences immediately following each meeting. To record impressions, participants responded to a four-question, open-ended survey. The survey questions were: 1. 2. 3. 4. What impact did video have on your team experience? Why? What impact did video have on the development of trust in your virtual team? Why? What impact did video have on your own effectiveness? The effectiveness of your team? Why? Other comments: Each weekly log was transcribed by a third party organization and identifying information was removed from the transcripts. Methodology Content Analysis seeks to confirm a preexisting theory within the data moving from theory through observation to confirmation. It is a deductive approach seeking to confirm historic ideas. This approach is far more structured than most qualitative approaches, with little latitude for the researchers to discover new ideas. Content analysis aims to establish the presence of content in a body of data (Robson, 2002). Based on the literature, two historic approaches were selected to inform the preparation of codes for this study. For trust, Sarker et al. (2003) provide a system to measure trust as related to personality, institutional and cognitive basis. The latter is subdivided into unit grouping, reputation, and stereotyping. Perceived team effectiveness comes from Lurey and Raisinghani (2001). Analysis of perceived team effectiveness places a focus on satisfaction and performance where performance includes both the execution and the outcome of the team interaction (see Table 1 for a full list of codes and definitions). Prior to coding, each researcher created a set of proposed codes based on these two existing theories. The researchers then reviewed their proposed codes to clarify code definitions prior to determining the final codebook. The agreed upon unit of measure for the text data was a sentence with no more than two codes for each sentence choosing the most important when there were more potential meanings. While analyzing a sentence individually for meaning, the context of the surrounding text data contributed to the definition. This context provided important meaning given the unstructured nature of audio responses. 164 © 2012 The Clute Institute International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 Intercoder reliability was also addressed continually throughout the coding process. After coding the first study participant, the researchers compared their codes. Reliability statistics (Kappa and percent agreement) were calculated after the completion of coding for each participant’s data. If the coders did not reach an acceptable level of agreement for a participant, they reviewed the codebook again to improve their understanding of the definitions. Once agreement was reached, they would code the next participant’s data. If acceptable agreement were not reached, they would recode all previous participants’ data after a review of the codebook. Code Video impact positive (V+) Video impact negative (V-) Technology learning curve (Tech) Trust – Personality (TPers+) positive Trust – Personality (TPers-) negative Trust –Institutional (TInst+) positive Trust – Institutional (TInst-) negative Trust – Cognitive Unit grouping (TUnit+) positive Trust – Cognitive Unit grouping (TUnit) negative Trust – Cognitive Reputation (TRep+) positive Trust – Cognitive Reputation (TRep-) negative Trust – Cognitive Stereotyping (TSter+) positive Trust – Cognitive Stereotyping (TSter-) negative Perceived effectiveness – Satisfaction with team – (PESat+) positive Perceived effectiveness – Satisfaction with team – (PESat-) negative Perceived effectiveness – Performance – Execution (process, procedures) – (PEPerf+) positive Perceived effectiveness – Performance – Execution (process, procedures) – (PEPerf-) negative Perceived effectiveness – Performance – Outcome (PEOut+) positive Perceived effectiveness – Performance – Outcome (PEOut-) negative Table 1. Codes, Definitions and Sources Definition A statement of positive impact of the video A statement of negative impact of the video Technology learning curve present – large, reasonable Mention of trust related to having the tendency to trust – trusting nature Mention of trust related to having the tendency to trust – trusting nature Mention of trust related to being an employee of the same organization Mention of trust related to being an employee of the same organization Unit grouping (sharing common goals) Source Sarker, Valacich, and Sarker (2003) Sarker, Valacich, and Sarker (2003) Sarker, Valacich, and Sarker (2003) Sarker, Valacich, and Sarker (2003) Sarker, Valacich, and Sarker (2003) Unit grouping (sharing common goals) Sarker, Valacich, and Sarker (2003) Reputation (good reputation = trusted) Sarker, Valacich, and Sarker (2003) Reputation (good reputation = trusted) Sarker, Valacich, and Sarker (2003) Stereotyping (Physical appearance/behavior) Sarker, Valacich, and Sarker (2003) Stereotyping (Physical appearance/behavior) Sarker, Valacich, and Sarker (2003) Care for members and provide the right opportunities for personal development and growth Care for members and provide the right opportunities for personal development and growth Team’s ability to learn and therefore improve itself and its members while conducting its work Team’s ability to learn and therefore improve itself and its members while conducting its work The extent to which the group’s output meets the required standard The extent to which the group’s output meets the required standard Lurey and Raisinghani (2000) Lurey and Raisinghani (2000) Lurey and Raisinghani (2000) Lurey and Raisinghani (2000) Lurey and Raisinghani (2000) Lurey and Raisinghani (2000) RESULTS There were 1271 sentences across the five participants and the seven logs. The analysis used Microsoft Excel files, merging sheets from each researcher, an assortment of text formulae, and then using frequency counts. With the limited number of units, this approach allowed flexibility, negligible training, and accurate assessment. Further text formulae performed validation of input and identified researcher errors. © 2012 The Clute Institute 165 International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 With two potential interpretations per sentence, Table 2 shows a percentage spread across the two researchers’ agreement on 637 instances across the participants. There were 240 occasions where the two researchers did not recognize the same code; these are excluded from Table 2. The implication is that the participants were not restrained or influenced to limit their audio logs to the specific content anticipated in this research. The agreement between researchers results in a simple inter-rater statistic of 72.6% and Cohen’s Kappa at 59%. Both statistics are comfortably above the acceptance norm. Table 2. Total Number of responses and percentage analysis Number Codes PEOut- PEOut+ PEPerf- PEPerf+ TUnit- TUnit+ Tech V- V+ 16.1 7.8 5.5 5.6 0.9 4.6 4.2 2.2 1.4 50.1 1271 Total 20.3 9.4 6.0 5.6 0.6 4.5 0.9 0.6 1.1 49.5 531 S1 5.9 2.0 4.6 3.9 1.3 3.9 9.2 3.9 2.6 42.5 153 S2 15.5 5.3 10.1 3.9 1.9 8.2 6.3 5.3 2.9 60.9 207 S3 9.9 10.5 2.7 6.6 0.6 1.8 6.3 0.3 0.6 42.6 333 S4 46.8 2.1 10.6 12.8 14.9 87.2 47 S5 13.0 6.6 5.1 5.5 1.1 4.7 6.5 3.4 1.6 50.5 740 All but S1 Notes: V+ was improvement through video while V- was negative. Tech indicated a technical comment, similarly TUnit considered cognitive trust within the sample unit, PEPerf perceived effectiveness in execution and PEOut perceived effectiveness of the outcome. Code refers to the percentage of sentences that had some recognition while Number refers to the absolute number of units or sentences. The diversity of the participants was confirmed with individual results identifying divergent results across all 19 codes tested in this research. The biggest difference was the positive influence of video where the response went from 46% as a high to a low at 10% of the available codes. The use of percentages to represent the previous statistic results from very different response rates per participant. The most verbose participant provided 531 sentences or 41% of the total responses, representing double the average. The lowest response rate at 47 sentences represents less than half of the average rate. The implication being that the views of a single participant could overshadow the research. In interpreting the outcome, the researchers considered both the absolute number of responses and the percentage of responses by participant and in total. It was felt that the consideration of all three reviews would contribute to the results. Detailed analysis of the results will consider the group outcome, individual results, and a review of meaning across the seven audio logs that were spread across two months. Group Results Of the 19 codes defined as relevant for this research, 10 codes found little or no support from the participants. A personality-based trust code had only one positive sentence recognized by the researchers and no negative findings at all. Trust in the shared institution found four points of agreement and no negative support. A cognitive basis of trust for a positive or negative reputation found no support. Similarly, stereotypical cognitive trust that considered physical appearance and behavior found only two instances of support and no more. The last code considered less important relates to perceived effectiveness in terms of satisfaction with the team members and individual opportunity. Participants provided content where the researchers recognized 15 instances of positive satisfaction and 2 negative opinions of satisfaction. At 1.4% of the total matched codes, participants’ perceived effectiveness satisfaction was deemed too low for consideration. Group results regarding video, the target of this research, found 204 instances or 16.1% of research units supporting benefits from video webcams. A further 99 occurrences or 7.8% identified some negative facet related to video use. This represents twice as many positive comments that are negative. Cognitive trust within the unit showed a significant positive rating at 4.7%, nearly 7 times larger than the negative. 166 © 2012 The Clute Institute International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 The perceived effectiveness of process identified 59 points of positive influence and 53 negative. The perceived effectiveness for outcomes showed a stronger proportion for positive results at 28 instances, 50% higher than negative. Numerous studies identified technology to be an issue, the group outcome of this study found 70 comments related to technology, representing 5.5% of all comments. This represented both positive and negative comments. A Participant-Centric View The five participants in this study were coded as S1, S2, S3, S4, and S5. The verbose response came from S1 and the limited number of responses came from S5. The first consideration regarding participants is to exclude the verbose participant. Considering the remaining four participants resulted in fewer positive comments for video; however, they provided increased support for perceived effectiveness outcomes. Perceived effectiveness processes turned negative. The conflicting data outcomes raised concern regarding the data. Fortunately, further analysis of individual responses provided important insights. A review of the data showed that there were three different types of respondents. The first and third logs (S1 and S3) provided 2 to 3 times as many positive outcomes for the use of video, despite the concurrent high number of technology comments. Cognitive trust in the team and perceived effectiveness for process were particularly positive. Perceived effectiveness outcomes showed mixed results. A second group, S2 and S4, provided less support for the use of video. S4 provided more negative responses than positive, and surprisingly few technology comments. The introduction of video resulted in concerns for appearance and the degree of attentiveness shown to other participants. Despite this, S4 commented regularly, 6.6%, that there was an improvement in the cognitive trust within the unit. Both S2 and S4 responded with far more negative comments regarding perceived effectiveness performance, and mixed results for perceived effectiveness outcomes. Finally, S5, the participant with very few responses provided exceptionally strong support for video, team trust, and both performance measures. Should this participant have provided as many responses as S1, the outcome of this research would have shown far stronger support for video. The analysis that considers the responses and grouping of individual participants highlights the contradictory experiences of team members that use video. This would also explain why research has found it difficult to provide obvious answers regarding the adoption of technology to improve virtual and remote team processes. It also highlights the need to anticipate contradictory reactions in staff, and that the individuals can reverse expectations. The outcome also points to the importance of the individual and perceptions in the adoption of technology. Longitudinal Analysis The adoption of technology and the benefits derived from learning new technology often require users to become familiar with the features and processes required. In response to the previous analysis, the data for this research was organized into the seven logs, and initial entry and then six weekly entries recorded immediately after using video. The initial or baseline logs showed and anticipation of positive support for video, cognitive trust, and perceived effectiveness. There was some trepidation regarding technology. The progression through the six weeks showed growing support for video and some reduction in technology issues. Cognitive trust in the team started with a few comments; however, these group towards the end. There was no specific trend for perceived effectiveness responses. A review of all longitudinal responses at both individual and group levels identified a number of anomalies. First, individuals would provide very different responses on a weekly basis. One subject had a particularly negative demeanor regarding video, providing 50% of all the negative video comments in the last session. Other participants reacted differently. The fifth log provided a better group response than the sixth, despite the improving trend. © 2012 The Clute Institute 167 International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 Convergence Content analysis is a qualitative approach that uses quantitative techniques to analyze and verify findings. The two researchers that analyzed the logs had no common background and have never met or worked together. Validity or credibility comes from and inter-rater to show convergence. The plain inter-rater statistic should achieve 70% agreement and this research achieved 72.6%. Another measure, Cohen’s Kappa, should achieve 50%, the responses reached 59% for the group, and individual response levels from 53 to 70%. External validity or transferability should be strong with a mix of cultures, location, and technical adeptness within the group. None had worked together in a single physical location and many came from different departments. The diverse individual responses found in the research underscore the breadth of participants. Reliability or documentation included an example text; careful tracking of every code, and repeated coding where there was limited convergence. The researchers never considered or compared individual codes, rather relying on shared understanding. The use of percentage responses, rather than counts overcame the risk of skewed results from a disk proportionate number of comments between the different logs. SUMMARY Group, individual, and longitudinal analysis provided support for video leading to some support for improved perceived effectiveness. Despite the general trend, both individuals and longitudinal results showed a number of conflicting comments. Even within a single respondent, one could detect uncertainty as shown by “and then I realized that it actually did help me stay focused on the task of the call and be more engaged.” Further reflection often resulted in introspection and further insights for individuals such as “another observation that just occurs to me about how I feel is when I'm working, I will wear reading glasses now, and when I'm on this video conference, I don't.” The negative participant provided a further insight that would remove technology as a course with the comment “the more I worked with the video, the more I have determined it has a negative impact.” While another reflection supported the traditional view “so, we weren't very effective as a team trying to learn this new technology.” The value of open-ended responses is underscored by a first comment “it may have limited our effectiveness because we spent less time working on the task.” This was followed a few sentences later by “although it was less time chronologically, it was more effective time.” DISCUSSION Testing existing concepts and ideas from the literature, this qualitative, content analysis research found varying degrees of support for the representative literature (Lurey and Raisinghani, 2001; Sarker et al., 2003). In the group of participants, there was no significant support for trust other than cognitive trust for the unit and perceived satisfaction of effectiveness. Perceptions for performance execution had some support, as did performance outcome. The use of video had relatively strong support with 13% of all codes recognized. All of the items found in the results had a number of negative comments too. In the case of satisfaction of effectiveness, negative comments exceeded positive items. The outcome indicated very large variances between participants in terms of detail provided, opposing impressions regarding performance, use of technology, and the value of video. The longitudinal consideration across the reporting weeks showed some support for the growth of comfort with technology; however, this had some limitations. Considering all of the results, one should conclude that individuals provide inconsistent views regarding all forms of performance and trust by extending the use of video in virtual teams. The ability to provide rich responses seems to have facilitated a deeper insight of the individuals in this group. It also raises some questions regarding existing assumptions of the value of technology for remote groups. Despite the previous finding, it is noteworthy that the majority of participants provided resounding support to continue the use of video subsequent to this study and one might want to verify the use of solutions over periods that exceed seven meetings. 168 © 2012 The Clute Institute International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 From a research method viewpoint, the wide disparity between the volume of comments in total and per code between participants would lead one to suggest more-sensitive approaches to implementing technology. Cleary, individual feelings vary significantly over time and between persons. Future researchers might consider using proportionate response data and not absolute numbers. Despite using percentages, results from two participants weighed heavily on the performance outcomes. The use of technology and video for virtual teams finds support in this research; however, it has limitations and it did not follow expectations based on earlier research. Future research might consider unbounded qualitative research using a method that uses some form of open analysis. A further alternative should include testing much larger groups in a quantitative approach and include a longitudinal component. A longitudinal design would add value given that trust may be more relevant to the beginning of team development. In the current study, the participants had some experience working as a team prior to data collection. This research would also suggest careful attention to the analysis of variance across the participants. AUTHOR INFORMATION Joel Olson, PhD in Human Resources from the College of Education, Colorado State University, Ft. Collins and an MA in theology from Denver Seminary, has extensive experience in nonprofit leadership and consultation, education, and instructional design. Most recently, he has served the Reformed Church in America and the Evangelical Presbyterian Church as a consultant for churches in crisis. Currently he serves as the Leadership and Management Academic Department Chair for the School of Business and Management in Kaplan University. E-mail: jolson@kaplan.edu Frank Appunn, Professor, holds a PhD from Capella University in Organization, Management, and Technology. He has published on technology, information security, and teams. His research considers the confluence of technology, people, and organizations, while information assurance forms another interest area. He teaches at multiple institutions, leads a leadership degree and specialist areas include technology, security, business, and project management, and is the chair of 20 doctoral dissertation committees. E-mail: Frank@Appunn.net Kimberly Walters is a Professor of Human Resources at Kaplan University and holds a PhD in Industrial and Organizational Psychology. She serves the University as a course curriculum leader and as a faculty advisor for the SHRM Student Chapter. Her research interests include examining the relationship between trust and effectiveness in virtual workers and studying reality based learning in the online classroom. E-mail: kwalters@kaplan.edu. Corresponding author. Lynn Grinnell, Professor, holds a PhD from the University of South Florida in Curriculum and Instruction and an M.S. in Organization Business Management. She has published on sustainability management, educational measurement, and teams. Her research examines the ethical influence of individuals and teams and measurement of learning. E-mail: Grinnell.Lynn@spcollege.edu Chad McAllister is lead faculty at Walden University in the DBA Program and holds a PhD from Capella University in Organization and Management with previous degrees in electrical engineering. His research interests involve issues in new product development and innovation, including virtual team performance. He serves as VP of Education for the Rocky Mountain Product Development and Management Association chapter. E-mail: chad.mcallister@waldenu.edu REFERENCES 1. 2. 3. Arrow, K. (1974). The Limits of Organization, Norton, New York, NY. Balthazard, P. A., Waldman, D. A., and Warren, J. E. (2009). Predictors of the emergence of transformational leadership in virtual decision teams, The Leadership Quarterly, 20(5), 651-663. Bielski, L. (2004). Bucking the back to bricks trend, American Bankers Association. ABA Banking Journal, 96(11),25. © 2012 The Clute Institute 169 International Journal of Management & Information Systems – Second Quarter 2012 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 170 Volume 16, Number 2 Bluemink, J., and Järvelä, S. (2004). Face-to-face encounters as contextual support for Web-based discussions in a teacher education course, The Internet and Higher Education, 7(3), 199-215. Chen, M., Liou, Y., Wang, C.-W., Fan, Y.-W., and Chi, Y.-P. J. (2007). TeamSpirit: Design, implementation, and evaluation of a Web-based group decision support system, Decision Support Systems, 43(4), 1186-1202. Coleman, J. (1990). Foundations of Social Theory. Cambridge, MA: Harvard University Press. Couzins, M., and Beagrie, S. (2004, February). How to... make the most of video conferencing. Personnel Today, p. 29. Cummings, L. L. and Bromiley, P. (1996). The organizational trust inventory (OTI): Development and validation, In Trust in organizations Frontiers of theory and research, R. M. Kramer and T. R. Tyler, (Eds.). Thousand Oaks, CA: Sage. Deutch, M. (1958). Trust and Suspicion, Journal of Conflict Resolution, 2, 265-279. Farris, G.F., Senner, E.E., and Butterfield, D.A. (1973). Trust, culture, and organizational behavior, Industrial Relations: A Journal of Economy and Society, 12(2), 144-157. Frost, G.F., Stimpson, D.V., and Maughan, M.R. (1978). Some correlates of trust, Journal of Pscyology, 99, 103-108. Golden, T. D., and Veiga, J. F. (2008). The impact of superior-subordinate relationships on the commitment, job satisfaction, and performance of virtual workers, The Leadership Quarterly, 1900(1), 7788. Greenberg, P. S., Greenberg, R. H., and Antonucci, Y. L. (2007). Creating and sustaining trust in virtual teams, Business Horizons, 50(4), 325-333. Hambley, L. A., O'Neill, T. A., and Kline, T. J. B. (2007). Virtual team leadership: The effects of leadership style and communication medium on team interaction styles and outcomes, Organizational Behavior and Human Decision Processes, 103(1), 1-20. Hertel, G., Geister, S., and Konradt, U. (2005). Managing virtual teams: A review of current empirical research, Human Resource Management Review, 15(1), 69-95. Ioannides, Y. M., Overman, H. G., Rossi-Hansberg, E., and Schmidheiny, K. (2008). The effect of information and communication technologies on urban structure, Economic Policy, 23(54), 201-242. Jarmon, L., Traphagan, T., Mayrath, M., and Trivedi, A. (2009). Virtual world teaching, experiential learning, and assessment: An interdisciplinary communication course in Second Life, Computers and Education, 53(1), 169-182. Jarvenpaa, S.L., Knoll, K., and Leidner, D.E. (1998). Is anybody out there? Antecedents of trust in global virtual teams, Journal of Management Information Systems, 14, 29-64. Johnson, J. T. (2004). The costs and benefits of remote workers, Network World, 21(51), 24. Karpova, E., Correia, A.-P., and Baran, E. (2009). Learn to use and use to learn: Technology in virtual collaboration experience The Internet and Higher Education, 12(1), 45-52. Kleij, R. v. d., Jong, A. d., Brake, G. t., and Greef, T. d. (2009). Network-aware support for mobile distributed teams, Computers in Human Behavior, 25(4), 940-948. Kramer, R.M. (1999). Trust and Distrust in Organizations: Emerging perspectives, enduring questions, Annual Review of Psychology, 50, 569-598. Kreps, D. M. (1990). Corporate Culture and Economic Theory, In Perspectives on Positive Political Economy, J. Alt and K. Shepsle, (Eds.). New York, NY: Cambridge University Press. Lurey, J. S. and Raisinghani, M. S. (2001). An empirical study of best practices in virtual teams, Information and Management, 38(8), 523-544. Lewis, D., Shea, T., and Daley, T. M. (2005). The effect of virtual team membership on attitudes towards technology usage: A study of student attitudes in the United States, International Journal of Management, 22(1), 3-10. Liu, X., Magjuka, R. J., and Lee, S.-h. (2008). The effects of cognitive thinking styles, trust, conflict management on online students' learning and virtual team performance, British Journal of Educational Technology, 39(5), 829-846. Mayer, R. C., Davis, J. H., and Schoorman, F. D. (1995). An Integrative Model of Organizational Trust, Academy of Management Review, 20, 709-734. McAllister, D. L. (1995). Affect and cognition based trust as foundations for interpersonal cooperation in organizations, Academy of Management Journal, 38, 24-59. © 2012 The Clute Institute International Journal of Management & Information Systems – Second Quarter 2012 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. Volume 16, Number 2 Miller, G. J. (1992). Managerial Dilemmas: The Political Economy of Hierarchies. New York, NY: Cambridge University Press. Nakanishi, H. (2004). FreeWalk: a social interaction platform for group behaviour in a virtual space, International Journal of Human-Computer Studies, 60(4), 421-454. Narayanan, S., Jayaraman, V., Luo, Y., and Swaminathan, J. M. (2011). The antecedents of process integration in business process outsourcing and its effect on firm performance, Journal of Operations Management, 29(1-2), 3-16. Nydegger, R. P., and Nydegger, L. B. (2010). Challenges in managing virtual teams, Journal of Business and Economics Research, 8(3), 69-82. Purvanova, R. K., and Bono, J. E. (2009). Transformational leadership in context: Face-to-face and virtual teams, The Leadership Quarterly, 20(3), 343-357. Reed, A. H., and Knight, L. V. (2010). Effect of a virtual project team environment on communicationrelated project risk, International Journal of Project Management, 28,(5), 422-427. Robert, L. P., Jr, Dennis, A. R., and Hung, Y.-T. C. (2009). Individual swift trust and knowledge-based trust in face-to-face and virtual team members, Journal of Management Information Systems, 26(2), 241279. Robson, C. (2002). Real world research (2nd ed.). Boston, MA: Blackwell, Malden. Sarker, S., Valacich, J. S., and Sarker, S. (2003). Virtual team trust: Instrument development and validation in an IS educational environment, Information Resources Management Journal, 16(2), 35-55. Schiller, S.Z. and Mandiwalla, M. (2007). Virtual Team Research: An analysis of theory use and a framework for theory appropriation, Small Group Research, 38 12-59. Sridhar, V., Nath, D., Paul, R., and Kapur, K. (2007). Analyzing factors that affect performance of global virtual teams, Second International Conference on Management of Globally Distributed Work, Bangalore, India, available at http://www.globalwork.in/GDW07/pdf/14-159-170.pdf Tarique, I., and Schuler, R. S. (2010). Global talent management: Literature review, integrative framework, and suggestions for further research, Journal of World Business, 45(2), 122-133. Thomas, D., and Bostrom, R. (2008). Building trust and cooperation through technology adaptation in virtual teams: Empirical field evidence, Information Systems Management, 25(1), 45-56. Tyler, T. R. and Kramer, R. M. (1996). Whither Trust?, In Trustin organizations: Frontiers of theory and research, R. M. Kramer and T. R. Tyler, (Eds.). Thousand Oaks, CA: Sage. van der Kleij, R., Lijkwan, J. T. E., Rasker, P. C., and De Dreu, C. K. W. (2009). Effects of time pressure and communication environment on team processes and outcomes in dyadic planning, International Journal of Human-Computer Studies, 67(5), 411-423. Wheelen, T. L., and Hunger, J. D. (2010). Strategic management and business policy: Achieving sustainability (12th ed.). Upper Saddle River, NJ: Prentice Hall. Wiggins, B. (2009). Global teams and media selection. World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009, Honolulu, HI, USA. Williamson, O. (1993). Calculativeness, Trust, and Economic Organization, Journal of Law and Economics, 36(1), 453-486. © 2012 The Clute Institute 171 International Journal of Management & Information Systems – Second Quarter 2012 Volume 16, Number 2 NOTES 172 © 2012 The Clute Institute