Academia.eduAcademia.edu

Suggestions for international research using electronic surveys

Introduction The experience portrayed in this paper on the use of electronic surveys is based on a study that aimed to quantify the influence of several constructs, such as trust and risk perceptions, on the satisfaction with a specific international marketing relationship. In order to obtain the data needed, an electronic survey was conducted. The objective of this paper is to present some useful suggestions on how to plan, set up and execute an internet-based survey based on the experience obtained. Specifically, we try to highlight some suggestions to be followed and some pitfalls to be avoided in similar studies that involve, simultaneously: the use of a survey, the administration of the survey through the Internet, the difficulties associated with obtaining responses, and the administration of the survey to a sample of firms with partners from different countries. This paper aims to draw useful indications for researchers who wish to conduct international marketing research using electronic surveys. These suggestions are based on our own studies in which data on the perceptions of international companies' managers were collected via a mixed mode of surveying that combined e-mail and web-based applications. The experience was rich in problem solving and uncovered several practical details that might escape a researcher less experienced in electronic survey instruments.

the marketing review Suggestions for international research using electronic surveys Susana Costa e Silva, Catholic University of Portugal, Portugal* Paulo Duarte, University of Beira Interior, Portugal This paper aims to draw useful indications for researchers who wish to conduct international marketing research using electronic surveys. These suggestions are based on our own studies in which data on the perceptions of international companies’ managers were collected via a mixed mode of surveying that combined e-mail and web-based applications. The experience was rich in problem solving and uncovered several practical details that might escape a researcher less experienced in electronic survey instruments. Keywords Web based surveys, Survey design, Cross-cultural research, Survey system’s comparison Introduction The experience portrayed in this paper on the use of electronic surveys is based on a study that aimed to quantify the influence of several constructs, such as trust and risk perceptions, on the satisfaction with a specific international marketing relationship. In order to obtain the data needed, an electronic survey was conducted. The objective of this paper is to present some useful suggestions on how to plan, set up and execute an internet-based survey based on the experience obtained. Specifically, we try to highlight some suggestions to be followed and some pitfalls to be avoided in similar studies that involve, simultaneously: the use of a survey, the administration of the survey through the Internet, the difficulties associated with obtaining responses, and the administration of the survey to a sample of firms with partners from different countries. *Correspondence details and biographies for the authors are located at the end of the article. The Marketing Review, 2014, Vol. 14, No. 3, pp. 297-309 http://dx.doi.org/10.1362/146934714X14024779061992 ISSN1469-347X print / ISSN 1472-1384 online ©Westburn Publishers Ltd. 298 The Marketing Review, 2014, Vol. 14, No. 3 The research process and major results A survey is a particular type of data collection instrument that can be conducted through the Internet using e-mail and a web-based application form. The Internet, as a means of administering surveys, is still at a young stage1. In fact, paper surveys dominate empirical research in international business (Yang, Wang, & Su, 2006). However, this new approach to data collection is gaining the preference of researchers along with the Internet’s proliferation and with the spread of its use of it all over the world. Thus, learning about other researchers’ experiences may be regarded as an attempt to transform tacit into explicit knowledge and therefore systematises useful comprehension about this important part of an empirical study, especially in international research. One may distinguish three stages in conducting a survey: planning and conception, setup and execution, and validation. In this paper, we will focus more on the execution phase because that is where concerns about the use of electronic surveys are more pertinent. However, we will also give some suggestions concerning the other phases. Planning and conception phase At the planning and conception stage, the researcher should assess the objectives of the study and examine the literature to see how other researchers have conducted their studies, in order to identify weaknesses or limitations already reported. Likewise, in this phase, researchers should try to ensure comparability with previous studies in order to provide valuable results and conclusions. The decision to conduct an electronic survey is based upon an examination of the trade-off between the benefits and drawbacks associated with this type of survey, plotted against the equivalent postal survey. Table 1 synthesises some important questions that should be answered by the researcher in this initial phase when designing the survey. These questions must be answered by the researcher before moving on to the setup and execution phase of the survey. Some flexibility is required in order to accommodate unanticipated problems occurring. Choosing the online survey system Researchers have two main options in order to deploy the questionnaire. They can use a survey software system that can be individually managed, such as LimeSurvey. However, to install the software users must have their own webserver or they may rent a dedicated server from a hosting company that meets the software requirements (e.g., PHP and MYSQL). The package then needs to be configured according to one’s needs, demanding some specific knowledge and extra work to get it running. Another option is to use an available online survey system. These systems provide quicker and easier solutions for gathering data. Most of the systems available in the market offer an intuitive dashboard for the creation of survey questions within 1 For further developments on this topic see Fricker and Scholau (2002), Kiernan, Kiernan, Oyler and Gilles, (2005) and Zhang (1999). Silva & Duarte Suggestions for international research 299 Table 1 Important questions at the conception phase of a survey Questionnaire design What constructs are to be measured? What words are appropriate to use? • Are there already validated scales that can be used? • Is it necessary to develop specific measures? • What special care with wording should be taken? Are questions grouped according to the constructs under analysis? Is it possible to use Likert-scales (see note 1)? • Do all scales have the same number of points? • Do all the anchors used reflect an improvement when options pass from 1 to 2, from 2 to 3, and so on? • Are there negatively worded items which require special attention? Are questions with blank spaces to fill reduced to a minimum in order to facilitate their treatment? Is it possible to substitute open response questions by closed response questions? Is the form of the questionnaire appealing to respondents? • How long is the completion supposed to take? Is there a space for observations? Is the respondent given the possibility of identifying him/herself and/or his/her company if desired? Pre-test Is a pre-test to be considered? What is the desirable number of pre-tests? How representative of the sample are they? Internet vs. postal survey Is it possible to weigh the advantages and disadvantages of postal and internet-based surveys according to criteria such as: time necessary, cost (see note 2), complexity of the process of developing and administering a web-based survey, and response rates (see note 3)? • Is it viable to survey the intended sample through the Internet? • Are there means to survey the members of the sample via web or e-mail? • What is the age of potential respondents? • Are they familiar with computers? • Is the Internet widely used in the subject country? • Is access to the Internet costly? • Is it costly to conduct the survey through the Internet? • How costly is an electronic survey when compared with a paper survey? • How would a survey conducted through the Internet diminish the response rate when compared with its administration via post mail? Are response rates available for similar studies? Is it possible to have the internet questionnaire form as similar as possible to the paper format? In the case of an internet-based survey is it possible to collect information on the respondee location (city/country)? Is it possible to collect data on the Internet Protocol (IP) address (this may give some control over duplication of responses)? Is it possible to ensure confidentiality in an internet-based survey? Notes : 1 Likert-scales are easy to deal with in quantitative analysis and avoid the problem of outliers in responses, since respondents cannot respond outside the pre-defined range of values. 2 It is important to bear in mind that the cost of programming, sending and maintaining the survey electronically should be compared with the equivalent cost of mailing the survey: production of the questionnaires, envelopes, stamps, and pre-paid mail (which is normally the case, in order to increase response rates). 3 For further developments on this subject see Fricker and Scholau (2002) and Archer (2003). 300 The Marketing Review, 2014, Vol. 14, No. 3 templates, and generate a URL that can be embedded in an e-mail or posted on a website. Some can even be automatically integrated in social networks like Facebook or Twitter so that direct access to the research instrument is assured. Table 2 presents a list of some of the best known online survey systems currently available, and the respective URL location. Before choosing the web-based survey system, the researcher should evaluate some issues. The first is the budget available. Most of the systems offer both free and paid versions. Free versions are normally more limited. The available free features must be assessed to check if they provide what is necessary to conduct the study. A second issue to explore is the type of question that needs to be responded to. It is necessary to take a close look at the instrument to see if there are special considerations concerning some item type response requisites, like drag and drop facility, continuous sum, table ranking or video-based questions. Nevertheless, the most common types of questions are available in almost every system. Some special ones are only available in some products, as is the case of the ones requiring the use of pictures, drill down or bipolar semantic tables. Regarding this specific feature, the highest distinction goes to Qualtrics and Surveygizmo, as they provide 32 and 38 questions types, respectively. Surveymonkey and Zoomerang are two well-known web survey systems. However, these display the lowest number of question types: just 15. Table 2 Survey tools* Online solutions 2ask eSurveysPro FluidSurveys Google forms Key survey KwikSurveys LimeSurvey PollDaddy Qualtrics QuestionPro SurveyGizmo SurveyMonkey Wufoo Zoomerang http://www.2ask.net http://www.esurveyspro.com http://fluidsurveys.com http://drive.google.com http://www.keysurvey.com http://kwiksurveys.com http://www.limesurvey.org http://www.polldaddy.com http://www.qualtrics.com http://www.questionpro.com http://www.surveygizmo.com http://www.surveymonkey.com http://wufoo.com http://www.zoomerang.com Integrated solutions Constant contact FormSite Moodle http://www.constantcontact.com http://www.formsite.com http://www.moodle.org * Information current at September 2013. Silva & Duarte Suggestions for international research The number of anticipated items and participants should also be considered. The free versions often have a maximum number of questions per survey (usually 10) and/or a maximum number of responses (typically 50 or 100 per survey). Some systems have an even smaller number of responses, as is the case with QuestionPro. In the case of special features such as: ‘branching’ (when the researcher needs to direct the respondent to different questions on the basis of their previous responses), ‘piping’ (when the researcher needs to prefill a question based on the respondent’s responses to a previous question) or ‘randomisation’; the paid version must be considered, since the implementation of these features is limited or even absent in free versions. Other additional features should also be evaluated, according to the requisites of the investigation, since it also unlikely that free versions allow, for instance, controlling the time spent completing the survey. The export functionality is also limited in the free versions. Although the majority of systems provide the user with some basic and innovative analyses and reports, the researcher may need to export data to statistical software packages to perform advanced data analysis. Almost all systems export data in CSV and Excel formats, but some provide special file types like SPSS compatible ones. Finally, the need to customise the questionnaire should also be taken into account. Different systems offer different options for creating custom URLs, for adding images (e.g., logos), and for creating colour schemes. In the case of international surveys, it is also important to check whether there is the option to have the questionnaire translated into several languages and to send personalised links according to the geographic location of the respondent. Multiple languages are not usually supported, not even in paid systems. An interesting online tool is the totally free Kwiksurveys system. This online survey system offers unlimited questions and responses, plus full results export. However, it has only seven types of questions and very limited logic ability, lacking all the advanced features existing in other systems, such as a ‘save’ and ‘continue later’ option. For some simple projects, Google forms can also perform the task well. Although not regarded as a real survey system, it deserves to be mentioned due to its availability, simplicity, low cost and the unlimited number of forms and responses. With only seven types of questions, customised themes and section and page break features, Google forms allows the creation of quick and simple forms that can be used as surveys. Nevertheless, it misses almost all the functions of an accurate survey system, such as piping, the use of logic, randomised questions, an adequate number of question types, and the option to customise questions. Setup and execution phase Before moving on to this phase, it is assumed that a decision was made to implement an electronic survey and that the researcher is, at this point, confronted with the tasks associated with the implementation of the survey itself. Several assignments are identified in the execution phase, as follows. According to Dillman (2000) a “tailored design method” of surveying may be appropriate when the Internet is to be used or internet related behaviours are the focus of the study. This means that mixed modes of surveying may be 301 302 The Marketing Review, 2014, Vol. 14, No. 3 adequate considering the context, in an attempt to harmonise the different modes and procedures to the different survey situations. From the authors’ experience with databases of international marketing relationships, it is possible to draw up a list of the names of managers and their respective e-mail contacts, but in this case particular attention and care must be taken to respect data protection laws. However, one should not rely too much on responses collected via e-mail, as there is a likelihood of technical incompatibilities between programmes used by the sender and receiver to process text and e-mail messages. Thus, we suggest it would be better to combine both e-mail and the web for sending the survey and collecting the responses, respectively. The web is considered a meeting place like any other central location and therefore a sample based on the internet is considered to be as good as others (Aaker, Kumar, & Day, 2004). In a recent study carried out by the authors, the web was not used as a point of contact because that strategy would not have allowed the target sample to be reached: managers of companies involved in international alliances. Thus, only the managers who received a customised e-mail were directed to the web platform from where the survey could be completed. We also verified that, when needed, some contact names and/or e-mails could be confirmed by telephone. Hence, a third contact method was added in the initial phase, though only to confirm some names. However, there are cases in which the telephone can also be used in the pre-announcement and reminder phase of the survey. After defining the sample parameters and the database or databases to be used, it is necessary to confirm that there are enough elements with an e-mail address in order to conduct an e-mail survey. At this point, the researcher has to be aware of the errors that a mixed-mode of survey like this may produce, which are basically the same as in other types of survey: sampling, coverage, measurement, and non-response. The first one derives from the fact that we are surveying a sample and not the population itself2. In the authors’ case, from the initial database, a frame was drawn based on the sampled individuals who had an e-mail address, a valid procedure that Couper (2000) labeled as “list-based sample of high-coverage population”. Coverage error (has to do with the mismatch between the target population and the frame population), measurement error (the deviation of the respondents’ answers from their true value on the measure), and non-response error (arising because not all people included in the sample are willing or able to complete the survey) are not within the objectives of this paper. For more details on these errors, see, for example, Aaker et al. (2004); Chisnall (2001); Couper (2000); Grandcolas et al. (2003); Groves (1989); Hair, Anderson, Tatham and Black, (1998); and Manfreda, Batagelj and Vehovar (2002). The text of the e-mail needs to captivate the potential respondent; it is from this that he/she will decide to answer (or not answer) the questionnaire. Thus, besides being personalised, it should be appealing, short and informative. It should contain information on who is conducting the study, what its purposes are, why the respondent’s input is needed, and the importance of his/her contribution. A note on confidentiality and on the time needed to complete the questionnaire should also be emphasised here. The 2 For further developments on this type of error, see, for example, Couper (2000) and Grandcolas, Rettie and Marusenko (2003). Silva & Duarte Suggestions for international research link to the survey itself should be embedded in the text. Further, it is desirable that the domain name belongs to an institution, since this may enhance the credibility of the study. Thus, if the survey is outsourced to a service provider3, the link in the e-mail should indicate the name of the institution and be redirected. This is because the association with the server that bears the name of an institution brings credibility to the study, which may lead to an increase in the response rate. The web-survey should be carefully designed and thoroughly tested on different browsers. According to Couper (2000), the survey instrument must be easy to understand and to complete, must be designed to keep respondents motivated to provide optimal answers, and must serve to reassure respondents regarding the confidentiality of their responses. The design of the web-survey, such as the placement of the questions, flow of the instrument and typographical features are very important issues to consider, in order to improve its quality and, therefore, minimise measurement errors. This is particularly important in electronic surveys, where response rates are known to be lower (Fraze, Hardin, Brashears, Haygood, & Smith, 2003) and the decision not to respond is likely to be made more quickly (Fricker & Schonlau, 2002). However some factors may favour responses. The authors suggestions of such factors are as follows: • • • an appealing graphic layout, which includes, for instance, restraining the use of colour in order to maintain readability, ensuring a good resolution and restraining the screen to 800x600 pixels to perfectly fit old and small screens, and using scroll-down options, check boxes and radio buttons; a readable way of presenting questions, preferably presenting them in a conventional format, similar to the traditional paper design, and group questions according to theme; restricting the number of questions per page, since placing a large number of questions in one single page could possibly cause errors from and discourage the respondent. The same applies to the number of answer choices; - the importance of allowing the respondent to save the responses given and to allow for the completion of the survey at a later date if necessary; and - the ability to print the survey. Another advantage associated with web-based surveys is the ability to inhibit respondents from moving forward in the survey without full completion of previous questions. Thus, responses become compulsory in the sense that there is a message that is displayed each time the respondent wants to skip a question. This avoids missing values in some questions as frequently happens in paper surveys. This, associated with the use of Likert-scales, limits the problem of univariate outliers4. 3 Outsourcing the webpage creation was chosen against the alternative of creating it with university means. It was more advantageous to outsource in terms of cost and time needed. 4 For further developments on this subject, see, for instance, Hair et al. (1998). 303 304 The Marketing Review, 2014, Vol. 14, No. 3 Responses should be collected in a format that allows subsequent export to a known file format, such as Microsoft Access, Excel, SPSS or the general CSV (comma/tab delimited file), for instance. Including extra information on the place (country and/or city) from where the questionnaire was submitted and the correspondent IP address should also be considered. This may be important to ensure that there were not multiple responses coming from the same respondent. We verified twice in our studies that respondents started responding to the questionnaire, stopped without fully completing it, then submitted it again which meant starting all over. These mistakes were easily identified because the IP addresses were the same. Other methods may be to employ the use of cookies and individual key based links. Another aspect to be considered at this point is the option of limiting the maximum number of characters in responses. Execution The process of sending e-mails through an e-mail programme should be carefully thought out. There are mail merge software programmes that can be used to customise e-mails in order to reduce the chances that the e-mail will be considered as unsolicited by the respondent company’s server. One of the most effective methods to increase response rates and avoid the spam filters problem is to obtain a sponsorship. This has happened in previous studies, as, for instance, in Delerue (2004) and in Porporato (2005), where trade associations or other organisations that regulate the selected sector, send the questionnaires or allow the use of their institutional name. This strategy, along with others, such as the offer of certain incentives to respond, may produce very encouraging response rates. A good example of an incentive was the offer to respondents of an annual subscription to the magazine of the car and manufacturer’s trade association magazine, which happened in Porporato (2005). In any case, the e-mail should have a friendly and polite approach, without too much information given but, at the same time, with enough to allow the receiver to know what the survey is all about. The researcher must be as sure as possible of the validity of e-mail addresses and, if there is the opportunity of sending customised e-mails, of the name of the person to whom the e-mail, with the survey link, is addressed. Mondays, Fridays, the end of the day, proximity to holidays, as well as critical periods such as the tax season, should be avoided. In the first case, the respondents may be confronted with a large number of messages on Monday morning reducing their willingness to answer. In other cases, people are off work, and “out-of-office” reply messages are frequent. Responses are less likely due to an overload with priority tasks, particularly if respondents are going away. In any case, it is advisable that follow-up contact be made after the person to be contacted has returned. If the contact information of a substitute person is provided, the request could be redirected to that person. It is also possible that some reply messages may be received indicating that the target person is no longer employed at the company. In this case, and to obtain as many responses as possible, a personalised message should be sent asking the name of the new person in charge. Silva & Duarte Suggestions for international research We should also note that reminders are associated with an increased response rate. Reminding respondents is a common practice in surveys with the intention of raising response rates (Dillman, 2000). This should be done two weeks to one month after e-mails are sent. Less than two weeks does not give enough time for those who wish to respond but due to time constrains need a longer period to do so. But it should not take too long either; otherwise they may forget what the survey was about. The reminder notice should contain all the information needed so that if the respondent wants to respond immediately he/she can do this without having to look for the initial e-mail. In the authors’ experience, a peak of responses was received on the day in which reminders were sent. However, the authors believe that if more than one reminder is sent, the second should be sent only to those people in the sample that have not yet responded. In fact, a respondent who has already filled out the survey may understand receiving one reminder but will probably consider it unreasonable to receive a second. In any case, no more than three attempts to contact should be made. Thus, if a pre-announcement has already occurred in the initial phase of the survey, then just one reminder should be sent, in order to avoid contact overloading. The authors’ experience proved that the researchers’ social network of contacts may be highly useful in obtaining more responses. In fact, it was verified that advertising the survey to possible respondents, or people who might know the respondents, is useful. Thus, “snowball sampling” may be a valuable resource for the increase of response rates. This resource is not only valuable in stimulating responses but also in obtaining the names of potential respondents to whom the questionnaire might be sent. Validation phase It is important to create a special local folder to collect the e-mails that are being sent and received, in order to avoid overloading the server. An accurate record should be kept of all the e-mails sent, e-mails returned, e-mails requiring a personalised reply or any kind of special attention. This information may be precious in the future, especially to determine the response rate. In fact, the final sample should be composed only of delivered e-mails (Swoboda et al. 1997, cited in Zhang, 1999). Thus, it is necessary to keep an updated record of all the e-mails that were not delivered. According to Zhang (1999) and to Grandcolas et al. (2003), there is a high rate of e-mail address turnover, which makes the updating process very important. It is also necessary to verify the address if in some cases the server of the potential receiver sends “undelivered messages” notifications more than once. This is due to the fact that the server keeps trying to deliver the message for a certain period of time, instead of attempting to deliver it just once. This needs to be taken into account when calculating the number of elements of the sample considered unreachable. The unreachable e-mails need to be sorted out. One needs to know why they did not reach the initial e-mail address. At this point, it is important to verify that there are no double responses, meaning that the same respondent has responded twice. The procedures discussed in the previous section (IP based, cookie based) can be extremely useful to help identify duplicates. While this is not a normal occurrence, it may happen if there was a problem during submission of the questionnaire, 305 306 The Marketing Review, 2014, Vol. 14, No. 3 or if the respondent had to interrupt the process before having completed the survey. In these cases, one of the questionnaires should be eliminated. It may also be the case that a question has been left unanswered or that a misunderstanding has occurred. In these situations and when a telephone or an e-mail has been provided voluntarily, it is admissible to contact the respondent in order to obtain the remaining information needed. Another important task to perform at this stage is data cleaning, which means the removal of random and systematic errors from data elements through filtering, merging and translation (Hair et al., 1998). At this point of the study, the researcher looks for errors in data entry. Some errors may be corrected if they are due to mistakes in data entry or to the use of nonstandardised codes. The decision may involve the deletion of data, but this is never a simple decision, since the researcher has already expended considerable time and effort collecting it and is not likely to be eager to toss some out (Tabachnick & Fidell, 1989). Normally, it is done when respondent morbidity is identified, which happens when, for instance, the respondent ticks only neutral positions across the entire questionnaire. The next step is to verify the need for transformations5 and/or recoding of variables. Recoding may be needed in case negatively worded items have been used and it is very important that researchers do not forget to recode these items. Otherwise, all the statistical inference may be based on assumptions that are in fact exactly the opposite of what they mean. In any case, with the exception of missing data, most of the problems identified, and to which we draw attention at this stage of the research, may also occur in paper surveys. Nevertheless, we must emphasise that electronic surveys may have the advantage of avoiding these problems, not only because of the elimination of missing data but also because the data is collected in a friendly format, which allows for immediate analysis. If data is to be analysed through multivariate statistical techniques, the next procedure would be the analysis of normality. For this, graphical analysis and statistical tests are usually preferred to infer normality and necessary transformations. This is easier if data is already ready to be analysed. In paper surveys, the extra work of data entry is needed before analysis can occur. Nevertheless, there are also disadvantages associated with electronic surveys that have to be considered. The most important are the SPAM mail label, the smaller response rates and biased samples. Implications and conclusions We verify that conducting a survey is, per se, a very difficult task. Conducting it through the Internet, despite the multiple advantages that we have attempted to identify here, also produces difficulties. We try to address those here. From our experience, the initial phases are critical to the success of the research process and to obtain useful and valuable conclusions. Addressing the right questions in the right way helps ensure the quality of the responses and in consequence the results of the investigation. Choosing the right 5 For further developments on this, see, for instance, Coakes and Steed (2003) and Tabachnick and Fidell (1989). Silva & Duarte Suggestions for international research tools to design and disseminate the survey is very important to prevent and mitigate previously identified issues and to obtain a good response rate. Despite the importance of these simple tasks, many studies keep on ignoring them. Many researchers are extensively guided by results and they ignore important issues such as the right phrasing of sentences or the periods to avoid because of holidays. These are simple suggestions that can avoid significant difficulties later. The Internet is indeed a very useful device as a survey instrument, with several advantages over the traditional paper survey. However, the response rate associated in electronic surveys is still very low. In fact, we verified that while this type of approach is out of the question in countries where internet penetration is still very low, it is also problematic in countries with high usage rates, where the Internet is used for the disclosure of all kinds of information, leading people to discredit studies conducted via the web. It is also important to note that many people use the Internet largely for entertainment purposes and recreation, which may contribute to the low response rates. More empirical results on the use of the web for disclosure surveys are still necessary, as well as the study of the effects of the Internet on response rates. However, it is undeniable that the Internet is opening up new avenues in surveying (Fricker & Schonlau, 2002) and that its potential, when correctly used, is unquestionable. From our experience, we conclude that every survey has its own characteristics, and that the best approach to the survey will depend on the attributes of the target population, the sample, the object of study, the necessary means to obtain the information (and that varies from country to country), and the characteristics of the survey instrument itself. A welldesigned survey will increase the chances of obtaining responses, but the correct definition of the potential population, a good sampling approach and an appropriate coverage procedure are also fundamental in obtaining the data. For this to happen having local knowledge of the specific target population and corresponding means to approach it are fundamental. This includes an awareness of existing databases and their updating processes, being conscious of the advantages of modern surveying methods, such as electronic surveys, and acquaintance with the local/industrial institutions that may assist in the channelling/sponsorship of the survey. It is also important to consider the possibility of delegating some of the work that a survey study requires. In some instances, particular tasks requiring specific knowledge, such as web survey production, are more efficiently performed by specialists. In other cases, more generalised tasks, such as preparation of lists of contacts and telephone confirmations can also be delegated. Flexibility and dedication are of capital importance, mainly because an internet-based survey may produce unexpected problems and demand a prompt and flexible response. Finally, and yet equally important, we would like to draw attention to the importance of having international teams in cross-cultural research, since this may increase the chances of better adapting the survey to cultural and other local idiosyncrasies. Local knowledge can be more readily obtained when the researcher is placed and knows well the field where the survey is to be applied. 307 308 The Marketing Review, 2014, Vol. 14, No. 3 References Aaker, D.A., Kumar, V., & Day, G.S. (2004). Marketing Research (8th ed.). Hoboken, NJ: John Wiley & Sons. Archer, T.M. (2003). Web-based surveys. Journal of Extension, 41(4), 1-5. Chisnall, P. (2001). Marketing Research. London: McGraw-Hill. Coakes, S.J., & Steed, L.G. (2003). SPSS: analysis without anguish: verison 11.0 for Windows. Sidney: John Wiley and Sons Australia, Ltd. Couper, M.P. (2000). Web surveys: a review of issues and approaches. Public Opinion Quarterly, 64(4), 464-494. http://www.jstor.org/stable/3078739 Delerue, H. (2004). Relational risks perception in European biotechnology alliances: the effect of contextual factors. European Management Journal, 22(5), 546-556. doi: 10.1016/j.emj.2004.09.012 Dillman, D.A. (2000). Mail and internet surveys: the tailored design method (2nd ed.). New York: John Wiley & Sons. Fraze, S., Hardin, K., Brashears, T., Haygood, J., & Smith, J.H. (2003). The effects of delivery mode upon survey response rate and perceived attitudes of Texas AgriScience teachers. Journal of Agricultural Education, 44(2), 27-37. Fricker Jr., R.D., & Schonlau, M. (2002). Advantages and disadvantages of Internet research surveys: evidence from the literature. Field Methods, 14(4), 347-367. doi: 10.1177/152582202237725 Grandcolas, U., Rettie, R., & Marusenko, K. (2003). Web survey bias: sample or mode effect? Journal of Marketing Management, 19(5-6), 541-561. doi: 10.1080/0267257X.2003.9728225 Groves, R.M. (1989). Survey errors and survey costs. New York: John Wiley & Sons. Hair, J.F., Anderson, R.E., Tatham, R.L., & Black, W.C. (1998). Multivariate data analysis (5 ed.). New Jersey: Prentice Hall International. Kiernan, N.E., Kiernan, M., Oyler, M.A., & Gilles, C. (2005). Is a Web Survey as effective as a mail survey? A Field Experiment Among Computer Users. American Journal of Evaluation, 26(2), 245-252. doi: 10.1177/1098214005275826 Manfreda, K.L., Batagelj, Z., & Vehovar, V. (2002). Design the web survey questionnaires: three basic experiments. Journal of Computer-Mediated Communication, 7(3), 1-31. doi: 10.1111/j.1083-6101.2002.tb00149.x Porporato, M. (2005, August, 7th-10th). Uses of management control systems in international equity joint ventures: an empirical study of its monitoring and coordination role. Paper presented at the An International Meeting of the American Accounting Association, San Francisco, California. Tabachnick, B.G., & Fidell, L.S. (1989). Using multivariate statistics (2nd ed.). New York: Harper & Row. Yang, Z., Wang, X., & Su, C. (2006). A review of research methodologies in international business. International Business Review, 15(6), 601-617. doi: 10.1016/j.ibusrev.2006.08.003 Zhang, Y. (1999). Using the Internet for survey research: a case study. Journal of the American Society for Information Science & Technology, 51(1), 57-68. doi: 10.1002/(SICI)1097-4571(2000)51:1<57::AID-ASI9>3.0.CO;2-W About the Authors and Correspondence Susana Costa e Silva holds a PhD in Marketing from University College Dublin (Ireland), and an MSc in Economics from University of Porto (Portugal). She is currently Professor of Marketing at the Catholic University of Portugal (Porto), where she is the Director of the Marketing Department and a researcher in the fields of international marketing, cooperation, social Silva & Duarte Suggestions for international research marketing and trust. She is also a visiting teacher at several universities in Brazil and Macao. Susana has authored and co-authored several books and books chapters in Portugal, UK, Brazil and USA. She has also published articles in scientific journals in the fields of international management and international marketing and she is on the editorial board of several international journals. She was awarded the Best International Marketing paper at EIBA (European International Business Academy Conference) in 2006 and since 2013 she has been their National Representative for Portugal. Corresponding author: Susana Costa e Silva, Universidade Católica Portuguesa, Faculty of Economics and Management, Rua Diogo Botelho 1327, 4169-005 Porto, Portugal. E ssilva@porto.ucp.pt Paulo Duarte is Professor of Marketing at the Business and Economics Department and Dean of the Master in Marketing degree at University of Beira Interior, Portugal. Prior to receiving his PhD in Management at the University of Beira Interior, he held a senior marketing position in a fast moving consumer goods distribution company. Paulo is a researcher in the fields of consumer behaviour, satisfaction, brand management, international careers and firms’ internationalisation, and has published several articles and co-authored several book chapters on these topics. He is also on the editorial board of several international journals. Paulo Duarte, University of Beira Interior, NECE - Research Unit in Business Sciences, Faculty of Human and Social Sciences, Estrada do Sineiro, Edifício Ernesto Cruz, 6200-209 Covilhã, Portugal. E pduarte@ubi.pt 309 © of Marketing Review is the property of Westburn Publishers Ltd or its licensors and its content may not be copied or e-mailed to multiple sites or posted to a listserv without first obtaining the copyright holder's express written permission. However, users may print, download, or e-mail articles for individual, non-commercial use only. This article has been reproduced by EBSCO under license from Westburn Publishers Ltd.