Web Site Usability in Higher Education Sandra Christoun schristoun@bridgew.edu Hope Aubin hopeaubin@butlerautomatic.com Christine Hannon christine.hannon@stopandshop.com Robert Wolk rwolk@bridgew.edu School of Management & Aviation Science Bridgewater State College Bridgewater, Massachusetts 02345 USA Abstract The use of Web site technology in higher education presents a challenge. Measurement of Web site usability requires continual analysis. This research investigates student’s overall satisfaction with a College’s Web site. A research team designed and administered an on-line Likert scale survey, to measure student satisfaction with regard to the College Web site’s technology, usability, aesthetics, and content. The researchers used an on-line application survey to reach a large audience with fast and inexpensive delivery. The responses collected represented a significant sample of the College’s student population. Over ten percent of the total college population responded to the survey. The on-line survey results indicate that 89.4% of those respondents agreed or strongly agreed that they are, overall, satisfied with the Web site. A test and retest conducted in two classroom two weeks later proved the original findings to be valid. The instrument and methodology employed provide a benchmark for other institutions of higher learning requiring examination of their Web site usability. Keywords: higher education, web site, technology, usability, online survey, aesthetics 1. INTRODUCTION Research conducted during the Fall 2004 semester in one of southeastern Massachusetts’ premier public institutions of higher education, measured and evaluated how well the institution is meeting the technological demands of today’s student. Alana Klein (2005), writer for Business Technology, considers the Web the single most important tool in the college experience. Institutions must appeal to the Web-surfing savvy of users and first-time users as they create their Web sites (New Chalk, 1996). The importance of designing Web site services with the overall educational experience in mind provides a challenge to all institutions. Klein (2005) stated that rather than pumping money into hard copy materials, higher education should focus on improving their Web sites. In cooperation with the College’s IT Department, a research team designed and conducted a survey to a sample population that would assist the institution in determining just how satisfied students are with the College’s Web site and technological services. The College Web site has multiple purposes; to attract, assist, and help students, as well as stakeholders stay connected. This paper discusses the technology, usability, aesthetics, content and overall satisfaction of the College Web site and highlights student recommendations. This institution is the fifth largest of the state’s 29 public colleges and universities, and sits on a 235-acre campus. Both residential (29%) and commuter students make up the campus. There are 9,731 full-time and part-time students, with the majority of students between the ages of 17-24. The College has 261 full-time faculty (91% with terminal degrees), a 20:1 student/faculty ratio, and offers a broad range of undergraduate and graduate programs (College Factbook, 2004). Overall Student Population: (9,731) * 63% women * 37% men * 6% students of color * 69% full-time * 88% matriculated Undergraduate: (7,753) * 60% women * 40% men * 7% students of color * 82% full-time * 96% matriculated Graduate: (1,978) * 75% women * 26% men * 3% students of color * 20% full-time * 60% matriculated In general, most college Web sites offer the basic services for students to gain information, contact faculty and staff, and access email. The EDUCAUSE Center for Applied Research (ECAR), a nationally recognized center, concluded in a recent study that students use technology more for educational purposes than any other reason, followed by communication, management of classroom activities, and presentation of assignments (Caruso, 2004). EDUCAUSE is a nonprofit association whose mission is to advance higher education by promoting the intelligent use of information technology (Caruso, 2004). ECAR also found that the greatest benefit of using technology in the classroom is convenience (Caruso, 2004). The way a student connects with his/her surrounding can be the difference between academic success and failure. The College’s IT Department has designed many Web-based services with the student experience in mind. The homepage makes it easy for students to navigate the Web site by organizing information into several different categories: prospective students, current students, faculty, staff, alumni, donors, visitors, and parents. The homepage also offers quick links to the most commonly used services such as: Web mail, Blackboard (a virtual classroom tool), and InfoBear (a student account and registration resource featuring on-line registration and transcript information, available course sections, and on-line grade accessibility). Links to the College catalog, admissions office, calendars, clubs and organizations, news, campus events, library databases, and a virtual campus tour can also be found on the homepage. The Web site provides the technology and information necessary for today’s students to take full advantage of the services and academic offerings of the College. In addition to meeting student’s needs via the College Web Site, the campus is 100% wireless, one of the first and largest in New England. This enables users to have ubiquitous access to the Web and e-mail, without the restriction of wires. In addition to maintaining an outstanding Web site, the College recognizes the importance of developing technology in all areas of campus life, and has designed technological applications to aid students in his/her quest for information. For a complete list of the College’s technological applications, see Appendix A. 2. LITERATURE REVIEW The importance of the College Web site cannot be overstated and remains vital to the college experience. Researches conducting a study at Western Idaho University concluded that a Web site that provides too many challenges for students to navigate will not be effective (Carter, Couch, Frobish, Martin, 2003), and that making links visible and accessible from the homepage is the best practice to use (Bitler et al, 2000). According to the article, Academic Affairs Online: A survey of Information available on Websites in Higher Education, the homepage is the “technological emblem of the institution and an invaluable informational resource,” (Bitler et al, 2000). Institutions must continue to strive to create Web sites in an efficient format that meets the needs of today’s consumer, the student. This institution clearly recognizes the importance of the homepage. The rapid growth of technology has changed the way we live, teach, and learn. According to the National Council for Accreditation of Teacher Education (NCATE), students need new skills in the workplace (Tsang-Kosma, 2004). The business world demands that institutions prepare students who are not only skilled effective problem solvers and able to apply what they have learned, but students who can use technology effectively in the global market. “Thus the challenges and educational goals for schools should focus on creating appropriate learning environments that integrate technology as well as foster the needed skills to empower students” (Tsang-Kosma, 2004). Web Usability In a recent study published by Zaphiris and Ellis (2004), the authors defined Web usability as “anyone using any kind of Web browsing technology must be able to visit any site and get a full and complete understanding of the information, as well as have the full and complete ability to interact with the site if that is necessary.” The ECAR study supports the definition of Web usability stated above, concluding students use technology for writing documents (99.5 %), e-mail (99.5 %), followed by surfing the Internet for pleasure (97.2 %), and for classroom activities (96.4 %). Studies in higher education have focused on technology in the classroom and do little to assess Web usability in other areas (Bitler et al, 2000). A usability inspection and test should be conducted after a site has been designed and implemented. The application of these inspection and test methods would identify problems in the design, and lead to improvements in five critical usability characteristics: 1) learnability, 2) efficiency, 3) memorability, 4) low error rate, and 5) satisfaction (Holzinger, 2005). The findings can lead to costly changes. The usability inspection and test methods outlined in Usability Engineering Methods for Software Developers (Holzinger, 2005), list a set of methods that check and improve the usability of a web site against established standards. These inspection methods do not involve the end user, instead they involve usability specialists and analysts walking through the system to judge and discuss elements of the site, and how they conform to a set of usability principles. A more effective method may be to involve the end user. Holzinger states in his article, “testing with end users is the most fundamental usability method and is in some sense indispensable,” (2005). This method provides direct information about how people use a system and their exact problems with a specific interface. Other methods for evaluating usability are thinking aloud, field observation, and questionnaires (Holzinger, 2005). Questionnaires are one of the most common methods for testing usability. “This is especially true for issues related to the subjective satisfaction of the users and their possible anxieties, which are difficult to measure objectively,” (Holzinger, 2005). The research team worked with the College Web team to create an on-line questionnaire. The College recognizes the need for assessment and found the research team’s survey results a valuable tool. A stronger approach would be a combination of several inspection and test methods to ensure complete usability improvement. M. Levi and F. Conrad (1996), authors of A Heuristic Evaluation of a World Wide Web Prototype, state that “A heuristic evaluation, along with other inspection methods, differs from more conventional empirical usability testing in significant ways: evaluators are not drawn from the user community, evaluations take less time, evaluations are easier to set up and run, and evaluations cost less.” Since the College’s site continues to evolve over time in keeping up with the needs of the students, it is important to conduct more surveys at colleges and universities to obtain a better assessment of Web usability in higher education. 3. METHODOLOGY This study has two purposes. The first was to measure and evaluate how well the College Web site meets the technological demands of today’s student population. The second required the measurement of how well the College’s IT Department meets its mission statement: “The purpose of the College Web site as it pertains to students is to provide the tools and information necessary for today’s students to take full advantage of the services and academic offerings of the College.” The research team analyzed technology, aesthetics, usability, and content, by using a sample representative of the total population of the College’s students. For the survey, the College’s Web site became a feedback tool. Students answered questions about satisfaction with the College Web site and technology applications. Survey Implementation The research team outlined a strategy to administer an on-line survey using a five-point Likert scale to measure the overall student satisfaction of the College’s Web site. A five-point Likert scale was chosen because it has proven to provide just enough depth without too many options; providing a better response rate and better test and retest validation. By using the Likert scale, the mean can be easily calculated, and is one of the most common methods for questionnaires. By reviewing trends, and performing statistical analysis using this type of scale, it is possible to ascertain if the respondent's opinions change from survey to survey by comparing means (Kelly, 1999). For example, we could conduct a survey again next year after changes have been made to the Web site and compare the data to see what still needs to be improved. One of the most important advantages in using on-line survey technology is the ability to reach a large audience with fast and inexpensive delivery. The research team met with members of the College Web team to evaluate aspects of the Web site and received a copy of an earlier survey that the IT Department conducted on the College Web site to determine areas previously studied. The research team researched other surveys used by businesses and educational institutions to measure technology, aesthetics, usability, and content. Researchers at Baruch College in New York City, conducting a study on Web site usability found comparative results between businesses and educational Web sites (2003). Fidelity Investments researchers conducted a comparison of five questionnaires for assessing Web site usability with 123 participants, including one survey developed at the University of Maryland (Tullis, Stetson, 2004). The research team contacted the Chief Information Officer of the College in order to gain permission to conduct the survey. In an effort to increase the response rate, the team offered incentives. Simultaneously, the research team obtained prizes for survey participants. The prizes included gift certificates to the college bookstore and dining credits to the college cafeteria. To expedite the survey, the team developed a brief questionnaire using the five-point Likert scale so that survey-takers could complete the questionnaire in a timely manner, reducing the number of dropouts’ midway through the survey. After several revisions and a pilot study conducted with College staff and the Web team, the researchers submitted the final survey to the College Web team for approval. The first eight questions related to the demographics of participants. The questions included gender, enrollment status (part or full-time), commuter or resident, undergraduate or graduate, class year and age group, major, how often site is used, and what the site is used for. The remaining twelve questions examined technology, aesthetics, ease of use, and content. See Appendix B for a complete list of survey questions, and Section 5 for a look at the survey’s findings. Registered students received an email promoting prizes for participation in the on-line survey. The College Web site posted the final survey for two weeks, open to students only. The IT Department put student controls in place to block unauthorized users. The Web site posted prizewinners and students collected prizes within one week. Researchers conducted a test and retest two weeks later in three separate classes to validate results. Survey Analysis Researchers gathered data in a standardized manner. Data analyzing software (SPSS), transferred data from the on-line web survey application to SPSS. Upon completion of the survey period, the research team analyzed the results of the surveys. To determine correlations, the research team analyzed the data and ran queries. Correlations can be found below in Section 4. The team also grouped and recorded answers to the open-ended questions. The team prepared a presentation representing the findings of the survey. In addition to gathering data from the on-line survey, the research team collected statistics on Web server usage during the same period. Requests to the server totaled 2,751,063 during a seven-day period. The heaviest usage occurred on Wednesday (533,629) and Thursday (517,204), while the lightest usage was Saturday (200,650). The heaviest usage time occurred between ten and eleven o’clock in the morning (205,381). URL requests other than college Web site areas ranged from 49,181 for Google.com to 720 for pickthehottie.com. Within the College Web site, the weather report received 22,417 requests, while the Library received 18,503 requests. Section 5 contains a complete review of the findings. 4. DISCUSSION Sampling: The sample method was a Non-Probability, easily accessible convenience sampling (Kelly, 1999). Unintentionally, the sample reflected the same ratio as shown in the general population, hence depicting quota sampling (Kelly, 1999). The population size was slightly less than 10,000 and the sample size was 1,087. Although the sample did not duplicate the population exactly, the sample was representative of the entire population and the research team had no opportunity for personal judgment in choosing the sample population (Kelly, 1999). Over 10% (1,087) of the student population participated in the on-line survey, representative of the 9,731-student population in gender, age, major, and college year. Comparing the survey statistics against the 2003 college statistics confirmed the representation. See Section 5,Table 2 for a comparison. As stated earlier, one of the most important advantages in using on-line survey technology is the ability to reach a large audience with fast and inexpensive delivery. The largest disadvantage to using an on-line survey was the response bias. The survey received near a 100% response rate. However, the survey respondents only include students who used the Web site. It may be possible that those respondents with the strongest opinions chose to respond. Students did not receive guidance from the research team while answering the survey; therefore, there was a risk that students could misinterpret questions. To avoid this risk, the research team and Web team prescreened questions using a pilot study to avoid misinterpretation and to check for face and content validity. While content experts typically establish content validity, this can sometimes lead to questions that are more complex. According to Hunter and Schmidt (1990), construct validity is a quantitative question rather than a qualitative distinction such as "valid" or "invalid," it is a matter of degree. The construct validity of this survey was open to some interpretation. “Test bias is a major threat against construct validity, and therefore test bias analyses should be employed to examine the test items” (Osterlind, 1983). The presence of test bias does affect the measurement of the construct validity, but the absence of test bias does not guarantee that the test possesses construct validity. 5. FINDINGS Registration for classes and student information pages (InfoBear) usage increases with each class year, with the exception of the sophomore class. There is almost a 25% difference in the frequency of juniors using on-line registration and student information pages then sophomores. As class year progresses, two features show a decline in use: downloading forms and campus news and events. Master of Social Work, Management Science, Music, Health Education, and Economics majors use the site most infrequently. Biology and Management Science are the two majors that have the highest ratings of using the Web site monthly or less. More communication about the site may be able to increase traffic from those groups. Department Web sites appear to have increased usage as class year increases. The overall number of users is minimal in our survey. Only about 15 people selected this as a top use. Table 1 Respondent by Class Year Class Frequency Percent Freshman 231 21.5 Sophomore 236 22.0 Junior 271 25.3 Senior 277 25.8 Graduate 57 5.3 Total 1072 100.0 Table 1 summarizes the survey participants by class year with no grouping possessing more than 25.7% of the total. The gender of the respondents was 27.6% (297) male and 72.3% (776) female, which reflected the overall ratio of the College community (92.6% full time students). However, more juniors and seniors responded to the survey. Table 2 summarizes the total number of students who took the survey representative of each class year versus the total number of students in each class year overall. Table 2 # Who Responded to Survey compared to # Students at Institution Class # of Students that responded Total Student Body at the Institution Freshman 231 1794 Sophomore 236 1785 Junior 271 1817 Senior 277 2022 Graduate 57 1978 Unclassified Undergraduate 335 Total 1072 9,731 The respondents’ frequency of use (Table 3) may highlight a weakness in a survey method that uses a Web based survey to evaluate a Web site’s usability. Respondents reported that they used the site more than once per day (58.5%), while an additional 31.1% reported using the site at an average of about once per day. Only 10.2% indicated usage of weekly or less. The implied weakness of measuring Web site usability, from already heavy users of this particular Web site, on an instrument that was only available on that Web site is troubling. Table 3 Frequency of Use Frequency Percent More Than Once Per Day 630 58.8 About Once Per Day 333 31.1 Weekly or Less 114 10.1 Total 1077 100.0 The survey respondents reported varying degrees of satisfaction towards the Web site (see Table 4). Those that strongly agreed or agreed that the Web site contained useful information totaled 93.4%. Only 53.4% agreed that the search function was effective and only 65.7% agreed that, it was easy to find information for their major. This indicated two areas in need of improvement. Table 4 Satisfaction with Web Site Category Strongly Agree or Agree Information is Useful 93.4% Overall Organization 92.9% Site is Easy to Use 88.9% Interface is Pleasing 86.3% Overall Satisfaction 89.4% Appropriate for Higher Education 85.0% Site Has Functions I Expect 81.3% Easy to Find Information for Major 65.7% Search Function is Effective 53.4% Tests for Validity and Reliability In addition to the on-line survey, a test and retest were conducted two weeks after the original online survey ended. Participation was voluntary. A total of sixty-one students in three classrooms took both tests two weeks apart. The subjects were full-time undergraduates. Four students were present for only one survey date and were excluded from the analysis. The eleven items that measure satisfaction are combined into a satisfaction scale The reliability from the test-retest, internal reliability, and paper survey compared to online survey were all higher than .89 using Cronbach’s coefficient alpha. Student Recommendations The last question on the survey, “What else would you like to see available on the College Web site,” helped to identify areas of the web site in need of improvement. Below is a list of the most common answers: * Bring back the older version of InfoBear, the student/registration information resource. The College recently upgraded to a newer version. * Students would like to see class cancellations for off campus students added to the site. * Students would like to see more pictures of events on campus, as well as more information on clubs and organizations on the Web site. * Students want access to student employment via the Web site, a better search engine, and more local news. * Students want to see more major department related information and more information on individual faculty. 6. CONCLUSIONS Students today have not known a world without technology. This research shows that the College recognizes the importance of meeting the needs of today’s student. Developing on-line services that are convenient and easy to use benefits both the institution and the student. By designing, administering and analyzing a questionnaire on Web technology, usability, aesthetics, and content, the College can see how students view the current Web site and determine what areas to focus on when developing Web services. Survey results tell us that students are looking for a well-organized, easy to use technology that allows them to efficiently complete clerical tasks, locate information, and communicate electronically. Technology is a critical tool that continues to develop on College Web sites and serves as the primary resource for information transfer among students, administrators, and professors. Creating the kind of Web services students want empowers them. Students that feel empowered feel good about his/her institution. The College proves to be a technologically advanced institution that aims to advance the virtual learning platform of the internet so that students may share ideas, thoughts, data, and opportunities. Many of the suggestions made by this research are now in place. Research into the technology, usability, aesthetics and content continues as the College strives to meet the needs of today’s students, both in and out of the classroom. The College Web team found this research to be valuable in their continued quest to meet their mission statement. 7. ACKNOWLEDGEMENTS We acknowledge the assistance and hard work of Michael Sale and Christine Perakslis in the research and analysis of this survey. 8. REFERENCES Baruch College, The City University of New York (2003) “Data Summary for: Web Site Usability Survey.” Bitler, Doris, Walter Rankin, and Joann Schrass (2000 September) “Academic Affairs On-line: A Survey of Information Available on Web sites in Higher Education.” College Student Journal v.34, 325-33. Retrieved May 6, 2005, from http://web5.infotrac.galegroup.com/itw/infomark413/723/66637177w5/purl=rcl_EAIM_0 Carter, Linda and Shannon Couch, Jennifer Frobish, Timothy Marten (n.d.) “Responding to Change: Division of Student Affairs at Western Idaho University.” Caruso, Judith B. (2004 October) “ECAR Study of Students and Information Technology, 2004: Convenience, Connection, and Control.” EDUCASE Center for Applied Research and Director of Policy, Security, and Planning, University of Wisconsin-Madison. Roadmap, pp. 1-4. Factbook, Bridgewater State College, Institutional Research Academic Year 2004-2005, Retrieved October 10, 2004, from http:// www.wireless.bridgew.edu/depts/IR/factbook.cfm Fuller, Diane and Patricia Hinegardner (2001 October) “Ensuring Quality Web Site Redesign: The University of Maryland’s Experience.” Retrieved June 2, 2005, from http://www.pubmedcentral.nih.gov/articlerender.fcgi?arti=57962 Holzinger, A. (January 2005) “Usability Engineering Methods for Software Developers.” Communications of the ACM 48, 1 71- 74. Hunter, John and Frank Schmidt (1990) “Methods of Meta-analysis: Correcting Error and Bias in Research Findings.” Newsbury Park: Sage Publications. Kelley, L. (1999) Measurement Made Accessible, A Research Approach Using Qualitative, Quantitative, & Quality Improvement Methods. Sage Publications, pp. 57-78. Klein, A. (2005 April) “Business Technology: Look to the web to Increase Recruitment, the College Website Continues to Influence Enrollment Decision. Is Your Site Up To Par?” Univeristy Business. Retrieved April 16, 2005, from http://www .universitybusiness.com/page.cfm?p=706 retrieved 4/16/2005. Levi, M. and Frederick Conrad (1996 July-August) “A Heuristic Evaluation of a World Wide Web Prototype.” Bureau of Labor Statistics Retrieved 4/16/05 from www.bls.gov New Chalk, Encouraging Student Use of Internet Resources (1996 November). Produced by Academic Technology and Networks and Institute for Academic Technology, two divisions of Information Technology. Retrieved April 16, 2005, from http://www.unc.edu/cit/newchalk/ncv1n1.ht Osterlind, S. J. (1983) “Test item bias.” Newbury Park: Sage Publications. Tsang-Kosma, W. (n.d.) “Student Centered Learning + Technology = Rethinking Teachers’ Education.” Retrieved April 16, 2005, from http://www2.gsu.edu/ ~mstswh/courses/it7000/papers/student-3.htm Tullis, T. and Jacqueline N. Stetson (2004) “A Comparison of Questionnaires for Assessing Website Usability.” Fidelity Center for Applied Technology. UPA 2004 Conference Chapel Hill. vol. 1, issue 1. etrieved October 10, 2004, from http:// www.unc.edu/cit/newchalk/ Zaphiris, Panayiotis and Darin Ellis (2004) “Website Usability and Content Accessibility of the Top USA Universities.” Institute of Gerontology and Department of Industrial and Manufacturing Engineering, Wayne State University, Detroit, MI. http://agrino.org/pzaphiri/papers/accessibility-webnet.pdf APPENDIX A College Technological Applications FALL 2004 * Notebook initiative – all freshmen and sophomores, including transfer students have notebook computers. The College has negotiated a special discounted price with a popular computer manufacturer. * Printing in labs and classrooms - Students receive the first $30 in printing costs free each semester. Black-and-white costs 10 cents per page and color costs 25 cents per page. Students see a notice on the computer screen each time they logon informing them of the amount remaining on his/her account. Once the money is used, students automatically receive a second $30 allocation to cover printing costs, now charged to his/her account. * Computer accessibility - students can access computers in any of the 15 computing labs on campus with a student account. * Express stations - featuring walk-up computers providing access to e-mail, InfoBear, the campus Web site, and the World Wide Web. * One Card Project – the official College identification card and part of a convenient method to access a wide range of services on campus such as, College dining facilities, vending machines, copiers, college library identification card, replaces keys to control access to most buildings on campus for faculty and staff, and local venues accept the card, i.e. Subway. * Payment Gateway (beginning fall 2006) - a sophisticated fee payment system that accepts, authorizes, processes credit card payments, electronic checks, and updates the student information system in real-time. * On-line Student Guide to Information Technology. * On-line computer based training. * On-line courses; distance education. APPENDIX B Survey Questionnaire FALL 2004 Survey Results: 1087 Respondents General Information 1. Gender: 72% (775) Female 27% (296) Male 2. Enrollment Status: 92% (995) Full-Time 7% (75) Part-Time 3. Student Classification: 94% (1010) Undergraduate Student 5% (60) Graduate Student 4. Class Year: 25% (277) Senior 25% (270) Junior 21% (235) Sophomore 21% (231) Freshman 5% (57) Graduate Student 5. Age Group: 55% (592) 17 - 20 32% (352) 21 - 24 4% (46) 25 - 28 1% (20) 29 - 32 1% (18) 33 - 36 1% (18) 45+ 1% (14) 41 - 44 1% (13) 37 - 40 6. Major (check all that apply): 11% (144) Elementary Education 10% (136) Psychology 9% (120) Management Science 7% (91) English 6% (81) Communication Arts & Sciences 5% (69) Criminal Justice 4% (58) Accounting & Finance 4% (51) Special Education 3% (49) Early Childhood Education 3% (43) History 3% (41) Social Work 2% (37) Art 2% (34) Mathematics 2% (32) Physical Education 2% (29) Biology 2% (27) Political Science 2% (26) Aviation Science 2% (26) Master of Education (M.Ed.) 1% (23) Sociology 1% (19) Computer Science 1% (18) Earth Sciences 1% (13) Music 0% (12) Chemistry 0% (12) Health Education 0% (9) Geography 0% (9) Spanish 0% (8) Anthropology 0% (7) Master of Science in Management (M.S.M.) 0% (6) Economics 0% (5) Philosophy 0% (5) Master of Social Work (M.S.W.) 0% (3) Physics 0% (3) Master of Science (M.S.) 0% (2) Master of Arts in Teaching (M.A.T.) 0% (1) Master of Arts (M.A.) 0% (1) Master of Public Administration (M.P.A.) 0% (0) Chemistry-Geology Other 7. I use the web site: 58% (628) More than once per day 31% (333) about once per day 9% (105) Weekly 0% (4) monthly or less 8. I most often use the web site for (Excluding E-Mail and BlackBoard) - Check up to 3: 21% (714) Registration for Classes/InfoBear 15% (524) Campus News and Events 14% (465) Class Cancellation List 13% (432) Library/Webster/Databases 8% (279) Campus Directory (Formerly "Find People") 7% (240) Departmental Web Sites 5% (180) Student Employment Opportunities 4% (162) Weather 4% (133) Downloading Forms 3% (111) Athletic Scores 2% (74) Technical Support Other The following are questions pertaining to specific components of the BSC Web Site: 1. The overall organization of the web site is easy to understand. 61% (663) Agree 30% (332) Strongly Agree 4% (51) Disagree 1% (17) No Opinion 0% (8) Strongly Disagree 2. It is easy to find the information I need for my major. 53% (568) Agree 17% (186) No Opinion 14% (156) Disagree 12% (133) Strongly Agree 2% (25) Strongly Disagree 3. The interface of this web site is pleasing. (Interface generally includes how the site can be navigated, menus available, search options, etc.) 58% (622) Agree 27% (298) Strongly Agree 6% (69) Disagree 6% (67) No Opinion 0% (10) Strongly Disagree 4. The web site has all the functions and features I expect it to have. 56% (605) Agree 24% (264) Strongly Agree 10% (109) Disagree 7% (77) No Opinion 1% (14) Strongly Disagree 5. The information on the web site is useful. 60% (645) Agree 32% (350) Strongly Agree 4% (48) No Opinion 1% (20) Disagree 0% (3) Strongly Disagree 6. The design of the web site, including color use, design, and placement of content is pleasing. 58% (619) Agree 29% (310) Strongly Agree 6% (73) No Opinion 4% (53) Disagree 1% (12) Strongly Disagree 7. The web site is easy to use. 59% (637) Agree 29% (310) Strongly Agree 6% (65) Disagree 3% (38) No Opinion 1% (15) Strongly Disagree 8. The design of the site is appropriate for a higher education institution. 56% (602) Agree 28% (306) Strongly Agree 11% (119) No Opinion 2% (31) Disagree 0% (10) Strongly Disagree 9. The search engine on the web site provides useful results. 39% (425) Agree 25% (275) No Opinion 15% (166) Disagree 13% (147) Strongly Agree 5% (56) Strongly Disagree 10. The content on the web site is reliable and up-to-date. 61% (654) Agree 20% (223) Strongly Agree 8% (91) Disagree 7% (76) No Opinion 2% (26) Strongly Disagree 11. Overall, I am satisfied with the web site. 61% (658) Agree 27% (297) Strongly Agree 5% (58) Disagree 3% (39) No Opinion 1% (17) Strongly Disagree 12. What else would you like to see available on the BSC web site? Text Box Answers