Academic Exchange Quarterly Spring 2011 ISSN 1096-1453 Volume 15, Issue 1
To cite, use print source rather than this on-line version which may not reflect print copy
format requirements or text lay-out and pagination.
This article should not be reprinted for inclusion in any publication for sale without author's explicit permission. Anyone may view, reproduce or store copy of this article for personal, non-commercial use as allowed by the "Fair Use" limitations (sections 107 and 108) of the U.S. Copyright law. For any other use and for reprints, contact article's author(s) who may impose usage fee.. See also electronic version copyright clearance CURRENT VERSION COPYRIGHT © MMX AUTHOR & ACADEMIC EXCHANGE QUARTERLY
Evaluating Ubiquitous Computing Programs
Meridith K. Wentz, University of Wisconsin-Stout
Wendy Marson, University of Wisconsin-Stout
Jane Henderson, University of Wisconsin-Stout
Wentz, PhD, is the Director of the Office of Planning, Assessment, Research and Quality, Marson, M.S., is an Institutional Planner, and Henderson, M.S., is the Director of Learning Technology Services and Co-director of Nakatani Teaching and Learning Center.
As more and more institutions of higher learning integrate computers into their learning environments, be it classrooms or dorm rooms, evaluation of their impact remains elusive for many institutions. This article describes how one institution implemented an ongoing evaluation of its ubiquitous computing program, how the information gathered was used, and how its assessment model has evolved across time.
This article is being presented as a “how-to” in getting started in digital learning assessment for campuses that either are considering implementing an assessment plan or are not satisfied with their current assessment plan. It describes a methodical approach to assessing process issues common to any start-up program involving technology, including support/service, student satisfaction, training in using the software, and how students are using the technology.
We recommend that any evaluation of ubiquitous computing programs begins with a focus on satisfaction, support, training needs and usage when a program is early in its life cycle. Once the program becomes firmly established, as ours has, we recommend that the focus be shifted to focus on outcome measures such as student learning. This is the natural evolution of the process/program life cycle. This article focuses on the evaluation of satisfaction, support, training needs and usage in the early years of the program.
A unique aspect of our approach is a broader evaluation focus on the Digital Learning Environment (DLE) instead of an exclusive focus on the laptop computer. The laptop computer is but one part of the DLE, connecting students and faculty with 24/7 access to services and university-provided software within a wireless environment. The hardware is bundled with general as well as program-specific software. The e-Scholar program is the DLE offering students a variety of opportunities to be successful in achieving their academic goals. Students pay for the laptop lease, software, support and administrative costs through a per credit fee added onto their tuition.
The University of Wisconsin–Stout, located in Menomonie, WI, serves approximately 9000 graduate and undergraduate students. In addition to being “Wisconsin’s Polytechnic University”, UW-Stout remains the only totally laptop campus in the UW System, a unique opportunity for students desiring to participate in this type of learning environment.
The laptop program at our institution began in the fall of 2002, following pilots, campus discussion, action by University shared governance, and final Board of Regents approval in the fall of 2001. The e-Scholar program was phased in, beginning with new freshmen in the fall of 2002 and continuing each subsequent year with new freshmen.
Since 2000, the number of laptop, or “ubiquitous computing,” initiatives across the United States and worldwide continues to increase. Currently, there are over 275 colleges and universities with programs that provide some or all students with laptop computers or that requires students to purchase a laptop computer (Brown, 2010). In K-12 education, Picciano and Seaman (2009) estimated that more than one million K-12 students took online courses during 2007-08, and Watson et al., (2010) report continuing growth in K-12 online learning with the most growth occurring in the sector of individual school districts creating blended learning programs Although colleges, universities, and school districts differ in terms of the implementation of their ubiquitous computing programs and how technology is used, all of them have had to invest a significant amount of resources to implement and maintain these programs.
There’s no question that students in Higher Ed are using their computers. In the most recent ECAR (Educause Center for Applied Research) study (Smith and Caruso, 2010) the mean amount of time spent on the internet for school, work and recreation by respondents was over 20 hrs/week. More than 90% of respondents reported using their college/university website, presentation software, text messaging, social networking websites and a course or learning management system.
Like other institutions who have implemented ubiquitous computing programs, Stout has struggled with evaluating its e-Scholar program. (e-Scholar being the term used to describe our ubiquitous computing program). A primary goal in implementing e-Scholar was to improve student learning outcomes, as reported by many other early adopters of ubiquitous computing, (Cuban, 2001; Zemsky & Massy, 2004; Zucker, 2004). Both Higher Ed and K-12 institutions assumed that implementing laptop programs would result in improved student achievement (Texas Center for Educational Research, 2006; U.S. Department of Education, 2003; Zucker, 2004), higher standardized test scores (Texas Center for Educational Research, 2009; Weglinsky, 1998; Zucker, 2004), the development of 21st century skills that are needed outside of school (Silvernail et al., 2008, Texas Center for Educational Research, 2009, Zucker and Hug, 2008), and higher grades or GPAs (Efaw, Hampton, Martinez & Smith, 2004; Griffith, Gu & Brown, 1999). These goals have not changed. Zucker and Light (2009) state, “With the continuing decline in costs of technology, programs are proliferating worldwide to put networked laptop computers into the hands of millions of students on a routine basis. The reasons policy-makers support these programs are based on economic arguments, equity concerns, and widespread interest in education reform.”
Yet, the evidence that laptop computers have had any positive impact on student learning outcomes remains mixed. For example, some studies report a positive impact on learning (e.g. Burke et al., 2005; Dvorak & Buchanan, 2002; Efaw et al., 2004; McVay et al., 2005; Silvernail et al., 2008; Zucker & Hug, 2008) , others report no impact on learning (e.g., Cuban, 2006; Finn and Inman, 2004; Kvavik, Caruso & Morgan, 2004; Weston and Bain, 2010), focus solely on successful integration of technology into classrooms rather than how the technology is impacting learning. (e.g., Becker, 2007, Brill and Galloway, 2007, Chen et al., 2009)
We believe that there are several reasons for the mixed evidence, including barriers to program implementation, lack of planning, a failure to integrate evaluation and planning, a lack of stakeholder buy-in at the time of implementation, lack of training, and the limited time spent on any particular study, in most cases.
The outcomes assessment piece, often missing in programs of this type, was grounded in the university’s vision and value statement and included the goals of the program and the assessment goals and their identification. Outcomes assessment goals included: understanding of best practices, assessment of learning in areas of process, concept, attitude and critical analysis, improvement of instruction, identification of desirable traits in students, enhancement of active learning, and identifying the acquisition of concepts, processes and attitudes through active learning. Identifiable outcomes included: what students did to learn, what they reported learning, what instructors did to enhance learning for students and themselves, what outcomes (in terms of both quality and quantity) were produced by students. Our evaluation efforts are currently focused on this outcomes assessment.
Assessment of Satisfaction, Support, Training and Usage
In January, 2003, a long-term plan was developed to assess learning, critical thinking and active learning in graduate and undergraduate education. Simultaneously, a short-term plan was simultaneously developed to assess satisfaction, support, training, and usage.
The short-term assessment plan included four surveys which will be summarized along with their results. After the first survey, subsequent surveys built on the surveys which preceded it. When enough data was gathered to identify what problems needed to be addressed, a new survey was created so the body of knowledge about the program could continue to increase. This portion of the assessment plan is the focus for this article.
All four of these surveys have since been discontinued or modified after the identified student concerns were addressed and few new ideas were surfacing with subsequent survey administration. However, these assessments were critical in assisting in the implementation and improvement of the e-Scholar program and we would recommend similar assessments at other institutions in the early stages of program implementation.
Survey #1 – Skills inventory (Microsoft Office Proficiency Self-Report)
Timeframe: Summer 2002 –Fall 2004.
The first freshman cohort to receive laptops (fall 2002) were administered this instrument the summer of their entry to UW-Stout. The survey asked if they could perform specific learning objectives within Microsoft Office (Word, Excel, PowerPoint and Outlook). This instrument was utilized to work in tandem with the Student Questionnaire Survey (SQS). The Skills Inventory would identify possible problem areas in training, which ideally would be confirmed by the SQS a few months later. This instrument would also provide an opportunity to identify students who may be at-risk due to their lack of skills, and provide an opportunity to contact them with help resources.
This instrument was administered in 2002, 2003 and 2004 to incoming freshman e-Scholars during summer registration. It was dropped from the assessment plan after the fall 2004administration.
Summary of results
Students in the second laptop cohort (fall 2003) reported slightly higher abilities than the fall 2002 cohort. Highest self-reported skill level was for Word, followed by e-mail, Excel and PowerPoint. The fall 2004 cohort showed statistically significant increases in knowledge of Word, Excel and PowerPoint compared to the fall 2002 cohort.
Actions taken as a result of this survey:
1. Exploration and implementation of student training pilots based on MS Proficiency Assessments
2. Students identified as “at-Risk” after completing MS Proficiency Assessments were contacted individually to provide mentoring
3. Training resources were developed for web delivery
4. Tech Tips sent to specific student populations for training outreach (Word, copyright, back-up, etc), literally providing training at their fingertips.
5. Developed/printed digital and hardcopy manual “Computing @ UW-Stout” as a resource for students
Survey#2 – (Training and) Expectation Survey
Timeframe: Fall 2002 – Fall 2006
This survey was begun in fall 2002, and initially consisted of three questions. The first question asked if the participant had attended the library/blackboard training session and the second and third questions asked participants to briefly explain what being an e-Scholar meant to them and how they could expect to use their laptop in their daily life as a new student.
After 2004, the training question was dropped, and the survey was known after that as the Expectation Survey. The survey was dropped from the assessment plan following the 2006 report.
Summary of Results
“Being an e-Scholar means having the ability to connect with an entire college campus and being able to utilize its resources”
Being an e-Scholar, according to students, could mean a number of different things. The most frequent responses across five years were; 1) staying connected to instructors and classmates, to family at home, to friends on and off-campus; 2) using/accessing learning technology - having 24/7 access to the internet, to resources, to information, to communication; and 3) learning - using a laptop computer to aid in learning, having a level playing field because everyone has a laptop. Even though UW-Stout as a laptop campus is in its sixth year, many students were confused about what being an e-Scholar means. It was described by students as: a program, a portal, a course management system and a person. These types of responses comprised about 9% of all responses in 2006. A new theme that emerged in 2006 was use by participants of the word “access” and how their laptops would allow them access to everything offered by UW-Stout’s Digital Learning Environment. An example of this was, “I am able to access a different method to learning because of the help of the internet”.
How students expected to use their laptops remained steady across time, with schoolwork/homework listed as the #1 choice across all five years of data collection. Email/communication was a strong second. “I can communicate with anyone on campus at the push of a button as well as reach valuable research information.” Personal use/entertainment was also frequent use mentioned by students. “Truthfully, I will mostly use it for the internet. I have a lot of different messaging and fantasy football things. I will use it to talk to my teachers and friends.”
Actions taken as a result of this survey:
1. Developed a document for students explaining UW-Stout’s student expectations for how the laptops will be used, endorsed by Senates.
2. Provided information to the UW-Stout faculty on student expectations, so they could use the information in designing their courses.
Survey #3 – Student Questionnaire Survey (SQS)
Timeframe: Fall 2002 – Fall 2003
The purpose of this process instrument was to gather student issues and concerns regarding several aspects of the laptop program. Initially in fall 2002, students were asked four open-ended questions –
1. Have they had problems with their laptop?
2. Have they had training issues with their laptop?
3. How they have used their laptop inside the classroom?
4. How they have used their laptop outside the classroom?
Follow-up focus groups were also held in both fall 2002 and fall 2003 to “drill down” into comments made on the survey.
The SQS was repeated in spring 2003, with few changes in results. The instrument was modified for fall 2003 administration, becoming three likert-type scale questions related to satisfaction with the e-Scholar program, three questions related to service/support of the laptop itself, two questions related to training/knowledge, and one qualitative question asking for further comment. The earlier qualitative questions related to laptop use in and out of the classroom were spun off into a new survey, the One-Minute Laptop Survey. The training and knowledge questions replaced similar questions previously asked on the MS Office Proficiency Self-Report (component #1). This entire instrument was discontinued after the fall 2003 administration.
Summary of Results
This survey tended to identify concerns with the program, which is not unusual given the early stage of the program at the time of administration. Having the concerns identified by users allowed those responsible to create action plans, implement changes to improve the program, and monitor progress on subsequent surveys.
One concern was “no use in class/little use in class” of laptops being reported by 21% of student respondents (spring 2003). In subsequent surveys, particularly the One-Minute survey, this percentage declined drastically. Additionally, actions were taken specifically to address this issue, including: the development and eventual publishing of instructional practices related to learning, teaching and laptops/technology, enhanced focus and direction on the improvement of teaching and learning, enhanced focus and commitment on learning assessment related to laptops, and development of a best practices learning database.
One-third of respondents reported wireless/connectivity problems and network speed problems. These types of comments have also decreased across time. Knowledge issues also emerged in the areas of security practices, laptop features, and laptop care. However, overall satisfaction with the e-Scholar program increased in fall 2003 compared to spring 2003.
Actions taken as a result of this survey:
1. Various faculty development sessions offered by the Teaching and Learning Center.
2. Technical issues were forwarded to the IT help desk and addressed individually.
Survey #4 – The One-Minute Laptop Survey
Timeframe: Fall 2003 – Fall 2006.
This was first administered in fall 2003, using the two qualitative questions about laptop use in and out of class from the earlier SQS survey, creating a new survey. It was called the One-Minute survey because it was only a two-question survey, and it was hoped that students would take “a minute” to complete it. Since it was a survey for laptop students only, the first year of its administration, it went only to freshmen and sophomores, the second year to freshmen, sophomores and juniors, and so on. The One-Minute survey was designed to work in tandem with the Expectation Survey (component #2), comparing how students expected to use their laptops compared with how they actually used them, and additionally, to look at how reported laptop use changed across years in school as each new cohort was added. For example, how did seniors use their laptops compared to freshmen?
The survey consisted of only two questions:
1. How do you use your laptop in class?
2. How do you use your laptop outside of class?
A third question, a likert-type scale questions asking about overall satisfaction with the e-Scholar program, was added in fall 2006. This survey instrument was dropped from the assessment plan following the fall 2006 administration, but a new survey that is focused on student learning is in development for fall administration.
Summary of results
“Most of my classes at this point require the laptop every day.”
“All of my classes use Learn@uwstout.edu, so I use it to look at my assignments, grades and use the drop box.”
Students reported using their laptops in-class most often for taking notes, also for schoolwork, in-class assignments/labs/quizzes, email/keeping in touch, research tool/access web, and internet/research. Reported recreational use of laptops in-class increased from #12 in frequency of comments in2005 to #7 in 2006. Outside of class, until 2005 the most often reported use of laptops was for schoolwork. In 2006, the most frequently reported uses were e-mail and recreational use, followed by schoolwork. Negative comments about the laptop program decreased from a high of 17% in 2004 to 6.5% in 2006, and more comments were made about frequency of laptop use in the classroom. When comparing the results of this survey with the results of the Expectation survey, reported use of laptops continued to exceed expected use of laptops for e-mail/keeping in touch, schoolwork, recreational use, and taking notes.
Actions taken as a result of this survey:
1. Various faculty development sessions have been offered by the Teaching and Learning Center. As a result, the percentage of students reporting that instructors who do not use the laptops in class decreased.
UW-Stout’s approach to evaluation of their e-Scholar program was to combine different elements into a comprehensive view of a learning environment, and then use the data gathered for decision–making to further improve the e-Scholar program.
Beginning with the basic process of opening up the computer and turning it on (The SQS survey), then using it for word processing (skills inventory), problems were identified, and solutions were developed and implemented.
The training and expectation survey, along with the SQS, helped identify the gaps both between how students expected to use their laptops for learning vs. how they actually used them, and how students expected the entire digital learning environment to function. This information was disseminated to decision-makers around campus, and programs were subsequently developed for faculty to learn how to use the digital learning environment in ways the students expected.
Finally, the One-Minute laptop survey continued to look at how students were using their laptop computers both in and out of class, and compared with the training and expectation survey to see how expectations and actual use aligned. Not surprisingly, students continued to report that they used the available technology more than they had anticipated when entering school. Recreational use continued to increase, as did overall satisfaction.
The timeline for gathering the above data encompassed five years, 2002-2006. By fall of 2006, it was felt by decision-makers on campus that little new information was coming forward about support/service, student satisfaction, training, and use of the technology. This part of the process/program cycle was complete.
Becker, K. (2007). Digital Game-based Learning once Removed: Teaching Teachers. British Journal of Educational Technology, 38(3), 478-488.
Brill, J. & Galloway, C. (2007). Perils and Promises: University Instructors' Integration of Technology in Classroom-based Practices. British Journal of Educational Technology, 38(1), 95-105.
Brown, J. (2010). List of Notebook Initiatives. Retrieved November 18, 2010
Burke, M., Colter, S., Little, J., & Riehl, J. (2005). Strategies for the Mobile Learning Environment: Harnessing Collaborative Learning within Nomadic Communities. Paper presented at the National Learning Infrastructure Initiative Annual Conference, New Orleans, LA.
Chen, C.-H., Hwang, G.-J., Yang, T.-C., Chen, S.-H., Huang, S.-Y. (2009). Analysis of a Ubiquitous Performance Support System for Teachers. Innovations in Education and Teaching International, 46(4), 421-33.
Cuban, L. (2001). Over Sold and Underused: Computers in the Classroom. Cambridge: Harvard University Press.
Cuban, L. (2006, October 18). Commentary: The Laptop Revolution has no Clothes. Education Week, p.29.
Dvorak, J., & Buchanan, K. (2002, June 24-29). Using Technology to Create and Enhance Collaborative Learning. Paper presented at the World Conference on Educational Multimedia, Hypermedia and Telecommunications, Denver, CO.
Dwyer, D. (2000). Changing the Conversation about Teaching, Learning and Technology: A Report about Ten Years of ACOT Research. Cupertino, CA: Apple Computer.
Education, U.S. Department of (2003). $15 Million in Grants Awarded to Help States Study Technology's Impact on Student Achievement. Retrieved November 12, 2006, from http://www.ed.gov/news/pressreleases/2003/11/11102003.html
Efaw, J., Hampton, S., Martinez, S., & Smith, S. (2004). Miracle or Teaching and Learning with Laptop Computers in the Classroom. Educause Quarterly (3), 10-18.
Evaluation of the Texas Technology Immersion Project. (2006). Austin, TX: Texas Center for Educational Research.
Evaluation of the Texas Technology Immersion Project. Final outcomes for a four-year study (2009). Austin, TX: Texas Center for Educational Research.
Finn, S., & Inman, J. (2004). Digital Unity and Digital Divide: Surveying Alumni to study Effects of a Campus Laptop Initiative. Journal of Research on Technology in Education, 36(3), 297-317.
Griffith, R. S., Gu, Y., & Brown, D. G. (1999). Assessment of the Impact of Ubiquitous Computing. Paper presented at the Association for Institutional Research 39th Annual Forum, Seattle, Washington.
Kvavik, R., Caruso, J., & Morgan, G. (2004). ECAR Study of Students and Information Technology, 2004: Convenience, Connection, and Control (Volume 5): EDUCAUSE Center for Applied Research.
McVay, G., Snyder, K., Kimberlee, D., & Graetz, K. (2005). Evaluation of a Laptop University: A Case Study. British Journal of Educational Technology, 36(3), 513-524.
Picciano, A., & Seaman, J. (2009). K-12 Online Learning: A 2008 Follow-up of the Survey of U.S. School District Administrators. Newburyport, MA: The Sloan Consortium
Silvernail, D., Small, D.,Walker, L.,Wilson, R.,Wintle, S. (2008). Using Technology in Helping Students Achieve 21st Century Skills: A Pilot Study. Portland, ME: Center for Education Policy, Applied Research, and Evaluation.
Smith, S. & Caruso, J. (2010). The ECAR Study of Undergraduate Students and Information Technology, 2010 Key Findings. Boulder, CO: EDUCAUSE Center for Applied Research.
Weston, M.E. & Bain, A. (2010). The End of Techno-Critique: The Naked Truth About 1:1 Laptop Initiatives and Educational Change. The Journal of Technology, Learning, and Assessment, 9(6).Retrieved January 3, 2011 from http://www.jtla.org.
Watson, J., Murin, A., Vashaw, L., Gemin, B., Rapp, C., et al. (2010). Keeping Pace with K-12 Online Learning. An Annual Review of Policy and Practice. Evergreen, CO: The Evergreen Educational Group.
Wenglinsky, H. (1998). Does it Compute? The Relationship Between Educational Technology and Student Achievement in Mathematics (Policy information report). Princeton: Educational Testing Service.
Zemsky, R., & Massy, W. (2004). Thwarted Innovation, What Happened to e-learning and Why: The Learning Alliance.
Zucker. (2004). Developing a Research Agenda for Ubiquitous Computing in Schools. Journal of Educational Computing Research, 30(4), 371-386.
Zucker, A. & Hug, S. (2008). Teaching and Learning Physics in a 1:1 Laptop School. Journal of Science and Technology, 17: 586-594.
Zucker, A. & Light, D. (2009). Laptop Programs for Students. Science 2, January, 323(5910), 82-85.