28 August 2008

ePortfolio Requirement: Student Maintained

As mentioned in a previous post, I have 3 requirements for ePortfolio: that it be web-based, faculty designed, and student maintained. This week I will address each of those aspects.

Presumably, an institution or academic program establishes student ePortfolios to demonstrate competencies, whether those are defined as skills or knowledge or both. Concomitant learning comes through the student's maintenance of the ePortfolio project and may not be articulated in a purpose statement by the student, nor in a Student Learning Outcome statement by the institution.

A summary follows in this format:
Concomitant learning for student.
How it can be supported by program or institution.

Make a sustained effort over time. Understand the scope of the project and plan accordingly. The academic program can assist by displaying sample ePortfolios and providing a template that helps the student plan the work in segments or chunks. Further, the program can provide a practice field, either in the form of a low-stakes review of an initial effort in the ePortoflio or even a stand-alone "practice portfolio."

Work independently. Exercise your own judgment. The academic program can encourage independence by presenting clear guidelines and expectations. With appropriate consideration for level (undergraduate or graduate), the program can further support independence by limiting oversight to specific and well-publicized check-points. At these checks on progress, the program should give students meaningful feedback so that students benefit from formative assessment that they can act on, building confidence in their own judgment between check-points.

Set the pace for one's work. Create and follow a schedule. The program's schedule of check-points can support pacing directly, first by setting deadlines for the class or cohort and then by gradually moving to individual student-selected deadlines.

Self-evaluate. Develop techniques for review of your own work so that you literally become your own editor, your own critic. The academic program can provide coaching for students with self-evaluation as the goal. Depending on level, peer or mentor review may be more productive. (Not all instructors are well suited to this role.) Through the sharing of rubrics and evaluation criteria, the program can communicate to students the customary standards for the ePortfolio. Of course, students may use additional criteria in their own evaluation, such as level of cool (hard to define and a great criterion to leave to the individual).

Learn ancillary skills. Standards of professional presentation and intellectual property (including copyright of original works) may not be addressed directly in course work but affect the impact of your portfolio. The academic program can provide information along these lines, either through additional ePortfolio guidelines or support from a portfolio coach.

Make technology decisions. Select technology that enhances ePortfolio content. Students may need assistance in understanding ramifications about technology decisions with respect to adequate file storage, bandwidth, and a back-up strategy. The institution can support the student choices with campus services or online knowledge bases. Specific to the ePortfolio platform, the institution should provide technical assistance via an internal or external HelpDesk.

Student maintenance of the ePortfolio doesn't necessarily emerge from the student's initiative. It relies on the academic program's adoption of a system, publication of guidelines, and supervision of processes. But when those foundations are in place, it is possible for students to assume the responsibility of the ePortfolio and the result is much more than a collection of evidences of competencies.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology. She can be contacted at bold[AT]marybold.com using standard email format instead of [AT].

27 August 2008

ePortfolio Requirement: Faculty Designed

As mentioned in a previous post, I have 3 requirements for ePortfolio: that it be web-based, faculty designed, and student maintained. This week I will address each of those aspects.

Faculty-designed portfolios provide opportunity to:

  • display student skills and competencies, reflecting course work that the faculty directed.
  • align ePortfolio elements with course outcomes and program outcomes.
  • align these student products with the goals or expectations of accrediting bodies.

When faculty identify course-embedded assessments (for example, to serve as measures of Student Learning Outcomes) these may also be incorporated in the student portfolio. The faculty can design the portfolio for evaluation during the semester that the assessment was made, or for evaluation at end of degree. In my experience, a faculty group is happiest when it plans for the "end" or exit portfolio reflecting an entire program (which in itself requires a lot of discussion and cooperation).

The most powerful web-based portfolio platforms support online evaluation, complete with rubric scoring and feedback mechanisms between student and faculty. In these systems, rubrics are typically created within the platform and faculty can share in this work.

While templates are available, customized rubrics produce the best results because evaluation results match a program's own outcomes. Results are thus diagnostic and a good portfolio platform includes reporting features that can even drill-down to a single rubric criterion.

A set of rubrics created by faculty can also reflect the language of the program. I've run across a great variety of terms that faculty use to describe unacceptable or weak efforts. Some programs use frank rubric ratings of unacceptable, unsatisfactory, not passing. Others take a gentler approach: novice, not yet appropriate, developing. When faculty consider the language for their rubrics, a pattern typically emerges and it will influence other curriculum decisions.

The faculty-designed portfolio leads naturally to a set of guidelines for their students. Built upon the cooperative design work, the guidelines reflect the collective voice of the faculty. That's a big advantage for a portfolio spanning an entire program.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology. She can be contacted at bold[AT]marybold.com using standard email format instead of [AT].

26 August 2008

ePortfolio Requirement: Web-based

As mentioned in a previous post, I have 3 requirements for ePortfolio: that it be web-based, faculty designed, and student maintained. This week I will address each of those aspects.

Student portfolios on the web: "Electronic portfolio" can mean any number of things, from a PowerPoint slideshow emailed to an instructor—to a set of files burned to CD or DVD—to an integrated body of work displayed on the Internet. This last option is the least cumbersome for both creators and viewers. Also, it is the superior option if online scoring is incorporated.

Benefits of web-based ePortfolios:

  • The product is created and stored on a server that can be accessed anytime, from anywhere. Thus, students can work from any computer with Internet access.
  • Viewing the ePortfolio can also happen anytime, from anywhere, typically by clicking on a URL link that the student sends through email. This convenience is crucial if students intend to share their work externally (e.g., in making employment applications).
  • Web portfolios can be mounted by an institution (using local servers and even home-grown software), by students working independently, or through commercial software such as that offered by Chalk & Wire, LiveText, TaskStream, et al.
  • Evaluators can score the product online if the platform is designed to allow that. TaskStream* permits instructors to create and use rubrics for a variety of student works. Depending on the platform, instructors can also set up lesson plans, discussion boards, and data collection forms for students. Instructors can also create their own web folios and web pages. (TaskStream has these features as do some other platforms.)
  • Students can store their source material (course work, research files) and have access to it 24/7.

Provisions required for web-based ePortfolios: While students are increasingly adept at creating online persona and products, they do not generally design their web work in styles appropriate for demonstration of academic competencies. They need guidelines for what should be in a portfolio and often the support of a template, as well. Web platforms can build in such resources.

  • Students in the arts may require additional support in terms of server space. If video and audio clips are suitable for inclusion in the ePortfolio, the institution may need to supplement standard storage allowances or students may need to purchase additional storage. This is not a large cost but may require advance planning.
  • Templates or folio requirements built into the student interface ease the learning curve for students and streamline evaluation by instructors. When these requirements are aligned with Student Learning Outcomes, institutional assessment efforts are also streamlined. Provision for template or requirements requires considerable planning and coordination by faculty, instructional support staff, and administrators.
  • Training and ongoing HelpDesk support are needed for technical issues and also portfolio standards. Instructors or their assistants must anticipate requests for help at least while students are learning how to use new software. (Over the past 7 years that I have coordinated ePortfolio projects, I have seen less and less demand from students for training.) If a commercial platform is used, the vendor may provide a "free" accompanying HelpDesk.

Technology considerations. Web-based portfolios can utilize the basic file types that are common today (e.g., HTML, FLASH, PDF) as well as newer rich Internet applications (RIAs) and the interactive options generally referred to as Web 2.0 (this is a YouTube link to a popular explanation of 2.0 technology).

Location of platform. An institution's choice of ePortfolio software may hinge on decisions about local or outsourced servers, campus-based or outsourced technical support, institution- or student-paid accounts, and integration with other software (for example, linked to or part of a learning management system or LMS). I lean toward the multiple vendor model, meaning that student ePortfolios should be separate from their LMS for courses. That doesn't sound efficient, I know, but I'll address it in a future post and explore the reasons.

*TaskStream: When I use TaskStream as my example, I always disclose that I have served as a consultant to the company, although as I like to tell clients, I chose TaskStream before TaskStream chose me.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology. She can be contacted at bold[AT]marybold.com using standard email format instead of [AT].

21 August 2008

Technology Note: Eternal September

As campuses prepare for students in the next couple of weeks, at least some of the folks in IT have a memory of Eternal September, which described the influx (and impact) of new users on computer networks. Across the 1980s, September on campuses meant arrival of newbies (new computer users) who disrupted the flow of communications on Usenet (think early Internet). As students learned the rules of Usenet, the system returned to normal. Of course, the cycle would begin again the next September.

It was in 1993 that the month became Eternal September: AOL opened the commercial portal to Usenet and newbies flooded the system. (The term Eternal September was actually coined in 1994, with a backward glance.) Internet pioneers bemoaned the activity by the raw beginners who knew no netiquette standards. By sheer numbers, Internet newbies overwhelmed the system for years. They outnumbered the experienced users for a decade or more (the timing is debated but typically set at 2003-2005). At that point, Internet users with at least a year's experience finally outnumbered the newbies and "acculturation" therefore occurred more quickly. Today, the ratio of experienced:newbie in the larger society's Internet finally looks more like the ratio that campuses saw on Usenet in the 1980s. Eternal September has ended.

Campus life continues to revolve around Septembers, though, and more departments than IT plan for and accommodate new students and faculty. There's not a function of higher education that doesn't involve new participants and their incorporation into a culture of practice (including institutional effectiveness and assessment). Luckily, the people hanging around campuses are energized by the start of the school year and they work hard to keep the Septembers short.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

20 August 2008

Accountability for Student Identity

Online education's challenge. The summer is ending with a swirl of commentary about a passage in the new Higher Education Act requiring that institutions assure that online students are doing their own work. Another market for educational services? Indeed. Already underway.

Electronic monitoring in the lead. If online programs aim to provide a completely off-campus course of study, then vetting the student must have an online strategy. (One alternative would be to require proctored in-person exams but then the program is no longer fully online, of course.) Among the potential strategies to employ:

1. Web cameras can capture an image of the student and the environment around the computer or work area.

2. User-centric software can create a profile of the student's computer, report on student's use of online resources during an exam, and even report on the student's speed of typing.

3. Even more sophisticated security can provide access to a testing computer via fingerprint, eye scan, etc.

Such strategies are possible today and the marketplace is now filling with companies ready to pitch their products and services to higher ed.

If you are a faculty member, you are already wondering who will monitor such hardware and software. (You are hoping it's not you or your teaching assistant.)

If you are an administrator, you are already calculating how you will pay for the hardware and software...and the work force to monitor them.

If you are a campus marketer your thoughts are racing ahead to (a) explaining the student fees that may be imposed to fund the monitoring, (b) recruiting students for a program that suggests students are cheaters, and (c) peddling harder to find new students if the market contracts.

The challenges are no less for the companies seeking to sell security services. Even though we marvel at estimates of tens of thousands of dollars to support a semester's worth of security for an online program, we know the companies are in a developing market with unknown final costs and presumably unknown prospects. Yes, distance learning must live up to a high standard of accountabilty for student identity—but the standards are not well defined. Is there a prospect of a big market? Yes. But there's also the prospect of changing market segments.

Which institutions can afford to be accountable? A long time ago (that would be in the 1990s), distance learning pioneers wondered out loud if smaller institutions could compete in online offerings. As technology advanced on campuses, it became clear that the cost of distance learning was not a barrier and, in fact, smaller schools could shore up their enrollments by rolling out online courses and degrees. (A more threatening argument also emerged: if a small school wanted to maintain market share, it must include online choices.) Today, as security tools emerge, smaller institutions will once again have to answer whether they can "afford" to run a distance program.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

19 August 2008

Assessment Lexicon Resource

While most books on assessment include a glossary of assessment terms, James Madison University's Dictionary of Student Outcomes Assessment does it better. The online, referenced tool offers a "Textual Search" that retrieves every mention of the word you enter, either as a Term, a Definition, or (most valuable option) both. The search result is a sometimes lengthy table of terms, cross-references, synonyms, definitions, and sources/references.

The JMU Dictionary draws from regional and state accreditation sources researched by the University's Center for Assessment and Research Studies faculty and students. Here's the catch but one easily noticed when you use the tool: references are dated and some obvious terms are missing.

For example, you won't find NCLB or even much from Department of Education (what is cited is from the 1990s, predating the passage of NCLB of 2001). The most recent reference date is 2001, although the compilation work was obviously completed in 2003. (A frequent "Retrieved" date for online references is February 2003.)

You will find a sound selection of terminology from the 1990s that continues to serve the assessment community. What I like is the cross-referencing called "Cross Sources" that is provided whether you use the Textual Search or just Browse by Alphabet.

I also like the concomitant message that comes with this Dictionary. A campus culture of assessment does not emerge overnight: JMU's projects reflecty a history of sustained focus and support.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

14 August 2008

Tech Tip: Fast Feedback

Saving time in keyboarding appeals for all tasks but I'll admit I am drawn to it primarily when staring at a stack of electronic submissions awaiting feedback. While every evaluator may aspire to giving only personalized feedback, a more pragmatic approach is to include the automated variety at least some of the time. When time or sheer numbers demand fast response, these two strategies can help.

Text substitution in any Windows application (including email): Using a small app called Texter available from lifehacker, you can devise your own set of codes for text substitution. Line 1 displays my shortcuts created in Texter; Line 2 displays the automatically substituted text:

  1. T hed comy looks to schy js for comn abt ass. [45 characters]

  2. The higher education community looks to scholarly journals for communication about assessment. [94 characters]

If you cannot install an app on your office computer, you can accomplish similar text substitution inside Microsoft Word. Access the option called AutoCorrect Options to enter your codes. This is the same feature that permits you to type a string of characters such as (c) and have Word transform it to the symbol ©.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

13 August 2008

Online Survey Software

No product endorsements implied...these are the online survey options that have popped up in listservs in recent months as well as some "old regulars" that have served schools well for years. I've kept my comments down to a few key words.

Inquisite - offers 30-day trial, price quotes via web form, names higher education as a leading market, founded 1997.

Limesurvey - free, open source software; currently accepting applications for beta version of limeservice (hosted version).

Psychdata - online sign-up for free account for a free survey and access to manual; does not require naming your institution for the trial; features IRB aspects of surveys; founded 2001.

SPSS Dimensions ASP - hosted online survey services, with tools such as mrinterview and different levels of consulting help; founded 1968.

StudentVoice - campus members can collect data via web-based surveys or handheld devices (PDAs to record responses to in-person polling); this About page includes a link to a list of member institutions; founded 1999.

Surveymonkey - transparent in terms of options and pricing (which includes a limited but free level of service); extremely user-friendly with short learning curve; founded 1999.

Vovici - (voh-VEE-see) - offers a free trial; based on WebSurveyor software tools.

Zoomerang - options and prices are similar to Surveymonkey's; founded 1999.

As noted, these survey services have all received high marks on forums. I have used only 3 of them: Psychdata, Surveymonkey, and Zoomerang. I have applied to the limeservice beta group. The prominent theme for my usage is clear: free or low-cost service. (My Psychdata account was "free" only in the sense that I did not pay for it personally; my university held a site license.) The free or very low cost Surveymonkey and Zoomerang are user-friendly hosted services. Survey writing procedures are pretty intuitive and the creation screens have embedded instructions.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

12 August 2008

Inducements: Heavy & Light

Inducements to learn about assessment. Higher education assessment is not a big-draw topic on most campuses. Unless your institution holds Assessment Days or similar high profile events, chances are that you have to work very hard to pull in a crowd for training or professional development. Rotating crowd-pleasers from the categories below will keep inducements fresh.

Leadership endorsement
President's or Provost's attendance as often as you can swing it
Either person's signature on an email invitation

Pay per day
Stipend or task payment for attendance or service
Token stipend of $25
Gift certificate of $10 (to just about anywhere)

Promotional product
USB drive with school logo
Tape measure with logo
Desk fan unless that's a bad joke in your buildings
Good quality highlighter set (affordable pens don't work, figuratively and literally)

Food
Outstanding treat like Chocolate Fondue (advertise it)
Box lunch that attendees can take away at end (advertise it)
Full meal in middle of day (advertise it)

Faculty and staff will also come for new information. When eportfolio software was new, I saw a crowd of 72 gather for a webinar. Following a SACS Institute one year, a panel discussion by 4 attendees drew a standing-room-only crowd. A gift is not always needed, obviously. The best strategy is to provide genuinely new information or service but to have a few rewards planned, too.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

07 August 2008

Teaching about Data Collection

This table, Overview of Basic Methods to Collect Information, concisely describes the purpose (along with advantages and challenges) for surveys, interviews, focus groups, etc. Especially the Challenges column can help a faculty group in deciding what method to adopt when beginning a new initiative. Recommended use: for people new to assessment.

The publisher, Free Management Library, advises that article authors hold the copyright but that you can link to the table in the Library. Follow the link above and capture the URL for re-use in a web page or email. (See the Library's menu item Copyright and Reprint.)

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice.

06 August 2008

The Tracking Portfolio

ePortfolio: my favorite form of assessment. I've been lucky to be in a position to develop assessment methods as technology emerged to support affordable versions. That means a tracking portfolio can be web based, efficiently storing and scoring student works and saving a lot of trees in the bargain. And I've had the pleasure of designing a few. I'll make my ethical disclosure early: I am a TaskStream fan. The disclosure is that I have served as a consultant to the company, although as I like to tell cilents, I chose TaskStream before TaskStream chose me. (In a future article I will explain the criteria I used to compare portfolio platforms; I tested three platforms in four semesters in real-word conditions.)

Dual purpose: tracking and exiting. The ideal use of an ePortfolio is a tracking tool that also serves as the basis for an "exit portfolio," or a representative body of work that is submitted by the student at the end of a degree program. That dual purpose creates a win-win proposition for a program. Especially for a program that is implementing a portfolio for the first time, the win and the win should be spelled out. Instituting a portfolio involves a learning curve and change, so an explicit explanation of benefits is needed to assure participants that the "win" is worth the effort.

The student's win:
Portfolio that can be used in a job search
Skill-building in professional presentation
A place to put their stuff

The institution's win:
Tracking mechanism for students across degree
Online rubric scoring of student works
Efficient management of embedded assessments
A place to put our stuff

I have three requirements for a tracking portfolio. The ePortfolio must be web-based, faculty designed, and student maintained. I will address those three topics on August 26, 27, and 28. We'll call it ePortfolio Week. In the meantime, I recommend this web site of the Inter/National Coalition for Electronic Portfolio Research. In systematic assessment with cohort schools, the Coalition tracks portfolio development over 3 years' time. The results are beginning to add up.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

05 August 2008

Accountability

Last month I attended a conference on assessment in Fort Worth and recorded the language as it unfolded from speakers who included consultants, vendors, and host institution administrators.

What was late to unfold: accountability. In fact, we were in session for 55 minutes before the word emerged. I think that wouldn't have been the case a year or two ago. Here are the words that preceded it:
........scorecard
........dashboard
........tracking
........deploy
........KPIs
........strategy maps
........implementation
........alignment
........operational review
........systematic
........consistent
........accountability

There were a lot of other words spoken that day, of course. Enough that I finally quit tracking the vocabulary. But I've found a new diversion for the coming year: my own little content analysis of the lexicon of the assessment world.

A plug for the people presenting at "Higher Education - Aligning Your Strategic Goals": Jan Lyddon and Bruce McComb (of Achieving the Dream and other consulting experience) led us in a full exploration of strategic thinking about assessment, David Cook and colleagues (of Information Builders) described the Performance Management Framework software in what was surely the least-pressuring vendor demo I've ever seen, and Rebel Jones (administrator at the University of North Texas Health Science Center) demonstrated how his unit collects and analyzes assessment data for a medical school. Good show!

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.