30 September 2008

College Data on the Web: U-CAN




As the College Portrait: Voluntary System of Accountability (VSA) rolls out its new website (which will be reviewed here later), I looked at the independent college alternative called U-CAN. Profiles of accredited, private, and not-for-profit schools are presented on a website that's old enough to already be in its second version.

Fast facts about U-CAN:

  • stands for University and College Accountability Network
  • displays basic facts about schools (admissions, tuition, etc.)
  • sponsored by the National Association of Independent Colleges and Universities
  • data display not limited to NAICU members
  • charges no fees (to schools or to web visitors)

U-CAN launched in 2007 with 600 schools displaying data; the updated web site (called U-CAN 2.0) was launched September 17 with 728 participating institutions and advanced search across 17 variables. The interface is sophisticated with multiple methods for locating schools (including browsing by Institution, by State, by Affiliation/Mission) and fast-loading linked pages. The first view of a school's data is a standard HTML web page, so there is no delay for a file to open. The site design is strong and clearly intended to be user-friendly in typesetting, navigation, and use of color.

Each school's data page follows a standard template that can also be printed in a 2-page PDF file. The display also permits linking to a school's own web site for specific information. A subtle plus: links to the institution's own web site open in new browser windows, allowing exploration of those web pages while still maintaining a window for the U-CAN profile. (For the Abilene Christian University profile, I counted 22 outbound links.)

U-CAN's home page requires a lot of reading to decide where to start, but the organization gets brownie points for including the links that matter to its most youthful visitors: twitter, Facebook, YouTube, and Wikipedia. I visited them all and found some charming videos on the YouTube link. Another extra, the U-CAN blog, details the history of the program.

© 2008 Mary Bold, PhD, CFLE. Email contact: bold[AT]marybold.com. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

25 September 2008

3A Software: Accountability, Accreditation, Assessment

Choose Your A. With no clear front runner, the terms accountability, accreditation, and assessment all refer to a class of software competing in the higher ed market. The purpose of said software encompasses all the terms, of course. Institutions seek one or more tools to demonstrate accountability for learning. The tools must operate in a style that is suitable for accreditation purposes and must support multiple forms of assessment. That's a pretty tall order for any software application and I'm naturally suspicious of any company that claims to have it fully developed.

Range of products. The current software options range from complete content management applications to more specialized tools. That means the prices range from hundreds of thousands of dollars (per year) to the more moderate tens of thousands. Some companies package their accountability software with student portfolio software as a low-cost add-on.

TaskStream AMS (stands for Accountability Management System), Blackboard Outcomes, Tk20, and LiveText are well-known platforms that function as tools for accountability, accreditation, and assessment. All of them have faced the challenge of designing software that can be used efficiently by small schools as well as large schools, and customized for local needs.

Considerations in the selection of software:

  • Purpose: to organize the work of on-going systematic assessment OR ALSO to serve as the location of data collection and analysis.
  • Scope: for use by assessment report writers only OR ALSO by the faculty and staff members who are responsible for assessment on the front line.
  • Display: as an internal platform for campus members only OR ALSO as a platform for sharing with accreditors and other stakeholders.
  • Integration: to be linked to other campus systems OR to stand alone.

(My own preference for an assessment system is to maintain a separate platform. I'll blog more about that this fall.)

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, educationy, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

24 September 2008

College Rankings

Whether you live by them or not, published college rankings affect every campus, even if only through the effort expended in discounting them. If the institution feels the pressure to actually rank, by which we always mean rank highly, then a lot more energy must be expended.

To the public, rankings appear organized and orderly. Almost always employing highly readable charts, a magazine (like U.S. News and World Report) seems to support students' and parents' comparison shopping. Resisting the assumptions that follow (College A is better than College B) is difficult.

Research voice. The Beyond Ranking Project of the Education Conservancy provides a contrast that includes a research base about ranking. Students' voices were captured in EC's 2007 survey on college admissions and the web summary includes a sampling of their comments (scroll to bottom of that web page).

Corporate voice. The Boeing Company is making its own ranking of engineering schools based largely on the employee evaluations of the graduates of those schools. While college assessment officers frequently seek (or at least desire) employer input about graduates' readiness for the workplace, feedback on actual job performance has not been common. An approximation is sometimes gathered through evaluations of interns or students in practicum placements, but always with the emphasis on pre-professional competency. A "hire rate" with an internship site is sometimes established but that's also a function of the economy, not just interns' performance. So, the academy will have some adjustments (in thinking and maybe even in practices) if Boeing's program has warm reception. (Boeing will not publish a list but right about now it will begin notifying deans of how their schools "ranked" in the Boeing review of data. I think we can expect some sharing of news, at least by happy deans.)

For sheer enjoyment, contrast all of the above with Stuart Rojstaczer's website, College Ranking Service.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

23 September 2008

Projected College Enrollment for 2017

Projections of student enrollments to 2017 have been released by the Department of Education. As the graphic above for higher education indicates, numbers will be up. The age break-down shows slower growth rate for younger students and slightly brisker growth rates for students up to age 35.

(This graphic is enhanced for web viewing. Find the original (and many more) in the PDF publication available at the NCES web site.)

This chart reports actual figures from 1997 and 2006 for the six age brackets. The light blue bars reflect the projected number of students for 2017, in millions. The NCES report goes into more detail, with high and low projections, as well. (The light blue bars here are based on the report's "middle alternative" figures, which are thought to be the most likely projections.)

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

18 September 2008

Internet Resource: Turning the Pages


The British Library collection called Turning the Pages places historical texts online in a fashion that does justice to the old books as well as to the technology. The short list of texts: Sketches by Leonardo (his personal notebook), the Diamond Sutra (oldest printed book dated 868), first atlas of Europe (Mercator in the 1570s), Jane Austen's History of England.

The hyperlink above is a preferred starting point that explains options according to your computer and operating system. The fast way to find your version is simply to click on a book title and see what will load. If you have problems, then delve into the small print to identify your PC or Mac using Shockwave plug-in, Mac OS X alternative download, or version 2.0 for Microsoft Vista and Windows XP.NET 3 framework.

Controls across the bottom of the interface open a Help screen that explains all functions; for some books, a hyperlinking list of Contents; Text for translation; Notes for explanation; Audio; and Magnify. (Some books have more controls than others.)

Less animation but the same text, enlargements, and some audio: If you cannot install the Shockwave plug-in or you need a more accessible page, the Library also provides on a set of pages with static images.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

17 September 2008

Updated Locations for HEOA Hearings

This week's opening of hearings for HEOA—Higher Education Opportunity Act of 2008—is on schedule for Friday, September 19, at Texas Christian University (TCU) in Forth Worth, Texas. The TCU location is the Brown-Lupton University Union. Locations for all 6 hearings are now posted at a dedicated Dept of Ed web page. The list conveniently includes map links.

The hearings are precursor to the standard negotiated rulemaking that accompanies the long-awaited legislation. At the public meetings, anyone can request a short speaking time; sign up for that at start of day. The Department will also accept written comments up to 8 October 2008. (See web page for those details.)

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

16 September 2008

Checklists for Evaluation Projects in Higher Ed

The Evaluation Checklist Project at Western Michigan's Evaluation Center presents a menu of checklists (blind, peer-reviewed) across categories such as evaluation management, models, criteria, institutionalization, and even checklist creation. Most of the entries are in checklist format; just a few are text-heavy.

In the Institutionalization category, two complementary lists can be put to good use to introduce the main concepts of institutional-level evaluation to stakeholders new to the process. (The two lists are described below.) A related 2002 list under Evaluation Management is also very short; it comes to the same purpose for the evaluation staffer who must sell the process. The title is apt: Making Evaluation Meaningful to All Education Stakeholders.

Institutionalizing Evaluation Checklist (2002) - Daniel Stufflebeam's list of 17 steps fits on one page and, akin to its repeated mention of stakeholders, can serve as a "big picture" view of evaluation for those stakeholders who don't know the scope of an evaluation system. The title clearly leads to an institution-wide view.

A Checklist for Building Organizational Evaluation Capacity (2007) - Boris Volkov and Jean King cite the 2002 work by Stufflebeam as complementary to the newer work, but stress their emphasis is evaluation capacity building (ECB). Their 8 steps consist of a total of 35 sub-steps in the pattern of a general directive (cultivate a positive...context) answered with specific actions (make sure that key leaders....).

I also especially like two of the entries under Evaluation Models: Constructivist and Qualitative. They are among the longer checklists but that's a fit for their models.

On the Checklist web page (not on this blog page), allow your cursor to hover over each title for an abstract. With the exception of one Excel file, the checklists are presented as PDF files; the file sizes are listed on the web site but in the list below I have augmented that information with page counts. Also, I have added the publication year here because that's what I wanted to know when I began exploring the site. The list below follows the same pattern as the web site's for ease of location.

Evaluation Management
1999 - 5 pg - Plans and Operations - Stufflebeam
2001 - 6 pg - Budget Development - Horn
1999 - 1 pg - Contracts - Stufflebeam
2004 - 5 pg - Design - Stufflebeam
1976 - 1 pg - Negotiating Agreements - Stake
2001 - 2 pg - Feedback Workshops - Gullickson & Stufflebeam
2004 - 3 pg - Reports - Miron
2002 - 3 pg - Making Evaluation Meaningful to All Education Stakeholders - Gangopadhyay

Evaluation Models
2007 - 16 pg - CIPP Model - Stufflebeam
2001 - 15 pg - Constructivist (a.k.a. Fourth Generation) Evaluation - Guba & Lincoln
2000 - 2 pg - Deliberative Democratic Evaluation - House & Howe
2007 - 22 pg - Key Evaluation Checklist - Scriven
2003 - 13 pg - Qualitative Evaluation - Patton
2002 - 6 pg - Utilization-Focused Evaluation - Patton

Evaluation Values & Criteria
2001 - 3 pg - General Values and Criteria - Stufflebeam
1994 - 1 pg - Duties of the Teacher - Scriven
n.d. - 3 pg - Educational Products - Scriven
2001 - 3 pg - Institutionalization of Technology in Schools - Nelson, Post, & Bickel
1977 - 32 pg - Large-Scale Assessment Programs - Shepard
2002 - 12 pg - Research & Development Centers - Stufflebeam

Metaevaluation
2005 - 44 pg - AEA Guiding Principles - Stufflebeam, Goodyear, Marquart, & Johnson
2001 - 4 pg - Legal Viability of Personnel Evaluations - Stufflebeam & Pullin
2000 - 7 pg - Personnel Evaluations - Stufflebeam
2000 - 7 pg - Personnel Evaluation Systems - Stufflebeam
1999 - 8 pg - Program Evaluations (short version) - Stufflebeam
1999 - 11 pg - Program Evaluations (long version) - Stufflebeam
1999 - 11 pg - Program Evaluation Models - Stufflebeam

Evaluation Capacity Building & Institutionalization
2007 - 4 pg - Evaluation Capacity Building - Volkov & King
2002 - 1 pg - Institutionalizing Evaluation - Stufflebeam

Checklist Creation
2000 - 10 pg - Checklist Development - Stufflebeam
2003 - 3 pg - Checklist for Formatting Checklists - Bichelmeyer

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

11 September 2008

Upcoming Assessment Hearings & Conferences

HE Act: Hearings on the Higher Education Act
Focus: Preliminary hearings to inform the regulation-writing that is generally called "negotiated rulemaking."

Announcement in the Federal Register about the hearings:
http://edocket.access.gpo.gov/2008/E8-20776.htm

Dept of Ed page that will be updated with more details about the hearings:
http://www.ed.gov/policy/highered/leg/hea08/index.html

Dates and Locations of Hearings:
Sept 19: Texas Christian University, Fort Worth, TX
Sept 29: University of Rhode Island, Providence, RI
Oct 2: Pepperdine University, Malibu, CA
Oct 6: Johnson C. Smith University, Charlotte, NC
Oct 8: US Department of Education, Washington, DC
Oct 15: Cuyahoga Community College, Cleveland, OH

Upcoming Conferences:

F-Y Assmt: National Conference on First-Year Assessment
Oct 12 - 14, Hyatt Regency San Antonio, TX

SAIR: Southern Association for Institutional Research
Oct 18 - 21, Sheraton Nashville Downtown, TN

Assmt Inst: 2008 Assessment Institute
Oct 26 - 28, The Westin Indianapolis, IN

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

10 September 2008

Students from another planet

Or at least another world.

If the avatar (online personage) at right is not familiar illustration, you haven't been to the Habbo Hotel. But you might want to go because that's where a lot of your future students are now.

Virtual world perspective. Research from KZero assures us that Habbo is bigger than Second Life (SL), the virtual world that most folks in higher ed have heard of, visited, or resisted. Why should we care about these online spaces? Because if we consider only real world (RL) assessment, we will mistakenly assume that measuring global perspective (yesterday's post) is enough.

The KZero worldview. Kzero (double click on the web site's charts for readability) does not pretend to predict the future of virtual learning environments (our educational application) or even growth of online communities. But the message from the numbers is undeniable: the next wave of college students will arrive on campus not only having used computers for their whole school careers (as in 100% of them) but also having created their own alternate existence in one or more social networks.

Experience gap. The generation gap becomes an experience gap, but not with the older generation being able to claim to be more experienced. While commerce, marketing, and entertainment dominate online experiences such as Habbo and Second Life (and gaming such as World of Warcraft and social networking such as Facebook), educational use is growing. Assessment of that use will grow from the work of the pioneering educators but it will be more valuable if assessment administrators are already familiar with these developments.

Personal experimentation. The graphic here is my own avatar in Habbo Hotel. I modeled it after my junior high appearance, so anyone who knows me in Second Life may see only a slight resemblance. The study of how humans navigate virtual worlds is well under way and already has theories about why some people (like me) aim for approximations of real name and real looks in creating their avatars. Other folks create very different online personalities and, no, our older term of pseudopersonality just doesn't suffice anymore. Similarly, behavior in an online world may be very different from real life (RL). I don't mind disclosing that my first action in my Habbo Hotel room was to drag my stool to the window "for the view." Just what I do in real hotels, too.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

09 September 2008

Measuring "Global Perspectives"

An instrument aimed at measuring undergraduate students' "global perspective" has emerged from the pilot stage to a norming group of 2500+ respondents. (Instrument development since 2007 has included about double that number.) The 46-item Global Perspectives Inventory is described on the web site of Global Perspectives Institute (http://www.gpinv.org), along with its history (see menu item Development of the GPI on that site).

An online instrument, the GPI relies on self-report. Results are not returned on individuals; the institution receives results at the group level. See the homepage text of the web site for a code to use to access a copy of the survey.

The GPI is currently affordable, with this offer good through June 2009: $500 for two administrations, scoring and interpretation provided by the company. I checked with the company to confirm that there is no top limit for number of students at this price.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

04 September 2008

Transparency: Results from Outside Reviews

The Quality Matters (QM) web site lists distance learning "Courses Recognized," meaning that the courses successfully passed the QM review that is billed as "inter-institutional quality assurance in online learning." (Click on Course Reviews on the web site menu for years 2004 through 2008.)

While no accreditation claim is made, the QM review serves as a means for external peer review that an institution can report to its stakeholders. Distance learning courses continue to be dogged by charges of lesser quality than their on-ground counterparts and schools understandably seek a good housekeeping seal of approval. QM provides that endorsement through a peer review team that focuses on course design according to aligned standards.

QM grew out of a FIPSE Grant at MarylandOnline state consortium of two- and four-year institutions. The QM organization now has more than 120 subscribers in more than 30 states. Through a fee-based subscription, an institution can submit distance learning courses for review. Not all courses pass—but all receive thorough feedback.

The Quality Matters research-based rubric has just been revised, demonstrating that it is an evolving instrument. As a national reviewer of 2-3 courses per month, I have used the new rubric in its launch. It's tougher.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

03 September 2008

Transparency and Grade Creep/Inflation

The Chronicle recently highlighted Princeton University's famous address of grade inflation and I followed up with a web search. A friend who has taught at PU recently told me about the institution's initiative and not in wholly positive terms. Regardless of how quickly or slowly the goals are met, the level of transparency is admirable. (See link below for the actual results.)

The strategy aims at departmental numbers, not individual course grades. This is how the policy is summarized on the website:

A’s (A+, A, A-) shall account for less than 35 percent of the grades given in undergraduate courses and less than 55 percent of the grades given in junior and senior independent work. The standard by which the grading record of a department or program is evaluated is the percentage of A’s given over the previous three years.

Results for the first 3 years of the initiative appear in Word.docs at the web site of the Dean of the College (http://www.princeton.edu/odoc/faculty/grading/results/).

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

02 September 2008

Transparency and Course Evaluation

As higher education responds to calls for accountability, the Internet supports the effort. Increasingly, institutions use web sites to display data, not just periodic reports for accreditation. For a close-up look at a Colorado school's course evaluations, read on.

The University of Colorado at Boulder maintains a web view of students' ratings of courses and instructors. From the web site of the Office of Planning, Budget, and Analysis anyone can check on results of the Faculty Course Questionnaire [click on the top link in the main window: FCQ results]. On the search page, scroll down for options of most interest to outsiders (including the option of display in a table in the browser as opposed to an EXCEL file).

When results appear on screen, you can click on a recurring link for the Guide to Interpretation. This text is directly from the Guide:

Information from FCQs is used by
  • students for selecting courses and instructors,
  • instructors for improving their teaching, and
  • deans and department chairs for promotion, tenure, salary, and course-assignment decisions.

Yes, the results really are used for these things!

Remember that several factors in addition to quality of instruction may influence the ratings. These include department, class size, class level, and whether the class is required or an elective.

© 2008 Mary Bold, PhD, CFLE. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.