Criticism of college and university rankings (2007 United States)

Criticism of college and university rankings (2007 United States)

Criticism of college and university rankings (2007 United States) refers to a 2007 movement which developed among faculty and administrators in American Institutions of Higher Education. It follows previous movements in the U.S. and Canada (by schools in the 1990s such as Reed College, Stanford University, Alma College, as well as a number of universities in Canada in 2006) which have criticized the practice of college rankings.

Contents

Sarah Lawrence College

In 2007, some educators in the United States began to question the impact of rankings on the college admissions process, due in part to the 11 March 2007 Washington Post article "The Cost of Bucking College Rankings" [1] by Dr. Michele Tolela Myers (the former President of Sarah Lawrence College). As Sarah Lawrence College dropped its SAT test score submission requirement for its undergraduate applicants in 2003 [2] (thus joining the SAT optional movement for undergraduate admission), SLC does not have SAT data to send to U.S. News for its national survey. Of this decision, Myers states, "We are a writing-intensive school, and the information produced by SAT scores added little to our ability to predict how a student would do at our college; it did, however, do much to bias admission in favor of those who could afford expensive coaching sessions.[1] Currently, Sarah Lawrence is one of only a few American colleges that completely disregard SAT scores in the admission process[3].

As a result of this policy, in the same Washington Post article, Dr. Myers stated that: "I was recently informed by the director of data research at U.S. News, the person at the magazine who has a lot to say about how the rankings are computed, that absent students' SAT scores, the magazine will calculate the college's ranking by assuming an arbitrary average SAT score of one standard deviation (roughly 200 points) below the average score of our peer group. In other words, in the absence of real data, they will make up a number. He made clear to me that he believes that schools that do not use SAT scores in their admission process are admitting less capable students and therefore should lose points on their selectivity index." [1][4]

Myers further stated that "several faculty members and deans suggested that perhaps it was time to stop playing ranking roulette and opt out of the survey." [1] Myers next argued that at the NEAIR 33rd Annual Conference ( North East Association for Institutional Research) in 2006, a talk given by U.S. News, [5] "indicated that if a school stops sending data, the default assumption will be that it performs one standard deviation below the mean on numerous factors for which U.S. News can't find published data. Again, making up the numbers it can't get. The message is clear. Unless we are willing to be badly misrepresented, we had better send the information the magazine wants." [1][4]

U.S. News and World Report official response

U.S. News and World Report issued a response to this article on 12 March 2007 which stated: "Sarah Lawrence's decision is unique, and the magazine's handling of it is still under consideration. Some colleges have made SAT or ACT scores optional in the admissions process, but to our knowledge, no other major college has decided to disregard them completely. Our rankings are painstakingly tabulated, using the best data available. U.S. News data researchers regularly participate in briefings and conferences where the most complicated nuances of the process are discussed with the ranked institutions. We regularly adjust to changes in the educational environment, and we plan to address this circumstance in a similar manner." [3]

Presidents Letter

The Presidents Letter (dated May 10, 2007), developed by Lloyd Thacker of the Education Conservancy, was sent to college and university presidents in the United States in May 2007, concerning the U.S. News and World Report college rankings. The letter does not ask for a full boycott but rather states that:

while we believe colleges and universities may want to cooperate in providing data to publications for the purposes of rankings, we believe such data provision should be limited to data which is collected in accord with clear, shared professional standards (not the idiosyncratic standards of any single publication), and to data which is required to be reported to state or federal officials or which the institution believes (in accord with good accountability) should routinely be made available to any member of the public who seeks it.[6]

Instead, it asks presidents not to participate in the "reputational survey" portion of the overall survey (this section accounts for 25% of the total rank and asks college presidents to give their subjective opinion of other colleges). The letter also asks presidents not to use the rankings as a form of publicity:

Among other reasons, we believe [...] rankings: imply a false precision and authority that is not warranted by the data they use; obscure important differences in educational mission in aligning institutions on a single scale; say nothing or very little about whether students are actually learning at particular colleges or universities; encourage wasteful spending and gamesmanship in institutions' pursuing improved rankings; overlook the importance of a student in making education happen and overweight the importance of a university's prestige in that process; and degrade for students the educational value of the college search process. We ask you to make the following two commitments: 1. Refuse to fill out the U.S. News and World Report reputational survey. 2. Refuse to use the rankings in any promotional efforts on behalf of your college or university, and more generally, refuse to refer to the rankings as an indication of the quality of your college or university."[6]

List of colleges and universities

Twelve college and university presidents originally signed the letter in early May.[7] The letter currently has sixty-one signatures, though others may be added at a later date.[8]

Annapolis Group meeting

On 19 June 2007, during the annual meeting of the Annapolis Group (which represents over 100 liberal arts colleges), members discussed the letter to college presidents. As a result, "a majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future." [9] However, the decision to fill out the reputational survey or not will be left up to each individual college as: "the Annapolis Group is not a legislative body and any decision about participating in the US News rankings rests with the individual institutions." [10]

The statement also said that its members "have agreed to participate in the development of an alternative common format that presents information about their colleges for students and their families to use in the college search process." [10] This database will be web based and developed in conjunction with higher education organizations including the National Association of Independent Colleges and Universities and the Council of Independent Colleges.

The new database was described in TIME magazine as "a web-based alternative to the rankings that is being spearheaded by the 900-member National Association of Independent Colleges and Universities. NAICU's easy-to-read template, which is expected to be rolled out by hundreds of schools in September, allows students and their families to pull up extensive information organized in an objective format that includes such data as what percentage of students graduate in four years compared to those who graduate in five or six years." [11]

U.S. News and World Report official response and debate

On 22 June 2007, U.S. News and World Report editor Robert Morse issued a response in which he argued, "in terms of the peer assessment survey, we at U.S. News firmly believe the survey has significant value because it allows us to measure the "intangibles" of a college that we can't measure through statistical data. Plus, the reputation of a school can help get that all-important first job and plays a key part in which grad school someone will be able to get into. The peer survey is by nature subjective, but the technique of asking industry leaders to rate their competitors is a commonly accepted practice. The results from the peer survey also can act to level the playing field between private and public colleges." [12]

In reference to the alternative database discussed by the Annapolis Group, Morse also argued, "it's important to point out that the Annapolis Group's stated goal of presenting college data in a common format has been tried before [...] U.S. News has been supplying this exact college information for many years already. And it appears that NAICU will be doing it with significantly less comparability and functionality. U.S. News first collects all these data (using an agreed-upon set of definitions from the Common Data Set). Then we post the data on our website in easily accessible, comparable tables. In other words, the Annapolis Group and the others in the NAICU initiative actually are following the lead of U.S. News." [12]

A debate on this issue was published as a podcast in the 25 June 2007 issue of Inside Higher Ed. The debate was between Lloyd Thacker, director of the Education Conservancy, who is a well known critic of the U.S. News rankings, and U.S. News editor Brian Kelly. The debate was moderated by Inside Higher Ed reporter, Scott Jaschik.[13]

Statement from Annapolis Group chair

Chair of the Annapolis Group, and president of Gettysburg College, Katherine Haley Will, discussed this decision further in a 9 July 2007 article for The Washington Post. In this article, Hill states that this decision was not based upon "a lack of concern about providing accurate, comprehensive information to help students and their families make decisions about college." Rather, she argued against the methodology of the U.S. News rankings. In particular, she argues against "the largest single factor in the U.S. News rating formula" which is the reputational survey as, "it is unrealistic to expect academic officials to know enough about hundreds of institutions to fairly evaluate the quality of their programs." Hill then argues that, "by contrast, 1 percent of the U.S. News ratings formula is assigned to student-to-faculty ratios, which many faculty members and students consider the most important factor in educational experience." Hill states that the members of the Annapolis Group will offer the same information in an alternative, free, format which will not rank schools, as, "an educational experience can't be reduced to one number, a school's so-called rank. The simplicity of a rank is understandably more appealing than spending hours poring over college catalogues and visiting campuses, but myriad complex variables can't be reduced to a single number." Instead, Hill asks students and parents to "compare schools on a variety of factors [...] they should visit campuses and go on what feels like a good match rather than relying on filtered or secondhand information. We must encourage students to look inside their hearts and trust their instincts when it comes to choosing a college, not whether parents or friends think a university is cool or prestigious." [14]

College presidents: responses

A number of presidents have issued responses to these events. One of them, Presbyterian College president John Griffith, compared this movement to a form of revolution: "I have lived long enough to come to the conclusion that major shifts occur every quarter century or so in the way American culture approaches matters of importance. We often call those shifts revolutions because people revolt against old and outmoded ways of doing things in favor of new approaches, new technologies and new ideas that better meet the needs of the time. We have experienced revolutions in information technology, travel and communication. There is one going on now that is symbolized by the introduction of iPhones this past week; we know what this one is about. But there is another revolution going on related to choosing a college -- and the role that public rankings play in that choice -- that may be less clear." [15]

Economics and endowment

Presidents have also discussed the role of endowment, correlating a high ranking on the survey with institutional wealth. President of Muhlenberg College, Peyton Helm, argued that "most of the other factors weighted by U.S. News in their rankings (in a secret formula they will not reveal, that is changed every year, and that independent researchers have been unable to replicate) are based, ultimately, on institutional wealth [...] A trustee once asked me what it would take for Muhlenberg to be ranked in the top five by U.S. News. My answer was simple: A check for $800 million placed directly in the endowment would do it -- even if we never changed another thing we were doing." Helms also noted that, "what you won't read in U.S. News is that most of the data they use is public information, readily available on the Web sites of most colleges and universities, as well as on the U.S. Department of Education Web site. There is no single formula for weighting these factors -- they will have different significance for different students and families. So, next year I and many other leaders of our nation's best colleges and universities will be working on a new and better Web-based tool for families engaged in the college search." [16] Millsaps College president, Frances Lucas, further noted that, "she previously had paid little attention to the rankings debate because her own institution was rated highly in U.S. News. But after learning more about the magazine's methodology and discussing the issue with colleagues at this week's meeting, she concluded that the rankings were based too heavily on measurements determined by institutional wealth." [17]

Historically Black Colleges and Universities (HBCUs)

President Walter Kimbrough of the historically black college, Philander Smith College, argued that U.S. News, "focuses on institutional resources, student selectivity and graduation rates to select the top institutions. But since many HBCUs struggle with these issues, he says the rankings in effect discourage students from going to those schools [...] If there are people looking at the rankings as a measurement of the quality of an institution, they think [HBCUs] do not have any type of qualities [...] [The rankings] do not tell you who the best schools are, just the most privileged.",[18][19]

Reputational survey

Former president of Sarah Lawrence College, Michele Tolela Myers, in discussing her decision to no longer submit information to U.S. News, stated, "they will do what they will do, [...] we will do what we will do. And we want to do it in a principled way."[20] Myers also indicated in a press release for the college magazine, Sarah Lawrence, that the college will be involved in developing the new database of colleges discussed in the Annapolis Group statement as they "believe in accountability and openness, and that the public has a right to solid and reliable information about the important decisions involved in choosing a college." The press release also indicated that Sarah Lawrence "plans not to participate in the peer reputational survey or data collection for U.S. News and World Report’s rankings" as, according to Myers, "by submitting data and the peer reputation survey we have tacitly been endorsing these rankings [...] all the information we have provided to U.S. News in the past will be available to the public through other channels.” [21]

Other presidents have also commented on the reputational survey. Former president of Scripps College, Nancy Y. Bekavac also stated in a press release on the college website that Scripps will also no longer submit the Reputational Survey to U.S. News as "for years we have known of flaws in the methodology; many of us have spoken with editors at U.S. News in an attempt to improve its approach [...] but nothing can really improve a system that seeks to reduce 3,300 educational programs in American higher education to one set of numbers, and then rank them. College presidents, academic deans and deans of admission do not know enough about other institutions to make meaningful comparisons. This gives a false sense of reliability to what is a ranking system without any real validity."[22] Sweet Briar College president, Elizabeth S. Muhlenfeld stated that, "one of our colleagues likened it to trying to rank composers. It's a great analogy. How can you say that Beethoven and Brahms are better than Mahler or Mozart?"[23] Trinity Washington University president Patricia McGuire argued that, "the survey asks me to 'rate the academic quality of undergraduate programs,' assigning each school a single score using a 1-to-5 scale from 'marginal' to 'distinguished.' That I have little real information about these 181 institutions does not seem to matter to the U.S. News editors [...] Some of the actual best colleges in this nation do not fare well in the U.S. News survey because they do not have the wealth, big-time sports notoriety or public relations clout to influence the peer voting system." [24] Finally, DePauw University president, Robert G. Bottoms, argued that, "I, in fact, did not fill out the reputational survey for this past year. I came to the conclusion that I am not in a position to make judgments on other schools, many of which I have little or no familiarity with. The fact that one quarter of a college's ranking is based upon what is, in essence, its popularity, is very disturbing and we choose not to be a part of the process." [25]

Third-party involvement

Catharine Bond Hill, president of Vassar College, took a different approach to the question of the reputational survey. She argued that, "many of us in higher education dislike popular college rankings such as the annual academic beauty pageant from US News & World Report. But expecting them to go away is naive, and attempting to undermine them is unwise since students and families could perceive that as petulant and paternalistic. Worse, it could seem as if we have something to hide." Rather than not sending in the reputation survey, she argued, it would be of value to focus on, "a third-party non-profit or foundation", sending them "the same data that we already submit to US News and other rating organizations." On this point, she argues, "a one-size ranking does not fit all, because students and families care about different things [...] What if a school doesn't use the SAT in making admissions decisions and therefore doesn't collect or report these data? In a new system, that school couldn't be ranked if a student chose a positive weight for the SATs. Students would know that the school doesn't value that piece of information. They could then run the rankings with other information (maybe class rank and other indicators of academic achievement), excluding the SAT, and see what those rankings look like. Alternatively, they could decide they actually do care about the average SATs of the student body and decide to look at other schools. Fair enough." [26]

Faculty, scholars, administrators, and journalists: responses

Economics and endowment

Others have discussed the correlation between economics, college choice, and the rankings. David McGrath, emeritus professor of English, College of DuPage, discussed his own decision to attend Chicago State University in the 24 July 2007 Chicago Tribune article, Ode to a fourth-tier college. Of this decision, he noted that, "I qualified for admission elsewhere, but CSU was close to my part-time job, and it was cheap [...] I never required a student loan since I earned enough as a grocery bagger to pay tuition and fees in 1970 that totaled $300 per year. All told, a pretty good value, even for a fourth-tier school." McGrath considered it a "good value," because, "CSU eventually led to a teaching career, and my working alongside professors from Princeton, Northwestern, and the University of Chicago. I earned the same ample salary and benefits as they, and, more important, was privileged to engage in the same kind of fulfilling work." He also referenced the 2000 Krueger-Dale study (which compared groups of students who received the same SAT scores, attended both high and low-income schools, and found no difference in post-graduation success rates [27]) and noted that, "too often, it seems, students choose colleges the way they choose jeans or athletic shoes. They would rather bust the family budget than be caught dead in sweats bearing an unrecognizable school logo. But it's their ability, work ethic and dedication that determine the height of their achievement." [28]

Author and journalist, Peter Sacks, narrows the argument by suggesting a direct correlation between the wealth of school and its rank. He suggests that "the ranking amounts to little more than a pseudo-scientific and yet popularly legitimate tool for perpetuating inequality between educational haves and have nots -- the rich families from the poor ones, and the well-endowed schools from the poorly endowed ones. Toss in the most heavily weighted factor in the U.S. News survey, the assessment of deans, college presidents, admissions officials and others regarding their peer institutions (a beauty contest that constitutes a full 25 percent of the U.S. News ranking), and you get the perfect recipe for a self-perpetuating, class-based rankings system driven by brand names, marketing hype, and prestige."[29]

Where Colleges Recruit: Local SES Correlates with SAT/ACT and Ability; Colleges Recruit the Best but Are Constrained by Economic Forces

In the summer of 1996, Marshall University sociology professor Dr. William Westbrook was having a conversation with a recent Master's graduate. The graduate inquired about Marshall's traditional admissions policy, which had a lower SAT/ACT apparent requirement than his undergraduate school, Shepherd University, with a "selective" admissions policy and higher SAT/ACT apparent requirement.

The graduate made an incorrect inference from the averages and thought Shepherd was arbitrarily turning new students away below a higher fixed ACT or SAT score.

Dr. Westbrook clarified the situation first by explaining that every university or college is going to meet its enrollment quota with the best students available, and asked where both Marshall University and Shepherd University got their students.

Marshall typically recruited from Cabell, Logan, Wayne, and Putnam Counties, WV, and Lawrence County, OH.

Shepherd typically recruited both from West Virginia, both locally in the Eastern Panhandle and in the interior of and elsewhere in the state, and from the comparatively wealthy Maryland and Virginia suburbs surrounding Washington D.C.

Dr. Westbrook made the point that SAT and ACT scores are an indirect measure of socioeconomic status. County kindergarten to twelfth-grade school boards are funded by property taxes assessed on home values. At nearly the end of grade school, a college-bound student takes the SAT or ACT.

The difference between Shepherd University and Marshall University is that Shepherd recruited more from areas whose home values were higher, whose homeowners paid more in property taxes, whose school systems were better funded, and whose students benefitted as a result, took the SAT or ACT, some of whom applied to Shepherd University and were accepted, and attended.

Colleges and universities, then, have less control over setting their ACT and SAT requirements than the graduate assumed and than most parents might imagine. Parents need to understand that large scale political and economic forces such as the level of Federal employment and commensurate income around Washington, or other regional differences in living standards, have more to do setting ACT and SAT requirements than the decisions of college committees trying to enroll the best students they can under the circumstances over which they have no control.[30]

Gaming and methodology

Assistant to the Dean for the University of North Carolina School of Law, Sarah E. Wald, noted, "the rankings purport to give an overall order to colleges and graduate schools to help students make the best decisions about where to attend school. But universities all know how misleading and even destructive these rankings can be. It's common knowledge how the statistics can be 'gamed.' Colleges can solicit applications from students with little chance of acceptance to boost how selective they appear. Schools can adjust when they allow faculty to take leave in order to raise the faculty/student ratio. And admitting more "risky" students on transfer rather than in the initial class results in a higher freshman SAT average."[31]

Professor Marty Kaplan of the USC Annenberg School for Communication further argued that, "the problem with U.S. News' college rankings isn't that institutions of higher education shouldn't be held accountable for the quality of services they provide [...] The problem is that the fierce competition among colleges to raise their rankings torques the priorities of colleges toward the criteria that U.S. News uses [...] So this week, when an association of 80 liberal arts college presidents, including Barnard, Sarah Lawrence and Kenyon, announced that a majority of them would no longer participate in the U.S. News annual survey, and that they would fashion their own way to collect and report common data, it was bad news for the magazine, but good news for families. It's also good news for American higher education, some of whose institutions may now become less timid about accepting the quirky applicant, less nuts about generating journalistic puff pieces, and more bold about declaring (and living up to) unique educational missions that don't derive from focus groups.[32]

Senior scholar at The Carnegie Foundation for the Advancement of Teaching, Alexander C. McCormick, adds to the above discussion by arguing against the way in which the Carnegie Classification of Institutions of Higher Education is used in the creation of U.S. News rankings. The problem, he argues, with this use is that there is "no basis for inferring national versus regional focus, because it’s not a factor in the classification criteria. So it should come as no surprise that the national and regional lists contain a great many inconsistencies and bizarre placements [...] By continuing to rely on the Carnegie Classification, they avoid the tough job of defining their terms." [33]

Rebuttals and further responses

Some have offered rebuttals to this criticism.

Consumer demand

U.S. News and World Report editor Robert Morse, argued that "a couple of journalists are making the case for the U.S. News rankings, explaining why the actions of a group of college presidents who have signed the letter boycotting the U.S. News peer survey may not be in the best interests of prospective students and their parents." [34] In fact, Morse refers to an article published in a right-wing journal, the National Review, entitled They Protest Too Much, published on 28 June 2007 in which he quotes John J. Miller as stating, "the magazine's editors and writers aren't interfering with higher education so much as responding to a consumer demand for more information about it. The demand exists because colleges and universities are among the least accountable institutions in American life [...] the U.S. News rankings indisputably measure something—and something is better than nothing, which is why parents of high school students pore over the magazine's tables and charts. This is rational behavior for people on the verge of spending more huge sums of money on the education of a single child. Like wise investors, they want to know if they're getting a good deal."[35] He also refers to the 27 June 2007 Washington Post op-ed, A College Course in Cynicism, in which he quotes Robert Samuelson as stating, "[w]hat's so shameful about this campaign against the rankings is its anti-intellectualism. Much information is in some way incomplete or imperfect. The proper response to evidence that you dislike or dispute is to supplement or discredit it with better evidence. The wrong response is to suppress it. And yet, that's the agenda of these college presidents. By not cooperating with the U.S. News survey, they hope to sabotage the rankings. They say they'll provide superior information. But they want to control what parents and students see. This is soft censorship.What their students will learn, if they're paying attention, is a life lesson in cynicism: how eminent authorities cloak their self-interest in high-sounding, deceptive rhetoric." [36]

Response from Dickinson College

The Provost and Dean of Dickinson College, Neil Weissman, responded to Robert Samuelson's rebuttal, in the 30 June 2007 Letter to the Editor, for the Washington Post, College Rankings Are Lame Science, in which he states, "when Dickinson College chose not to participate in the U.S. News and World Report rankings of colleges, I imagined the decision would evoke some criticism, but never the charge Robert J. Samuelson made of 'anti-intellectualism' [op-ed, June 27]. 'Intellectual' to me means thoughtful. The problem with the U.S. News rankings is that they are not 'intellectual.' They are, as some higher education experts label them, lame science. Mr. Samuelson also missed the point in suggesting nonparticipating colleges are trying to censor U.S. News. The magazine is of course free to continue its rankings, as are others. We are simply saying that we will not participate in an exercise that, in our view, misleads prospective students more than it helps and drives up college costs by encouraging spending in pursuit of rankings on a fictional prestige ladder invented by U.S. News."[37]

Consumer information

Professor of journalism at Elon University, Michael Skube, argued in the editorial, "The No. 1 reason to rank colleges," against arguments made in the 11 March Washington Post article, "The Cost of Bucking College Rankings" [1] by the President of Sarah Lawrence College, Michele Tolela Myers. Skube states that, while having some merit, these arguments were "partly beside the point [...] U.S. News survey, for all its imperfections, performs the useful service of comparing apples with academic apples.In some ways, one might even argue that its nuts-and-bolts consumer information is at least as practical as the bar charts and numbers a car buyer might find in Consumer Reports or Car and Driver. What factors go into the rankings? Student retention accounts for 25% at schools U.S. News calls master's level and those that provide primarily the bachelor's degree (called 'comprehensive' schools, oddly enough)." Skube also notes objections made to the reputation survey portion of the U.S. News survey and responds by stating that, "one can see why." However, he argues, "sometimes just the facts will do, and the U.S. News manual offers them in great heaps [...] Sarah Lawrence, for example, does not take into consideration SAT or ACT scores. Don't even send 'em, it tells high school students. That tells me all I need to know about Sarah Lawrence. It tells me that Sarah Lawrence doesn't take aptitude as seriously as I'd like. The university depends far more on high school grades, which, as anyone who has taught at the college level knows, cannot be trusted. If last year's freshman classes at several colleges all had composite high school grade point averages of 3.6 to 3.8, I don't know how the intellectual caliber of one differs from another. But if one college attracted high school students whose SATs averaged 1100 to 1200, and another attracted students with SATs averaging 1300 to 1400, I know the latter is more selective. Sarah Lawrence might not care about such things, but I do."[38]

Response from Sarah Lawrence College

Former president of Sarah Lawrence College, Michele Tolela Myers, responded to Michael Skube's rebuttal in the 12 July 2007 Letter to the Editor for the Los Angeles Times, "Argument may be a rank disgrace." On the general topic of U.S. News methodology, she states, "what many of us dispute is the validity of a single score computed by using "data points" to which weights are arbitrarily ascribed (why should retention count for 20% instead of 30%; why is peer assessment 25% instead of 10%; and who decides?). How can a single measure be valid when, in some cases, values are made up when they are not provided (the case of the missing SATs at Sarah Lawrence — the point of my Washington Post Op-Ed)? However, that's exactly what U.S. News does each year. Professional statisticians have reported that the methodology used by the magazine is seriously flawed and cannot be trusted." She also responds to Skube's discussion of Sarah Lawrence's decision not to consider SAT or ACT scores by stating, "Skube says he knows 'all he needs to know about Sarah Lawrence' because the college does not use SAT scores in its admission process, and therefore he infers we don't take aptitude seriously. Perhaps he doesn't know the research showing that SAT tests do not measure aptitude and at best provide a guess about academic performance in the first year of college. I do not think Elon University's SAT scores tell all there is to know about Elon. To think so would be falling into the trap of using one single measure as a proxy for the complex nature of any college. Which is precisely why the rankings are flawed."[39]

Notes

  1. ^ a b c d e f Tolela Myers, Michele (11 March 2007). "The Cost of Bucking College Rankings". The Washington Post. http://www.washingtonpost.com/wp-dyn/content/article/2007/03/09/AR2007030901836.html. 
  2. ^ Gross, Jane (13 November 2003). "Sarah Lawrence College Drops SAT Requirement, Saying a New Writing Test Misses the Point". The New York Times. http://select.nytimes.com/gst/abstract.html?res=F00614F93C5C0C708DDDA80994DB404482&n=Top%2fReference%2fTimes%20Topics%2fOrganizations%2fS%2fSarah%20Lawrence%20College%3C/ref%3E%20thus%20joining%20the%20. Retrieved 30 April 2010. 
  3. ^ a b "U.S. News Statement on College Rankings". U.S. News and World Report. 12 March 2007. http://www.usnews.com/usnews/blogs/news_blog/070312/us_news_statement_on_college_r.htm. 
  4. ^ a b Jaschik, Scott (12 March 2007). "Would U.S. News Make Up Fake Data?". Inside Higher Ed. http://insidehighered.com/news/2007/03/12/usnews. 
  5. ^ "NEAIR 33rd Annual Conference Program at a Glance". North East Association for Institutional Research. http://www.neair.org/conferences/philly06/neair%20-%20program%20at%20a%20glance%20final%20110106.pdf. 
  6. ^ a b "Presidents Letter". 2007-05-10. http://www.educationconservancy.org/presidents_letter.html. 
  7. ^ "Battle Lines on ‘U.S. News’". Inside Higher Ed. 7 May 2007. http://www.insidehighered.com/news/2007/05/07/usnews. 
  8. ^ "Presidents Letter". Education Conservancy. http://www.educationconservancy.org/presidents_letter.html. 
  9. ^ Jaschik, Scott (20 June 2007). "More Momentum Against ‘U.S. News’". Inside Higher Ed. http://www.insidehighered.com/news/2007/06/20/usnews. 
  10. ^ a b "ANNAPOLIS GROUP STATEMENT ON RANKINGS AND RATINGS". Annapolis Group. 19 June 2007. http://www.collegenews.org/x7131.xml. 
  11. ^ RAWE, JULIE (20 June 2007). "A Better Way to Rank Colleges?". TIME. http://www.time.com/time/nation/article/0,8599,1635326,00.html. 
  12. ^ a b Morse, Robert (22 June 2007). "About the Annapolis Group's Statement". U.S. News and World Report. http://www.usnews.com/blogs/college-rankings-blog/2007/6/22/about-the-annapolis-groups-statement.html#read_more. 
  13. ^ Jaschik, Scott (25 June 2007). "Debate: Top Critic vs. ‘U.S. News’ Editor". Inside Higher Ed. http://www.insidehighered.com/news/2007/06/25/rankings. 
  14. ^ Will, Katherine Haley (9 July 2007). "Breaking Ranks:A College Can't Be Reduced to a Number in a Magazine". Washington Post. http://www.washingtonpost.com/wp-dyn/content/article/2007/07/08/AR2007070800922.html. 
  15. ^ Griffith, John (4 July 2007). "Revolutions: iPhones and college choice". The Times and Democrat. http://www.timesanddemocrat.com/articles/2007/07/04/opinion/doc468c1d9d9c8a1056983370.txt. 
  16. ^ Helm, Peyton (29 June 2007). "'Hearsay' isn't the way to chose a college". The Morning Call. http://www.mcall.com/news/opinion/anotherview/all-left_col-c.5920286jun29,0,1390216.story?coll=all-newsopinionanotherview-hed. 
  17. ^ Hoover, Eric (20 June 2007). "Liberal-Arts College Group Plans to Help Develop Alternative to Commercial Rankings". Chronicle of Higher Education. http://chronicle.com/daily/2007/06/2007062006n.htm. 
  18. ^ Kamara, Margaret (28 June 2007). "Are U.S. News Rankings Inherently Biased Against Black Colleges?". Diverse Issues in Higher Education. http://www.diverseeducation.com/artman/publish/article_7831.shtml. 
  19. ^ "Growing Challenge to ‘U.S. News’". Inside Higher Ed. 18 May 2007. http://insidehighered.com/news/2007/05/18/usnews. 
  20. ^ Finder, Alan (20 June 2007). "Some Colleges to Drop Out of U.S. News Rankings". New York Times. http://www.chicagotribune.com/news/local/chi-070620collegerank,0,1372864.story. 
  21. ^ "Sarah Lawrence College Endorses Annapolis Group Actions". Sarah Lawrence College. http://www.slc.edu/index.php?pageID=4282&detail=log&id=862&returnURL=%2Findex.php%3FpageID%3D4282. 
  22. ^ "Scripps College Joins Annapolis Group in Support of Better Information for Parents and Prospective Students". Scripps College. 21 June 2007. http://www.scrippscollege.edu/dept/newscenter/news/2007/annapolis-group.html. 
  23. ^ Macenka, Joe (7 July 2007). "Colleges question U.S. News rankings: Six Virginia schools stopped or are likely to stop participating". Richmond Times-Dispatch. http://www.inrich.com/cva/ric/news.apx.-content-articles-RTD-2007-07-07-0109.html. 
  24. ^ McGuire, Patricia (16 May 2007). "Colleges Should Boycott Bogus Ratings Game". Hartford Courant. http://www.courant.com/news/opinion/op_ed/hcmcguire0516.artmay16,0,7943960.story?coll=hc-headlines-oped. 
  25. ^ "DePauw Joins Annapolis Group Peers in Quest to Deliver Better Information to Prospective Students and Parents". DePauw University. 20 June 2007. http://www.depauw.edu/news/index.asp?id=19624. 
  26. ^ Hill, Catharine (19 July 2007). "Parents and students deserve a program to create their own rankings.". Christian Science Monitor. http://www.csmonitor.com/2007/0719/p09s02-coop.htm. 
  27. ^ Krueger-Dale Study
  28. ^ McGrath, David (24 July 2007). "Ode to a fourth-tier college". Chicago Tribune. http://newsblogs.chicagotribune.com/news_opinion_letters/2007/07/ode-to-a-fourth.html. 
  29. ^ Sacks, Peter (5 April 2007). "America's Best College Scam". The Huffington Post. http://www.huffingtonpost.com/peter-sacks/americas-best-college-sc_b_45064.html. 
  30. ^ Christopher Marsh (chris-marsh-usa) was the graduate talking to his adviser the summer after he graduated.
  31. ^ Wald, Sarah (30 June 2007). "Dismissing school rankings". Boston Globe. http://www.boston.com/news/globe/editorial_opinion/oped/articles/2007/06/30/dismissing_school_rankings/. 
  32. ^ Kaplan, Marty (20 June 2007). "Reaming College Rankings". Huffington Post. http://www.huffingtonpost.com/marty-kaplan/reaming-college-rankings_b_52995.html. 
  33. ^ McCormick, Alexander (10 May 2007). "Hidden in Plain View". Inside Higher Ed. http://www.insidehighered.com/views/2007/05/10/mccormick. 
  34. ^ Morse, Robert (3 July 2007). "Some Support from Reporters". U.S. News and World Report. http://www.usnews.com/blogs/college-rankings-blog/2007/7/3/some-support-from-reporters.html. 
  35. ^ Miller, John (28 June 2007). "They Protest Too Much". National Review. http://article.nationalreview.com/?q=YTBmYzA2OTcyYjc0MzBiNDhkMzVmNzc5ZDc5YmQ0YmE=. 
  36. ^ Samuelson, Robert (27 June 2007). "A College Course in Cynicism". Washington Post. http://www.washingtonpost.com/wp-dyn/content/article/2007/06/26/AR2007062601687.html. 
  37. ^ Weissman, Neil (30 June 2007). "College Rankings Are 'Lame Science'". Washington Post. http://www.washingtonpost.com/wp-dyn/content/article/2007/06/29/AR2007062902135.html. 
  38. ^ Skube, Michael (8 July 2007). "The No. 1 reason to rank colleges". The Los Angeles Times. http://www.latimes.com/news/opinion/commentary/la-op-skube8jul08,0,6137792.story?track=rss. 
  39. ^ Myers, Michele Tolela (12 July 2007). "Argument may be a rank disgrace". The Los Angeles Times. http://www.latimes.com/news/opinion/letters/la-le-thursday12.5jul12,0,3328873.story?coll=la-news-comment-letters. 

Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Criticism of college and university rankings (North America) — Criticism of college and university rankings refers to movements which developed among faculty and administrators in American Institutions of Higher Education as well as in Canada. Contents 1 Reed College 2 Stanford University, FUNC, and Alma… …   Wikipedia

  • College and university rankings — are lists of institutions in higher education, ordered by combinations of factors. In addition to entire institutions, specific programs, departments, and schools are ranked. Rankings are conducted by magazines, newspapers, governments and… …   Wikipedia

  • College admissions in the United States — refers to the annual process of applying to institutions of higher education in the United States for undergraduate study. This usually takes place during the senior year of high school (usually around the ages of 17 or 18). While dates and… …   Wikipedia

  • University of Derby — Established 1992 gained University Status 1851 Teacher Training College) Endowment £15.5m [1] …   Wikipedia

  • Sarah Lawrence College — Sarah Lawrence redirects here. For other uses, see Sarah Lawrence (disambiguation). Sarah Lawrence College Motto Wisdom with understanding Established 1926 Type …   Wikipedia

  • List of colleges and universities which have signed the Presidents Letter — The following List of colleges and universities which have signed the Presidents Letter refers to a 2007 movement criticizing the practice of college rankings which developed among faculty and administrators in American Institutions of Higher… …   Wikipedia

  • United States Naval Academy — USNA redirects here. For the fictional nation of the United States of North America, see A Mind Forever Voyaging. United States Naval Academy Motto Ex Scientia Tridens Motto in English …   Wikipedia

  • United States — This article is about the United States of America. For other uses of terms redirecting here, see US (disambiguation), USA (disambiguation), and United States (disambiguation). United States of America …   Wikipedia

  • List of United States graduate business school rankings — [ thumb|right|Columbia Business School is ranked #1 in The Financial Times ranking of rankings for United States business schools.] List of United States business school rankings is a tabular listing of all business schools and their affiliated… …   Wikipedia

  • THES - QS World University Rankings — The THES QS World University Rankings is an annual publication of university rankings around the world, published by The Times Higher Education Supplement (THES) and Quacquarelli Symonds (QS). The full listings feature on the QS website and on… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”