After all, what if you had an Olympics and no one could clear the first high jump bar setting? You going to leave the podium empty? No, sir, Mr. Gorbachev, take down that bar!
Here's LSTC Knight in Tailored Armor Nick Allard:
Over the past several months, fellow deans across the country have asked for a complete, credible and accurate explanation of the July 2014 results. We still are waiting.I mean, I know that a group of educated professionals at the NCBE already audited these results and wrote a post-Allard summary in their newsletter, but if there's one thing the legal profession has taught me - nay, taught any of us - it's that if you don't get what you want to your satisfaction, the best course is to keep asking for it and pretend like your concerns have never been addressed by the appropriate people. The only way Allard's concerns can be possibly addressed is either a) complete capitulation to his viewpoint or b) appeal to higher levels of experts who unjustly haven't given the chance to review it yet.
Indeed, Allard now appeals to experts. Here's more:
[E]xpert commentators have shown through statistical analysis that, contrary to the claims by the National Conference, the Law School Admission Test scores in 2014 were comparable to the previous year’s and that, in any event, the bar-exam results do not correlate with any measurable change in LSATs. An important new expert analysis by Professor Deborah Merritt at Ohio State University Michael E. Moritz College of Law strongly suggests that National Conference of Bar Examiners’ scoring errors were the source of the problem with the July 2014 exam.Oh, you got a PhD in test design and verification and have been using these methods for decades? Fuck that shit, I've got a vague correlation between two tangentially related tests that measure entirely different things. Fie on you and your expertise! I'll provide my own.
There's also Professor Merritt, an expert who must be believed since she's a friend of the rascally reformers. Look at her expert analysis.
After looking closely at the way in which NCBE and states grade the bar exam, I’ve concluded that ExamSoft probably was the major culprit.The gist here is that an unknown (could be in the millions!) number of bar exam takers probably had severe problems with ExamSoft such that their performance on day 2 of the examination was four (4) raw points lower in the aggregate. Tons of people study relentlessly between day 1 and day 2 of the bar. This caused a domino effect that made the NCBE conclude that this group was "less able" to pass the bar, which caused lower scaling and equating than normal, thereby affecting students even in non-ExamSoft situations.
But here’s the rub: NCBE can’t tell from this general analysis why a group of examinees is less able than an earlier group. Most of the time, we would assume that “less able” means less innately talented, less well prepared, or less motivated. But “less able” can also mean distracted, stressed, and tired because of a massive software crash the night before.Forget that first-time takers had a lower score drop year-over-year than retakers by a significant margin and that ExamSoft is about as trusty as running Windows 95 on a Tandy. We've got an alternative hypothesis.
We must take this further, of course. Students should be allowed to declare any distracting of stressful occurrence in their life to have factored in their bar examination score in a way that benefits law schools. Among a proposed list should include:
- The weather outside - was it raining? blisteringly hot? windy?
- The biologic - was the examinee on her menstrual cycle? Pregnant? How were the bathrooms set up? Did they have to pee for the last hour? Any untreated source of pain? That can cause scores to drop. Was the examinee horny the whole time? Sitting next to a well-endowed "hottie"?
- The familial/personal - any divorces? Deaths? Break-ups? Sick children? That's like ten raw score points right there.
- The material - is their phone broken? Car problems? Not wearing the right outfit?
- The national picture - any recent insurrections? What's the president's approval rating? How about the economic outlook?
- The financial - speaking of which, almost nothing is more stressful than financial woes. Does this student have unpayable debt and bleak job prospects from a third-rate and unnecessary education institution? Because that should be worth at least ten raw points.
- The astrological - the bar exam is the same general time every year, but Taurus isn't always in the same space!
- The dietary - did the examinee eat breakfast? What did they eat for supper? Did they take a multivitamin? Are they on drugs? We can't assume they aren't on drugs unless they tell us. If every bar examinee did a line of coke before the test, we would have to assume scores would be affected!
- The fatigue - every examinee should be allowed to note the number of hours they slept the night before and whether they feel exhausted or not. Many examinees stay in hotels - was the bed comfy? Was it a Holiday Inn Express or better?
- Finally, every examinee should get a paragraph of lines to explain their stress level prior to taking an expensive multiday test whose passage is a prerequisitve to consideration for employment in the one field they actually have a
50-50excellent chance at landing employment in.
As a result, Allard, Merritt and other folks clearly are in the right for demanding a full audit and official investigation. The NCBE is clearly biased towards having stressed and uncomfortable graduates fail for no reason.