The following article, published in the April 1995 issue of Progressive Architecture magazine, poses the question, “Will the new ARE change the profession?” Eight years later, we ask whether the profession will change the new ARE?
The new exam: will it change the profession?
By Michael Crosbie
In her book on the profession, Architecture: The Story of Practice, social scientist Dana Cuff appraised the architectural registration exam this way: “The exam is not about design ability, talent, or any of the other mysterious architectural qualities; it is about competence. A brilliant designer will not pass without technical knowledge of construction conventions, graphic standards, and building codes about fire safety, accessibility, and energy.” Cuff hit upon the exam’s most incredible quality. For most of us, the exam, written by the National Council of Architectural Registration Boards, is the great leveler. It can bring low the exalted designers we knew in school, and raise high those who rarely drew praise in studio.
But the exam is changing. The grueling, macho rite of passage is about to become a thing of the past. While the standards for being admitted to the exam are getting tougher, the new exam and how it will be administered appear to be getting easier and less threatening. Will these changes alter our conception of a registered architect?
Who Can Take the Exam?
Qualifications to sit for the exam vary among the 55 registration boards that belong to NCARB (the 50 states, plus Washington, D.C., Guam, the Northern Mariana Islands, Puerto Rico, and the Virgin Islands). The general drift among the boards is to make admission to the exam tougher. For example, 32 boards now require candidates to have a degree from an NAAB-accredited architecture program or an NCARB-sanctioned equivalent.
The number of boards demanding that candidates complete the Intern Development Program to sit for the exam is also on the rise. Right now, 35 boards require IDP for admission to the exam. In another two years that number will grow to at least 40 and by the year 2000, NCARB’s Executive Director Samuel Balen believes, virtually all the boards will require IDP.
For most intern architects IDP is a blessing and a curse. The program requires that interns document their practice experience in the office of a registered architect for various categories such as schematic design, code research, specifications and materials research, and construction documents. Completion requires 700 IDP “value units,” one unit being equivalent to eight hours. The candidate’s record-keeping must be verified by an architect within the firm, and be reviewed and approved by an architect-sponsor outside the firm.
Interns complain most about IDP’s byzantine recording methods, the paperwork, the bureaucracy, and the cost. Most boards require that NCARB certify the IDP experience. Starting a council record costs $195 and the annual maintenance fee is $30. “For an intern, this can be steep,” says Elizabeth Koski, vice president of the American Institute of Architecture Students, who sits on NCARB’s IDP coordinating committee. Some firms are not keen on IDP, because it requires employers to keep accurate account of intern time and to help the intern complete the program. “More firms are coming around to it,” says Koski. “They’re finding that, in the long run, people who complete IDP make better employees.” Requiring IDP for the exam will continue to have a profound effect on the profession, because it obligates firms to provide the intern with a coherent, well-rounded intern experience. Firms that buck the trend may find interns shunning them.
Once you’re qualified to sit for the exam, get out your wallet. The cost of taking the entire exam can run anywhere from $400 to $1,100, depending on the registration board (another hit for the lowest-paid members of the profession). That’s just to get through the door. There are study guides and seminars that will help the candidate to prepare, and these can cost several hundred dollars. Taking time off for the exam often results in loss of pay, and there are often travel and accommodation expenses.
More stringent requirements, the cost, and (most important) the general state of the economy and the profession have had a devastating effect on the number of people taking the exam. According to NCARB statistics, in 1985 close to 5,500 candidates sat for the exam for the first time. A decade later, that number has dwindled to less than 3,000 nationally.
The Dreaded Division C
The exam is structured in ten divisions: Pre Design (A); Site Design-Written (B); Site Design-Graphic (B); Building Design (C); Structural Technology-General and Long Span (D/F); Structural Technology-Lateral Forces (E); Mechanical, Electrical, Plumbing, and Acoustical Systems (G); Materials and Methods (H); Contract Documents and Services (I). All but two of these divisions are given in a multiple-choice format; candidates answer questions by filling in a gridded answer sheet, just as on the SATs. Site Design-Graphic and Building Design are tested graphically, with candidates hunched over drawing boards for hours.
Until just recently, Division C was the stuff of legends. Within the space of 12 continuous hours, candidates were given an entire building to design and were required to show floor plans, building sections, structural plans, elevations, and a host of life-safety compliance factors. For most candidates (myself included) Division C was a nightmare. You had to juggle and satisfy dozens of program and code requirements, integrating them into a solution that was sensible and not a death trap.
On P/A’s recent fax-back page, by which we encouraged comments on the ARE, Division C drew the most fire. “It rewards physical stamina, very quick thinking, and a certain amount of luck,” one respondent wrote. Another confessed: “I passed on the fourth attempt, without knowing what made that effort better than the rest. That 12 hours is the most grueling and inhumane experience I have undergone.”
Divide and Conquer
Last June, NCARB introduced a completely overhauled Division C, but humanitarianism had little to do with the change. NCARB determined that the old exam was not a fair appraisal of what the candidate knew and didn’t know, because examinees essentially created their own test scenarios. “Some people were designing straightforward buildings,” says Sam Balen, “while others were making the building design so complex that they couldn’t resolve all of the issues they needed to.”
Mistakes in one part of the exam also had reverberating effects throughout the building design. For example, a candidate who wasn’t very good at laying out a rational plan diagram might later find that the mechanical system couldn’t be resolved, or that the structure was impossible to figure out. The complexity of the problem also meant that many candidates couldn’t finish the required drawings (even though NCARB maintained that the 12-hour exam could be completed in 10 hours), and so could fail this part. Effective time management was essential to pass.
The single-building problem, the format for which was virtually unchanged for more than 70 years, was also demanding on the graders. They were coached to ignore some aspects of the design, and to hunt through the drawings to verify that the candidate had solved all of the problems, particularly those regarding life safety.
The new Division C is modeled on the Site Design-Graphic division, where candidates are asked to solve a number of “vignette” problems in 12 continuous hours. There are now six in Division C: Block Diagram, Schematic Design (floor plans), Building Section, Structural Plan, Mechanical and Electrical Plan, and Accessibility. Candidates are asked to solve discrete problems in each vignette. For example, for the Structural Plan, they are given a building plan and section, a construction system, and are required to show footings, columns, beams, joists/trusses, headers/lintel, and bearing walls. The Section vignette provides two floors of a building wing, all the materials and their sizes, and requires a transverse section. For Accessibility, candidates must draw an exterior ramp and stair for the handicapped. Schematic Design is the most demanding, requiring candidates to design and draw plans for two floors of a building. Each vignette has a suggested time allotment, which makes time management easier.
The results of the new exam have been profound. Candidates who have taken the new Division C exam report that it was much easier to understand what they were being tested on and how to respond appropriately. Candidates who commented on the new format on our fax-back page said that they had had more than enough time to complete all six vignettes, some with hours to spare. Graders I interviewed said that candidates seemed to have a firmer grasp of the test. “The candidates understood the exam better,” says George Terrien, who graded the June test and sits on NCARB’s ARE Research and Development Subcommittee committee. He perceived that more candidates were completing the exam, and that there were fewer wild responses than to the single-building format. “It was clear whether they knew the material or they didn’t, and from that perspective it was easier to grade.”
The pass rates for the new format confirm these observations. The percentage of those candidates with an accredited degree taking the test for the first time and passing Division C has ranged, over the past five years, anywhere from 24 to 45 percent. In June, the rate shot up to 53 percent.
Is It Too Easy?
The question of whether the new format for Division C is too easy is a difficult one to answer, because several factors are involved. The most significant is grading. The old, single-building format was graded as a single entity. Candidates who completed the exam failed if their solutions had serious life-safety deficiencies or code infractions. Graders were trained to focus on those issues, and if they were in doubt, the candidate was given the benefit. Grades were “pass” or “fail.”
The new grading system for Division C is the same as for the Site Design-Graphic exam and evaluates each vignette on a scale of one to four, according to NCARB’s ARE Graphic Handbook. The highest score, four, is given for a solution that “shows an understanding of the concept … and an acceptable solution”; three for “an understanding of the concept and a marginally acceptable solution”; two for concept understanding “but a poor solution”; one for a “lack of understanding of the concept and/or an extremely poor solution”; a blank sheet gets a zero. The graders are instructed by NCARB coordinators on what is acceptable, marginal, or poor, and are given specific criteria, for grading. Each exam is scored by at least two graders (who spend from three to six minutes per exam). The grades for the six vignettes are then averaged together, according to their relative weight, and a final grade is assigned, either “pass” or “fail.”
The grading system makes it possible to do well on the heavily weighted Schematic Design vignette, maybe not so well on one or two others, and still pass. Racking up points is important. According to one grader I interviewed, who also teaches an exam prep course, “I tell candidates that, whatever you do, don’t hand in a blank sheet. At least draw a few lines that show you’ve started.”
A more serious concern is how well the new format gauges a candidate’s skills at design integration. Because Division C is now six unrelated vignettes, candidates do not have to demonstrate graphically that they can produce an integrated solution involving all of the factors being tested in this division. This is troubling because it appears to be at odds with what architects are considered good at doing: thinking integratively and generating solutions to multifaceted problems.
“Because the new format sets up the scenarios for you, it takes a lot of the initial conceptualization away from the candidates,” says Larry Paul, an architect who graded the new exam and is a commissioner on California’s state registration board. Paul says he understands NCARB’s rationale for changing the format, that it seems more fair and more focused, but he has reservations as to whether it’s really closer to the way architects actually practice, as NCARB claims. For example, the new exam no longer requires the candidate to draw an elevation. NCARB considers such an ability a “lower-order skill” that doesn’t need to be tested.
“The nature of being an architect is that you set up your own scenarios,” says Paul. “If you’ve made some basic errors in the beginning, it’s going to have ramifications all the way through the process. That’s what we should be testing for. Theoretically, you could pass all of these vignettes and still not be able to put together a whole building on your own.” For Paul, and for other architects I spoke with, that deficiency strikes at the very heart of competency.
NCARB responds to these concerns by pointing out that Division C is only one of ten needed to pass the exam, and that the exam itself is only part of a long registration process that includes schooling and internship. But the exam is the only element in the process that the registration boards have any direct control over in terms of content. Passing the exam is the last thing that separates a candidate from a license, and Division C has been the highest hurdle to clear.
Brave New Exam
The exam is now undergoing further radical change. Within two years, the test completed with paper and pencil will be history. In 1997 NCARB will administer a fully computerized exam, for both multiple-choice and graphic divisions, that will also be graded by computer. In 1984, NCARB’s then president, Robert Oringdulph, challenged the Council to harness computer technology to improve the exam’s content and grading consistency. CAD was just catching on in practice, and it made sense that the exam be administered in a form that would likely come to dominate the architectural office.
Computerization offers a lot of advantages. NCARB anticipates having hundreds of test stations around the country. The exam will be administered daily throughout the year, and candidates will be able to take the exam when it’s convenient (barring any scheduling back-ups at the test center of choice). The divisions can be taken one at a time, a few at a sitting, or all at once, depending on the candidate. This will make it easier for candidates to afford the exam, because the payments can be spread out over months (candidates will also be able to charge exam fees to a credit card). Examinees won’t have to take a block of time off from work; a division might be taken in an afternoon or on a weekend.
The computerized exam will also be easier on the registration boards. They won’t have to gear up in June and December to administer the exam, find and rent a facility, hire proctoring staff, and mail out grades en masse. In an age of shrinking state budgets, this is a welcome change. NCARB will be freed from coordinating the national grading sessions, rounding up a hundred or so graders twice a year, flying them to the grading site, and covering their accommodations.
NCARB worked with Educational Testing Services (the testing giant responsible for the SAT, GRE, and a host of professional exams) to develop the computerized exam. The multiple-choice (fill-in-the-grid) divisions were fairly easy to adapt to the computer. They will present the question on a screen in a form virtually the same as the one in the test booklets. Candidates will click on the right answer.
Daily testing brings up security concerns. Won’t candidates be able to memorize questions and pass them on? Jeff Kenney, NCARB’s director of exam development, says that every candidate will be given a multiple-choice exam composed of “pre-tested” questions. ETS and NCARB have studied how well these questions have performed in past exams and what their relative difficulty is. For a typical division, the computer will administer two 50-question “testlets” which will, says Kenney, be comparable in content and difficulty. The computer will simultaneously grade the answers and calculate whether the candidate has a “mastery” of the material or not. Borderline candidates will be given another 50 questions, until the computer determines their competency. Each multiple-choice division has 20 testlets for the computer to draw from, and they can be given to candidates in different combinations. Thus the exams will be virtually impossible to memorize or duplicate. And candidates will have to wait six months to retake a division.
Will CAD Nerds Excel?
An obvious concern in computerizing the graphic portions of the exam is whether candidates with CAD experience will have an advantage over those who have never touched a computer. “We developed the software from scratch,” says Peter Brittingham of Educational Testing Services, who has been working on the computerized exam. “We intentionally didn’t make it like any another CAD package on the market.”
I visited ETS in Princeton, New Jersey, to see a demonstration of the computerized exam. I’ve never worked with CAD, but I found the computerized vignettes fairly easy to manipulate. The screen presents the vignette just as it now appears in the exam sketch pad. There are tool bars on the left side of the screen, one set for “sketching” and another set for “drawing” the solution. There are also measurement functions, which can tell you the length and angle of any line on the screen and the square footage of rooms. Lines are easy to erase and layers can be turned off to replicate the function of tracing paper.
What about candidates who have never worked on a computer? ETS has been doing extensive field testing, bringing in such people as young interns and middle-aged architects to work on the problems. Sometimes they’re observed by cognitive psychologists to determine whether the computer is getting in the way of administering the test. ETS has also given vignettes to test volunteers to complete by computer and by paper and pencil to study the differences. According to the NCARB’s Jeff Kenney, “interface” problems with the computerized exam have pretty much been worked out. In fact, the new computerized format will allow NCARB to include more vignettes: eight for Site Design and ten for Building Design. To meet the demand of variety in daily testing, there will be six versions of each vignette, and four “isomorphs” (computer lingo for appreciable variations) for each version. For those with computer anxiety, a tutorial disk containing sample vignettes will be available for candidates to practice with at their leisure before they sit for the exam.
Untouched By Human Hands
There are other advantages of computerizing the test, Kenney points out. “Candidates with physical disabilities who found the paper and pencil test too demanding will be able to take this one,” he says, adding that paralyzed candidates can manipulate the “drawing tools” with special features. The exam will be far less physically demanding on able-bodied candidates as well. No need to bring a pencil. Just point and click with your mouse.
But won’t the profession lose something in the process? Maybe I’m hopelessly old-fashioned, but an architectural registration exam that doesn’t test the candidate’s graphic communication skills strikes me as odd. Isn’t an architect’s ability to draw, even at a minimal level, an important measure of professional competence? I applaud NCARB’s efforts to make the exam as fair as possible, but is the profession–and is the public–ready to accept as competent an architect who doesn’t need to demonstrate how to use a pencil?
A driving force behind the new vignette format for Division C was to make the test gradable by computer. Because the vignettes require the candidates to perform very specific, measurable tasks, programs can be devised to grade the graphic exams. Most candidates will champion this breakthrough, especially those who are convinced that they flunked the exam because they were graded by someone who got up on the wrong side of the bed. The computer will grade every candidate using exactly the same criteria.
But this raises the question: if a computer can grade the exam, why not just let computers design the solutions and remove architects from the process altogether? In fact there are CAD programs available right now to check drawings for code compliance and life-safety factors. Can computers that design, at least as the exam defines that term, be far behind? “Knowledge-based expert systems are continually getting more sophisticated,” says Thomas Seebohm of the University of Waterloo School of Architecture, who directs Waterloo’s program in architectural computing. “There isn’t a program yet that will do the drawings, but we’re just scratching the surface.”
In a world where architects can use computers to check drawings for code compliance the way writers use them to check for spelling and grammatical errors, will this ability become just another “lower-order skill,” in NCARB’s parlance, that doesn’t need to be tested? How, then, does the exam test competence?
Back to the Future
An answer might be found in the history of the exam. Long before standardized tests, candidates for architectural registration in many states were tested not only with a written exam, but also orally. Even today, four state registration boards–California, Maine, Rhode Island, and Washington–require candidates to be interviewed before they’re granted a license. The fact that California still requires an oral exam is significant because its board is responsible fore registering between 20 and 25 percent of all the architects in the U.S.
What does an oral exam tell the registration board about a candidate that a written examination doesn’t? “We’ve found that candidates may be able to pass the written exam,” says Larry Paul, a California registration commissioner who has conducted candidate interviews, “but when they come before our board they’re not able to sufficiently answer some fairly basic questions.” Paul adds that California has kept the oral exam not only because of the state’s stringent seismic codes, but also to screen more effectively the large number of candidates for registration. “It’s added protection for the public.”
Weeks before the interview, candidates are sent an outline of the test and a list of documents and laws they should be familiar with. The oral exam usually takes about an hour and contains about 35 questions. There is an explicitly stated exam methodology so all the examiners are testing candidates in the same way. Candidates who fail the interview are given feedback about areas where they need to improve. Paul says that the process seems to run smoothly and hasn’t been a bureaucratic headache.
Paul believes the interview enhances the state board’s appraisal of a candidate who has passed the NCARB’s registration exam. “The written and graphic exams tend to focus on the knowledge of single-issues items. But architectural practice involves analyzing and integrating a larger amount of knowledge into a coherent design. The integration is what we try to accomplish with the oral exam. We set up practice scenarios, ask candidates to analyze the information, and tell us how they would respond. Architects are called upon to do that every day.”
Will the new exam change the profession? With less emphasis on testing integrative skills, one could argue, the exam encourages a view of architectural specialization: a competent architect is one who can solve highly defined, abstract problems without a comprehensive understanding of the solution’s ramification. This runs counter to our notion of architects as generalists.
If the exam is viewed as a version of “natural selection,” it’s likely that such integrative skills will atrophy. It’s also a possible that the exam will hurry the profession along a path toward less and less responsibility for the building project in its totality. This trend is ironic given the computer’s capacity to help us to think about and visualize architecture in multifaceted ways. With the high-tech of the computerized exam, it may become even more critical for examiners to further test for competency face-to-face, to judge the architect’s skills of integrative thinking that no computer may ever duplicate.
We should also turn the question on its head: can the profession change the exam? In developing the exam, NCARB works with little or no overview by the profession. To calibrate the exam to practice, every five years NCARB surveys architects as to what tasks must be performed in an office. This “task analysis” is a narrow view of the profession and not a good barometer of how it is changing, as architects pursue avenues of nontraditional practice that also affect the public’s health, safety, and welfare.
An exam written for the public’s benefit should have formal mechanisms for comment by architects on its content. Even if NCARB chooses to ignore critiques of the exam by the profession and those engaged in new patterns of practice, the opportunity should exits. Without it, NCARB risks perpetuating an exam out of touch with architecture, and a profession ill-qualified for its future.