Strangers are manipulating your systems. By pulling cords and pushing buttons, these users will stretch the limits of your best laid plans and intentions. For every multifamily technology programmer that claims to have developed the industry’s ultimate deliverable, there are a couple of guys down in accounting and a property manager somewhere in Denver who, with the simplest of observations, can send that system back to the corporate office crashing and burning all the way. It’s the beta test, and it’s where multifamily technologies live or die. Virtually every technology in use by the multifamily industry today has endured the beta test gauntlet. Whether you plan to implement off-the-shelf solutions or your own internal systems this year, understanding the cadence of beta-testing and systems rollout is integral to gaining all of the efficiencies and productivity that your technology has to offer.
Step one: Forget what you think you know about technology. “To pretend that technology people—even paired with the right subject-matter experts—can design, build, and internally test something and know that they have nailed it is the height of arrogance,” says Donald Davidoff, group vice president of strategic systems for Englewood, Colo.-based Archstone, who is involved in the rollout of virtually all of the company’s technologies, including LRO, lead management systems, and the use of call centers.
“It doesn’t matter how well you have designed it or how smart you think you are: The real world [will] exercise that system in ways that you never predicted,” Davidoff says. “Someone in the field is either going to use it differently than you expected or have some bright idea that might even extend applications into a different arena.”
Even multifamily technology software and systems providers agree that an “ignorance is bliss” mentality best serves systems once they are submitted for beta. “It’s not until you get it into people’s hands when a client will say, ‘How come it can’t do this and this?’” explains Monte Jones, Lakewood, Colo.-based First Advantage Safe-Rent’s senior vice president of national sales. “The refinements you make through that process are what beta testing is all about.”
Don’t get ahead of yourself, though. Tech providers and IT execs alike recommend careful consideration prior to entering the beta test. At First Advantage SafeRent, every new product in the company’s suite of resident screening, renter’s insurance, and performance analytics systems faces an interdisciplinary board for a go/no-go decision to proceed into beta testing, Jones says. Such initial “proof of concept” procedures allow marketing, accounting, and management pros to vet any given tech initiative. “We will look at all of the logistics—the operational side, the financial side, the technology side,” Jones explains. “For every eight to 10 initiatives that the board considers, one idea will make it through to the beta stage.”
Following a proof of concept stage, steps often include alpha testing and demos of version one products to a development group. Once the beta test on larger user groups begins, however, things don’t get any easier. Nor should they, Davidoff argues. “The clear best practice in rolling out technologies is to apply the old adage: Crawl before you walk and walk before you run. It is critical at every stage to pay attention to your validation, to make sure that you really are right—not just rationalizing that you are right.”
At Archstone, the duration and user group size of beta tests vary dramatically depending on the complexity of the technology but virtually always include quantitative, qualitative, and scalability analyses, Davidoff says. Qualitative testing is user-centric and should include primary evaluations of a technology’s interface: Is it intuitive and bug-free? Does the technology basically feel right? Quantitative testing, on the other hand, involves budgetary comparisons with a beta test and control group to determine the business case for moving forward: Is the technology expensive to implement from an equipment or training standpoint? Does the technology demonstrate a savings in productivity or efficiency that corresponds to capital investments in development and rollout?
Finally, scalability analyses will determine if system requirements are viable considering available personnel and technological assets: Will the program overwhelm your help desk? How does it affect the speed of your systems platform? Does it eat memory or bandwidth? “You might put 20 properties on the resource, and everything is great. Then you move it to 40 properties, and everything is still great. Then you move it to 80 properties, and all of a sudden your systems are as slow as molasses,” Davidoff says. “That’s when you discover that it’s a good thing you didn’t try to drop it on 150 communities all at once.”
Naturally, complex programs such as portfolio modeling will have more narrowly defined user groups, which can significantly impact the time a technology takes to migrate from the drawing board to the motherboard. At Marietta, Ga.-based Wood Partners, the development and rollout of the company’s GEM green building program evaluation module in 2008 took about eight months to complete. “It’s definitely a fighter pilot system, so we’ll train about six employees in the company to use it across all of our projects,” says Wood Partners director Rick Mercer of the Microsoft Excel spreadsheet, which provides a comparative cost/value analysis of building a given project to EarthCraft House, LEED for Homes, and the National Association of Home Builders’ National Green Building Standard. “About five developmental people worked out the testing and wrestled this thing to the ground.”
Expanding Your Reach
Although Wood Partners also conducted a cross-disciplinary proof of concept review, Mercer says the system was almost singularly oriented to preconstruction and design personnel, limiting the need to expose GEM to a huge user group for testing. Still, the beta test for GEM endured two difficult go/no-go hurdles before finally proving itself out.
“At one point, it looked like it would be too cumbersome, and we were not going to be able to handle it,” Mercer says, adding that he chose to incorporate naysayers into the development team, helping to speed up release of the final product. “The people who might have problems with the system are going to be a vital part of your solution. They enable programmers to be a lot more responsive to the greater end user needs of the technology.”
The team at leasing solutions provider Realty DataTrust opted for a public beta test of PadZing.com, its real-time market data portal. “We wanted to get as much input from as many different people as possible,” says Mike Mueller, president of the Scottsdale, Ariz.-based company. “Blasting it out to the world helped us figure out exactly who would be interested in using PadZing.” The public beta test also revealed programming bugs that might have otherwise been difficult to find.
Ultimately, a successful beta test allows for user input into technology rollouts without overwhelming them from an interface, ease of use, or program architecture standpoint. “People who use technology are not interested in, don’t know, and don’t want to know how it works. They want to use it to solve a problem,” says Doug Weiss, vice president of web development for Norfolk, Va.-based ForRent.com. “The challenge that we run into is that we get excited about technology for technology’s sake and forget that real people have to use it. Otherwise, it is destined to become the next eight-track cassette cartridge.”
Every Step Matters
Here’s how to ensure your beta tests pay precise attention to detail and employ strict methodologies.
1. Prove it. A cross-departmental team should review all tech initiatives at the “proof of concept” stage and provide preliminary feedback. Finance, marketing, operations, and IT should all be able to judge a system’s plausibility.
2. Manage the time. The step-by-step monotony of gradually introducing demo, alpha, and beta versions of systems can seem onerous, but the practice can drastically eliminate problems up front. Consider requiring that technology survive several go/no-go votes of confidence.
3. Know your audience. Minor updates to existing technologies for a narrow user group might beta test out in a week. A global shift to a new property management system could take more than a year. Also, consider the impact of beta tests on your users as well as the stress that new technology might put on help desk personnel.