Risk Management and Planning for an Alien Invasion
Track: Security, Records and Data Management
Prepare for the worst and hope for the best. How likely is an alien invasion? Likely enough that you should plan for one? Risk management plans identify known and unknown risks, add a potential response to the risk (Mitigate it? Avoid it?), establish the root cause of the potential risk, determine what action to take if the event happens, and provide a contingency plan of action so your program can live long and prosper.
- Risk: Aliens invade and sell copies of your exam on iPads
- Potential Response: Mitigate
- Root Cause: Delivery trucks containing iPads were not secured against aliens
- Action: Buy all the iPads
- Contingency: Wipe the minds of the iPad purchasers and republish the exam without iPad compatibility
Program sponsors must engage in proactive risk management and apply the principals of due care and due diligence in order to protect their programs. As technology continues to expand in scope and availability, and diminish in scale, what may once have been an unlikely, improbable risk could become pervasive tomorrow. Risk management practices must be employed openly and routinely to be effective.
Management at the program level can help avoid or mitigate risks in many ways. Some examples would include requiring multiple factor identification to access item banks, watermarking hard copies of documents, creating security practices restricting sensitive data from PCs, limiting phone usage during workshops, and/or using secure FTP site(s) to transfer sensitive information.
Psychometric analysis can be a cost-effective manner to review potential security concerns related to exams, candidates, and test centers equitably, by applying the same metric to all units of analysis. Some examples would include basic longitudinal tracking of mean scores and/or mean test time to see if there are trends of scores increasing or total exam time decreasing as the exam form(s) have been released; reviewing candidate performance against pre-defined criteria such as: score versus exam time, score versus item times, score on item subsets, etc.; and reviewing test center performance against that of other test centers which can be done for a single exam title or across an entire certification program.