In a recent article from UDG News Service, it was reported that hackers are paying top dollar (over $30,000) for older model mobile phones because they can be reprogrammed to facilitate illegal online bank transfers.
Perhaps at the time the phone was made and the firmware developed, such banking transactions may not have been considered as part of the function of a phone. Later advances in banking, networking, and telephony have resulted, however, in modern mobile phones being more like mobile computers, capable of functioning just like laptop.
While reading this article, it occurred to me that if QFD had been used in the development of this phone, that a "security deployment" should have been included as one of the steps.
In QFD, we have many deployments (see the diagram below, partial list of deployments). Most common are quality, function, technology, reliability, parts, and manufacturing. Other deployments such as cost, environment, mechanism, task, hygiene, safety, etc. are less frequently used because they require more advanced skills.
You may notice that most of the deployments relate to positive design elements such as quality, function, parts, etc. Other deployments focus on the negative design elements such as reliability/failure, safety/danger, etc. The purpose of these deployments is to attempt to predict what could go wrong, and design it out either by prevention or mitigation. Advanced practitioners of QFD will also recognize that these negative deployments are kept separate from the quality deployment (example, House of Quality HoQ), in order not to gain or lose priority at the expense of the positive quality elements.
Think about it this way. In the case of this mobile phone security violation, if we reworded this into a customer need in the House of Quality, it might read something like this: "My phone cannot be used for illegal purposes even years after I have replaced it with a new model." It would no doubt earn a low priority when compared to a positive customer need such as: "I can locate where my friends are calling from,""I can book a lunch reservation easily," etc.
On the other hand, if we reworded the security violation into a customer need like "The residual value of my obsolete phone will be 300x higher than what I paid for it," it might earn a higher priority than these positive customer needs. In other words, were we to mix these negative design elements with positive customer needs, the priority of the customer needs could be severely distorted. Rather, in QFD, it is preferred that these deployments be kept separate either by different columns in the Maximum Value table or different matrices if you are doing traditional "house" QFD.
So, how might a Security Deployment work in a case when the technological risks may not have even been invented? We might look first at risk assessments commonly done in IT QFDs.
In a case study done at National City Bank, "Quality Infrastructure Improvement: Using QFD to Manage Project Priorities and Project Management Resources," QFD was used to manage the risk of inadequate resources being spread over too many projects. This assessment was used to prioritize projects and resources based on value and risk to the corporation.
Another common risk assessment was presented by members of Toshiba Corporation's software engineering center at the 9th International Symposium on QFD in 2003. This paper on software FMEA (failure modes and effects analysis), "QFD for Software Development Considering Future Design Risks," aimed to explore potential future risks and countermeasures.
Most papers on IT development are based on a common approach that in order to decrease the total development cost of software, customer requirements and specifications should be made clear before starting development. However, even if developers tried their best to gather and analyze the Voice of the Customers, on a practical level, it is very difficult to restrict change of specifications completely because customers often do not know how they actually intend to use a technology product or notice the discrepancy between the product features and their expectation until they actually see the behavior of a finished IT product. They then start modifying the software or adding other functions that the developers did not anticipate, experimenting and putting the product into a use beyond what the manufacturer originally intended in design.
This makes it difficult for developers to make a comprehensive list of all potential future risks, so the Toshiba team prepared a list of questions related to each of the quality characteristics and sub- characteristics defined in ISO/IEC9126, such as functionality, accuracy, interoperability, and security. These can then be prioritized based on the risk they impose of the most important customer needs; customers including both their own management (VOM) and external users (VOC).
The example of the security in the older mobile phone proves that the issue of future modification of an IT product, and the risk from it, is no longer limited to actions taken by the customer; developers must also consider the behavior of unknown abusers for whom their product is not intended. Security risks include access inspection, access controllability, data destruction prevention, data encryption, etc. If the mobile phone firmware developers had prioritized and analyzed these security risks from the viewpoint of managers, would they not have discovered the future potential of hackers abusing the phone and taken appropriate countermeasures?
The calculation of the RPN (risk priority number) by multiplying three rating scores (frequency of occurrence, severity, and detectability) on a 1-10 ordinal scale violates proper use of math (you should not multiply ordinal scale numbers). Click the table on the right, for an example of this RPN math problem.
The Analytic Hierarchy Process (AHP) should be used instead. The proper process is explained in the QFD Green Belt® course and QFD Black Belt® courses.
© QFD Institute