Complexity Tables, General System Characteristics and FPA Summary Tutorial

5.2 IFPUG FPA Complexity Tables General System Characteristics and FPA Summary

Hello and welcome to Software Estimation course offered by Simplilearn. In the previous module, we understood the counting rules for data and transaction functions. We will complete the IFPUG (pronounce as if-pug) FPA in this module. Let us look at the agenda of this module in the next slide.

5.3 Agenda

We will begin with a quick recap of the FPA counting process. Next, we will understand what complexity tables are. This will be followed by a discussion on unadjusted function point. Additionally, the fourteen general system characteristics will be dealt with. This module will be concluded by looking into the adjusted function points, and some tips and tricks to be kept in mind while using FPA. Let us recap the FP counting process in the next slide.

5.4 Recap of the FP Counting Process

In the previous module, we understood that the data functions are broken into data element types and record element types; and the transaction functions are broken into data element types and file type referenced. These components are individually analyzed and counted to determine the unadjusted function point. Complexity tables are used in IFPUG Function Point Analysis to rate the data functions; based on the respective DET, RET, and FTR counts; to determine the unadjusted function point count. We will look at these complexity tables in the next slide.

5.5 Function Point Analysis Complexity Tables

The functional complexity of the data or transaction functions is determined using the respective complexity tables. Table-1 shows the functional complexity table for ILFs and EIFs, Table-2 is the functional complexity table for EIs; and table-3 is the functional complexity table for EQs and EOs. These table help in determining the complexity of the data or transaction functions, based on the number of DETs and RETs – for data functions – and the number of DETs and FTRs for transaction functions. For example, if an ILF has ten DETs and 3 RETs, then its complexity is ‘Average’. Similarly, an EO with 30 DETs and 5 FTRs would be rated as ‘High’. Once all the data and transaction functions are identified, the DETs, RETs, and FTR are counted; these complexity tables are used to determine the complexity of the data and transaction functions. Note that these complexity tables can’t be changed and remains fixed. The values in matrix of DET vs. RET or DET vs. FTR can’t be changed. In the next slide, we will see how the complexity of the data and transaction functions are used to determine the unadjusted function point count.

5.6 Unadjusted Function Point(UFP)

Once the complexity of the data and transaction functions are identified, based on the complexity tables discussed in the previous slide, the table shown in this slide is used to determine the function point contributed by each function. Based on the complexity determined from the number of DETs and RETs, or FTRs; the number of function points contained within the function is determined. For example, if the complexity of an ILF is determined as ‘Low’, then the size of the ILF is seven function points. Similarly, if the complexity of an EQ is ‘High’, then the size of that EQ is six function points. As evident from the table in the slide, ILFs contribute the most function points, followed by EIF, and the transaction functions. This is logical as in any application; the storage and data model plays a primary role in an application, followed by external references and transactions. Similar to complexity tables, the values in this table remain constant and shouldn’t be changed. The measure that arises out of identifying and counting functions is called the unadjusted function point. Unadjusted function point count is the sum of the function points of all the data and transaction function point. This measure only signifies the core functionality provided by the application. This doesn’t include other factors that play a critical role in the application. These factors are classified under general system characteristics. Let us identify the unadjusted function point count for the currency converter example in the next slide.

5.7 UFP for Currency Converter

The table showed here lists out the data and transaction functions identified and counted for the currency converter application. Based on the complexity tables, the size of each function is determined. The sum of all such function point yields the total unadjusted function point (UFP). In the case of currency converter, the total UFP is 29 function points. In the next slide, we will be introduced to the concept of general system characteristics.

5.8 General System Characteristics (GSCs)

Apart from the core functionality of the application, there are few inherent characteristics of the application that needs to be included as a factor, contributing to the overall size of the application. These system characteristics – usually referred to as general system characteristics – are derived based on the non-functional requirements from the customer. Non-functional requirements are usually those requirements that convey the attributes to be possessed by the application. Some of the examples of such non-functional requirements are performance of the application, number of concurrent users to the application, etc. To get a feel of why GSCs are important, consider an application that is used to record orders placed by customer. Clearly, the size of the application which allows 10 orders to be placed per hour would be more if it were to allow 1000 orders per hour. The amount of work needed to design the architecture to support such high performance requirements needs to be considered. Clearly, these attributes are not considered in the process of identifying and rating of data and transaction function discussed in the previous module. The general system characteristics are basically used to fine tune, or adjust the size of the application measured in function points. In IFPUG FPA, there are 14 general system characteristics that are rated based on their degrees of influence on the application. The degrees of influence are rated on a scale of zero to five, with 0 indicating no impact and 5 indicating very high impact. In the next slide we will take a look at the rating guidelines for each GSC.

5.9 GSC Rating Guidelines

Based on the stated user requirements, each GSC must be rated on a scale of zero to five, based on its degree of influence. For example, if a particular GSC is not present in the application, it is rated as zero, and a very strong influence throughout the application is rated as 5. The rating ‘one’ is for incidental influence, two is for moderate influence, a rating of three means average influence, and a rating of 4 denotes significant influence. The guidelines for each rating scale to rate the degree of influence is shown in the table. In the next slide we will take a look at the fourteen general system characteristics as per IFPUG FPA.

5.10 The 14 GSCs

The 14 GSCs prescribed by IFPUG function point analysis is shown in this slide. Note that while we considered the logical user requirements from the user’s perspective to determine the unadjusted function point, the GSCs are based on the technical or implementation aspects of the application. Refer to the information available in the URL provided here for more detailed information on general system characteristics. We’ll briefly understand each of these GSCs in the next slide.

5.11 GSCs at a Glance

Data communications – This factor describes the design attribute of the application, for the mode of transfer of data within and outside the application. Various communication protocols like FTP, Dial-in, Ethernet, etc., have varying degrees of complexity, and these differences are considered in this GSC. Distributed data processing – This factor describes the technical attribute of the application, which allows distributed data processing. The complexity of an application, where the core processing occurs in a single server is different from an application that allows processing to occur in various servers, hardware, and operation system. Performance – This factor describes the technical attribute of the application, to signify the performance of the application. Considerations like concurrent users, load time, etc., are part of this factor. Heavily used configuration – This factor describes the attribute of the application that allows users to configure the application based on their needs. For example, consider a simple windows application where the look and feel is pretty static. Contrast this with an application that allows the user to choose themes, place widgets, etc. The complexity in the latter application is more than the former, and thus the degree of influence for this factor would be high. Transaction rate – This factor describes the attribute of the application, based on the frequency of the transactions that take place. The complexity of an application that is minimally used would be low, compared to an application that has frequent transactions from the users. On-line data entry – This factor describes the application’s attribute which allows it to pass or retrieve data on-line or real-time. If the percentage of transactions in the application is predominantly on-line, then this factor’s degree of influence would be high. End-user efficiency – This factor basically describes the application’s attribute which makes it intuitive to the user to use. If the application’s user interface is extremely intuitive, and requires less effort from the user to use, then the degree of influence would be high for this factor. We’ll continue describing the other GSCs in the next slide.

5.12 GSCs at a Glance (contd.)

On-line update – This is similar to the on-line data entry factor of the previous slide. However, the main difference here is that this factor directly attributes how many ILFs are maintained by such on-line transactions. If the percentage of ILFs updated due to online transactions is high, the degree of influence would be high for this factor. Complex processing – This factor describes the application’s attribute which contains complex mathematical and algorithms to retrieve and report data. If the application is mission-critical with complex operations, the degree of influence for this factor will be high. Reusability – This factor is an implementation attribute which signifies how much percentage of the code has been reused while developing the application. If the amount of reused code is high or has been designed to dynamically generate and reuse code, the degree of influence would be higher. Installation Ease – This attribute describes the application’s attribute which makes it easy for installation. If there is no external intervention required to install the application in the user’s environment, the degree of influence for this factor is rated high. Operational ease – This factor is an application attribute, which describes the degree to which the application handles operational aspects like start-up, shut-down, back-up, etc. If the application reduces human intervention for operational and maintenance purposes, this factor would be rated high. Multiple sites – This factor is an implementation attribute which describes the degree to which the need for installation in multiple sites were considered while designing the application. If the application was designed to be installed in just one location, this factor would be rated low. If the application is designed to be installed in variety of hardware and operating systems, the factor would be rated high. Facilitate change – This is also an implementation attribute which describes the degree to which the application is developed to facilitate change in operational logic, processing, or data structure. If the application is developed to handle frequent changes, this parameter is rated low. We will find out how the adjusted function point count is derived in the next slide.

5.13 Adjusted Function Point(AFP)

Once the fourteen GSCs have been rated, the total degree of influence is calculated which is the sum of the degree of influence of the fourteen GSCs. Post this the value adjustment factor (VAF) is calculated as 0.65 (pronounce as zero point six five) plus 10 percent of the TDI. Next, the adjusted function point count is calculated by multiplying the value adjustment factor with the unadjusted function point count. Care has to be taken while rating the GSCs, as the value adjustment factor has the capability to impact the overall function point count by 35percent. That is, the value adjustment can increase or reduce the adjusted function points by 35percent. For example, consider that all the GSCs are rated as zero, and the unadjusted function point is 50 function points. Thus, TDI is 0 and value adjustment factor is 0.65. Based on these values, the adjusted function points would be 50 multiplied by 0.65 which is equal to 32.5 function point. On the other hand, consider that all the GSCs are rated as 5 which makes the TDI as 70, VAF as 1.35. Thus, the adjusted function point would be 1.35 multiplied by 50 which equals to 67.5 function points. As evident, GSCs can reduce or increase the overall unadjusted function points by maximum 35percent. Before we summarize IFPUG FPA, let us understand some special cases or tips and tricks while performing function point analysis in the next slide.

5.14 Tips and Tricks

While using the application boundary in a typical client-server application, the boundary has to be drawn around both. There is usually a tendency to consider the client system and server as two different entities. From the user’s point of view, both the client and the server system together make up the application, and provide the required functionality. While identifying the DETs for data or transaction functions, it is possible that the data element is either quantitative or qualitative. Quantitative data elements can be data in numerical forms. Examples of quantitative data are salary details, scores in exams, etc. Qualitative data elements can be data which are not numerical. Examples of qualitative data elements are: texts, sound bites, pictures, etc. While determining the DETs in a transaction, all such data elements will have to be included. It is possible to have three types of response messages in a transaction. They are error, confirmation, and notification messages. Error and confirmation messages indicate that a transaction has failed or succeeded respectively. They are not an elementary or an independent process. Such messages are not considered as a separate transaction, but are counted as a data element in a transaction function. However, notification messages are business requirements. A notification is an elementary process, meaningful to the user and is independent of other transactions. Consider the example of the notification message displayed while withdrawing money from an ATM. If the requested amount is more than the available balance, the system generates a message of ‘Insufficient Balance’. What happens in the background is the system calculates the balance, and accordingly generates the message. Thus the notification message has to be considered as an external output, and not a mere data element of a transaction. In a transaction, two types of data elements are possible - business data and control data. Business data is the core business data that is keyed in (in the case of EI), or generated (in the case of EQ or EO) for the user. Example of business data are employee name, job role, designation, age, etc., in the case of an employee system. Control data is the data element that invokes the transaction or alters the behaviour of the system. Control information specifies how, what, and when data will be processed. Examples for control data are buttons, sort commands, filter commands, etc. These data elements are meaningful for the user and hence will have to be considered DETs in a transaction. With this we complete this module as well as the study of IFPUG FPA. Let us summarize the key elements of FPA in the next slide.

5.15 Summary

We started this module with the IFPUG FPA complexity table. The data and transaction functions are classified as simple, average, and complex based on the number of DETs and RETs or FTRs. Once the complexities of the functions are determined, their respective functional size is determined. We then covered the general system characteristics (GSCs). There are 14 GSCs which basically handles the non-functional requirements of the application. Each of the GSCs has to be rated on a scale of 0 to 5 based on their degree of influence on the application. The sum of these ratings is called total degree of influence (TDI). The value adjustment factor is then calculated as VAF = 0.65 + (0.01 * TDI). This VAF is used to adjust the unadjusted function point count. Adjust function point count is calculated as unadjusted function point count multiplied by value adjustment factor. With this we come to the end of IFPUG FPA. In the next module, we’ll re-iterate the types of function points, understand the short-comings of IFPUG, and what the alternatives are.

  • Disclaimer
  • PMP, PMI, PMBOK, CAPM, PgMP, PfMP, ACP, PBA, RMP, SP, and OPM3 are registered marks of the Project Management Institute, Inc.

Request more information

For individuals
For business
Name*
Email*
Phone Number*
Your Message (Optional)
We are looking into your query.
Our consultants will get in touch with you soon.

A Simplilearn representative will get back to you in one business day.

First Name*
Last Name*
Email*
Phone Number*
Company*
Job Title*