This document proposes a new approach called the "Obligation Complexity Measure" to assess the complexity of procedural programs in a code-independent manner. It summarizes existing code-based measures like Halstead's software science metrics and McCabe's cyclomatic complexity that rely on analyzing a program's source code. It also outlines cognitive complexity measures like KLCID, CFS, and CICM that consider a program's inputs, outputs, control structures, and information content but require more time and effort to implement. The proposed Obligation Complexity Measure is derived from a program's software requirement specification to allow complexity to be computed earlier in the development process in a manner independent of code.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online from Scribd
Obligation Complexity Measure Over Code and Cognitive Complexity Measures
This document proposes a new approach called the "Obligation Complexity Measure" to assess the complexity of procedural programs in a code-independent manner. It summarizes existing code-based measures like Halstead's software science metrics and McCabe's cyclomatic complexity that rely on analyzing a program's source code. It also outlines cognitive complexity measures like KLCID, CFS, and CICM that consider a program's inputs, outputs, control structures, and information content but require more time and effort to implement. The proposed Obligation Complexity Measure is derived from a program's software requirement specification to allow complexity to be computed earlier in the development process in a manner independent of code.
This document proposes a new approach called the "Obligation Complexity Measure" to assess the complexity of procedural programs in a code-independent manner. It summarizes existing code-based measures like Halstead's software science metrics and McCabe's cyclomatic complexity that rely on analyzing a program's source code. It also outlines cognitive complexity measures like KLCID, CFS, and CICM that consider a program's inputs, outputs, control structures, and information content but require more time and effort to implement. The proposed Obligation Complexity Measure is derived from a program's software requirement specification to allow complexity to be computed earlier in the development process in a manner independent of code.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online from Scribd
Download as pdf or txt
0 ratings0% found this document useful (0 votes)
51 views14 pages
Obligation Complexity Measure Over Code and Cognitive Complexity Measures
This document proposes a new approach called the "Obligation Complexity Measure" to assess the complexity of procedural programs in a code-independent manner. It summarizes existing code-based measures like Halstead's software science metrics and McCabe's cyclomatic complexity that rely on analyzing a program's source code. It also outlines cognitive complexity measures like KLCID, CFS, and CICM that consider a program's inputs, outputs, control structures, and information content but require more time and effort to implement. The proposed Obligation Complexity Measure is derived from a program's software requirement specification to allow complexity to be computed earlier in the development process in a manner independent of code.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online from Scribd
Download as pdf or txt
You are on page 1of 14
Obligation Complexity Measure over Code and
Cognitive Complexity Measures
Abstract: Basili [1] defines complexity as a measure of the resources expended by a system while interacting with a piece of software to perform a given task. Procedural complexity represents the logical structure of a program. Code and Cognitive complexity measures that we have are dependent on the code of the program. So these approaches required time to be implemented. This paper consist of a new approach named Obligation Complexity Measure is proposed to make complexity computation code independent. This method is derived from SRS (Software requirement Specification). Controlling the procedural complexity of a program will help in reducing both the time and space complexity.
Keywords: Procedural complexity, Code based complexity measures, Cognitive complexity measures and Obligation complexity measure.
Anurag Bhatnagar M.Tech student RTU Kota
Shweta Shukla M.Tech, Banasthali Vidyapith
I SSN 2319-9725
March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 2
1. Introduction: Latin word complexus, which signifies "entwined", "twisted together". This may be interpreted in the following way in order to have a complex product we need two or more components, which are joined in such a way that it is difficult to separate them. Similarly, the Oxford Dictionary defines something as "complex" if it is "made of (usually several) closely connected parts". Such in the case of software or in program when a set of instructions are written together then it started showing complexity. IEEE [2] on the other hand defines complexity as the degree to which a system or component has a design or implementation that is difficult to understand and verify. According to me in practical life anything that is not in smallest part or in a single unit, is complex but in the case of the software the true meaning of software complexity is the difficulty to maintain, change and understand the software. Generally we are concerned with the time and space complexity of the program but this paper is concerned with the procedural complexity of procedural programs and the comparison of new approach named Obligation Complexity Measure with established code and cognitive approaches.
2. Code based Complexity Measures [3][4][14]: These measures are dependent on the code of the program. Some of the parameters on which these methods depend are program sizes, program flow graphs, or module interfaces. Some of the code based complexity measures are Halsteads software science metrics [4] and the most widely known measure of cyclomatic complexity developed by McCabe [7]. Halsteads software science metrics is purely concerned with the number of operators and operands, but it does not comprise the internal structures of the program or module, while McCabes cyclomatic complexity does not consider I/O s of a system as it is based on the flow chart of the program. It uses flow chart of the program and on the basis of nodes and edges it provides complexity of the program. The main motive of this paper is to provide an easier approach to calculate the complexity of the procedural programs, and some part of this paper is also devoted in the analysis of various available methods that are classified as code and cognitive based software complexity measures.
March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 3
2.1 Halstead Complexity Measure [5][6]: A set of software metrics was developed by Maurice Halstead in 1977 to find the complexity of procedural programs. This method is used to measure computational complexity of a program component directly from its source code. [4] Halsteads measure uses distinct as well as total number of operators and operands. Operators [4] [5] For instance assignment, arithmetic, and logical operators are usually considered as operators. A pair of parenthesis as well as block begins block ends, pairs are considered as single operator. A label will be considered as an operator, if it is targeted towards GOTO statement. If..then..else, while and DO while are considered as single operator. A sequence of termination operator ; is counted as single operator. Function name in a function call statement is counted as an operator but it is not an operator in function definition or declaration. Operands [4] [5] Variables and constants which are being used with operators in expressions are known as operands. Subroutine declaration and variable declaration considered as operands. The argument list in the function call is counted as an operands but it neither in function definition nor in function declaration is considered as an operand. N1: Number of all operators N2: Number of all operands n1: Number of non-recurring operators n2: Number of non- recurring operands [8] Some of the calculation in Halstead Complexity Measure Length N = N1+ N2 (1) Vocabulary n = n1+ n2 (2) Volume V = n log 2 n (3) Latent Volume V * = (2 + n2) log 2 (2+n2) (4) Difficulty D =V * /V (5) Effort E = V/D (6) Time T= E/18 (7)
March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 4
2.2 Mac Cabes Cyclometric Complexity [4][7]: In 1976, Thomas J. Mc Cabe provides a method to calculate Cyclomatic Complexity by the control flow graph of program [8]. This Mc Cabe method is based on control flow graph of the program, the cyclomatic complexity measure is an example of open engineering as it measures the amount of decision logic in a source code function. McCabes cyclomatic complexity also known as conditional complexity based on control flow. It denotes the number of linearly independent paths through a programs source code [4]. This measure provides a single ordinal number that can be used to measure the complexity of different programs The metric is calculated by using equation (8) V(G) = e n + p (8) Here, e is the edges of graph, n is the nodes of graph, p is the non-connected parts of the graph. Another formula for calculating complexity is the following V(G) = Number of Decision nodes +1 It can be computed early in life cycle than of Halstead's metrics but there are some difficulties with the McCabe metric. Although no one would argue that the number of control paths relates to code complexity, some argue that this number is only part of the complexity picture. According to McCabe, a 5,000-line program with six IF/THEN statements is less complex than a 500-line program with seven IF/THEN statements and this shows the complexity of uncontrolled statement are ignored.
3. Cognitive Complexity Measures [9][10][11][14]: In cognitive informatics, the functional complexity of software in design and comprehension is dependent on fundamental factors such as inputs, outputs, Loops/branches structure, and number of operators and operands [9].
March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 5
3.1 KLCID Complexity Metrics [10] [14]: Klemola and Rilling proposed KLCID which defines identifiers as programmer's defined labels. It defines the use of the identifiers as programmer defined variables and identifiers (ID) when software is built up [11]. ID = Total no. of identifiers/ LOC In order to calculate KLCID, we need to find the number of unique lines of code in a module, lines that have same type and kind of operands with same arrangements of operators would be consider equal. I define KLCID as KLCID= No. of Identifier in the set of unique lines/ No. of unique lines containing identifier This is a time consuming method when comparing a line of code with each line of the program. KLCID accepts that internal control structures for different softwares are identical. 3.2 Cognitive Functional Size (CFS) [9][11]: Wang proposed a Cognitive Functional Size (CFS) state that the complexity of software is dependent on inputs, outputs, and its internal processing. [12] As CFS = (N i + N o ) * Wc Where, N i = No of inputs. N o = No of outputs. Wc = The total cognitive weight of software The cognitive weight of software [11] is the degree of intricacy or relative time and attempt for comprehending given software modelled by a number of Basic control structures (BCS). 3.3 Cognitive Information Complexity Measure [11][13]: Cognitive Informatics plays an important role in understanding the fundamental characteristics of software. CICM [14] is defines as the product of weighted information count of software (WICS) and the cognitive weight (W c ) of the BCSs in the software i.e, CICM = WICS * W c March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 6
Where, WICS is sum of weighted information count of line of code (WICL). WICL for k th
line of code is given by WICL k =ICS k / [LOCS-k] Where, ICS k information contained in software for k th line of code LOCS: total lines of code. Further ICS is given by LOCS ICS= (I k ) k=1 Where, I k is the information contained in K th line of code and calculated I k = (Identifiers + Operands) k Note that, similar to KLCID CICM is also difficult and complex to calculate. It calculates the weighted information count of each line. In their formulation they claim that CICM is based on cognitive informatics the functional complexity of software only depend on input, output and internal architecture not on the operators. Further they claimed that information is a function of identifiers and operators. It is difficult to understand that how they claimed that information is function of operators. Operators are run time attributes and cannot be taken as information contained in the software.
4. Obligation Complexity Measure: Code based complexity measures such as Halstead Complexity Measure and Mc Cabes Cyclomatic Complexity Measure are based on the source code of the procedural programs. On the other hand Cognitive based complexity measures such as Kinds of Lines of Code Identifier Density (KLCID), Cognitive Functional Size (CFS) and Cognitive Information Complexity Measure (CICM) depend on the internal architecture of the procedural programs [1]. Thus both the methods will wait for the source code of the program and take more time to get implemented. It will be more beneficial if we can calculate the complexity of the procedural programs in the earlier phases of the software development life cycle at the time of preliminary March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 7
assessment that is requirement analysis. This OCM is extracted from the SRS and implemented at the time of requirement analysis. So the merit of this approach is that it is able to estimate the program complexity in early phases of software development life cycle, even before analysis and design is carried out. Due to this fact this is a cost effective and less time consuming approach. It can be implemented by considering the various attributes such as 4.1 Key In Out (KIO) : This parameter is very basic and easy to calculate. KIO can be define as KIO = No. of Inputs + No. of outputs + No. of files + No of interfaces 4.2 Functional Requirement (FR) : Generally a procedural program is written using functional approach but it is not essential tha every procedural must be coded using the functional approach. Functional requirements should define the elementary trial that must take place. This can be defined as FR = No. of Functions * SPF i
i=1 Here, SPF is Sub Process or Sub-functions available after decomposition. 4.3 Non Functional Requirement (NFR): It refers to the system qualitative requirements and not satisfying those leads to customer's dissatisfaction. This can be represented as n NFR = Count j i=1 March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 8
Table 1: Different Types of Non Functional Requirement 4.4 Obligatory Complexity (OC): This can be calculated by the sum of all functional and its decomposition into sub- functions and non functional requirements OC = FR + NFR 4.5 Special Complexity Attributes (SCA): This is referred to as the Cost Driver Attributes of unique Category from COCOMO Intermediate model proposed by Berry Boehm. Mathematically defined as
5 SCA = MF i=1 Here MF is a Multiplying Factor. 4.6 Design Constraints Imposed (DCI): This is the number of constraints that are to be considered during development of software. Represented as n DCI = C i i=0 March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 9
Where C i is Number of Constraints and value of C i will vary from 0 to n. C i = 0 If Blind Development. C i = Non-Zero If Constraints exists. 4.7 Interface Complexity (IFC): This parameter is used to define number of external interfaces to the proposed program. n IFC = I f i=0 Here I f is Number of External Interfaces and value of I f will vary from 0 to n. I f = 0 No External Interface. I f = Non-Zero If External Interface exists 4.8 Users / Location Complexity (ULC): This parameter discuss the number of users for accessing the program and locations (Single or Multiple) use. This can be symbolized as ULC = No. of User * No. of Location 4.9 Program Feature Complexity (PFC): If advancement of the program is to be done then some features are added and this parameter shows the program feature complexity by multiplying all the features that have been added into it. Thus mathematical representation is as follows PFC = (Feature 1 * Feature 2 *. * Feature n ) Now by considering all these parameter and defining a new measure that is Obligation Complexity Measure. It can be mathematically represented as OCM = ((KIO + OC) * SCA + (DCI + IFC + PFC))* ULC An example of a program for addition of two numbers using user define function is considered and all the approaches have been applied and result is shown in table2 & 3. A histogram chart is also shown in figure 1 for comparison. By implementing OCM, table 2 is obtained March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 10
Table 2: Obligation Complexity Measure Now by considering the code of the program and implementing other approaches on it table3 is obtained.
#include<stdio.h> #include<conio.h> Float add(float,float); void main() { float a,b,c; clrscr(); printf("Enter the value for a & b\n\n"); scanf("%f %f",&a,&b); c=add(a,b); printf("\nc=%f",c); getch(); } float add(float x,float y) { float z; z=x+y;
return(z);
} March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 11
Table 3: Code and Cognitive Complexity Measures 5. Comparision Between Various Complexity Measures: OCM is applied on a program developed in C language that is used for the addition of two numbers using user define function. In order to analyze the validity of the result, the OCM is calculated before coding and further it is compared with other established measures which are Code and Cognitive complexity measures. In order to show comparison a chart is shown in figure 1.
Figure 1: Comparison among various complexity measures This graph consist both code and cognitive based complexity measures along with the obligation complexity measure. 0 2 4 6 8 10 12 Halstead KLCID CICM March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 12
6. Merits of OCM over Code and Cognitive Complexity Measures Thus we can affirm that other code and cognitive complexity measures required source code of the program but in Obligation Complexity Measure (OCM) time and cost for developing the code is saved as OCM gauges the complexity at the time of very first phase of software development life cycle that is requirement analysis phase in which software requirement specification is created. Code Based and Cognitive Complexity Measures are very hard to calculate but in there is less calculation and it is easier approach by which procedural complexity can easily be measured OCM. Obligation Complexity Measure also improves the quality of the program as Design Constraints Imposed (DCI), Interface Complexity (IFC), Users / Location Complexity (ULC), Program Feature Complexity (PFC) are different parameters that will also be considered when they exists. On the other hand code based complexity measures will wait for the source code of the program and even then there is no way to calculate design constraints, interface needed, location of accessing the program or number of users who can access program and any program features calculating parameter.
March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 13
References 1. Basili, V.R. Qualitative software complexity models: A summary. In Tutorial on Models and Methods for Software Management and Engineering. IEEE Computer Society Press, Los Alamitos, Calif., 1980. 2. IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12.1990. 3. Rafa E. AL Quitash, An Analysis of the Design and Definitions of Halsteads Metrics. 4. Fundamental of Software Engineering, 3 rd edition, 2011, Rajib Mall, ISBN- 978-81-203-3819-7 5. Halstead, M.H. Elements of Software Science. Elsevier North-Holland, New York, 1977. 6. J.K. Kearney et al., "Software Complexity Measurement", in Communications of the ACM, November 1986 vol. 29 no. 11, pp. 1044-1050. 7. Mc Cabe, T.H., A Complexity measure, IEEE Transactions on Software Engineering, SE-2,6, pp. 308- 320, 1976. 8. Elaine J. Weyuker, Evaluating Software Complexity Measures, IEEE Transactions on software engineering, Vol. 14, No. 9, September 1988. 9. Sanjay Misra, A Complexity Measure Based on Cognitive Weights, International Journal of Theoretical and Applied Computer Sciences, Volume 1 Number 1 (2006) pp. 110, (c) GBS Publishers and Distributors (India). 10. Ghazal Keshavarz, Nasser Modiri, Mirmohsen Pedram, A Model for the Controlled Development of Software Complexity Impacts, (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 6, 2011. 11. Benjapol Auprasert and Yachai Limpiyakorn, "Structuring Cognitive Information for Software Complexity Measurement", World Congress on Computer Science and Information Engineering, 2009. 12. Benjapol Auprasert, Towards Structured Software Cognitive Complexity Measurement with Granular Computing Strategies, Proc. 8 th IEEE. Conf. on Cognitive Informatics (ICCI09). March, 2013 www.ijirs.com Vol 2 Issue 3
International Journal of Innovative Research and Studies Page 14
13. Dharmender Singh Kushwaha and A.K.Misra, Robustness Analysis of Cognitive Information Complexity Measure using Weyuker Properties, ACM SIGSOFT Softwar Engineering Notes, January 2006 Volume 31 Number 1. 14. Anurag Bhatnagar, Nikhar Tak and Shweta Shukla, A literature survey on various software complexity measures, IJASCSE Volume 1 Issue 1 2012.