AUT Research Institutes, Centres and Networks
Permanent link for this community
AUT Research Institutes, Centres and Networks bring focus to research activity. The objectives are to:
- Ensure that resources are concentrated in areas where AUT has capability
- Be the key concentration of research activity
- Provide an education, mentoring and training role for postgraduate students
Browse
Browsing AUT Research Institutes, Centres and Networks by Issue Date
Now showing 1 - 20 of 632
Results Per Page
Sort Options
- ItemReliance on correlation data for complexity metric use and validation(ACM, 1991-08) MacDonell, SGThis paper reports the results of an experiment to illustrate the hazards of using correlation data as the sole determinant for software metric use and validation. Three widely cited complexity metrics have been examined in relation to the frequency of software development errors.
- ItemRigor in Software Complexity Measurement Experimentation(Elsevier, 1991-10) MacDonell, Stephen GerardThe lack of widespread industry acceptance of much of the research into the measurement of software complexity must be due at least in part to the lack of experimental rigor associated with many of the studies. This article examines 13 areas in which previous empirical problems have arisen, citing examples where appropriate, and provides recommendations regarding more adequate procedures.
- ItemDeriving relevant functional measures for automated development projects(Elsevier, 1993-09) MacDonell, SThe increasing use of computer aided software engineering (CASE) tools, fourth-generation languages (4GLs) and other similar development automation techniques, has reduced the impact of implementation methods and individual ability on development task difficulty. It has therefore been suggested that measures derived from software specification representations may provide a consistent basis for relatively accurate estimation of subsequent development attributes. To this end, this paper describes the development of a functional complexity analysis scheme that is applicable to system specification products, rather than to the traditional products of the lower-level design and construction phases.
- ItemAssessing the Graphical and Algorithmic Structure of Hierarchical Coloured Petri Net Models(Australian Computer Society Digital Library, 1994) MacDonell, Stephen Gerard; Benwell, GLPetri nets, as a modelling formalism, are utilised for the analysis of processes, whether for explicit understanding, database design or business process re-engineering. The formalism, however, can be represented on a virtual continuum from highly graphical to largely algorithmic. The use and understanding of the formalism will, in part, therefore depend on the resultant complexity and power of the representation and, on the graphical or algorithmic preference of the user. This paper develops a metric which will indicate the graphical or algorithmic tendency of hierarchical coloured Pctri nets.
- ItemComparative Review of Functional Complexity Assessment Methods for Effort Estimation(IEEE, 1994-05) MacDonell, SGBudgetary constraints are placing increasing pressure on project managers to effectively estimate development effort requirements at the earliest opportunity. With the rising impact of automation on commercial software development, the attention of researchers developing effort estimation models has recently been focused on functional representations of systems, in response to the assertion that development effort is a function of specification content. A number of such models exist; several, however, have received almost no research or industry attention. Project managers wishing to implement a functional assessment and estimation programme are therefore unlikely to be aware of the various methods or how they compare. This paper therefore provides this information, as well as forming a basis for the development and improvement of new methods
- ItemThe synthesis of five bar path generating mechanisms using genetic algorithms(IEE/IEEE, 1995) Connor, AM; Douglas, SS; Gilmartin, MJThis paper presents a methodology for the synthesis of multi-degree of freedom mechanisms using genetic algorithms. A Five-Bar mechanism is a 2-DOF system which requires two inputs to fully describe the output motion. In a hybrid mechanism, one of these inputs is supplied by a constant velocity (CV) motor and one is supplied by a programmable servo motor. Such configurations can have considerable savings in power consumption, when the armature inertia of the servo motor is low when compared to the load inertia. In the presented synthesis of such mechanisms the two inputs required are provided by the CV input and the desired position of the end effector. The genetic algorithm is used to search for the optimum link lengths and ground point positions to minimise a multi-criteria objective function. The criteria which contribute to the objective function value are the error between the actual path of the end effector and the desired path, the mobility of the mechanism, and the RMS value of the servo motor displacements
- ItemThe kinematic synthesis of path generating mechanisms using genetic algorithms(WIT Press, 1995) Connor, AM; Douglas, SS; Gilmartin, MJThis paper presents a methodology for the synthesis of path generating mechanisms using Genetic Algorithms GAs). GAs are a novel search and optimisation technique inspired by the principles of natural evolution and survival of the fittest . The problem used to illustrate the use of GAs in this way is the synthesis of a four bar mechanism to provide a desired output path.
- ItemSoftware forensics: old methods for a new science(IEEE Computer Society Press, 1996) MacDonell, SG; Aakjaer, A; Sallis, PJOver the past few years there has been a renewed interest in the science of software authorship identification; this area of research has been termed `software forensics'. This paper examines the range of possible measures that can be used to establish commonality and variance in programmer style, with a view to determining program authorship. It also describes some applications of these techniques, particularly for establishing the originator of programs in cases of security breach, plagiarism and computer fraud.
- ItemSoftware process engineering for measurement-driven software quality programs: realism and idealism(AUT University, 1996) MacDonell, SG; Gray, ARThis paper brings together a set of commonsense recommendations relating to the delivery of software quality, with some emphasis on the adoption of realistic perspectives for software process/product stakeholders in the area of process improvement. The use of software measurement is regarded as an essential component for a quality development program, in terms of prediction, control, and adaptation as well as the communication necessary for stakeholders’ realistic perspectives. Some recipes for failure are briefly considered so as to enable some degree of contrast between what is currently perceived to be good and bad practices. This is followed by an evaluation of the quality-at-all-costs model, including a brief pragmatic investigation of quality in other, more mature, disciplines. Several programs that claim to assist in the pursuit of quality are examined, with some suggestions made as to how they may best be used in practice.
- ItemEffort estimation for the development of spatial information systems(University of Otago, 1996) MacDonell, SG; Benwell, GLThe management and control of software processes has assumed increasing importance in recent times. The ability to obtain accurate and consistent indications of, for example, system quality, developer productivity and schedule projections is an essential component of effective project management. This paper focuses on these ‘traditional’ software engineering issues in relation to the development of spatial systems. In particular, techniques for development effort estimation are considered and a case study illustrating the application of one specific estimation method (Mark II function point analysis) is presented. Given its original basis in business information systems, the method is adjusted in order to account for (some of) the differentiating characteristics of spatial systems. The method is then retrospectively applied to a recently developed hazards analysis system. The effort estimate obtained is sufficiently close to the actual effort used in development to illustrate the potential of such a technique for project management in the spatial systems domain.
- ItemAlternatives to regression models for estimating software projects(AUT University, 1996) MacDonell, SG; Gray, ARThe use of ‘standard’ regression analysis to derive predictive equations for software development has recently been complemented by increasing numbers of analyses using less common methods, such as neural networks, fuzzy logic models, and regression trees. This paper considers the implications of using these methods and provides some recommendations as to when they may be appropriate. A comparison of techniques is also made in terms of their modelling capabilities with specific reference to function point analysis.
- ItemApplications of fuzzy logic to software metric models for development effort estimation(IEEE Computer Society Press, 1997) Gray, A; MacDonell, SSoftware metrics are measurements of the software development process and product that can be used as variables (both dependent and independent) in models for project management. The most common types of these models are those used for predicting the development effort for a software system based on size, complexity, developer characteristics, and other metrics. Despite the financial benefits from developing accurate and usable models, there are a number of problems that have not been overcome using the traditional techniques of formal and linear regression models. These include the nonlinearities and interactions inherent in complex realworld development processes, the lack of stationarity in such processes, over-commitment to precisely specified values, the small quantities of data often available, and the inability to use whatever knowledge is available where exact numerical values are unknown. The use of alternative techniques, especially fuzzy logic, is investigated and some usage recommendations are made.
- ItemApplying soft systems methodology to multimedia systems requirements analysis(AUT University, 1997) Butt, DZ; Fletcher, T; MacDonell, SG; Norris, BE; Wong, WBLThe Soft Systems Methodology (SSM) was used to identify requirements for the development of one or more information systems for a local company. The outcome of using this methodology was the development of three multimedia information systems. This paper discusses the use of the SSM when developing for multimedia environments. Namely, this paper covers the problems with traditional methods of requirements analysis (which the SSM addresses), how the SSM can he used to elicit multimedia information system requirements, and our personal experience of the method. Our personal experience is discussed in terms of the systems we developed using the SSM.
- ItemEarly experiences in measuring multimedia systems development effort(Springer-Verlag, 1997) Fletcher, T; MacDonell, SG; Wong, WBLThe development of multimedia information systems must be managed and controlled just as it is for other generic system types. This paper proposes an approach for assessing multimedia component and system characteristics with a view to ultimately using these features to estimate the associated development effort. Given the different nature of multimedia systems, existing metrics do not appear to be entirely useful in this domain; however, some general principles can still be applied in analysis. Some basic assertions concerning the influential characteristics of multimedia systems are made and a small preliminary set of data is evaluated.
- ItemMetrics for database systems: an empirical study(IEEE Computer Society Press, 1997) MacDonell, SG; Shepperd, MJ; Sallis, PJAn important task for any software project manager is to be able to predict and control project size and development effort. Unfortunately, there is comparatively little work, other than function points, that tackles the problem of building prediction systems for software that is dominated by data considerations, in particular systems developed using 4GLs. We describe an empirical investigation of 70 such systems. Various easily obtainable counts were extracted from data models (e.g. number of entities) and from specifications (e.g. number of screens). Using simple regression analysis, a prediction system of implementation size with accuracy of MMRE=21% was constructed. This approach offers several advantages. First there tend to be fewer counting problems than with function points since the metrics we used were based upon simple counts. Second, the prediction systems were calibrated to specific local environments rather than being based upon industry weights. We believe this enhanced their accuracy. Our work shows that it is possible to develop simple and useful local prediction systems based upon metrics easily derived from functional specifications and data models, without recourse to overly complex metrics or analysis techniques. We conclude that this type of use of metrics can provide valuable support for the management and control of 4GL and database projects
- ItemEstablishing relationships between specification size and software process effort in CASE environments(Elsevier, 1997-01) MacDonell, SGAdvances in software process technology have rendered some existing methods of size assessment and effort estimation inapplicable. The use of automation in the software process, however, provides an opportunity for the development of more appropriate software size-based effort estimation models. A specification-based size assessment method has therefore been developed and tested in relation to process effort on a preliminary set of systems. The results of the analysis confirm the assertion that, within the automated environment class, specification size indicators (that may be automatically and objectively derived) are strongly related to process effort requirements.
- ItemA comparison of techniques for developing predictive models of software metrics(Elsevier, 1997-06) Gray, A; MacDonell, SGThe use of regression analysis to derive predictive equations for software metrics has recently been complemented by increasing numbers of studies using non-traditional methods, such as neural networks, fuzzy logic models, case-based reasoning systems, and regression trees. There has also been an increasing level of sophistication in the regression-based techniques used, including robust regression methods, factor analysis, and more effective validation procedures. This paper examines the implications of using these methods and provides some recommendations as to when they may be appropriate. A comparison of the various techniques is also made in terms of their modelling capabilities with specific reference to software metrics.
- ItemOptimisation of power transmission systems using a discrete Tabu Search method(Professional Engineering Publishing, 1998) Connor, AM; Tilley, DGThis paper presents a brief description of the Tabu Search method and shows how it can be applied to two different power transmission systems. Examples are presented from two transmission systems. In the first example a mechanical transmission system is considered. A four bar mechanism is synthesised in order to produce a desired output motion. The second example is a hydrostatic transmission operating under closed loop control in order to maintain a constant operating speed as the loading conditions change.
- ItemIDENTIFIED (integrated dictionary-based extraction of non-language-dependent token information for forensic identification, examination, and discrimination): A dictionary-based system for extracting source code metrics for software forensics(IEEE, 1998) Gray, A; Sallis, P; MacDonell, SThe frequency and severity of computer-based attacks such as viruses and worms, logic bombs, trojan horses, computer fraud, and plagiarism of software code have all become of increasing concern to many of those involved with information systems. Part of the difficulty experienced in collecting evidence regarding the attack or theft in such situations has been the definition and collection of appropriate measurements to use in models of authorship. With this purpose in mind a system called IDENTIFIED is being developed to assist with the task of software forensics which is the use of software code authorship analysis for legal or official purposes. IDENTIFIED uses combinations of wildcards and special characters to define count-based metrics, allows for hierarchical metametric definitions, automates much of the file handling task, extracts metric values from source code, and assists with the analysis and modelling processes. It is hoped that the availability of such tools will encourage more detailed research into this area of ever-increasing importance.
- ItemThe optimal synthesis of mechanisms using harmonic information(Taylor & Francis, 1998) Connor, AM; Douglas, SS; Gilmartin, MJThis paper reviews several uses of harmonic information in the synthesis of mechanisms and shows that such information can be put to even greater use in this ®eld. Results are presented for both single and multi-degree of freedom systems which support this claim. In both cases, the inclusion of harmonic information into the objective function aids the search to locate high-quality solutions.