Browsing SERL - Software Engineering Research Laboratory by Title
Now showing 1 - 20 of 134
Results Per Page
- ItemA comparison of modeling techniques for software development effort prediction(Springer-Verlag, 1998) MacDonell, SG; Gray, ARSoftware metrics are playing an increasingly important role in software development project management, with the need to effectively control the expensive investment of software development of paramount concern. Research examining the estimation of software development effort has been particularly extensive. In this work, regression analysis has been used almost exclusively to derive equations for predicting software process effort. This approach, whilst useful in some cases, also suffers from a number of limitations in relation to data set characteristics. In an attempt to overcome some of these problems, some recent studies have adopted less common modeling methods, such as neural networks, fuzzy logic models and case-based reasoning. In this paper some consideration is given to the use of neural networks and fuzzy models in terms of their appropriateness for the task of effort estimation. A comparison of techniques is also made with specific reference to statistical modeling and to function point analysis, a popular formal method for estimating development size and effort.
- ItemA comparison of semi-deterministic and stochastic search techniques(Springer, 2000) Connor, AM; Shea, KThis paper presents an investigation of two search techniques, tabu search (TS) and simulated annealing (SA), to assess their relative merits when applied to engineering design optimisation. Design optimisation problems are generally characterised as having multi-modal search spaces and discontinuities making global optimisation techniques beneficial. Both techniques claim to be capable of locating globally optimum solutions on a range of problems but this capability is derived from different underlying philosophies. While tabu search uses a semi-deterministic approach to escape local optima, simulated annealing uses a complete stochastic approach. The performance of each technique is investigated using a structural optimisation problem. These performances are then compared to each other as well as a steepest descent (SD) method.
- ItemA comparison of techniques for developing predictive models of software metrics(Elsevier, 1997-06) Gray, A; MacDonell, SGThe use of regression analysis to derive predictive equations for software metrics has recently been complemented by increasing numbers of studies using non-traditional methods, such as neural networks, fuzzy logic models, case-based reasoning systems, and regression trees. There has also been an increasing level of sophistication in the regression-based techniques used, including robust regression methods, factor analysis, and more effective validation procedures. This paper examines the implications of using these methods and provides some recommendations as to when they may be appropriate. A comparison of the various techniques is also made in terms of their modelling capabilities with specific reference to software metrics.
- ItemA contextual information retrieval framework(National Advisory Committee on Computing Qualifications (NACCQ), 2005) Limbu, D; Connor, AM; MacDonell, SGThe amount of information on the Internet is constantly growing and the challenge now is one of finding relevant information. Contextual information retrieval (CIR) is a critical technology for today's search engines to facilitate queries and return relevant information. Despite its importance, little progress has been made in CIR due to the difficulty of capturing and representing contextual information about users. Numerous CIR approaches exist today, but, to the best of our knowledge, none of them offer a similar service to the one proposed in this paper. This paper proposes an alternative framework for CIR from the World Wide Web (WWW). The framework aims to improve query results (or make search results more relevant) by constructing a contextual profile based on a user's behaviour, their preferences, and a shared knowledge base, and by using this information in the search engine framework to find and return relevant information.
- ItemA framework for contextual information retrieval from the WWW(International Society for Computers and Their Applications (ISCA), 2005) Limbu, DK; Connor, AM; MacDonell, SGSearch engines are the most commonly used type of tool for finding relevant information on the Internet. However, today’s search engines are far from perfect. Typical search queries are short, often one or two words, and can be ambiguous therefore returning inappropriate results. Contextual information retrieval (CIR) is a critical technique for these search engines to facilitate queries and return relevant information. Despite its importance, little progress has been made in CIR due to the difficulty of capturing and representing contextual information about users. Numerous contextual information retrieval approaches exist today, but to the best of our knowledge none of them offer a similar service to the one proposed in this paper. This paper proposes an alternative framework for contextual information retrieval from the WWW. The framework aims to improve query results (or make search results more relevant) by constructing a contextual profile based on a user’s behaviour, their preferences, and a shared knowledge base, and using this information in the search engine framework to find and return relevant information
- ItemA fuzzy logic approach to computer software source code authorship analysis(Springer-Verlag, 1998) Kilgour, RI; Gray, AR; Sallis, PJ; MacDonell, SGSoftware source code authorship analysis has become an important area in recent years with promising applications in both the legal sector (such as proof of ownership and software forensics) and the education sector (such as plagiarism detection and assessing style). Authorship analysis encompasses the sub-areas of author discrimination, author characterization, and similarity detection (also referred to as plagiarism detection). While a large number of metrics have been proposed for this task, many borrowed or adapted from the area of computational linguistics, there is a difficulty with capturing certain types of information in terms of quantitative measurement. Here it is proposed that existing numerical metrics should be supplemented with fuzzy-logic linguistic variables to capture more subjective elements of authorship, such as the degree to which comments match the actual source code’s behavior. These variables avoid the need for complex and subjective rules, replacing these with an expert’s judgement. Fuzzy-logic models may also help to overcome problems with small data sets for calibrating such models. Using authorship discrimination as a test case, the utility of objective and fuzzy measures, singularly and in combination, is assessed as well as the consistency of the measures between counters.
- ItemA perspective-based understanding of project success(John Wiley & Sons, 2012) McLeod, L; Doolin, B; MacDonell, SGAnswering the call for alternative approaches to researching project management, we explore the evaluation of project success from a subjectivist perspective. An in-depth, longitudinal case study of information systems development in a large manufacturing company was used to investigate how various project stakeholders subjectively perceived the project outcome and what evaluation criteria they drew on in doing so. A conceptual framework is developed for understanding and analyzing evaluations of project success, both formal and informal. The framework highlights how different stakeholder perspectives influence the perceived outcome(s) of a project, and how project evaluations may differ between stakeholders and across time.
- ItemA prototype tool to support extended team collaboration in agile project feature management(ISRST, 2009) Licorish, S; Philpott, A; MacDonell, SGIn light of unacceptable rates of software project failure agile development methodologies have achieved widespread industry prominence, aimed at reducing software project risks and improving the likelihood of project success. However, the highly collaborative processes embedded in agile methodologies may themselves introduce other risks. In particular, the fluid and diverse nature of agile team structures may mean that collaboration regarding what is to be delivered becomes more challenging. We have therefore developed a prototype tool intended to enable all stakeholders to have greater access to the features of the emerging system irrespective of their location, via remote feature management functionality. Software engineering experts have evaluated the initial prototype, verifying that it would enhance collaboration and is likely to assist teams in their handling of feature management.
- ItemA simulation framework to support software project (re)planning(IEEE, 2009-08-27) Kirk, D; MacDonell, SPlanning and replanning software projects involves selecting activities according to organisational policies, project goals and contexts, deciding how to effect the activities, and dealing with uncertainty in activity outputs. There is at the present time no general model to support project managers with all of these tasks. The contributions of this paper are to propose a set of properties that are desirable in a model for (re)planning and to create a framework based on these properties. The purpose of the framework is to support the modelling and simulation of (re)planning during software projects. Key aspects of the framework are a focus on project objectives as drivers of activity selection, and activity prediction that supports uncertainty and that may be based on previous activity data, expert opinion or experimental evidence. We present a 'proof-of-concept' case study to illustrate how the framework can be applied to support planning.
- ItemA systematic mapping on the use of visual data mining to support the conduct of systematic literature reviews(Academy Publisher, 2012) Felizardo, KR; MacDonell, SG; Mendes, E; Maldonado, JCA systematic literature review (SLR) is a methodology used to find and aggregate all relevant existing evidence about a specific research question of interest. Important decisions need to be made at several points in the review process, relating to search of the literature, selection of relevant primary studies and use of methods of synthesis. Visualization can support tasks that involve large collections of data, such as the studies collected, evaluated and summarized in an SLR. The objective of this paper is to present the results of a systematic mapping study (SM) conducted to collect and evaluate evidence on the use of a specific visualization technique, visual data mining (VDM), to support the SLR process. We reviewed 20 papers and our results indicate a scarcity of research on the use of VDM to help with conducting SLRs in the software engineering domain. However, most of the studies (16 of the 20 studies included in our mapping) have been conducted in the field of medicine and they revealed that the activities of data extraction and data synthesis, related to conducting the review phase of an SLR process, have more VDM support than other activities. In contrast, according to our SM, previous studies using VDM techniques with SLRs have not employed such techniques during the SLR’s planning and reporting phases.
- ItemA systematic mapping study on dynamic metrics and software quality(IEEE Computer Society, 2012) Tahir, A; MacDonell, SGSeveral important aspects of software product quality can be evaluated using dynamic metrics that effectively capture and reflect the software's true runtime behavior. While the extent of research in this field is still relatively limited, particularly when compared to research on static metrics, the field is growing, given the inherent advantages of dynamic metrics. The aim of this work is to systematically investigate the body of research on dynamic software metrics to identify issues associated with their selection, design and implementation. Mapping studies are being increasingly used in software engineering to characterize an emerging body of research and to identify gaps in the field under investigation. In this study we identified and evaluated 60 works based on a set of defined selection criteria. These studies were further classified and analyzed to identify their relativity to future dynamic metrics research. The classification was based on three different facets: research focus, research type and contribution type. We found a strong body of research related to dynamic coupling and cohesion metrics, with most works also addressing the abstract notion of software complexity. Specific opportunities for future work relate to a much broader range of quality dimensions.
- ItemA systems approach to software process improvement in small organisations(Delta/Publizon, 2009) Kirk, D; MacDonell, SThere is, at the present time, no model to effectively support context-aware process change in small software organisations. The assessment reference models, for example, SPICE and CMMI, provide a tool for identifying gaps with best practice, but do not take into account group culture and environment, and do not help with prioritisation. These approaches thus do not support the many small software organisations that need to make effective changes that are linked to business objectives in short time periods. In this paper, we propose a model on an analogy of ‘software system as human’ and suggest that we can apply the idea of human health to help identify business objectives and improvement steps appropriate for these objectives. We describe a ‘proof-of-concept’ case study in which the model is retrospectively applied to a process improvement effort with a local software group.
- ItemA tabu search environment for engineering design optimisation(AUT University, 2004) Connor, AM; MacDonell, SG
- ItemA tabu search method for the optimisation of fluid power circuits(SAGE Publications Ltd., 1998) Connor, AM; Tilley, DGThis paper describes the development of an efficient algorithm for the optimization of fluid power circuits. The algorithm is based around the concepts of Tabu search, where different time-scale memory cycles are used as a metaheuristic to guide a hill climbing search method out of local optima and locate the globally optimum solution. Results are presented which illustrate the effectiveness of the method on mathematical test functions. In addition to these test functions, some results are presented for real problems in hydraulic circuit design by linking the method to the Bathfp dynamic simulation software. In one such example the solutions obtained are compared to those found using simple steady state calculations.
- ItemAdopting softer approaches in the study of repository data: a comparative analysis(ACM, 2013) Licorish, SA; MacDonell, SGContext: Given the acknowledged need to understand the people processes enacted during software development, software repositories and mailing lists have become a focus for many studies. However, researchers have tended to use mostly mathematical and frequency-based techniques to examine the software artifacts contained within them. Objective: There is growing recognition that these approaches uncover only a partial picture of what happens during software projects, and deeper contextual approaches may provide further understanding of the intricate nature of software teams' dynamics. We demonstrate the relevance and utility of such approaches in this study. Method: We use psycholinguistics and directed content analysis (CA) to study the way project tasks drive teams' attitudes and knowledge sharing. We compare the outcomes of these two approaches and offer methodological advice for researchers using similar forms of repository data. Results: Our analysis reveals significant differences in the way teams work given their portfolio of tasks and the distribution of roles. Conclusion: We overcome the limitations associated with employing purely quantitative approaches, while avoiding the time-intensive and potentially invasive nature of field work required in full case studies.
- ItemAlternatives to regression models for estimating software projects(AUT University, 1996) MacDonell, SG; Gray, ARThe use of ‘standard’ regression analysis to derive predictive equations for software development has recently been complemented by increasing numbers of analyses using less common methods, such as neural networks, fuzzy logic models, and regression trees. This paper considers the implications of using these methods and provides some recommendations as to when they may be appropriate. A comparison of techniques is also made in terms of their modelling capabilities with specific reference to function point analysis.
- ItemAn automatic architecture reconstruction and refactoring framework(Springer (Studies in Computational Intelligence v.377), 2012) Schmidt, F; MacDonell, SG; Connor, AMA variety of sources have noted that a substantial proportion of non trivial software systems fail due to unhindered architectural erosion. This design deterioration leads to low maintainability, poor testability and reduced development speed. The erosion of software systems is often caused by inadequate understanding, documentation and maintenance of the desired implementation architecture. If the desired architecture is lost or the deterioration is advanced, the reconstruction of the desired architecture and the realignment of this desired architecture with the physical architecture both require substantial manual analysis and implementation effort. This paper describes the initial development of a framework for automatic software architecture reconstruction and source code migration. This framework offers the potential to reconstruct the conceptual architecture of software systems and to automatically migrate the physical architecture of a software system toward a conceptual architecture model. The approach is implemented within a proof of concept prototype which is able to analyze java system and reconstruct a conceptual architecture for these systems as well as to refactor the system towards a conceptual architecture.
- ItemAn empirical cognitive model of the development of shared understanding of requirements(Springer, 2014-06-01) Buchan, JIt is well documented that customers and software development teams need to share and refine understanding of the requirements throughout the software development lifecycle. The development of this shared understand- ing is complex and error-prone however. Techniques and tools to support the development of a shared understanding of requirements (SUR) should be based on a clear conceptualization of the phenomenon, with a basis on relevant theory and analysis of observed practice. This study contributes to this with a detailed conceptualization of SUR development as sequence of group-level state transi- tions based on specializing the Team Mental Model construct. Furthermore it proposes a novel group-level cognitive model as the main result of an analysis of data collected from the observation of an Agile software development team over a period of several months. The initial high-level application of the model shows it has promise for providing new insights into supporting SUR development.
- ItemAn empirical investigation into IS development practice in New Zealand(Association for Information Systems - AIS Electronic Library (AISeL), 2004) McLeod, L; Macdonell, S; Doolin, BA Web-based survey of 106 large New Zealand organisations was undertaken to gain an understanding of their IS development practices. The survey focussed on the contribution of standard methods and user participation to IS development. Among the findings were that 91% of the respondents used a standard method in the development process in at least some of projects undertaken in the last three years. All organisations reported using some level of user participation. The majority of organisations agreed that organisational issues had been more important than technical issues in determining the outcome of the IS development in these projects.
- ItemAn integrated tool set to support software engineering learning(Software Engineering Research Group (SERG), the University of Auckland, 2007) Philpott, A; Buchan, J; Connor, AMThis paper considers the possible benefits of an integrated Software Engineering tool set specifically tailored for novice developers, and reflects on the experience of having software engineering students produce various components of this tool set. Experiences with a single semester pilot are discussed and future directions for refining the model are presented.