Imperial College London

MrLloydKamara

Faculty of EngineeringDepartment of Computing

Computing Support Manager
 
 
 
//

Contact

 

+44 (0)20 7594 8400l.kamara Website

 
 
//

Location

 

305Huxley BuildingSouth Kensington Campus

//

Summary

 

Publications

Publication Type
Year
to

102 results found

Masood S, Yang G-Z, 2001, Macroscopic structure and physiology of the normal and diseased heart, Departmental Technical Report: YY/No, Publisher: Department of Computing, Imperial College London

This paper outlines the macroscopic anatomy and physiology of the heart, linkingthe micro and macroscopic structure of cardiac muscle fibre to its function duringcontraction and dilatation. The properties of cardiac muscle cells and the process ofcontraction at a cellular level are described. The macroscopic structure of themyocardium is further explained as one muscle band wound into a double twist.This helps to elucidate the muscle architecture and structure of the ventricles.Ventricular dynamics are also described as the twisting and untwisting of thismuscle band to produce shortening and lengthening. Myocardial perfusion andcauses of disease are discussed. Coronary artery disease and its effect oncontractility is then described and ways of measuring contractility are introduced.

Report

Pitt J, Kamara L, Artikis A, 2001, Interaction patterns and observable commitments in a multi-agent trading scenario, 5th international conference on autonomous agents, Montreal, Canada, May 2001, Publisher: ACM Press, Pages: 481-488

Conference paper

Artikis A, Kamara L, Pitt J, 2001, Towards an open agent society model and animation, Proceedings of the agent-based simulation II workshop, Passau, Pages: 48-55

Conference paper

Artikis A, Kamara L, Guerin F, Pitt Jet al., 2001, Animation of Open Agent Societies, Symposium on information agents for electronic commerce, York, March 2001, Publisher: The Society for the Study of Artificial Intelligence and the Simulation of Behaviour, Pages: 99-108

Conference paper

Kahen G, Lehman MM, Ramil JF, 2000, System dynamics modelling for the management of long term software evolution processes, Departmental Technical Report: 2000/16, Publisher: Department of Computing, Imperial College London

An approach and basic concepts for the study of the system dynamics of long-term software evolution processes is presented. The approach provides a generic context and framework that supports at least three crucial process areas requiring management decision, resource allocation, release planning, and process performance monitoring. The report exemplifies the approach with an executable model. The latter reflects the global software process at a high level of abstraction and includes phenomenological observations derived from the laws of software evolution and the behaviours thereby implied. It incorporates concepts such as progressive (e.g., functional enhancement) and anti-regressive (e.g., complexity control) activities and enables the study of policies of human resource allocation to classes of activities. The example shows how the model permits assessment of the impact of alternative policies on various evolutionary attributes. It is part of and exemplifies the methods for software process modelling being developed and applied in the FEAST projects.

Report

Lehman MM, 2000, Rules and tools for software evolution planning and management, Departmental Technical Report: 2000/14, Publisher: Department of Computing, Imperial College London

When first formulated in the early seventies, the laws of software evolution were, for a number of reasons, not widely accepted as relevant to software engineering practice. Over the years, they have graduallybecome recognised as providing useful inputs to understanding of the software process and have found their place in a number of software engineering curricula. Now eight in number, they have been upplemented by a Software Uncertainty Principle and a FEAST Hypothesis. Based on all these and on the results of the recent FEAST/1 and current FEAST/2 research projects, this aper develops and presents some fifty rules for application in software system process planning and management and indicates tools available or to be developed to support their application. The listing is structured according to the laws that encapsulate the observed phenomena and that lead to the recommended procedure. Each sub-list is preceded by a textual discussion providing at least some of the justification for the recommended procedure. The text is fully referenced. This directs the interested reader to the literature that records observed behaviours, interpretations, models and metrics obtained from some three of theindustrially evolved systems studied, and from which the recommendations were derived

Report

Lehman MM, 2000, Approach to a theory of software process and software evolution, Departmental Technical Report: 2000/2, Publisher: Department of Computing, Imperial College London

This preliminary introduction, extracted from work in progress, is intended to illustrate an approach currently under development. If successfully and convincingly completed, the result should make a significant contribution, providing software engineering technology with the theoretical foundations and framework needed to support further major process and technology improvement. Expressing the theory in an appropriate formalism will represent a still further advance. The present development is a first, essential, step to achieve this outcome.

Report

Kahen G, Lehman MM, 2000, A brief review of feedback dimensions in the global software process, Departmental Technical Report: 2000/15, Publisher: Department of Computing, Imperial College London

FEAST, an ongoing study of the role of feedback in the software process, was prompted by various factors including the need to identify the mechanisms underpinning observed phenomena in a series of metrics-based studies of evolving software systems conducted during the seventies and eighties. Evidence to date indicates that feedback loop mechanisms play a significant role in determining the performance and dynamics of software processes. To improve understanding of the evolutionary behaviour of software systems and to exploit feedback in the context of process improvement it is necessary to improve knowledge of the origin and sources of feedback phenomena. This is also a prerequisite for a systematic definition of control and policy mechanisms for the management of such processes. This paper refers to some of the many dimensions that appear to relate to the issue of feedback in the global software process. It is argued that empirically assessing and modelling the presence and importance of those different dimensions in industrial software processes can bring significant progress to the FEAST investigation.

Report

Giannakopoulou D, 2000, Modelling and analysis of the bounded-retransmission protocol: experience with discrete time in the LTSA, Departmental Technical Report: 2000/5, Publisher: Department of Computing, Imperial College London

The Bounded Retransmission Protocol is an industrial protocol designed for the transfer of large data files over unreliable communication lines. The protocol relies on specific assumptions on the timed behaviour of its components. This paper describes our experience with modelling and analysing the Bounded Retransmission Protocol using the LTSA. The LTSA uses labelled transition systems to specify behaviour, and compositional reachability analysis to incrementally generate, minimise, and analyse a system, based on its software ar-chitecture. The tool was not originally designed to deal with real-time applications. However, by modelling time as a discrete entity, the LTSA does not need to be extended in order to han-dle such systems. We discuss how the features of the tool can be exploited to model and ana-lyse behaviours that involve time.

Report

Kahen G, Lehman MM, Ramil JF, 2000, Model-based assessment of software evolution processes, Departmental Technical Report: 2000/4, Publisher: Department of Computing, Imperial College London

This paper argues that quantitative process models must be considered essential to support sustained improvement of E-type software evolution processes and summarises some of the experiences gained in the FEAST projects to date. Modelling guidelines are provided.

Report

Ruger SM, Gauch SE, 2000, Feature reduction for document clustering and classification, Departmental Technical Report: 2000/8, Publisher: Department of Computing, Imperial College London

Often users receive search results which contain a wide range of documents, only some of which are relevant to their information needs. To address this problem, ever more systems not only locate information for users, but also organise that information on their behalf. We look at two main automatic approaches to information organisation: interactive clustering of search results and pre-categorising documents to provide hierarchical browsing structures. To be feasible in real world applications, both of these approaches require accurate yet efficient algorithms. Yet, both suffer from the curse of dimensionality - documents are typically represented by hundreds or thousands of words (features) which must be analysed and processed during clustering or classification. In this paper, we discuss feature reduction techniques and their application to document clustering and classification, showing that feature reduction improves efficiency as well as accuracy. We validate these algorithms using human relevance assignments and categorisation.

Report

van Schroeter T, 2000, Auto-regressive spectral line analysis of piano tones, Departmental Technical Report: 2000/7, Publisher: Department of Computing, Imperial College London

Three auto-regressive spectral estimation methods are experimentally tested with a view to musical applications: the Maximum Entropy method, Marple's MODCOVAR algorithm, and an efficient version of Prony spectral line estimation due to Cybenko. A performance analysis measuring the maximum relative error of their frequency estimates for a signal consisting of three sinusoids under variations of the model order (up to 20), signal length (60 to 200 samples) and noise level shows that unless the model order is close to 2/3 of the number of data points (i.e. when it is nearly ill-conditioned), Marple's algorithm gives by far the best results. In a separate experiment, Marple's algorithm was applied to recorded piano sounds; some preliminary results are shown which demonstrate its potential for fast multicomponent analysis.

Report

Ramil JF, 2000, 'Why COCOMO works' revisited or feedback control as a cost factor, Departmental Technical Report: 2000/3, Publisher: Department of Computing, Imperial College London

The achievement of accurate software cost estimation based only on a few factors is a long-standing goal in software engineering. Work in this area is exemplified by a number of algorithmic approaches that have been proposed over the years. COCOMO is one of the most frequently quoted of such approaches. Evidence emerging from observational and simulation studies suggest that feedback mechanisms play an important role in determining software process behaviour, its dynamics and performance. Thus, the presence of feedback mechanisms, and in particular feedback control may have a significant influence on software project cost and interval performance, but none of the current algorithmic cost estimation approaches appears, at least explicitly, to account for such influence. Why, in spite of this, do algorithmic approaches provide satisfactory estimates? Why did they work? This paper discusses some possible answers, that at the present must only be taken as hypotheses. The paper provides suggestions for further investigation of the problem.

Report

Maros I, Mitra G, 1998, Cooperating sparse simplex algorithm for a distributed memory multiprocessor, Departmental Technical Report: 98/12, Publisher: Department of Computing, Imperial College London

We undertake a computational analysis of the algorithmic components of the sparse simplex (SSX) method and consider the performance of alternative SSX solution strategies. We then combine these alternative strategies within an asychronous control structure of a parallel algorithm, implemented on a distributed memory multiprocessor machine. Our experimental results not only show (limited) speedup and robust performance, they also give new insight into the internal operation of the SSX and its adaptation on a parallel platform.

Report

Lehman MM, 1998, FEAST/1 final report, Departmental Technical Report: 99/1, Publisher: Department of Computing, Imperial College London

The FEAST/1 project was conceived in 1994 following formulation of the hypothesis that the software evolution process is a feedback system and must be treated as such to achieve major process improvement. The overall goals of the project were to: provide objective evidence of the presence and impact of feedback and systems dynamics in the software process demonstrate that they can be exploited for managing and improving industrial processes produce justification for more substantial study of the feedback perspective and its implicationsThis paper gives the specific objectives that were identified for the project and describes work carried out, difficulties encountered and results achieved.

Report

Lehmann MM, Ramil JF, 1998, Implications of laws of software evolution on continuing successful use of COTS software, Departmental Technical Report: 98/8, Publisher: Department of Computing, Imperial College London

However completely specified, integration of COTS software into real world systems makes it of type E even though, were it to be fully and absolutely specified, it would satisfy the definition of an S-type system. Thus, the laws of software evolution that apply to E-type systems are also relevant in the COTS context. This paper examines the wider implications of this fact and, in particular, that such systems must undergo continuing evolution. Managerial implications of the laws of software evolution in the context of COTS are also briefly highlighted.

Report

Lehman MM, Ramil JF, Wernick PD, 1998, The influence of global factors on software system evolution, Departmental Technical Report: 98/11, Publisher: Department of Computing, Imperial College London

The FEAST/1 project is investigating a hypothesis that the software process is a multi-level feedback system. Using a number of complementary approaches to model, analyse and interpret metric and other data records of the evolution of systems from various application areas, the project has made significant advances in detecting the effect of feedback and related dynamics in their evolution. Results obtained to date, supported by theoretical reasoning, support the view that software process improvement must focus on the global process including its feedback mechanisms. In addition to technical aspects of the process, organisation, management, marketing, user support and other factors must also be considered. This paper discusses the conceptual framework within which this question is raised, presents preliminary evidence to support tentative conclusions and indicates how the issue might be further investigated.

Report

Meszaros C, 1997, On free variables in interior point methods, Departmental Technical Report: 97/4, Publisher: Department of Computing, Imperial College London

Interior point methods, especially the algorithms for linear programming problems are sensitive if there are unconstrained (free) variables in the problem. While replacing a free variable by two nonnegative ones may cause numerical instabilities, the implicit handling results in a semidefinite scaling matrix at each interior point iteration. In this paper we investigate the effects if the scaling matrix is regularized. Our analysis will prove that the effect of the regularization can be easily monitored and corrected if necessary. We describe the regularization scheme mainly for the efficient handling of free variables, but a similar analysis can be made for the case when the small scaling factors are raised to larger values to improve the numerical stability of the systems that define the search direction. We will show the superiority of our approach over the variable replacement method on a set of test problems arising from a water management application.

Report

Gabbay DM, Olivetti N, 1997, Goal-directed proof theory, Departmental Technical Report: 97/12, Publisher: Department of Computing, Imperial College London

This report is the draft of a book about goal directed proof theoretical formulations of non-classical logics. It evolved from a response to the existence of two camps in the applied logic (computer science/artificial intelligence) community. There are those members who believe that the new non-classical logics are the most important ones for applications and that classical logic itself is now no longer the main workhorse of applied logic, and there are those who maintain that classical logic is the only logic worth considering and that within classical logic the Horn clause fragment is the most important one.The book presents a uniform Prolog-like formulation of the landscape of classical and non-classical logics, done in such away that the distinctions and movements from one logic to another seem simple and natural; and within it classical logic becomes just one among many. This should please the non-classical logic camp. It will also please the classical logic camp since the goal directed formulation makes it all look like an algorithmic extension of Logic Programming. The approach also seems to provide very good compuational complexity bounds across its landscape.

Report

Zisman A, 1997, A methodology to assist with a distributed information discovery process for autonomous databases, Departmental Technical Report: 97/3, Publisher: Department of Computing, Imperial College London

In this technical report we discuss a methodology and a support tool to assist the coordinator of a federation with the construction and evolution of hierarchical information structures. The definition of the terms composing the hierarchical information structures is based on the interests of the users and the applications, and the information that each database system shares with the other components. Therefore, the different group names refer to the types of databases participating in the federation. The other levels are related to the entity names, attribute names, class names, object names, and instances of the databases. The methodology consists of constructing a hierarchical information structure by incremental addition of the participating database systems. The support tool assists with the automation of some steps during construction and evolution of the structures.

Report

Vickers S, 1997, Localic completion of quasimetric spaces, Departmental Technical Report: 97/2, Publisher: Department of Computing, Imperial College London

We give a constructive localic account of the completion of quasimetric spaces. In the context of Lawvere's approach, using enriched categories, the points of the completion are flat left modules over the quasimetric space. The completion is a triquotient surjective image of a space of Cauchy sequences and can also be embedded in a continuous depo, the "ball domain". Various examples and constructions are given, including the lower, upper and Vietoris powerlocales, which are completions of finite powerspaces. The exposition uses the language of locales as "topology-free spaces".

Report

Meszaros C, 1997, BPMPD User's Manual, Version 2.20, Departmental Technical Report: 97/8, Publisher: Department of Computing, Imperial College London

The purpose of this document is to describe a software package, called BPMPD, which implements the infeasible primal-dual interior point method for linear and quadratic programming problems. This manual describes how to prepare data to solve with the package, how to use BPMPD as a callable solver library and which algorithmic options can be specified by the user.

Report

Meszaros C, 1997, Steplengths in infeasible primal-dual interior point algorithms of convex quadratic programming, Departmental Technical Report: 97/7, Publisher: Department of Computing, Imperial College London

An approach to determine primal and dual stepsizes in the infeasible-interior-point primal-dual method for convex quadratic problems is presented. The approach reduces the primal and dual infeasibilities in each step and allows different stepsizes. The method is derived by investigating the efficient set of a multiobjective optimization problem. Computational results are also given.

Report

Gabbay D, Nossum R, Thielscher M, 1997, Agents in proactive environments, Departmental Technical Report: 97/5, Publisher: Department of Computing, Imperial College London

Agents situated in proactive environments are acting au-tonomously while the environment is evolving alongside, whether or not the agents carry out any particular actions. A formal framework for simulating and reasoning about this generalized kind of dynamic systems is proposed. The capabilities of the agents are modeled by a set of conditional rules in a temporal-logical format. The environment itself is modeled by an independent transition relation on the state space. The temporal language is given a declarative semantics.

Report

Wernick P, 1997, Adding learning to software process models, Departmental Technical Report: 97/13, Publisher: Department of Computing, Imperial College London

This paper considers an issue raised by Lehman at ISPW 9, where he stated that the software process is a 'learning' process. We examine the ramifications of that statement, and of the nature and impact of that learning, in the context of the process controlling the evolution of a software product over time and multiple releases. We present a high-level model of the software process which describes that process in terms of the gaining and storing of knowledge of a software product and its use and environment, and of the application of that knowledge in making changes to the product.

Report

Vickers S, 1996, Topical categories of domains, Departmental Technical Report: 97/1, Publisher: Department of Computing, Imperial College London

It is shown how many techniques of categorical domain theory can be expressed in the general context of topical categories (where "topical" means internal in the category Top of Grothendieck toposes with geometric morphisms). The underlying topos machinery is hidden by using a geometric form of constructive mathematics, which enables toposes as "generalized topological spaces" to be treated in a transparently spatial way, and also shows the constructivity of the arguments. The theory of strongly algebraic (SFP) domains is given as a case study in which the topical category is Cartesian closed.Properties of local toposes and of lifting of toposes (sconing) are summarized, and it is shown that the category of toposes has a fixpoint object in the sense of Crole and Pitts. This is used to show that for a local topos, all endomaps have initial algebras, and this provides a general context in which to describe fixpoint constructions including the solution of domain equations involving constructors of mixed variance. Covariance with respect to embedding-projection pairs or adjunctions arises in a natural way.The paper also provides a summary of constructive results concerning Kuratowski finite sets, including a novel strong induction principle; and shows that the topical categories of sets, finite sets and decidable sets are not Cartesian closed (unlike the cases of finite decidable sets and strongly algebraic domains).

Report

Sharp DWN, 1996, Enhancements to a pitch detection system, Departmental Technical Report: 96/9, Publisher: Department of Computing, Imperial College London

This paper describes some enhancements to a pitch detection algorithm presented in an earlier paper by the author. The enhancements reduce flickering between adjacent semitones when the algorithm is used as part of a pitch to MIDI converter. The enhancements also permit less memory space to be used and minimise accidental octave jumping. The enhancements illustrate the use of voting techniques for pattern recognition applications.

Report

Sharp DWN, While RL, 1996, A tighter bound on the area occupied by a fractal image, Departmental Technical Report: 96/6, Publisher: Department of Computing, Imperial College London

We derive a bounding circle for a fractal image specified by an iterated function system. The radius of the bounding circle is smaller than those from previously published material. The bounding circle is important in fractal design and plotting software as it enables a fractal image to be scaled correctly to fit the screen of a digital computer.

Report

Dargam FCC, 1996, A compromised characterization to belief revision, Departmental Technical Report: 96/2, Publisher: Department of Computing, Imperial College London

This paper proposes a method for handling logically conflicting inputs into knowledge bases. Basically, it concerns reconciling conflicting inputs with the underlying theory, via restricting their consequences. The main idea is to update the database with as many consistent consequences of the inputs as possible, in the case that the inputs themselves are not allowed to be kept in it. And in the case that a revision applies, the idea is to keep as many as possible of the consistent consequences of the retracted sentences as a compromise.Resolving conflicting updates in dynamic databases, for instance, are frequent and critically important problems of real applications. Such problems require the revision of theories and knowledge bases. It is not realistic to aim for a generic approach in those cases, since theory revision is fundamentally dependent on application-specific mechanisms, principles and heuristics. The approach we propose here, caters for the specific case where compromised solutions for revising knowledge bases apply, when conflicts involving updates occur. In comparison with approaches that require preference between conflitting inputs, or that avoid them by cancelling them out completely, our approach fits as an alternative which provides more informative results. Examples of inputs include database updates, actions, and beliefs.In more practical terms, consider the situation where K is a database and A an input. Assume that A is inconsistent with K. Current belief revision/update approaches will keep A and maintain consistency by selecting some element from K to form a revised database, usually denoted by K*A. There is a lot of research in this area, both theorectical, e.g.: the AGM theory of belief revision, and algorithmic research, e.g.: Reason Mantenance Systems.Our aim is to offer an alternative approach, restricted to some specific applications, which is flexible enough to keep more data in K in the case of conflicts. We view the above situatio

Report

Dargam FCC, 1996, On compromising updates in labelled databases, Departmental Technical Report: 96/1, Publisher: Department of Computing, Imperial College London

This paper presents a logical system, CIULDS, as a labelled realization to our approach of Compromising Interfering Updates. Basically, this approach proposes a method for handling logically conflicting inputs into knowledge bases, via restricting their consequences. The main idea is to update the database with as many consistent consequences of the inputs as possible, in the case that the inputs themselves are not allowed to be kept in it. And in the case that a revision applies, the idea is to keep as many as possible of the consistent consequences of the retracted sentences as a compromise.Our approach caters for the specific case where compromised solutions for revising knowledge bases apply, when conflicts involving updates occur. In comparison with approaches that require preference between conflicting inputs, or that avoid them by cancelling them out completely, our approach fits as an alternative which provides more informative results, and is directed to some specific applications. Hence, instead of preventing updates to be performed, when they introduce inconsistency to the system, our approach proposes to generate the consequences of the conflicting inputs, and to get rid of the inconsistency, via a minimal number of retractions of those consequences. We expect the resulting database to be consistent w.r.t. the integrity constraints, and to retain a safe-maximal subset of the consistent non-supported consequences. This reconciliation of conflicting inputs follows some specified postulates for compromised revision.CIULDS is based on the Labelled Deductive Systems framework (LDS). This framework deals with labelled formulae as its basic units of information. By labelling the formulae, we are provided with a way of including in the labels extra information to the system. The main motivation for adopting LDS as the underlying framework of this formalization was to take advantage of its labelling facility, to control the derivation process of the compromised c

Report

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: id=00157058&limit=30&person=true&page=3&respub-action=search.html