• Assessing the credibility of online social network messages.

      Makinde, Oghenefejiro Winnie; University of Derby (University of Derby, 2018-01)
      ABSTRACT Information gathered socially online is a key feature of the growth and development of modern society. Presently the Internet is a platform for the distribution of data. Millions of people use Online Social Networks daily as a tool to get updated with social, political, educational or other occurrences. In many cases information derived from an Online Social Network is acted upon and often shared with other networks, without further assessments or judgments. Many people do not check to see if the information shared is credible. A user may trust the information generated by a close friend without questioning its credibility, in contrast to a message generated by an unknown user. This work considers the concept of credibility in the wider sense, by proposing whether a user can trust the service provider or even the information itself. Two key components of credibility have been explored; trustworthiness and expertise. Credibility has been researched in the past using Twitter as a validation tool. The research was focused on automatic methods of assessing the credibility of sets of tweets using analysis of microblog postings related to trending topics to determine the credibility of tweets. This research develops a framework that can assist the assessment of the credibility of messages in Online Social Networks. Four types of credibility are explored (experienced, surface, reputed and presumed credibility) resulting in a credibility hierarchy. To determine the credibility of messages generated and distributed in Online Social Networks, a virtual network is created, which attributes nodes with individual views to generate messages in the network at random, recording data from a network and analysing the data based on the behaviour exhibited by agents (an agent-based modelling approach). The factors considered for the experiment design included; peer-to-peer networking, collaboration, opinion formation and network rewiring. The behaviour of agents, frequency in which messages are shared and used, the pathway of the messages and how this affects credibility of messages is also considered. A framework is designed and the resulting data are tested using the design. The resulting data generated validated the framework in part, supporting an approach whereby the concept of tagging the message status assists the understanding and application of the credibility hierarchy. Validation was carried out with Twitter data acquired through twitter’s Application Programming Interface (API). There were similarities in the generation and frequency of the message distributions in the network; these findings were also recorded and analysed using the framework proposed. Some limitations were encountered while acquiring data from Twitter, however, there was sufficient evidence of correlation between the simulated and real social network datasets to indicate the validity of the framework.
    • Cloud BI: A Multi-party Authentication Framework for Securing Business Intelligence on the Cloud

      Al-Aqrabi, Hussain; University of Derby (2016)
      Business intelligence (BI) has emerged as a key technology to be hosted on Cloud computing. BI offers a method to analyse data thereby enabling informed decision making to improve business performance and profitability. However, within the shared domains of Cloud computing, BI is exposed to increased security and privacy threats because an unauthorised user may be able to gain access to highly sensitive, consolidated business information. The business process contains collaborating services and users from multiple Cloud systems in different security realms which need to be engaged dynamically at runtime. If the heterogamous Cloud systems located in different security realms do not have direct authentication relationships then it is technically difficult to enable a secure collaboration. In order to address these security challenges, a new authentication framework is required to establish certain trust relationships among these BI service instances and users by distributing a common session secret to all participants of a session. The author addresses this challenge by designing and implementing a multiparty authentication framework for dynamic secure interactions when members of different security realms want to access services. The framework takes advantage of the trust relationship between session members in different security realms to enable a user to obtain security credentials to access Cloud resources in a remote realm. This mechanism can help Cloud session users authenticate their session membership to improve the authentication processes within multi-party sessions. The correctness of the proposed framework has been verified by using BAN Logics. The performance and the overhead have been evaluated via simulation in a dynamic environment. A prototype authentication system has been designed, implemented and tested based on the proposed framework. The research concludes that the proposed framework and its supporting protocols are an effective functional basis for practical implementation testing, as it achieves good scalability and imposes only minimal performance overhead which is comparable with other state-of-art methods.
    • Computational fluid dynamics model of a quad-rotor helicopter for dynamic analysis

      Poyi, Gwangtim Timothy; Wu, Mian Hong; Bousbaine, Amar; University of Derby (Pioneer Research and Development Group, 2016-06-30)
      The control and performance of a quad-rotor helicopter UAV is greatly influenced by its aerodynamics, which in turn is affected by the interactions with features in its remote environment. This paper presents details of Computational Fluid Dynamics (CFD) simulation and analysis of a quadrotor helicopter. It starts by presenting how SolidWorks software is used to develop a 3-D Computer Aided Design (CAD) model of the quad-rotor helicopter, then describes how CFD is used as a computer based mathematical modelling tool to simulate and analyze the effects of wind flow patterns on the performance and control of the quadrotor helicopter. For the purpose of developing a robust adaptive controller for the quad-rotor helicopter to withstand any environmental constraints, which is not within the scope of this paper; this work accurately models the quad-rotor static and dynamic characteristics from a limited number of time-accurate CFD simulations.
    • Computer aided design of 3D of renewable energy platform for Togo's smart grid power system infrastructure

      Komlanvi, Moglo; University of Derby (2018-09-04)
      The global requirement for sustainable energy provision will become increasingly important over the next fifty years as the environmental effects of fossil fuel use become apparent. Therefore, the issues surrounding integration of renewable energy supplies need to be considered carefully. The focus of this work was the development of an innovative computer aided design of a 3 Dimensional renewable energy platform for Togo’s smart grid power system infrastructure. It demonstrates its validation for industrial, commercial and domestic applications. The Wind, Hydro, and PV system forming our 3 Dimensional renewable energy power generation systems introduces a new path for hybrid systems which extends the system capacities to include, a stable and constant clean energy supply, a reduced harmonic distortion, and an improved power system efficiency. Issues requiring consideration in high percentage renewable energy systems therefore includes the reliability of the supply when intermittent sources of electricity are being used, and the subsequent necessity for storage and back-up generation The adoption of Genetic algorithms in this case was much suited in minimizing the THD as the adoption of the CHB-MLI was ideal for connecting renewable energy sources with an AC grid. Cascaded inverters have also been proposed for use as the main traction drive in electric vehicles, where several batteries or ultra-capacitors are well suited to serve as separate DC sources. The simulation done in various non-linear load conditions showed the proportionality of an integral control based compensating cascaded passive filter thereby balancing the system even in non-linear load conditions. The measured total harmonic distortion of the source currents was found to be 2.36% thereby in compliance with IEEE 519-1992 and IEC 61000-3 standards for harmonics This work has succeeded in developing a more complete tool for analysing the feasibility of integrated renewable energy systems. This will allow informed decisions to be made about the technical feasibility of supply mix and control strategies, plant type, sizing and storage sizing, for any given area and range of supply options. The developed 3D renewable energy platform was examined and evaluated using CAD software analysis and a laboratory base mini test. The initial results showed improvements compared to other hybrid systems and their existing control systems. There was a notable improvement in the dynamic load demand and response, stability of the system with a reduced harmonic distortion. The derivatives of this research therefore proposes an innovative solution and a path for Togo and its intention of switching to renewable energy especially for its smart grid power system infrastructure. It demonstrates its validation for industrial, commercial and domestic applications
    • A critical analysis of the continued use of Georgian buildings: a case study of Darley Abbey Mills, Derbyshire.

      Deakin, Emmie Lousie; University of Derby (2016)
      This thesis undertakes a critical assessment of the impact of Statutory Legislation and UNESCO World Heritage Designation upon the sustainability and continued use of historic industrial buildings, utilising the late 18th Century Georgian Industrial Buildings of Darley Abbey Mills, Derby, as a case study. This thesis provides an indepth and longitudinal analysis of the morphology and evolution of Darley Abbey Mills between 2006-2015, during this time the assessment of whether the mills would find a sustainable and continued contemporary use has shifted from a concern that the site was slowly disintegrating with the danger of an important historical artefact being lost for ever or becoming irrevocably damaged through lack of maintenance and repair to a position where the future of the mills is looking promising. What makes Darley Abbey Mills so unusual or unique is that it possesses the highest possible levels of statutory protection, but that is also under private ownership. The initial findings in an analysis of policy documents and planning applications between 2006- 2010 was that there was limited engagement with the external heritage and conservations stakeholders or the Local Authority, an ‘umbrella of statutory protection’ was not providing barriers or protecting the site, there was just a lack of action by all parties. This changed during the period 2010-13 when the site came under new unified ownership, the new owners started to make small adaptations and repairs to the site that enabled them to encourage new tenants from the creative and artisan communities to the site, however all of this work was not authorised, nor was planning permission sought. Although there was still a lack of enforcement of what can be seen as ‘aspirational urbanism’, a dialogue was started between the owners and the wider stakeholder community. Between 2013-2015, the relationship between all of the stakeholders became more formalised and an unofficial partnership was formed between the owners and the monitoring bodies that resulted in the successful planning application to adapt the West Mills and Long Mill, which moved some of the way towards ensuring the sustainable and continued use of Darley Abbey Mills.
    • Electro-thermal modelling of electrical power drive systems.

      Trigkidis, Georgios.; University of Derby (2008)
    • Evaluation and improvement on service quality of Chinese university libraries under new information environments.

      Fan,Yue Qian; University of Derby (2018-06)
      The rapid development of information technology in the recent years has added a range of new featuresto the traditional information environment, which has a profound impact on university library services and users. The Quality of Service parameter in library services has reached a broader consensus,which directly reflects customer satisfactions and loyalty. Exploring the evaluation frameworks for service quality in university libraries cannot be undermined in this context. Besides, existing evaluation frameworks of service quality of university library services are also facing numerous challenges due to their imperfections. Thus,there is an urgency and necessity to explore and enhance the efficiencies of the evaluation frameworks of service quality. To this end, this thesis conducts a systematic analysisof evaluation frameworks with a motivation of identifying the core components that needs enhancements for achieving effective service quality in Chinese university libraries through empirical methods. Furthermore, the inferences extracted from the analysis has been exploited to provide suitable recommendations for improving the service quality of university libraries.
    • High Performance Video Stream Analytics System for Object Detection and Classification

      Anjum, Ashiq; Yaseen, Muhammad Usman (University of DerbyCollege of Engineering and Technology, 2019-02-05)
      Due to the recent advances in cameras, cell phones and camcorders, particularly the resolution at which they can record an image/video, large amounts of data are generated daily. This video data is often so large that manually inspecting it for object detection and classification can be time consuming and error prone, thereby it requires automated analysis to extract useful information and meta-data. The automated analysis from video streams also comes with numerous challenges such as blur content and variation in illumination conditions and poses. We investigate an automated video analytics system in this thesis which takes into account the characteristics from both shallow and deep learning domains. We propose fusion of features from spatial frequency domain to perform highly accurate blur and illumination invariant object classification using deep learning networks. We also propose the tuning of hyper-parameters associated with the deep learning network through a mathematical model. The mathematical model used to support hyper-parameter tuning improved the performance of the proposed system during training. The outcomes of various hyper-parameters on system's performance are compared. The parameters that contribute towards the most optimal performance are selected for the video object classification. The proposed video analytics system has been demonstrated to process a large number of video streams and the underlying infrastructure is able to scale based on the number and size of the video stream(s) being processed. The extensive experimentation on publicly available image and video datasets reveal that the proposed system is significantly more accurate and scalable and can be used as a general purpose video analytics system.
    • High Voltage Optical Fibre Sensor for Use in Wire Relay Electrical Protection Systems

      Bashour, Rami; University Of Derby (2016)
      The last few decades have a wide spread use of optical fibre sensors in many applications. Optical fibre sensors have significant benefits over existing conventional sensors such as; high immunity to electromagnetic interference, the ability to transmit signal over long distance at high bandwidth, high resolution, usage in hazardous environments and no need for isolation when working at high voltages. The measurement of high voltages is essential for electrical power systems as it is used as a source of electrical information for Relay Protection Systems (RPS) and load management systems. Electrical Power Systems need to be protected from faults. Faults can range from short circuits, voltage dips, surges, transients etc. The Optical High Voltage sensor developed is based on the principle that the Lead Zirconate Titanate (PZT) electrostriction displacement changes when a voltage is applied to it. The displacement causes the fibre (FBG) which is bonded to the PZT material to have a resultant change in the wavelength. An optical fibre sensor prototype has been developed and evaluated that measures up to 250 V DC. Simulation using ANSYS software has been used to demonstrate the operational capability of the sensor up to 300kV AC. This sensor overcomes some of the challenges of conventional sensors issues like electromagnetic interference, signal transmission, resolution etc. R BASHOUR 2 A novel optical fibre high voltage based on the Kerr effect has been demonstrated. The The Kerr effect was determined using Optsim (R-Soft) software and Maxwell software was used to model an optical Kerr Cell. Maxwell software is an electromagnetic/electric field software used for simulating, analysing, designing 2D and 3D electromagnetic materials and devices. It uses highly accurate Finite Element techniques to solve time varying, static, frequency domain electric and electromagnetic fields. A Relay Protection System on electrical networks was discussed in detail. Keywords: Fibre Bragg Grating, Fibre Optics Sensors, Piezoelectricity, Kerr effect, Relay Protection Systems.
    • Life cycle costing methodology for sustainable commerical office buildings

      Oduyemi, Olufolahan Ifeoluwa; University of Derby (2015)
      The need for a more authoritative approach to investment decision-making and cost control has been a requirement of office spending for many years now. The commercial offices find itself in an increasingly demanding position to allocate its budgets as wisely and prudently as possible. The significant percentage of total spending on buildings demands a more accurate and adaptable method of achieving quality of service within the constraints on the budgets. By adoption of life cycle costing techniques with risk management, practitioners have the ability to make accurate forecasts of likely future running costs. This thesis presents a novel framework (Artificial Neural Networks and probabilistic simulations) for modelling of operating and maintenance historical costs as well as economic performance measures of LCC. The methodology consisted of eight steps and presented a novel approach to modelling the LCC of operating and maintenance costs of two sustainable commercial office buildings. Finally, a set of performance measurement indicators were utilised to draw inference from these results. Therefore, the contribution that this research aimed to achieve was to develop a dynamic LCC framework for sustainable commercial office buildings, and by means of two existing buildings, demonstrate how assumption modelling can be utilised within a probabilistic environment. In this research, the key themes of risk assessment, probabilistic assumption modelling and stochastic assessment of LCC has been addressed. Significant improvements in existing LCC models have been achieved in this research in an attempt to make the LCC model more accurate and meaningful to estate managers and high-level capital investment decision makers A new approach to modelling historical costs and forecasting these costs in sustainable commercial office buildings is presented based upon a combination of ANN methods and stochastic modelling of the annual forecasted data. These models provide a far more accurate representation of long-term building costs as the inherent risk associated with the forecasts is easily quantifiable and the forecasts are based on a sounder approach to forecasting than what was previously used in the commercial sector. A novel framework for modelling the facilities management costs in two sustainable commercial office buildings is also presented. This is not only useful for modelling the LCC of existing commercial office buildings as presented here, but has wider implications for modelling LCC in competing option modelling in commercial office buildings. The processes of assumption modelling presented in this work can be modified easily to represent other types of commercial office buildings. Discussions with policy makers in the real estate industry revealed that concerns were held over how these building costs can be modelled given that available historical data represents wide spending and are not cost specific to commercial office buildings. Similarly, a pilot and main survey questionnaire was aimed at ascertaining current level of LCC application in sustainable construction; ranking drivers and barriers of sustainable commercial office buildings and determining the applications and limitations of LCC. The survey result showed that respondents strongly agreed that key performance indicators and economic performance measures need to be incorporated into LCC and that it is important to consider the initial, operating and maintenance costs of building when conducting LCC analysis, respondents disagreed that the current LCC techniques are suitable for calculating the whole costs of buildings but agreed that there is a low accuracy of historical cost data.
    • A novel service discovery model for decentralised online social networks.

      Yuan, Bo; University of Derby (2018-03)
      Online social networks (OSNs) have become the most popular Internet application that attracts billions of users to share information, disseminate opinions and interact with others in the online society. The unprecedented growing popularity of OSNs naturally makes using social network services as a pervasive phenomenon in our daily life. The majority of OSNs service providers adopts a centralised architecture because of its management simplicity and content controllability. However, the centralised architecture for large-scale OSNs applications incurs costly deployment of computing infrastructures and suffers performance bottleneck. Moreover, the centralised architecture has two major shortcomings: the single point failure problem and the lack of privacy, which challenges the uninterrupted service provision and raises serious privacy concerns. This thesis proposes a decentralised approach based on peer-to-peer (P2P) networks as an alternative to the traditional centralised architecture. Firstly, a self-organised architecture with self-sustaining social network adaptation has been designed to support decentralised topology maintenance. This self-organised architecture exhibits small-world characteristics with short average path length and large average clustering coefficient to support efficient information exchange. Based on this self-organised architecture, a novel decentralised service discovery model has been developed to achieve a semantic-aware and interest-aware query routing in the P2P social network. The proposed model encompasses a service matchmaking module to capture the hidden semantic information for query-service matching and a homophily-based query processing module to characterise user’s common social status and interests for personalised query routing. Furthermore, in order to optimise the efficiency of service discovery, a swarm intelligence inspired algorithm has been designed to reduce the query routing overhead. This algorithm employs an adaptive forwarding strategy that can adapt to various social network structures and achieves promising search performance with low redundant query overhead in dynamic environments. Finally, a configurable software simulator is implemented to simulate complex networks and to evaluate the proposed service discovery model. Extensive experiments have been conducted through simulations, and the obtained results have demonstrated the efficiency and effectiveness of the proposed model.
    • Power efficient and power attacks resistant system design and analysis using aggressive scaling with timing speculation

      Rathnala, Prasanthi; University of Derby (2017-05)
      Growing usage of smart and portable electronic devices demands embedded system designers to provide solutions with better performance and reduced power consumption. Due to the new development of IoT and embedded systems usage, not only power and performance of these devices but also security of them is becoming an important design constraint. In this work, a novel aggressive scaling based on timing speculation is proposed to overcome the drawbacks of traditional DVFS and provide security from power analysis attacks at the same time. Dynamic voltage and frequency scaling (DVFS) is proven to be the most suitable technique for power efficiency in processor designs. Due to its promising benefits, the technique is still getting researchers attention to trade off power and performance of modern processor designs. The issues of traditional DVFS are: 1) Due to its pre-calculated operating points, the system is not able to suit to modern process variations. 2) Since Process Voltage and Temperature (PVT) variations are not considered, large timing margins are added to guarantee a safe operation in the presence of variations. The research work presented here addresses these issues by employing aggressive scaling mechanisms to achieve more power savings with increased performance. This approach uses in-situ timing error monitoring and recovering mechanisms to reduce extra timing margins and to account for process variations. A novel timing error detection and correction mechanism, to achieve more power savings or high performance, is presented. This novel technique has also been shown to improve security of processors against differential power analysis attacks technique. Differential power analysis attacks can extract secret information from embedded systems without knowing much details about the internal architecture of the device. Simulated and experimental data show that the novel technique can provide a performance improvement of 24% or power savings of 44% while occupying less area and power overhead. Overall, the proposed aggressive scaling technique provides an improvement in power consumption and performance while increasing the security of processors from power analysis attacks.
    • A prescriptive analytics approach for energy efficiency in datacentres.

      Panneerselvam, John; University of Derby (University of Derby, 2018-02-19)
      Given the evolution of Cloud Computing in recent years, users and clients adopting Cloud Computing for both personal and business needs have increased at an unprecedented scale. This has naturally led to the increased deployments and implementations of Cloud datacentres across the globe. As a consequence of this increasing adoption of Cloud Computing, Cloud datacentres are witnessed to be massive energy consumers and environmental polluters. Whilst the energy implications of Cloud datacentres are being addressed from various research perspectives, predicting the future trend and behaviours of workloads at the datacentres thereby reducing the active server resources is one particular dimension of green computing gaining the interests of researchers and Cloud providers. However, this includes various practical and analytical challenges imposed by the increased dynamism of Cloud systems. The behavioural characteristics of Cloud workloads and users are still not perfectly clear which restrains the reliability of the prediction accuracy of existing research works in this context. To this end, this thesis presents a comprehensive descriptive analytics of Cloud workload and user behaviours, uncovering the cause and energy related implications of Cloud Computing. Furthermore, the characteristics of Cloud workloads and users including latency levels, job heterogeneity, user dynamicity, straggling task behaviours, energy implications of stragglers, job execution and termination patterns and the inherent periodicity among Cloud workload and user behaviours have been empirically presented. Driven by descriptive analytics, a novel user behaviour forecasting framework has been developed, aimed at a tri-fold forecast of user behaviours including the session duration of users, anticipated number of submissions and the arrival trend of the incoming workloads. Furthermore, a novel resource optimisation framework has been proposed to avail the most optimum level of resources for executing jobs with reduced server energy expenditures and job terminations. This optimisation framework encompasses a resource estimation module to predict the anticipated resource consumption level for the arrived jobs and a classification module to classify tasks based on their resource intensiveness. Both the proposed frameworks have been verified theoretically and tested experimentally based on Google Cloud trace logs. Experimental analysis demonstrates the effectiveness of the proposed framework in terms of the achieved reliability of the forecast results and in reducing the server energy expenditures spent towards executing jobs at the datacentres.
    • Service recommendation and selection in centralized and decentralized environments.

      Ahmed, Mariwan; University of Derby (2017-07-20)
      With the increasing use of web services in everyday tasks we are entering an era of Internet of Services (IoS). Service discovery and selection in both centralized and decentralized environments have become a critical issue in the area of web services, in particular when services having similar functionality but different Quality of Service (QoS). As a result, selecting a high quality service that best suits consumer requirements from a large list of functionally equivalent services is a challenging task. In response to increasing numbers of services in the discovery and selection process, there is a corresponding increase of service consumers and a consequent diversity in Quality of Service (QoS) available. Increases in both sides leads to a diversity in the demand and supply of services, which would result in the partial match of the requirements and offers. Furthermore, it is challenging for customers to select suitable services from a large number of services that satisfy consumer functional requirements. Therefore, web service recommendation becomes an attractive solution to provide recommended services to consumers which can satisfy their requirements.In this thesis, first a service ranking and selection algorithm is proposed by considering multiple QoS requirements and allowing partially matched services to be counted as a candidate for the selection process. With the initial list of available services the approach considers those services with a partial match of consumer requirements and ranks them based on the QoS parameters, this allows the consumer to select suitable service. In addition, providing weight value for QoS parameters might not be an easy and understandable task for consumers, as a result an automatic weight calculation method has been included for consumer requirements by utilizing distance correlation between QoS parameters. The second aspect of the work in the thesis is the process of QoS based web service recommendation. With an increasing number of web services having similar functionality, it is challenging for service consumers to find out suitable web services that meet their requirements. We propose a personalised service recommendation method using the LDA topic model, which extracts latent interests of consumers and latent topics of services in the form of probability distribution. In addition, the proposed method is able to improve the accuracy of prediction of QoS properties by considering the correlation between neighbouring services and return a list of recommended services that best satisfy consumer requirements. The third part of the thesis concerns providing service discovery and selection in a decentralized environment. Service discovery approaches are often supported by centralized repositories that could suffer from single point failure, performance bottleneck, and scalability issues in large scale systems. To address these issues, we propose a context-aware service discovery and selection approach in a decentralized peer-to-peer environment. In the approach homophily similarity was used for bootstrapping and distribution of nodes. The discovery process is based on the similarity of nodes and previous interaction and behaviour of the nodes, which will help the discovery process in a dynamic environment. Our approach is not only considering service discovery, but also the selection of suitable web service by taking into account the QoS properties of the web services. The major contribution of the thesis is providing a comprehensive QoS based service recommendation and selection in centralized and decentralized environments. With the proposed approach consumers will be able to select suitable service based on their requirements. Experimental results on real world service datasets showed that proposed approaches achieved better performance and efficiency in recommendation and selection process.
    • Simulation-based impact analysis for sustainable manufacturing design and management

      University of Derby (2018)
      This research focuses on effective decision-making for sustainable manufacturing design and management. The research contributes to the decision-making tools that can enable sustainability analysts to capture the aspects of the economic, environmental and social dimensions into a common framework. The framework will enable the practitioners to conduct a sustainability impact analysis of a real or proposed manufacturing system and use the outcome to support sustainability decision. In the past, the industries had focused more on the economic aspects in gaining and sustaining their competitive positions; this has changed in the recent years following the Brundtland report which centred on incorporating the sustainability of the future generations into our decision for meeting today’s needs (Brundtland, 1987). The government regulations and legislation, coupled with the changes in consumers’ preference for ethical and environmentally friendly products are other factors that are challenging and changing the way companies, and organisations perceive and drive their competitive goals (Gu et al., 2015). Another challenge is the lack of adequate tools to address the dynamism of the manufacturing environment and the need to balance the business’ competitive goal with sustainability requirements. The launch of the Life Cycle Sustainability Analysis (LCSA) framework further emphasised the needs for the integration and analysis of the interdependencies of the three dimensions for effective decision-making and the control of unintended consequences (UNEP, 2011). Various studies have also demonstrated the importance of interdependence impact analysis and integration of the three sustainability dimensions of the product, process and system levels of sustainability (Jayal et al., 2010; Valdivia et al., 2013; Eastwood and Haapala, 2015). Although there are tools capable of assessing the performance of either one or two of the three sustainability dimensions, the tools have not adequately integrated the three dimensions or address the holistic sustainability issues. Hence, this research proposes an approach to provide a solution for successful interdependence impact analysis and trade-off amongst the three sustainability dimensions and enable support for effective decision-making in a manufacturing environment. This novel approach explores and integrates the concepts and principles of the existing sustainability methodologies and frameworks and the simulation modelling construction process into a common descriptive framework for process level assessment. The thesis deploys Delphi study to verify and validate the descriptive framework and demonstrates its applicability in a case study of a real manufacturing system. The results of the research demonstrate the completeness, conciseness, correctness, clarity and applicability of the descriptive framework. Thus, the outcome of this research is a simulation-based impact analysis framework which provides a new way for sustainability practitioners to build an integrated and holistic computer simulation model of a real system, capable of assessing both production and sustainability performance of a dynamic manufacturing system.
    • Towards an efficient indexing and searching model for service discovery in a decentralised environment.

      Miao, Dejun; University of Derby (2018-05)
      Given the growth and outreach of new information, communication, computing and electronic technologies in various dimensions, the amount of data has explosively increased in the recent years. Centralised systems suffer some limitations to dealing with this issue due to all data is stored in central data centres. Thus, decentralised systems are getting more attention and increasing in popularity. Moreover, efficient service discovery mechanisms have naturally become an essential component in both large-scale and small-scale decentralised systems and. This research study is aimed at modelling a novel efficient indexing and searching model for service discovery in decentralised environments comprising numerous repositories with massive stored services. The main contributions of this research study can be summarised in three components: a novel distributed multilevel indexing model, an optimised searching algorithm and a new simulation environment. Indexing model has been widely used for efficient service discovery. For instance; the inverted index is one of the popular indexing models used for service retrieval in consistent repositories. However, redundancies are inevitable in the inverted index which is significantly time-consuming in the service discovery and retrieval process. This theeis proposes a novel distributed multilevel indexing model (DM-index), which offers an efficient solution for service discovery and retrieval in distributed service repositories comprising massive stored services. The architecture of the proposed indexing model encompasses four hierarchical levels to eliminate redundancy information in service repositories, to narrow the searching space and to reduce the number of traversed services whilst discovering services. Distributed Hash Tables have been widely used to provide data lookup services with logarithmic message costs which only require maintenance of limited amounts of routing states. This thesis develops an optimised searching algorithm, named Double-layer No-redundancy Enhanced Bi-direction Chord (DNEB-Chord), to handle retrieval requests in distributed destination repositories efficiently. This DNEB-Chord algorithm achieves faster routing performances with the double-layer routing mechanism and optimal routing index. The efficiency of the developed indexing and searching model is evaluated through theoretical analysis and experimental evaluation in a newly developed simulation environment, named Distributed Multilevel Bi-direction Simulator (DMBSim), which can be used as cost efficient tool for exploring various service configurations, user retrieval requirements and other parameter settings. Both the theoretical validation and experimental evaluations demonstrate that the service discovery efficiency of the DM-index outperforms the sequential index and inverted index configurations. Furthermore, the experimental evaluation results demostrate that the DNEB-Chord algorithm performs better than the Chord in terms of reducing the incurred hop counts. Finally, simulation results demonstrate that the proposed indexing and searching model can achieve better service discovery performances in large-scale decentralised environments comprising numerous repositories with massive stored services.
    • A Trust Evaluation Framework in Vehicular Ad-Hoc Networks

      Adnane, Asma; Franqueira, Virginia N. L.; Anjum, Ashiq; Ahmad, Farhan (University of DerbyCollege of Engineering and Technology, 2019-03-11)
      Vehicular Ad-Hoc Networks (VANET) is a novel cutting-edge technology which provides connectivity to millions of vehicles around the world. It is the future of Intelligent Transportation System (ITS) and plays a significant role in the success of emerging smart cities and Internet of Things (IoT). VANET provides a unique platform for vehicles to intelligently exchange critical information, such as collision avoidance or steep-curve warnings. It is, therefore, paramount that this information remains reliable and authentic, i.e., originated from a legitimate and trusted vehicle. Due to sensitive nature of the messages in VANET, a secure, attack-free and trusted network is imperative for the propagation of reliable, accurate and authentic information. In case of VANET, ensuring such network is extremely difficult due to its large-scale and open nature, making it susceptible to diverse range of attacks including man-in-the-middle (MITM), replay, jamming and eavesdropping. Trust establishment among vehicles can increase network security by identifying dishonest vehicles and revoking messages with malicious content. For this purpose, several trust models (TMs) have been proposed but, currently, there is no effective way to compare how they would behave in practice under adversary conditions. Further, the proposed TMs are mostly context-dependent. Due to randomly distributed and highly mobile vehicles, context changes very frequently in VANET. Ideally the TMs should perform in every context of VANET. Therefore, it is important to have a common framework for the validation and evaluation of TMs. In this thesis, we proposed a novel Trust Evaluation And Management (TEAM) framework, which serves as a unique paradigm for the design, management and evaluation of TMs in various contexts and in presence of malicious vehicles. Our framework incorporates an asset-based threat model and ISO-based risk assessment for the identification of attacks against critical risks. TEAM has been built using VEINS, an open source simulation environment which incorporates SUMO traffic simulator and OMNET++ discrete event simulator. The framework created has been tested with the implementation of three types of TM (data-oriented, entity-oriented and hybrid) under four different contexts of VANET based on the mobility of both honest and malicious vehicles. Results indicate that TEAM is effective to simulate a wide range of TMs, where the efficiency is evaluated against different Quality of Service (QoS) and security-related criteria. Such framework may be instrumental for planning smart cities and for car manufacturers.