Recent Submissions

  • Parallaxical identities: Architectural semantics of contemporary arts institutions and the curation of cultural identity

    Tracada, Eleni; D'Arcy-Reed, Louis (University of Derby, 2019-09-19)
    The research project interrogates the identity forming principles beneath contemporary arts museum architecture across physical and psychoanalytical dimensions. In identifying a metaphysical distance, or barrier, between the unconscious of the cultural architectural intervention and the identity within the cities’ fabric, the state of a parallaxical identity manifests itself. The parallaxical identity, developed from Slavoj Žižek’s parallax gap in psychoanalysis, elicits the presentation of ego-ideal, ideal-ego, and superego of architectural interventions seen as regenerative for culture, the city and its communities. Developing the parallax within architecture allows the thesis to include a rigorous interrogation of theory across disciplines of psychoanalysis, architecture, contemporary art and museology, whilst also remediating the position of architectural practice beyond its conventional boundaries and rhetoric. Adopting a mixed methodology across theoretical and practical disciplines, the thesis reveals unconscious interpretations and embodied analyses through a weaving of para-architectural methods including, photography, questionnaires, exploratory installations, written prose, and imagined cultural visualisations. Three major arts institutions act as case study analysands for psychoanalytical observation and diagnosis to take place, informing the resulting framework for observing parallaxical identities, whilst also producing recommendations for the future of the cultural institution of the museum/gallery. Alongside the thesis’ position as a critical commentary, a supplementary PhD exhibition proposal centered on Parallaxical Identities questions the role of architecture as a discipline that necessitates para-architectural and psychoanalytic methodologies, whilst also presenting new artistic works in response to the thesis to reveal to audiences’ the haptic and hidden structures within architecture and the ‘expected or unexpected’ parallaxical interventions of place.
  • Dynamic collaboration and secure access of services in multi-cloud environments

    Liu, Lu; Zhu, Shao Ying; Kazim, Muhammad (University of DerbyCollege of Engineering and Technology, 2019-08-19)
    The cloud computing services have gained popularity in both public and enterprise domains and they process a large amount of user data with varying privacy levels. The increasing demand for cloud services including storage and computation requires new functional elements and provisioning schemes to meet user requirements. Multi-clouds can optimise the user requirements by allowing them to choose best services from a large number of services offered by various cloud providers as they are massively scalable, can be dynamically configured, and delivered on demand with large-scale infrastructure resources. A major concern related to multi-cloud adoption is the lack of models for them and their associated security issues which become more unpredictable in a multi-cloud environment. Moreover, in order to trust the services in a foreign cloud users depend on their assurances given by the cloud provider but cloud providers give very limited evidence or accountability to users which offers them the ability to hide some behaviour of the service. In this thesis, we propose a model for multi-cloud collaboration that can securely establish dynamic collaboration between heterogeneous clouds using the cloud on-demand model in a secure way. Initially, threat modelling for cloud services has been done that leads to the identification of various threats to service interfaces along with the possible attackers and the mechanisms to exploit those threats. Based on these threats the cloud provider can apply suitable mechanisms to protect services and user data from these threats. In the next phase, we present a lightweight and novel authentication mechanism which provides a single sign-on (SSO) to users for authentication at runtime between multi-clouds before granting them service access and it is formally verified. Next, we provide a service scheduling mechanism to select the best services from multiple cloud providers that closely match user quality of service requirements (QoS). The scheduling mechanism achieves high accuracy by providing distance correlation weighting mechanism among a large number of services QoS parameters. In the next stage, novel service level agreement (SLA) management mechanisms are proposed to ensure secure service execution in the foreign cloud. The usage of SLA mechanisms ensures that user QoS parameters including the functional (CPU, RAM, memory etc.) and non-functional requirements (bandwidth, latency, availability, reliability etc.) of users for a particular service are negotiated before secure collaboration between multi-clouds is setup. The multi-cloud handling user requests will be responsible to enforce mechanisms that fulfil the QoS requirements agreed in the SLA. While the monitoring phase in SLA involves monitoring the service execution in the foreign cloud to check its compliance with the SLA and report it back to the user. Finally, we present the use cases of applying the proposed model in scenarios such as Internet of Things (IoT) and E-Healthcare in multi-clouds. Moreover, the designed protocols are empirically implemented on two different clouds including OpenStack and Amazon AWS. Experiments indicate that the proposed model is scalable, authentication protocols result only in a limited overhead compared to standard authentication protocols, service scheduling achieves high efficiency and any SLA violations by a cloud provider can be recorded and reported back to the user.
  • Proposing a framework for organisational sustainable development: integrating quality management, supply chain management and sustainability

    Liyanage, Kapila; Bastas, Ali (University of DerbyCollege of Engineering and Technology, 2019-07-04)
    Increasing worldwide demand for products and services is applying a significant pressure on firms and supply chains operationally and financially, along with negative implications on our planet and the public. New approaches are highly required to be adopted by all members of the society, including the businesses for sustainable development. On the other hand, enabling such integration from an organisational management perspective is not straightforward, due to complexities and conflicts associated with balanced integration of economic, environmental and social agendas. Aimed towards addressing this important research requirement, a tailored conceptual framework is presented, constructed upon the synergistic principles of quality management (QM) and supply chain management (SCM) to facilitate integration of triple bottom line sustainability into business management. As the first step of the research, a systematic literature review was conducted, evidencing research gaps, and opportunities. A conceptual framework was established, and an implementation procedure to facilitate operationalisation of the framework was developed including a business diagnostic tool contribution, aiding current state maturity assessment as one of the key implementation steps. These developments were verified, validated and improved through the Delphi method, and applied at an organisation in Cyprus as the final validation step, using the action research method. Positive relationships were established and verified conceptually between the ISO 9001 principles of QM, supply chain integration principle of SCM, and organisational triple bottom line sustainability integration. The relative importance of these principles adopted in the framework were determined based on expert Delphi panel feedback. The action research demonstrated the application of the framework, outlined its contextual implementation factors, and concluded positive effects on the sustainable development of the participating organisation. Several contributions to knowledge were made, including the refinement of existing QM and SCM concepts for organisational sustainability improvement, and formulation of a practical framework including a novel diagnostic tool to facilitate integration of triple bottom line sustainability through QM and SCM. Particularly, a new management perspective was introduced with implications to many organisational managers that adopt ISO 9001 and supply chain integration principles, setting the way for extending these principles beyond their original QM and SCM agendas towards organisational sustainable development.
  • Smart City: A Traffic Signal Control System for Reducing the Effects of Traffic Congestion in Urban Environments

    Hardy, James (University of Derby, 2019-06-10)
    This thesis addresses the detrimental effects of road traffic congestion in the Smart City environment. Urban congestion is a recognisable problem that affects much of the world’s population through delays and pollution although the delays are not an entirely modern phenomena. The progressive increase in urbanisation and the numbers of powered road vehicles have led to an increasing need to control traffic in order to maintain flows and avoid gridlock situations. Signalised methods typically control flows through reduction, frequently increasing delays, holding traffic within the urban area and increasing local pollution. The current levels of vehicular congestion may relate to an increase in traffic volumes of 300% over 50 years while traffic control methods based on delaying moving traffic have changed very little. Mobility and Socio-economics indicate that the number of active road vehicles will increase or at least remain at the same levels in the foreseeable future and as a result congestion will continue to be a problem. The Smart City concept is intended to improve the urban environment through the application of advanced technology. Within the context of road transportation, the urban area consists of a wide variety of low to moderate speed transportation systems ranging from pedestrians to heavy goods vehicles. Urban roadways have a large number of junctions where the transport systems and flows interact presenting additional and more complex challenges as compared to high speed dual carriageways and motorways. Congestion is a function of population density while car ownership is an indicator of affluence; road congestion can therefore be seen as an indicator of local economic and social prosperity. Congestion cannot be resolved while there is a social benefit to urbanisation, high density living and a materialistic population. Recognising that congestion cannot be resolved, this research proposes a method to reduce the undesirable consequences and side effects of traffic congestion such as transit delays, inefficient fuel use and chemical pollution without adversely affecting the social and economic benefits. Existing traffic signal systems manage traffic flows based on traffic arrivals, prediction and traffic census models. Flow modification is accomplished by introducing delays through signal transition in order to prioritise a conflicting direction. It is incorrectly assumed that traffic will always be able to move and therefore signal changes will always have an effect. Signal transitions result in lost time at the junction. Existing Urban Traffic Control systems have limited capability as they are unable to adapt immediately to unexpected conditions, have a finite response, cannot modify stationary flow and may introduce needless losses through inefficient transition. This research proposes and develops Available Forward Road Capacity (AFRC), an algorithm with the ability to detect the onset of congestion, actively promote clearance, prevent unnecessary losses due to ineffective transitions and can influence other AFRC equipped junctions to ensure the most efficient use of unoccupied road capacity. AFRC is an additional function that can be applied to existing traffic controllers, becoming active only during congestion conditions; as a result it cannot increase congestion above current levels. By reducing the duration of congestion periods, AFRC reduces delays, improves the efficiency of fuel use and reduces pollution. AFRC is a scalable, multi-junction generalised solution which is able to manage traffic from multiple directions without prior tuning; it can detect and actively resolve problems with stationary traffic. AFRC is evaluated using a commercial traffic simulation system and is shown to resolve inbound and outbound congestion in less time than Vehicle Actuated and Fully Timed systems when simulating both morning and evening rush-hours.
  • High Performance Video Stream Analytics System for Object Detection and Classification

    Anjum, Ashiq; Yaseen, Muhammad Usman (University of DerbyCollege of Engineering and Technology, 2019-02-05)
    Due to the recent advances in cameras, cell phones and camcorders, particularly the resolution at which they can record an image/video, large amounts of data are generated daily. This video data is often so large that manually inspecting it for object detection and classification can be time consuming and error prone, thereby it requires automated analysis to extract useful information and meta-data. The automated analysis from video streams also comes with numerous challenges such as blur content and variation in illumination conditions and poses. We investigate an automated video analytics system in this thesis which takes into account the characteristics from both shallow and deep learning domains. We propose fusion of features from spatial frequency domain to perform highly accurate blur and illumination invariant object classification using deep learning networks. We also propose the tuning of hyper-parameters associated with the deep learning network through a mathematical model. The mathematical model used to support hyper-parameter tuning improved the performance of the proposed system during training. The outcomes of various hyper-parameters on system's performance are compared. The parameters that contribute towards the most optimal performance are selected for the video object classification. The proposed video analytics system has been demonstrated to process a large number of video streams and the underlying infrastructure is able to scale based on the number and size of the video stream(s) being processed. The extensive experimentation on publicly available image and video datasets reveal that the proposed system is significantly more accurate and scalable and can be used as a general purpose video analytics system.
  • A Trust Evaluation Framework in Vehicular Ad-Hoc Networks

    Adnane, Asma; Franqueira, Virginia N. L.; Anjum, Ashiq; Ahmad, Farhan (University of DerbyCollege of Engineering and Technology, 2019-03-11)
    Vehicular Ad-Hoc Networks (VANET) is a novel cutting-edge technology which provides connectivity to millions of vehicles around the world. It is the future of Intelligent Transportation System (ITS) and plays a significant role in the success of emerging smart cities and Internet of Things (IoT). VANET provides a unique platform for vehicles to intelligently exchange critical information, such as collision avoidance or steep-curve warnings. It is, therefore, paramount that this information remains reliable and authentic, i.e., originated from a legitimate and trusted vehicle. Due to sensitive nature of the messages in VANET, a secure, attack-free and trusted network is imperative for the propagation of reliable, accurate and authentic information. In case of VANET, ensuring such network is extremely difficult due to its large-scale and open nature, making it susceptible to diverse range of attacks including man-in-the-middle (MITM), replay, jamming and eavesdropping. Trust establishment among vehicles can increase network security by identifying dishonest vehicles and revoking messages with malicious content. For this purpose, several trust models (TMs) have been proposed but, currently, there is no effective way to compare how they would behave in practice under adversary conditions. Further, the proposed TMs are mostly context-dependent. Due to randomly distributed and highly mobile vehicles, context changes very frequently in VANET. Ideally the TMs should perform in every context of VANET. Therefore, it is important to have a common framework for the validation and evaluation of TMs. In this thesis, we proposed a novel Trust Evaluation And Management (TEAM) framework, which serves as a unique paradigm for the design, management and evaluation of TMs in various contexts and in presence of malicious vehicles. Our framework incorporates an asset-based threat model and ISO-based risk assessment for the identification of attacks against critical risks. TEAM has been built using VEINS, an open source simulation environment which incorporates SUMO traffic simulator and OMNET++ discrete event simulator. The framework created has been tested with the implementation of three types of TM (data-oriented, entity-oriented and hybrid) under four different contexts of VANET based on the mobility of both honest and malicious vehicles. Results indicate that TEAM is effective to simulate a wide range of TMs, where the efficiency is evaluated against different Quality of Service (QoS) and security-related criteria. Such framework may be instrumental for planning smart cities and for car manufacturers.
  • Simulation-based impact analysis for sustainable manufacturing design and management

    University of Derby (2018)
    This research focuses on effective decision-making for sustainable manufacturing design and management. The research contributes to the decision-making tools that can enable sustainability analysts to capture the aspects of the economic, environmental and social dimensions into a common framework. The framework will enable the practitioners to conduct a sustainability impact analysis of a real or proposed manufacturing system and use the outcome to support sustainability decision. In the past, the industries had focused more on the economic aspects in gaining and sustaining their competitive positions; this has changed in the recent years following the Brundtland report which centred on incorporating the sustainability of the future generations into our decision for meeting today’s needs (Brundtland, 1987). The government regulations and legislation, coupled with the changes in consumers’ preference for ethical and environmentally friendly products are other factors that are challenging and changing the way companies, and organisations perceive and drive their competitive goals (Gu et al., 2015). Another challenge is the lack of adequate tools to address the dynamism of the manufacturing environment and the need to balance the business’ competitive goal with sustainability requirements. The launch of the Life Cycle Sustainability Analysis (LCSA) framework further emphasised the needs for the integration and analysis of the interdependencies of the three dimensions for effective decision-making and the control of unintended consequences (UNEP, 2011). Various studies have also demonstrated the importance of interdependence impact analysis and integration of the three sustainability dimensions of the product, process and system levels of sustainability (Jayal et al., 2010; Valdivia et al., 2013; Eastwood and Haapala, 2015). Although there are tools capable of assessing the performance of either one or two of the three sustainability dimensions, the tools have not adequately integrated the three dimensions or address the holistic sustainability issues. Hence, this research proposes an approach to provide a solution for successful interdependence impact analysis and trade-off amongst the three sustainability dimensions and enable support for effective decision-making in a manufacturing environment. This novel approach explores and integrates the concepts and principles of the existing sustainability methodologies and frameworks and the simulation modelling construction process into a common descriptive framework for process level assessment. The thesis deploys Delphi study to verify and validate the descriptive framework and demonstrates its applicability in a case study of a real manufacturing system. The results of the research demonstrate the completeness, conciseness, correctness, clarity and applicability of the descriptive framework. Thus, the outcome of this research is a simulation-based impact analysis framework which provides a new way for sustainability practitioners to build an integrated and holistic computer simulation model of a real system, capable of assessing both production and sustainability performance of a dynamic manufacturing system.
  • Computer aided design of 3D of renewable energy platform for Togo's smart grid power system infrastructure

    Komlanvi, Moglo; University of Derby (2018-09-04)
    The global requirement for sustainable energy provision will become increasingly important over the next fifty years as the environmental effects of fossil fuel use become apparent. Therefore, the issues surrounding integration of renewable energy supplies need to be considered carefully. The focus of this work was the development of an innovative computer aided design of a 3 Dimensional renewable energy platform for Togo’s smart grid power system infrastructure. It demonstrates its validation for industrial, commercial and domestic applications. The Wind, Hydro, and PV system forming our 3 Dimensional renewable energy power generation systems introduces a new path for hybrid systems which extends the system capacities to include, a stable and constant clean energy supply, a reduced harmonic distortion, and an improved power system efficiency. Issues requiring consideration in high percentage renewable energy systems therefore includes the reliability of the supply when intermittent sources of electricity are being used, and the subsequent necessity for storage and back-up generation The adoption of Genetic algorithms in this case was much suited in minimizing the THD as the adoption of the CHB-MLI was ideal for connecting renewable energy sources with an AC grid. Cascaded inverters have also been proposed for use as the main traction drive in electric vehicles, where several batteries or ultra-capacitors are well suited to serve as separate DC sources. The simulation done in various non-linear load conditions showed the proportionality of an integral control based compensating cascaded passive filter thereby balancing the system even in non-linear load conditions. The measured total harmonic distortion of the source currents was found to be 2.36% thereby in compliance with IEEE 519-1992 and IEC 61000-3 standards for harmonics This work has succeeded in developing a more complete tool for analysing the feasibility of integrated renewable energy systems. This will allow informed decisions to be made about the technical feasibility of supply mix and control strategies, plant type, sizing and storage sizing, for any given area and range of supply options. The developed 3D renewable energy platform was examined and evaluated using CAD software analysis and a laboratory base mini test. The initial results showed improvements compared to other hybrid systems and their existing control systems. There was a notable improvement in the dynamic load demand and response, stability of the system with a reduced harmonic distortion. The derivatives of this research therefore proposes an innovative solution and a path for Togo and its intention of switching to renewable energy especially for its smart grid power system infrastructure. It demonstrates its validation for industrial, commercial and domestic applications
  • Evaluation and improvement on service quality of Chinese university libraries under new information environments.

    Fan,Yue Qian; University of Derby (2018-06)
    The rapid development of information technology in the recent years has added a range of new featuresto the traditional information environment, which has a profound impact on university library services and users. The Quality of Service parameter in library services has reached a broader consensus,which directly reflects customer satisfactions and loyalty. Exploring the evaluation frameworks for service quality in university libraries cannot be undermined in this context. Besides, existing evaluation frameworks of service quality of university library services are also facing numerous challenges due to their imperfections. Thus,there is an urgency and necessity to explore and enhance the efficiencies of the evaluation frameworks of service quality. To this end, this thesis conducts a systematic analysisof evaluation frameworks with a motivation of identifying the core components that needs enhancements for achieving effective service quality in Chinese university libraries through empirical methods. Furthermore, the inferences extracted from the analysis has been exploited to provide suitable recommendations for improving the service quality of university libraries.
  • Towards an efficient indexing and searching model for service discovery in a decentralised environment.

    Miao, Dejun; University of Derby (2018-05)
    Given the growth and outreach of new information, communication, computing and electronic technologies in various dimensions, the amount of data has explosively increased in the recent years. Centralised systems suffer some limitations to dealing with this issue due to all data is stored in central data centres. Thus, decentralised systems are getting more attention and increasing in popularity. Moreover, efficient service discovery mechanisms have naturally become an essential component in both large-scale and small-scale decentralised systems and. This research study is aimed at modelling a novel efficient indexing and searching model for service discovery in decentralised environments comprising numerous repositories with massive stored services. The main contributions of this research study can be summarised in three components: a novel distributed multilevel indexing model, an optimised searching algorithm and a new simulation environment. Indexing model has been widely used for efficient service discovery. For instance; the inverted index is one of the popular indexing models used for service retrieval in consistent repositories. However, redundancies are inevitable in the inverted index which is significantly time-consuming in the service discovery and retrieval process. This theeis proposes a novel distributed multilevel indexing model (DM-index), which offers an efficient solution for service discovery and retrieval in distributed service repositories comprising massive stored services. The architecture of the proposed indexing model encompasses four hierarchical levels to eliminate redundancy information in service repositories, to narrow the searching space and to reduce the number of traversed services whilst discovering services. Distributed Hash Tables have been widely used to provide data lookup services with logarithmic message costs which only require maintenance of limited amounts of routing states. This thesis develops an optimised searching algorithm, named Double-layer No-redundancy Enhanced Bi-direction Chord (DNEB-Chord), to handle retrieval requests in distributed destination repositories efficiently. This DNEB-Chord algorithm achieves faster routing performances with the double-layer routing mechanism and optimal routing index. The efficiency of the developed indexing and searching model is evaluated through theoretical analysis and experimental evaluation in a newly developed simulation environment, named Distributed Multilevel Bi-direction Simulator (DMBSim), which can be used as cost efficient tool for exploring various service configurations, user retrieval requirements and other parameter settings. Both the theoretical validation and experimental evaluations demonstrate that the service discovery efficiency of the DM-index outperforms the sequential index and inverted index configurations. Furthermore, the experimental evaluation results demostrate that the DNEB-Chord algorithm performs better than the Chord in terms of reducing the incurred hop counts. Finally, simulation results demonstrate that the proposed indexing and searching model can achieve better service discovery performances in large-scale decentralised environments comprising numerous repositories with massive stored services.
  • A novel service discovery model for decentralised online social networks.

    Yuan, Bo; University of Derby (2018-03)
    Online social networks (OSNs) have become the most popular Internet application that attracts billions of users to share information, disseminate opinions and interact with others in the online society. The unprecedented growing popularity of OSNs naturally makes using social network services as a pervasive phenomenon in our daily life. The majority of OSNs service providers adopts a centralised architecture because of its management simplicity and content controllability. However, the centralised architecture for large-scale OSNs applications incurs costly deployment of computing infrastructures and suffers performance bottleneck. Moreover, the centralised architecture has two major shortcomings: the single point failure problem and the lack of privacy, which challenges the uninterrupted service provision and raises serious privacy concerns. This thesis proposes a decentralised approach based on peer-to-peer (P2P) networks as an alternative to the traditional centralised architecture. Firstly, a self-organised architecture with self-sustaining social network adaptation has been designed to support decentralised topology maintenance. This self-organised architecture exhibits small-world characteristics with short average path length and large average clustering coefficient to support efficient information exchange. Based on this self-organised architecture, a novel decentralised service discovery model has been developed to achieve a semantic-aware and interest-aware query routing in the P2P social network. The proposed model encompasses a service matchmaking module to capture the hidden semantic information for query-service matching and a homophily-based query processing module to characterise user’s common social status and interests for personalised query routing. Furthermore, in order to optimise the efficiency of service discovery, a swarm intelligence inspired algorithm has been designed to reduce the query routing overhead. This algorithm employs an adaptive forwarding strategy that can adapt to various social network structures and achieves promising search performance with low redundant query overhead in dynamic environments. Finally, a configurable software simulator is implemented to simulate complex networks and to evaluate the proposed service discovery model. Extensive experiments have been conducted through simulations, and the obtained results have demonstrated the efficiency and effectiveness of the proposed model.
  • A prescriptive analytics approach for energy efficiency in datacentres.

    Panneerselvam, John; University of Derby (University of Derby, 2018-02-19)
    Given the evolution of Cloud Computing in recent years, users and clients adopting Cloud Computing for both personal and business needs have increased at an unprecedented scale. This has naturally led to the increased deployments and implementations of Cloud datacentres across the globe. As a consequence of this increasing adoption of Cloud Computing, Cloud datacentres are witnessed to be massive energy consumers and environmental polluters. Whilst the energy implications of Cloud datacentres are being addressed from various research perspectives, predicting the future trend and behaviours of workloads at the datacentres thereby reducing the active server resources is one particular dimension of green computing gaining the interests of researchers and Cloud providers. However, this includes various practical and analytical challenges imposed by the increased dynamism of Cloud systems. The behavioural characteristics of Cloud workloads and users are still not perfectly clear which restrains the reliability of the prediction accuracy of existing research works in this context. To this end, this thesis presents a comprehensive descriptive analytics of Cloud workload and user behaviours, uncovering the cause and energy related implications of Cloud Computing. Furthermore, the characteristics of Cloud workloads and users including latency levels, job heterogeneity, user dynamicity, straggling task behaviours, energy implications of stragglers, job execution and termination patterns and the inherent periodicity among Cloud workload and user behaviours have been empirically presented. Driven by descriptive analytics, a novel user behaviour forecasting framework has been developed, aimed at a tri-fold forecast of user behaviours including the session duration of users, anticipated number of submissions and the arrival trend of the incoming workloads. Furthermore, a novel resource optimisation framework has been proposed to avail the most optimum level of resources for executing jobs with reduced server energy expenditures and job terminations. This optimisation framework encompasses a resource estimation module to predict the anticipated resource consumption level for the arrived jobs and a classification module to classify tasks based on their resource intensiveness. Both the proposed frameworks have been verified theoretically and tested experimentally based on Google Cloud trace logs. Experimental analysis demonstrates the effectiveness of the proposed framework in terms of the achieved reliability of the forecast results and in reducing the server energy expenditures spent towards executing jobs at the datacentres.
  • Assessing the credibility of online social network messages.

    Makinde, Oghenefejiro Winnie; University of Derby (University of Derby, 2018-01)
    ABSTRACT Information gathered socially online is a key feature of the growth and development of modern society. Presently the Internet is a platform for the distribution of data. Millions of people use Online Social Networks daily as a tool to get updated with social, political, educational or other occurrences. In many cases information derived from an Online Social Network is acted upon and often shared with other networks, without further assessments or judgments. Many people do not check to see if the information shared is credible. A user may trust the information generated by a close friend without questioning its credibility, in contrast to a message generated by an unknown user. This work considers the concept of credibility in the wider sense, by proposing whether a user can trust the service provider or even the information itself. Two key components of credibility have been explored; trustworthiness and expertise. Credibility has been researched in the past using Twitter as a validation tool. The research was focused on automatic methods of assessing the credibility of sets of tweets using analysis of microblog postings related to trending topics to determine the credibility of tweets. This research develops a framework that can assist the assessment of the credibility of messages in Online Social Networks. Four types of credibility are explored (experienced, surface, reputed and presumed credibility) resulting in a credibility hierarchy. To determine the credibility of messages generated and distributed in Online Social Networks, a virtual network is created, which attributes nodes with individual views to generate messages in the network at random, recording data from a network and analysing the data based on the behaviour exhibited by agents (an agent-based modelling approach). The factors considered for the experiment design included; peer-to-peer networking, collaboration, opinion formation and network rewiring. The behaviour of agents, frequency in which messages are shared and used, the pathway of the messages and how this affects credibility of messages is also considered. A framework is designed and the resulting data are tested using the design. The resulting data generated validated the framework in part, supporting an approach whereby the concept of tagging the message status assists the understanding and application of the credibility hierarchy. Validation was carried out with Twitter data acquired through twitter’s Application Programming Interface (API). There were similarities in the generation and frequency of the message distributions in the network; these findings were also recorded and analysed using the framework proposed. Some limitations were encountered while acquiring data from Twitter, however, there was sufficient evidence of correlation between the simulated and real social network datasets to indicate the validity of the framework.
  • Service recommendation and selection in centralized and decentralized environments.

    Ahmed, Mariwan; University of Derby (2017-07-20)
    With the increasing use of web services in everyday tasks we are entering an era of Internet of Services (IoS). Service discovery and selection in both centralized and decentralized environments have become a critical issue in the area of web services, in particular when services having similar functionality but different Quality of Service (QoS). As a result, selecting a high quality service that best suits consumer requirements from a large list of functionally equivalent services is a challenging task. In response to increasing numbers of services in the discovery and selection process, there is a corresponding increase of service consumers and a consequent diversity in Quality of Service (QoS) available. Increases in both sides leads to a diversity in the demand and supply of services, which would result in the partial match of the requirements and offers. Furthermore, it is challenging for customers to select suitable services from a large number of services that satisfy consumer functional requirements. Therefore, web service recommendation becomes an attractive solution to provide recommended services to consumers which can satisfy their requirements.In this thesis, first a service ranking and selection algorithm is proposed by considering multiple QoS requirements and allowing partially matched services to be counted as a candidate for the selection process. With the initial list of available services the approach considers those services with a partial match of consumer requirements and ranks them based on the QoS parameters, this allows the consumer to select suitable service. In addition, providing weight value for QoS parameters might not be an easy and understandable task for consumers, as a result an automatic weight calculation method has been included for consumer requirements by utilizing distance correlation between QoS parameters. The second aspect of the work in the thesis is the process of QoS based web service recommendation. With an increasing number of web services having similar functionality, it is challenging for service consumers to find out suitable web services that meet their requirements. We propose a personalised service recommendation method using the LDA topic model, which extracts latent interests of consumers and latent topics of services in the form of probability distribution. In addition, the proposed method is able to improve the accuracy of prediction of QoS properties by considering the correlation between neighbouring services and return a list of recommended services that best satisfy consumer requirements. The third part of the thesis concerns providing service discovery and selection in a decentralized environment. Service discovery approaches are often supported by centralized repositories that could suffer from single point failure, performance bottleneck, and scalability issues in large scale systems. To address these issues, we propose a context-aware service discovery and selection approach in a decentralized peer-to-peer environment. In the approach homophily similarity was used for bootstrapping and distribution of nodes. The discovery process is based on the similarity of nodes and previous interaction and behaviour of the nodes, which will help the discovery process in a dynamic environment. Our approach is not only considering service discovery, but also the selection of suitable web service by taking into account the QoS properties of the web services. The major contribution of the thesis is providing a comprehensive QoS based service recommendation and selection in centralized and decentralized environments. With the proposed approach consumers will be able to select suitable service based on their requirements. Experimental results on real world service datasets showed that proposed approaches achieved better performance and efficiency in recommendation and selection process.
  • Power efficient and power attacks resistant system design and analysis using aggressive scaling with timing speculation

    Rathnala, Prasanthi; University of Derby (2017-05)
    Growing usage of smart and portable electronic devices demands embedded system designers to provide solutions with better performance and reduced power consumption. Due to the new development of IoT and embedded systems usage, not only power and performance of these devices but also security of them is becoming an important design constraint. In this work, a novel aggressive scaling based on timing speculation is proposed to overcome the drawbacks of traditional DVFS and provide security from power analysis attacks at the same time. Dynamic voltage and frequency scaling (DVFS) is proven to be the most suitable technique for power efficiency in processor designs. Due to its promising benefits, the technique is still getting researchers attention to trade off power and performance of modern processor designs. The issues of traditional DVFS are: 1) Due to its pre-calculated operating points, the system is not able to suit to modern process variations. 2) Since Process Voltage and Temperature (PVT) variations are not considered, large timing margins are added to guarantee a safe operation in the presence of variations. The research work presented here addresses these issues by employing aggressive scaling mechanisms to achieve more power savings with increased performance. This approach uses in-situ timing error monitoring and recovering mechanisms to reduce extra timing margins and to account for process variations. A novel timing error detection and correction mechanism, to achieve more power savings or high performance, is presented. This novel technique has also been shown to improve security of processors against differential power analysis attacks technique. Differential power analysis attacks can extract secret information from embedded systems without knowing much details about the internal architecture of the device. Simulated and experimental data show that the novel technique can provide a performance improvement of 24% or power savings of 44% while occupying less area and power overhead. Overall, the proposed aggressive scaling technique provides an improvement in power consumption and performance while increasing the security of processors from power analysis attacks.
  • High Voltage Optical Fibre Sensor for Use in Wire Relay Electrical Protection Systems

    Bashour, Rami; University Of Derby (2016)
    The last few decades have a wide spread use of optical fibre sensors in many applications. Optical fibre sensors have significant benefits over existing conventional sensors such as; high immunity to electromagnetic interference, the ability to transmit signal over long distance at high bandwidth, high resolution, usage in hazardous environments and no need for isolation when working at high voltages. The measurement of high voltages is essential for electrical power systems as it is used as a source of electrical information for Relay Protection Systems (RPS) and load management systems. Electrical Power Systems need to be protected from faults. Faults can range from short circuits, voltage dips, surges, transients etc. The Optical High Voltage sensor developed is based on the principle that the Lead Zirconate Titanate (PZT) electrostriction displacement changes when a voltage is applied to it. The displacement causes the fibre (FBG) which is bonded to the PZT material to have a resultant change in the wavelength. An optical fibre sensor prototype has been developed and evaluated that measures up to 250 V DC. Simulation using ANSYS software has been used to demonstrate the operational capability of the sensor up to 300kV AC. This sensor overcomes some of the challenges of conventional sensors issues like electromagnetic interference, signal transmission, resolution etc. R BASHOUR 2 A novel optical fibre high voltage based on the Kerr effect has been demonstrated. The The Kerr effect was determined using Optsim (R-Soft) software and Maxwell software was used to model an optical Kerr Cell. Maxwell software is an electromagnetic/electric field software used for simulating, analysing, designing 2D and 3D electromagnetic materials and devices. It uses highly accurate Finite Element techniques to solve time varying, static, frequency domain electric and electromagnetic fields. A Relay Protection System on electrical networks was discussed in detail. Keywords: Fibre Bragg Grating, Fibre Optics Sensors, Piezoelectricity, Kerr effect, Relay Protection Systems.
  • A critical analysis of the continued use of Georgian buildings: a case study of Darley Abbey Mills, Derbyshire.

    Deakin, Emmie Lousie; University of Derby (2016)
    This thesis undertakes a critical assessment of the impact of Statutory Legislation and UNESCO World Heritage Designation upon the sustainability and continued use of historic industrial buildings, utilising the late 18th Century Georgian Industrial Buildings of Darley Abbey Mills, Derby, as a case study. This thesis provides an indepth and longitudinal analysis of the morphology and evolution of Darley Abbey Mills between 2006-2015, during this time the assessment of whether the mills would find a sustainable and continued contemporary use has shifted from a concern that the site was slowly disintegrating with the danger of an important historical artefact being lost for ever or becoming irrevocably damaged through lack of maintenance and repair to a position where the future of the mills is looking promising. What makes Darley Abbey Mills so unusual or unique is that it possesses the highest possible levels of statutory protection, but that is also under private ownership. The initial findings in an analysis of policy documents and planning applications between 2006- 2010 was that there was limited engagement with the external heritage and conservations stakeholders or the Local Authority, an ‘umbrella of statutory protection’ was not providing barriers or protecting the site, there was just a lack of action by all parties. This changed during the period 2010-13 when the site came under new unified ownership, the new owners started to make small adaptations and repairs to the site that enabled them to encourage new tenants from the creative and artisan communities to the site, however all of this work was not authorised, nor was planning permission sought. Although there was still a lack of enforcement of what can be seen as ‘aspirational urbanism’, a dialogue was started between the owners and the wider stakeholder community. Between 2013-2015, the relationship between all of the stakeholders became more formalised and an unofficial partnership was formed between the owners and the monitoring bodies that resulted in the successful planning application to adapt the West Mills and Long Mill, which moved some of the way towards ensuring the sustainable and continued use of Darley Abbey Mills.
  • Computational fluid dynamics model of a quad-rotor helicopter for dynamic analysis

    Poyi, Gwangtim Timothy; Wu, Mian Hong; Bousbaine, Amar; University of Derby (Pioneer Research and Development Group, 2016-06-30)
    The control and performance of a quad-rotor helicopter UAV is greatly influenced by its aerodynamics, which in turn is affected by the interactions with features in its remote environment. This paper presents details of Computational Fluid Dynamics (CFD) simulation and analysis of a quadrotor helicopter. It starts by presenting how SolidWorks software is used to develop a 3-D Computer Aided Design (CAD) model of the quad-rotor helicopter, then describes how CFD is used as a computer based mathematical modelling tool to simulate and analyze the effects of wind flow patterns on the performance and control of the quadrotor helicopter. For the purpose of developing a robust adaptive controller for the quad-rotor helicopter to withstand any environmental constraints, which is not within the scope of this paper; this work accurately models the quad-rotor static and dynamic characteristics from a limited number of time-accurate CFD simulations.
  • Cloud BI: A Multi-party Authentication Framework for Securing Business Intelligence on the Cloud

    Al-Aqrabi, Hussain; University of Derby (2016)
    Business intelligence (BI) has emerged as a key technology to be hosted on Cloud computing. BI offers a method to analyse data thereby enabling informed decision making to improve business performance and profitability. However, within the shared domains of Cloud computing, BI is exposed to increased security and privacy threats because an unauthorised user may be able to gain access to highly sensitive, consolidated business information. The business process contains collaborating services and users from multiple Cloud systems in different security realms which need to be engaged dynamically at runtime. If the heterogamous Cloud systems located in different security realms do not have direct authentication relationships then it is technically difficult to enable a secure collaboration. In order to address these security challenges, a new authentication framework is required to establish certain trust relationships among these BI service instances and users by distributing a common session secret to all participants of a session. The author addresses this challenge by designing and implementing a multiparty authentication framework for dynamic secure interactions when members of different security realms want to access services. The framework takes advantage of the trust relationship between session members in different security realms to enable a user to obtain security credentials to access Cloud resources in a remote realm. This mechanism can help Cloud session users authenticate their session membership to improve the authentication processes within multi-party sessions. The correctness of the proposed framework has been verified by using BAN Logics. The performance and the overhead have been evaluated via simulation in a dynamic environment. A prototype authentication system has been designed, implemented and tested based on the proposed framework. The research concludes that the proposed framework and its supporting protocols are an effective functional basis for practical implementation testing, as it achieves good scalability and imposes only minimal performance overhead which is comparable with other state-of-art methods.

View more