• Azimuthally Differential Pion Femtoscopy in Pb-Pb Collisions at √sNN = 2.76 TeV

      Andrews, Lee; Barnby, Lee; Evans, David; Graham, Katie; Jones, Peter; Jusko, Anton; Krivda, Marian; Lietava, Roman; Villalobos, Orlando; Zardoshti, Nima; et al. (2017-06-02)
    • Behavioural Digital Forensics Model: Embedding Behavioural Evidence Analysis into the Investigation of Digital Crimes

      Al Mutawa, Noora; Bryce, Joanne; Franqueira, Virginia N.L.; Marrington, Andrew; Read, Janet C.; University of Derby (Elsevier, 2018-12-15)
      The state-of-the-art and practice show an increased recognition, but limited adoption, of Behavioural Evidence Analysis (BEA) within the Digital Forensics (DF) investigation process. Yet, there is currently no BEA-driven process model and guidelines for DF investigators to follow in order to take advantage of such an approach. This paper proposes the Behavioural Digital Forensics Model to fill this gap. It takes a multidisciplinary approach which incorporates BEA into in-lab investigation of seized devices related to interpersonal cases (i.e., digital crimes involving human interactions between offender(s) and victim(s)). The model was designed based on the application of traditional BEA phases to 35 real cases, and evaluated using 5 real digital crime cases - all from Dubai Police archive. This paper, however, provides details of only one case from this evaluation pool. Compared to the outcome of these cases using a traditional DF investigation process, the new model showed a number of benefits. It allowed a more effective focusing of the investigation, and provided logical directions for identifying the location of further relevant evidence. It also enabled a better understanding and interpretation of victim/offender behaviours (e.g., probable offenders' motivations and modus operandi), which facilitated a more in depth understanding of the dynamics of the specific crime. Finally, in some cases, it enabled the identification of suspect's collaborators, something which was not identified via the traditional investigative process.
    • Behavioural evidence analysis applied to digital forensics: An empirical analysis of child pornography cases using P2P networks

      Mutawa, Noora Al; Bryce, Joanne; Franqueira, Virginia N. L.; Marrington, Andrew; Univesrity of Central Lancashire; University of Derby; Zayed University (IEEE Computer Society, 2015-08)
      The utility of Behavioural Evidence Analysis (BEA) has gained attention in the field of Digital Forensics in recent years. It has been recognized that, along with technical examination of digital evidence, it is important to learn as much as possible about the individuals behind an offence, the victim (s) and the dynamics of a crime. This can assist the investigator in producing a more accurate and complete reconstruction of the crime, in interpreting associated digital evidence, and with the description of investigative findings. Despite these potential benefits, the literature shows limited use of BEA for the investigation of cases of the possession and dissemination of Sexually Exploitative Imagery of Children (SEIC). This paper represents a step towards filling this gap. It reports on the forensic analysis of 15 SEIC cases involving P2P file sharing networks, obtained from the Dubai Police. Results confirmed the predicted benefits and indicate that BEA can assist digital forensic practitioners and prosecutors.
    • Big data analytics in healthcare: A cloud based framework for generating insights

      Anjum, Ashiq; Aizad, Sanna; Arshad, Bilal; Subhani, Moeez; Davies-Tagg, Dominic; Abdullah, Tariq; Antonopoulos, Nikolaos; University of Derby (Springer, 2017)
      With exabytes of data being generated from genome sequencing, a whole new science behind genomic big data has emerged. As technology improves, the cost of sequencing a human genome has gone down considerably increasing the number of genomes being sequenced. Huge amounts of genomic data along with a vast variety of clinical data cannot be handled using existing frameworks and techniques. It is to be efficiently stored in a warehouse where a number of things have to be taken into account. Firstly, the genome data is to be integrated effectively and correctly with clinical data. The other data sources along with their formats have to be identified. Required data is then extracted from these other sources (such as clinical datasets) and integrated with the genome. The main challenge here is to be able to handle the integration complexity as a large number of datasets are being integrated with huge amounts of genome. Secondly, since the data is captured at disparate locations individually by clinicians and scientists, it brings the challenge of data consistency. It has to be made sure that the data consistency is not compromised as it is passed along the warehouse. Checks have to be put in place to make sure the data remains consistent from start to finish. Thirdly, to carry this out effectively, the data infrastructure has to be in the correct order. How frequently the data is accessed plays a crucial role here. Data in frequent use will be handled differently than data which is not in frequent use. Lastly, efficient browsing mechanisms have to put in place to allow the data to be quickly retrieved. The data is then iteratively analysed to get meaningful insights. The challenge here is to perform analysis very quickly. Cloud Computing plays an important role as it is used to provide scalability.
    • Big data analytics: a threat or an opportunity for Knowledge Management?

      Self, Richard; Crane, Lesley; University of Derby (Springer Verlag, 2014-08-26)
      Big Data Analytics is a rapidly developing field which already shows early promising successes. There are considerable synergies between this and Knowledge Management: both have the goal of improving decision-making, fostering innovation, fuelling competitive edge and economic success through the acquisition and application of knowledge. Both operate in a world of increasing deluges of information, with no end in sight. Big Data Analytics can be seen as a threat to the practice of knowledge management: it could relegate the latter to the mists of organizational history in the rush to adopt the latest techniques and technologies. Alternatively, it can be approached as an opportunity for knowledge management in that it wrestles with many of the same issues and dilemmas as knowledge management, The key, it is argued, lies in the application of the latter’s more social and discursive construction of knowledge, a growing trend in knowledge management. This conceptual paper explores the synergies, opportunities and contingencies available to both fields. It identifies challenges and opportunities for future research into the application of Big Data to Knowledge Management.
    • Big Data applications: Making them deliver value

      Self, Richard; University of Derby (31 Media Ltd., 2016-11)
      As a reminder, Software Testing is about both verifying that software meets the specification and also validating that the software system meets the business requirements (PMBOK Guide 4th Ed.). Most of the activity of the software testing teams attempts to verify that the code meets the specification. A small amount of Validation occurs during User Acceptance Testing, at which point it is normal to discover many issues where the system does not do what the user needs or wants. It is only too clear that current approaches to software testing do not, so far, guarantee successful systems development and implementation.
    • Big Data applications: Making them deliver value

      Self, Richard; University of Derby (2016-09-28)
      This session will use a governance framework consisting of the 12 Vs of Big Data to pose important questions that should be considered in order to ensure successful implementation of value creating applications, especially those involving Big Data. It will cover relevant aspects of the design, development, testing and implementation cycle. Testing software has often been used as the gatekeeper for inadequacies in the analysis, design and coding processes and has been used to engineer quality back into a product. This is extremely expensive and time consuming. There are more effective ways of developing software. The current climate of apparently perpetual regular beta releases of apps can lead to significant reputational risks for businesses, hence the importance of considering the overall cycle as part of the software testing perspective in order to ensure business success.
    • Big earth data: a comprehensive analysis of visualization analytics issues

      Merritt, Patrick; Bi, Haixia; Davis, Bradley; Windmill, Christopher; Xue, Yong; University of Derby (Taylor and Francis, 2019-02-26)
      Big Earth Data analysis is a complex task requiring the integration of many skills and technologies. This paper provides a comprehensive review of the technology and terminology within the Big Earth Data problem space and presents examples of state-of-the-art projects in each major branch of Big Earth Data research. Current issues within Big Earth Data research are highlighted and potential future solutions identified.
    • Big IoT data mining for real-time energy disaggregation in buildings

      Mocanu, Decebal Constantin; Mocanu, Elena; Nguyen, Phuong H.; Gibescu, Madeleine; Liotta, Antonio (IEEE, 2016)
    • Big-Data analytics and cloud computing: Theory, algorithms and applications

      Hill, Richard; Trovati, Marcello; Liu, Lu; Anjum, Ashiq; Zhu, Shao Ying; University of Derby (Springer, 2015)
      This book reviews the theoretical concepts, leading-edge techniques and practical tools involved in the latest multi-disciplinary approaches addressing the challenges of big data. Illuminating perspectives from both academia and industry are presented by an international selection of experts in big data science. Topics and features: describes the innovative advances in theoretical aspects of big data, predictive analytics and cloud-based architectures; examines the applications and implementations that utilize big data in cloud architectures; surveys the state of the art in architectural approaches to the provision of cloud-based big data analytics functions; identifies potential research directions and technologies to facilitate the realization of emerging business models through big data approaches; provides relevant theoretical frameworks, empirical research findings, and numerous case studies; discusses real-world applications of algorithms and techniques to address the challenges of big datasets.
    • Bio-inspired evolutionary dynamics on complex networks under uncertain cross-inhibitory signals

      Stella, Leonardo; Bauso, Dario; University of Sheffield (Elsevier BV, 2018-11-22)
      Given a large population of agents, each agent has three possiblechoices between option 1 or 2 or no option. The two options are equally favorable and the population has to reach consensus on one of the two options quickly and in a distributed way. The more popular an option is, the more likely it is to be chosen by uncommitted agents. Agents committed to one option can be attracted by those committed to the other option through a cross-inhibitory signal. This model originates in the context of honeybee swarms, and we generalize it to duopolistic competition and opinion dynamics. The contributions of this work include (i) the formulation of a model to explain the behavioral traits of the honeybees in the case where the interactions are modeled through complex networks, (ii) the study of the individual and collective behavior that leads to deadlock or consensus depending on a threshold for the cross-inhibitory parameter, (iii) the analysis of the impact of the connectivity on consensus, and (iv) the study of absolute stability for the collective system under time-varying and uncertain cross-inhibitory parameter.
    • Bio-inspired evolutionary game dynamics in symmetric and asymmetric models

      Stella, Leonardo; Bauso, Dario; University of Sheffield (Institute of Electrical and Electronics Engineers (IEEE), 2018-05-18)
      A large population of players has to reach consensus in a distributed way between two options. The two options can be equally favorable or one option can have a higher intrinsic value (asymmetric parameters). In both cases, uncommitted players choose one of the two options depending on the popularity of that option, while committed players can be attracted by those committed to the other option via cross-inhibitory signals. We illustrate the model in different application domains including honeybee swarms, duopolistic competition and opinion dynamics. The main contributions of this letter are as follows: 1) we develop an evolutionary game model to explain the behavioral traits of the honeybees where this model originates; 2) we study individuals' and collective behavior including conditions for local asymptotic stability of the equilibria; 3) we study thresholds on the cross-inhibitory signal for the symmetric case and for the corresponding model with heterogeneous connectivity in the case of asymmetric structure with asymmetric parameters; and 4) we study conditions for stability and passivity properties for the collective system under time-varying and uncertain cross-inhibitory parameter in the asymmetric structure and parameters.
    • Blessing of dimensionality at the edge and geometry of few-shot learning

      Tyukin, Ivan Y.; Gorban, Alexander N.; McEwan, Alistair A.; Meshkinfamfard, Sepehr; Tang, Lixin; University of Leicester; Lobachevsky University, Russia; St Petersburg State Electrotechnical University, Russia; University College London; Northeastern University, China; et al. (Elsevier BV, 2021-02-03)
      In this paper we present theory and algorithms enabling classes of Artificial Intelligence (AI) systems to continuously and incrementally improve with a priori quantifiable guarantees – or more specifically remove classification errors – over time. This is distinct from state-of-the-art machine learning, AI, and software approaches. The theory enables building few-shot AI correction algorithms and provides conditions justifying their successful application. Another feature of this approach is that, in the supervised setting, the computational complexity of training is linear in the number of training samples. At the time of classification, the computational complexity is bounded by few inner product calculations. Moreover, the implementation is shown to be very scalable. This makes it viable for deployment in applications where computational power and memory are limited, such as embedded environments. It enables the possibility for fast on-line optimisation using improved training samples. The approach is based on the concentration of measure effects and stochastic separation theorems and is illustrated with an example on the identification faulty processes in Computer Numerical Control (CNC) milling and with a case study on adaptive removal of false positives in an industrial video surveillance and analytics system.
    • Blind image watermark detection algorithm based on discrete shearlet transform using statistical decision theory

      Ahmaderaghi, Baharak; Kurugollu, Fatih; Rincon, Jesus Martinez Del; Bouridane, Ahmed; Queen's University, Belfast (IEEE, 2018-01-15)
      Blind watermarking targets the challenging recovery of the watermark when the host is not available during the detection stage.This paper proposes Discrete Shearlet Transform (DST) as a new embedding domain for blind image watermarking. Our novel DST blind watermark detection system uses a non-additive scheme based on the statistical decision theory. It first computes the Probability Density Function (PDF) of the DST coefficients modelled as a Laplacian distribution. The resulting likelihood ratio is compared with a decision threshold calculated using Neyman-Pearson criterion to minimise the missed detection subject to a fixed false alarm probability. Our method is evaluated in terms of imperceptibility, robustness and payload against different attacks (Gaussian noise, Blurring, Cropping, Compression and Rotation) using 30 standard grayscale images covering different characteristics (smooth, more complex with a lot of edges and high detail textured regions). The proposed method shows greater windowing flexibility with more sensitive to directional and anisotropic features when compared against Discrete Wavelet and Contourlets.
    • Blockchain standards for compliance and trust

      Anjum, Ashiq; Sporny, Manu; Sill, Alan; University of Derby; Digital Bazaar; Texas Tech University (IEEE, 2017-10-12)
      Blockchain methods are emerging as practical tools for validation, record-keeping, and access control in addition to their early applications in cryptocurrency. This column explores the options for use of blockchains to enhance security, trust, and compliance in a variety of industry settings and explores the current state of blockchain standards.
    • Blockchain-Based Distributed Marketplace.

      Kabi, Oliver R.; Franqueira, Virginia N. L.; University of Derby (Springer Nature, 2019-01-03)
      Developments in Blockchain technology have enabled the creation of smart contracts; i.e., self-executing code that is stored and executed on the Blockchain. This has led to the creation of distributed, decentralised applications, along with frameworks for developing and deploying them easily. This paper describes a proof-of-concept system that implements a distributed online marketplace using the Ethereum framework, where buyers and sellers can engage in e-commerce transactions without the need of a large central entity coordinating the process. The performance of the system was measured in terms of cost of use through the concept of ‘gas usage’. It was determined that such costs are significantly less than that of Amazon and eBay for high volume users. The findings generally support the ability to use Ethereum to create a distributed on-chain market, however, there are still areas that require further research and development.
    • Botnet detection used fast-flux technique, based on adaptive dynamic evolving spiking neural network algorithm

      Almomani, Ammar; Nawasrah, Ahmad Al; Alauthman, Mohammad; Betar, Mohammed Azmi Al; Meziane, Farid; Al-Balqa Applied University, Irbid, Jordan; Taibah University, Median, Saudia Arabia; Zarqa University, Jordan; University of Derby (Inderscience, 2021-01-28)
      A botnet refers to a group of machines. These machines are controlled distantly by a specific attacker. It represents a threat facing the web and data security. Fast-flux service network (FFSN) has been engaged by bot herders for cover malicious botnet activities. It has been engaged by bot herders for increasing the lifetime of malicious servers through changing the IP addresses of the domain name quickly. In the present research, we aimed to propose a new system. This system is named fast flux botnet catcher system (FFBCS). This system can detect FF-domains in an online mode using an adaptive dynamic evolving spiking neural network algorithm. Comparing with two other related approaches the proposed system shows a high level of detection accuracy, low false positive and negative rates, respectively. It shows a high performance. The algorithm's proposed adaptation increased the accuracy of the detection. For instance, this accuracy reached (98.76%) approximately.
    • A boundary class for the k-path partition problem.

      Korpelainen, Nicholas; University of Derby (Elsevier, 2018-06-14)
      We establish the first known boundary class for the k-path partition problem and deduce that for a graph class defined by finitely many minimal forbidden induced subgraphs, the k-path partition problem remains NP-hard unless one of the forbidden induced subgraphs is a subcubic tree (a tree of maximum degree at most 3) with at most one vertex of degree 3.
    • Bringing the Blessing of Dimensionality to the Edge

      Tyukin, Ivan Y.; Gorban, Alexander N; McEwan, Alistair; Meshkinfamfard, Sepehr; University of Leicester; Lobachevsky University, Russia (IEEE, 2019-09-30)
      In this work we present a novel approach and algorithms for equipping Artificial Intelligence systems with capabilities to become better over time. A distinctive feature of the approach is that, in the supervised setting, the approaches' computational complexity is sub-linear in the number of training samples. This makes it particularly attractive in applications in which the computational power and memory are limited. The approach is based on the concentration of measure effects and stochastic separation theorems. The algorithms are illustrated with examples.
    • Calibration approaches for higher order ambisonic microphones

      Middlicott, Charlie; Wiggins, Bruce; University of Derby; Sky Labs (Audio Engineering Society, 2019-10-08)
      Recent years have seen an increase in the capture and production of ambisonic material due to companies such as YouTube and Facebook utilizing ambisonics for spatial audio playback. Consequently, there is now a greater need for affordable high order microphone arrays due to this uptake in technology. This work details the development of a five-channel circular horizontal ambisonic microphone intended as a tool to explore various optimization techniques, focusing on capsule calibration & pre-processing approaches for unmatched capsules.