• Development of an ambisonic guitar system GASP: Guitars with ambisonic spatial performance

      Werner, Duncan; Wiggins, Bruce; Fitzmaurice, Emma; University of Derby (CRC Press/ Routledge, 2021-01-22)
      Ambisonics, pioneered by Michael Gerzon (1977,1985), is a kernel-based 3D surround sound system. The encoding (recording or panning) of the audio is separated from the decoding (or rendering) of the audio to speaker feeds or, more recently, head tracked headphones (by binaurally decoding the Ambisonic sound field). Audio encoded in this way can be rendered to any number of speakers in almost any position in 3D space, as long as the positions of the speakers are known. Moreover, Ambisonics is a system optimised around a number of psycho-acoustic criteria which, when implemented, reduce the variability of audio no matter what speaker arrangement is used for reproduction. This allows for a `mix once' system where subsequent remixing is not necessary when replayed over different loudspeaker systems or headphones and allows for full 3D reproduction. The Ambisonics system is finally gaining some traction due to its use in Virtual Reality audio, using the ambiX standard (Nachbar et al. 2011) but few instruments exist that make use of this 3D spatial audio format, although previous studies into some aspects of the relationship between instruments, performance and spatialisation is available, for example, see Pysiewica and Weinzierl (2017), Graham and Bridges (2017), Bates (2010), Pukette (2007) and Graham (2012). The system combines custom and off-the-shelf hardware/software to create both a live performance Ambisonic guitar system, and virtual reality (VR) ready, binaural performance instrument. The system comprises of two aspects: firstly as an innovative audio project, fusing the musical with the technical, combining individual string timbralisation with Ambisonic surround sound. And secondly as an artistic musical project, providing alternative experimental surround sound production ideas for the guitarist and music producer, with potential applications in the Sound Arts world as well as commercial musical applications. This paper explores the possibilities of the Guitar as a spatial instrument detailing the technical and artistic processes involved in the production and live performance of the instrument. Key features of the described system include: Multichannel hexaphonic guitar pickups facilitate the guitar system to process individual strings independently for both timbre and spatial location. Guitar production timbre and effects are achieved with Line 6 Helix commercial sound processing software for individual string timbralisation. Ambisonic surround-sound performance: spatial positioning is achieved using our own bespoke WigWare algorithms and can be heard over either an array of circular (2D) or spherical (3D) loudspeakers, alternatively the user can listen to the output with headphones using binaural implementation. Rhythmic gate-switching of individual strings, such that either simple or complex polyrhythms can be programmed or performed live across individual strings (producing similar results to a keyboard controlled arpeggiator). ‘Auditory Scenes’ have been developed for presenting combinations of individual string timbres, spatial, and rhythmic arpeggiator parameters. The system can be applied to post-production sound manipulation, or as a real-time live Ambisonic performance instrument within a concert environment. These two categories can yield differing production possibilities. We have also identified potential applications for guitar training and education.
    • On some new arithmetic properties of the generalized Lucas sequences

      Andrica, Dorin; Bagdasar, Ovidiu; Babeş-Bolyai University of Cluj-Napoca, Cluj-Napoca, Romania; University of Derby (Springer Science and Business Media LLC, 2021-01-21)
      Some arithmetic properties of the generalized Lucas sequences are studied, extending a number of recent results obtained for Fibonacci, Lucas, Pell, and Pell–Lucas sequences. These properties are then applied to investigate certain notions of Fibonacci, Lucas, Pell, and Pell–Lucas pseudoprimality, for which we formulate some conjectures.
    • Diagnostic model for the society safety under COVID-19 pandemic conditions

      Varotsos, Costas A.; Krapivin, Vladimir F.; Xue, Yong; University of Athens, Greece; Kotelnikov’s Institute of Radioengineering and Electronics, Russian Academy of Sciences; University of Mining and Technology, Xuzhou, China; University of Derby (Elsevier BV, 2021-01-11)
      The aim of this paper is to develop an information-modeling method for assessing and predicting the consequences of the COVID-19 pandemic. To this end, a detailed analysis of official statistical information provided by global and national organizations is carried out. The developed method is based on the algorithm of multi-channel big data processing considering the demographic and socio-economic information. COVID-19 data are analyzed using an instability indicator and a system of differential equations that describe the dynamics of four groups of people: susceptible, infected, recovered and dead. Indicators of the global sustainable development in various sectors are considered to analyze COVID-19 data. Stochastic processes induced by COVID-19 are assessed with the instability indicator showing the level of stability of official data and the reduction of the level of uncertainty. It turns out that the number of deaths is rising with the Human Development Index. It is revealed that COVID-19 divides the global population into three groups according to the relationship between Gross Domestic Product and the number of infected people. The prognosis for the number of infected people in December 2020 and January-February 2021 shows negative events which will decrease slowly.
    • Simulation and experimental investigation into a photovoltaic and fuel cell hybrid integration power system for a typical small house application

      Djoudi, H; Benyahia, N; Badji, A; Bousbaine, Amar; Moualek, R; Aissou, S; Benamrouche, N; University of Tizi-Ouzou, Tizi-Ouzou, Algeria; French Naval Academy, Brest, France; Haute Alsace University, Mulhouse, France; et al. (Taylor & Francis, 2021-01-08)
      The paper addresses the simulation of a novel real-time implementation of a photovoltaic (PV) and fuel cell (FC) hybrid integration power system. The hybrid system has the potential of reducing the dependency on batteries, leading to reduced cost and increased life span of the whole system using the Proton Exchange Membrane (PEM) fuel cell. The interface structure of the hybrid system has been explored incorporating the Maximum Power Point Technique (MPPT) for maximum power extraction. The simulation of the hybrid system including fuel cell, PhotoVoltaic panels (PVs) and battery has been carried out using SimPowerSystems. An innovative Real Time Interface (RTI) approach using the concept of the Hardware-In-the-Loop (HIL) has been presented for a fast dynamic response of a closed loop control of the hybrid system. The corroboration of the hybrid system is validated experimentally, using a real photovoltaic panel connected to a PEM fuel cell emulator and battery. The PVs are controlled by the perturbation and observation Maximum Power point (MPP) technique and the PEM fuel cell is controlled through a boost DC-DC converter using current mode control. The whole system is implemented on the dSPACE 1103 platform for real-time interface and control strategies. The overall behavior of the hybrid system has been critically analyzed and corroboration of the simulated and experimental results have been presented.
    • NOTRINO: a NOvel hybrid TRust management scheme for INternet-Of-vehicles

      Ahmad, Farhan; Kurugollu, Fatih; Kerrache, Chaker Abdelaziz; Sezer, Sakir; Liu, Lu; Coventry University; University of Derby; Universit Amar Telidji Laghouat, 243326 Laghouat, Laghouat, Algeria; Queen's University Belfast; University of Leicester (IEEE, 2021-01-05)
      Internet-of-Vehicles (IoV) is a novel technology to ensure safe and secure transportation by enabling smart vehicles to communicate and share sensitive information with each other. However, the realization of IoV in real-life depends on several factors, including the assurance of security from attackers and propagation of authentic, accurate and trusted information within the network. Further, the dissemination of compromised information must be detected and vehicle disseminating such malicious messages must be revoked from the network. To this end, trust can be integrated within the network to detect the trustworthiness of the received information. However, most of the trust models in the literature relies on evaluating node or data at the application layer. In this study, we propose a novel hybrid trust management scheme, namely, NOTRINO, which evaluates trustworthiness on the received information in two steps. First step evaluates trust on the node itself at transport layer, while second step computes trustworthiness of the data at application layer. This mechanism enables the vehicles to efficiently model and evaluate the trustworthiness on the received information. The performance and accuracy of NOTRINO is rigorously evaluated under various realistic trust evaluation criteria (including precision, recall, F-measure and trust). Furthermore, the efficiency of NOTRINO is evaluated in presence of malicious nodes and its performance is benchmarked against three hybrid trust models. Extensive simulations indicate that NOTRINO achieve over 75% trust level as compared to benchmarked trust models where trust level falls below 60% for a network with 35% malicious nodes. Similarly, 92% precision and 87% recall are achieved simultaneously with NOTRINO for the same network, comparing to benchmark trust models where precision and recall falls below 87% and 85% respectively.
    • Efficient resampling for fraud detection during anonymised credit card transactions with unbalanced datasets

      Mrozek, Petr; Panneerselvam, John; Bagdasar, Ovidiu; University of Derby; University of Leicester (IEEE, 2020-12-30)
      The rapid growth of e-commerce and online shopping have resulted in an unprecedented increase in the amount of money that is annually lost to credit card fraudsters. In an attempt to address credit card fraud, researchers are leveraging the application of various machine learning techniques for efficiently detecting and preventing fraudulent credit card transactions. One of the prevalent common issues around the analytics of credit card transactions is the highly unbalanced nature of the datasets, which is frequently associated with the binary classification problems. This paper intends to review, analyse and implement a selection of notable machine learning algorithms such as Logistic Regression, Random Forest, K-Nearest Neighbours and Stochastic Gradient Descent, with the motivation of empirically evaluating their efficiencies in handling unbalanced datasets whilst detecting credit card fraud transactions. A publicly available dataset comprising 284807 transactions of European cardholders is analysed and trained with the studied machine learning techniques to detect fraudulent transactions. Furthermore, this paper also evaluates the incorporation of two notable resampling methods, namely Random Under-sampling and Synthetic Majority Oversampling Techniques (SMOTE) in the aforementioned algorithms, in order to analyse their efficiency in handling unbalanced datasets. The proposed resampling methods significantly increased the detection ability, the most successful technique of combination of Random Forest with Random Under-sampling achieved the recall score of 100% in contrast to the recall score 77% of model without resampling technique. The key contribution of this paper is the postulation of efficient machine learning algorithms together with suitable resampling methods, suitable for credit card fraud detection with unbalanced dataset.
    • Explaining probabilistic Artificial Intelligence (AI) models by discretizing Deep Neural Networks

      Saleem, Rabia; Yuan, Bo; Kurugollu, Fatih; Anjum, Ashiq; University of Derby; University of Leicester (IEEE, 2020-12-30)
      Artificial Intelligence (AI) models can learn from data and make decisions without any human intervention. However, the deployment of such models is challenging and risky because we do not know how the internal decisionmaking is happening in these models. Especially, the high-risk decisions such as medical diagnosis or automated navigation demand explainability and verification of the decision making process in AI algorithms. This research paper aims to explain Artificial Intelligence (AI) models by discretizing the black-box process model of deep neural networks using partial differential equations. The PDEs based deterministic models would minimize the time and computational cost of the decision-making process and reduce the chances of uncertainty that make the prediction more trustworthy.
    • A survey of interpretability of machine learning in accelerator-based high energy physics

      Turvill, Danielle; Barnby, Lee; Yuan, Bo; Zahir, Ali; University of Derby (IEEE, 2020-12-28)
      Data intensive studies in the domain of accelerator-based High Energy Physics, HEP, have become increasingly more achievable due to the emergence of machine learning with high-performance computing and big data technologies. In recent years, the intricate nature of physics tasks and data has prompted the use of more complex learning methods. To accurately identify physics of interest, and draw conclusions against proposed theories, it is crucial that these machine learning predictions are explainable. For it is not enough to accept an answer based on accuracy alone, but it is important in the process of physics discovery to understand exactly why an output was generated. That is, completeness of a solution is required. In this paper, we survey the application of machine learning methods to a variety of accelerator-based tasks in a bid to understand what role interpretability plays within this area. The main contribution of this paper is to promote the need for explainable artificial intelligence, XAI, for the future of machine learning in HEP.
    • Large-scale Data Integration Using Graph Probabilistic Dependencies (GPDs)

      Zada, Muhammad Sadiq Hassan; Yuan, Bo; Anjum, Ashiq; Azad, Muhammad Ajmal; Khan, Wajahat Ali; Reiff-Marganiec, Stephan; University of Derby; University of Leicester (IEEE, 2020-12-28)
      The diversity and proliferation of Knowledge bases have made data integration one of the key challenges in the data science domain. The imperfect representations of entities, particularly in graphs, add additional challenges in data integration. Graph dependencies (GDs) were investigated in existing studies for the integration and maintenance of data quality on graphs. However, the majority of graphs contain plenty of duplicates with high diversity. Consequently, the existence of dependencies over these graphs becomes highly uncertain. In this paper, we proposed graph probabilistic dependencies (GPDs) to address the issue of uncertainty over these large-scale graphs with a novel class of dependencies for graphs. GPDs can provide a probabilistic explanation for dealing with uncertainty while discovering dependencies over graphs. Furthermore, a case study is provided to verify the correctness of the data integration process based on GPDs. Preliminary results demonstrated the effectiveness of GPDs in terms of reducing redundancies and inconsistencies over the benchmark datasets.
    • Tweets classification and sentiment analysis for personalized tweets recommendation

      Batool, Rabia; Satti, Fahad Ahmed; Hussain, Jamil; Khan, Wajahat Ali; Khan, Adil Mehmood; Hayat, Bashir; University of Derby (Hindawi, 2020-12-17)
      Mining social network data and developing user profile from unstructured and informal data are a challenging task. The proposed research builds user profile using Twitter data which is later helpful to provide the user with personalized recommendations. Publicly available tweets are fetched and classified and sentiments expressed in tweets are extracted and normalized. This research uses domain-specific seed list to classify tweets. Semantic and syntactic analysis on tweets is performed to minimize information loss during the process of tweets classification. After precise classification and sentiment analysis, the system builds user interest-based profile by analyzing user’s post on Twitter to know about user interests. The proposed system was tested on a dataset of almost 1 million tweets and was able to classify up to 96% tweets accurately.
    • Application of the Lomb-Scargle Periodogram to Investigate Heart Rate Variability during Haemodialysis

      Stewart, Jill; Stewart, Paul; Walker, Tom; Gullapudi, Latha; Eldehni, Mohamed T; Selby, Nicholas M; Taal, Maarten W; University of Derby; University of Nottingham; Royal Derby Hospital (Hindawi, 2020-12-08)
      Short-term cardiovascular compensatory responses to perturbations in the circulatory system caused by haemodialysis can be investigated by the spectral analysis of heart rate variability, thus providing an important variable for categorising individual patients' response, leading to a more personalised treatment. This is typically accomplished by resampling the irregular heart rate to generate an equidistant time series prior to spectral analysis, but resampling can further distort the data series whose interpretation can already be compromised by the presence of artefacts. The Lomb-Scargle periodogram provides a more direct method of spectral analysis as this method is specifically designed for large, irregularly sampled, and noisy datasets such as those obtained in clinical settings. However, guidelines for preprocessing patient data have been established in combination with equidistant time-series methods and their validity when used in combination with the Lomb-Scargle approach is missing from literature. This paper examines the effect of common preprocessing methods on the Lomb-Scargle power spectral density estimate using both real and synthetic heart rate data and will show that many common techniques for identifying and editing suspect data points, particularly interpolation and replacement, will distort the resulting power spectrum potentially misleading clinical interpretations of the results. Other methods are proposed and evaluated for use with the Lomb-Scargle approach leading to the main finding that suspicious data points should be excluded rather than edited, and where required, denoising of the heart rate signal can be reliably accomplished by empirical mode decomposition. Some additional methods were found to be particularly helpful when used in conjunction with the Lomb-Scargle periodogram, such as the use of a false alarm probability metric to establish whether spectral estimates are valid and help automate the assessment of valid heart rate records, potentially leading to greater use of this powerful technique in a clinical setting.
    • Remote sensing evaluation of total suspended solids dynamic with markov model: a case study of inland reservoir across administrative boundary in south China

      Zhao, Jing; Zhang, Fujie; Chen, Shuisen; Wang, Chongyang; Chen, Jinyue; Zhou, Hui; Xue, Yong; Guangdong Engineering Technology Center for Remote Sensing Big Data Application, Guangzhou Institute of Geography, Guangdong Academy of Sciences, China; University of Science and Technology, Kunming, China; China Agricultural University, Beijing, China; et al. (MDPI AG, 2020-12-03)
      Accurate and quantitative assessment of the impact of natural environmental changes and human activities on total suspended solids (TSS) concentration is one of the important components of water environment protection. Due to the limits of traditional cross-sectional point monitoring, a novel water quality evaluation method based on the Markov model and remote sensing retrieval is proposed to realize the innovation of large-scale spatial monitoring across administrative boundaries. Additionally, to explore the spatiotemporal characteristics and driving factors of TSS, a new three-band remote sensing model of TSS was built by regression analysis for the inland reservoir using the synchronous field spectral data, water quality samples and remote sensing data in the trans-provincial Hedi Reservoir in the Guangdong and Guangxi Provinces of South China. The results show that: (1) The three-band model based on the OLI sensor explained about 82% of the TSS concentration variation (R2=0.81, N=34, p value<0.01) with an acceptable validation accuracy (RMSE=6.24 mg/L,MRE=18.02%, N=15), which is basically the first model of its kind available in South China. (2) The TSS concentration has spatial distribution characteristics of high upstream and low downstream, where the average TSS at 31.54 mg/L in the upstream are 2.5 times those of the downstream (12.55 mg/L). (3) Different seasons and rainfall are important factors affecting the TSS in the upstream cross-border area, the TSS in the dry season are higher with average TSS of 33.66 mg/L and TSS are negatively correlated with rainfall from upstream mankind activity. Generally, TSS are higher in rainy seasons than those in dry seasons. However, the result shows that TSS are negatively correlated with rainfall, which means human activities have higher impacts on water quality than climate change. (4) The Markov dynamic evaluation results show that the water quality improvement in the upstream Shijiao Town is the most obvious, especially in 2018, the improvement in the water quality level crossed three levels and the TSS were the lowest. This study provided a technical method for remote sensing dynamic monitoring of water quality in a large reservoir, which is of great significance for remediation of the water environment and the effective evaluation of the river and lake chief system in China.
    • Prescribed $k$-symmetric curvature hypersurfaces in de Sitter space

      Ballesteros-Chávez, Daniel; Klingenberg, Wilhelm; Lambert, Ben; Silesian University of Technology, Kaszubska; University of Durham; University of Derby (Cambridge University Press, 2020-11-26)
      We prove existence of compact spacelike hypersurfaces with prescribed k - curvature in de Sitter space, where the prescription function depends on both space and the tilt function.
    • The number of partitions of a set and Superelliptic Diophantine equations

      Andrica, Dorin; Bagdasar, Ovidiu; Ţurcaş, George Cătălin; “Babeş-Bolyai” University, Cluj-Napoca, Romania; University of Derby; The Institute of Mathematics of the Romanian Academy “Simion Stoilow” Bucharest, Romania (Springer, 2020-11-22)
      In this chapter we start by presenting some key results concerning the number of ordered k-partitions of multisets with equal sums. For these we give generating functions, recurrences and numerical examples. The coefficients arising from these formulae are then linked to certain elliptic and superelliptic Diophantine equations, which are investigated using some methods from Algebraic Geometry and Number Theory, as well as specialized software tools and algorithms. In this process we are able to solve some recent open problems concerning the number of solutions for certain Diophantine equations and to formulate new conjectures.
    • WHAM - Webcam Head-tracked AMbisonics

      Dring, Mark; Wiggins, Bruce; University of Derby (Institute of Acoustics, 2020-11-19)
      This paper describes the development and implementation of a real-time head-tracked auralisation platform using Higher Order Ambisonics (HOA) decoded binaurally based on open-source and freely available web technologies without the need for specialist head-tracking hardware. An example implementation of this work can be found at https://brucewiggins.co.uk/WHAM/.
    • A repairing missing activities approach with succession relation for event logs

      Liu, Jie; Xu, Jiuyun; Zhang, Ruru; Reiff-Marganiec, Stephan; China University of Petroleum; The China Mobile (Suzhou) Software Technology Company, Suzhou, China; University of Derby (Springer, 2020-11-11)
      In the field of process mining, it is worth noting that process mining techniques assume that the resulting event logs can not only continuously record the occurrence of events but also contain all event data. However, like in IoT systems, data transmission may fail due to weak signal or resource competition, which causes the company’s information system to be unable to keep a complete event log. Based on a incomplete event log, the process model obtained by using existing process mining technologies is deviated from actual business process to a certain degree. In this paper, we propose a method for repairing missing activities based on succession relation of activities from event logs. We use an activity relation matrix to represent the event log and cluster it. The number of traces in the cluster is used as a measure of similarity calculation between incomplete traces and cluster results. Parallel activities in selecting pre-occurrence and post-occurrence activities of missing activities from incomplete traces are considered. Experimental results on real-life event logs show that our approach performs better than previous method in repairing missing activities.
    • Impact of social distancing to mitigate the spread of COVID-19 in a virtual environment

      Marti-Mason, Diego; Kapinaj, Matej; Pinel-Martínez, Alejandro; Stella, Leonardo; University of Derby (The Association for Computing Machinery, 2020-11-01)
      A novel strand of Coronavirus has spread in the past months to the point of becoming a pandemic of massive proportions. In order to mitigate the spread of this disease, many different policies have been adopted, including a strict national lockdown in some countries or milder government policies: one common aspect is that they mostly rely around keeping distance between individuals. The aim of this work is to provide means of visualizing the impact of social distancing in an immersive environment by making use of the virtual reality technology. To this aim, we create a virtual environment which resembles a university setting (we based it on the University of Derby), and populate it with a number of AI agents. We assume that the minimum social distance is 2 meters. The main contribution of this work is twofold: the multi-disciplinary approach that results from visualizing the social distancing in an effort to mitigate the spread of the COVID-19, and the digital twin application in which the users can navigate the virtual environment whilst receiving visual feedback in the proximity of other agents. We named our application SoDAlVR, which stands for Social Distancing Algorithm in Virtual Reality.
    • A case study on the impact list event sound level regulations have on sound engineering practice

      Hill, Adam J.; Burton, Jon; University of Derby (Institute of Acoustics, 2020-11)
      Sound level management at live events in becoming increasingly common at live events in the UK, Europe and beyond. An inspection of regulations across the globe reveals a lack of standardization for sound level limits and averaging times. This case study is formed around a dataset generated on a recent tour by a well-known British musical act. The same sound engineer mixed the band throughout the tour using sound level monitoring software throughout. As the show’s configuration, engineer, musicians and running order were generally consistent day-to-day, the direct inspection of the influence of sound level limit and averaging time, as well as venue capacity and type (indoors or outdoors), is possible. The results from this study highlight both good and bad sound management practice, with key stakeholders’ experience and hearing safety in mind.
    • Smart anomaly detection in sensor systems: A multi-perspective review

      Erhan, L.; Ndubuaku, M.; Di Mauro, M.; Song, W.; Chen, M.; Fortino, G.; Bagdasar, Ovidiu; Liotta, A.; University of Derby; University of Salerno, Italy; et al. (Elsevier, 2020-10-15)
      Anomaly detection is concerned with identifying data patterns that deviate remarkably from the expected behavior. This is an important research problem, due to its broad set of application domains, from data analysis to e-health, cybersecurity, predictive maintenance, fault prevention, and industrial automation. Herein, we review state-of-the-art methods that may be employed to detect anomalies in the specific area of sensor systems, which poses hard challenges in terms of information fusion, data volumes, data speed, and network/energy efficiency, to mention but the most pressing ones. In this context, anomaly detection is a particularly hard problem, given the need to find computing-energy-accuracy trade-offs in a constrained environment. We taxonomize methods ranging from conventional techniques (statistical methods, time-series analysis, signal processing, etc.) to data-driven techniques (supervised learning, reinforcement learning, deep learning, etc.). We also look at the impact that different architectural environments (Cloud, Fog, Edge) can have on the sensors ecosystem. The review points to the most promising intelligent-sensing methods, and pinpoints a set of interesting open issues and challenges.
    • Research and implementation of intelligent decision based on a priori knowledge and DQN algorithms in wargame environment

      Sun, Yuxiang; Yuan, Bo; Zhang, Tao; Tang, Bojian; Zheng, Wanwen; Zhou, Xianzhong; University of Derby; Nanjing University, China (MDPI AG, 2020-10-13)
      The reinforcement learning problem of complex action control in a multi-player wargame has been a hot research topic in recent years. In this paper, a game system based on turn-based confrontation is designed and implemented with state-of-the-art deep reinforcement learning models. Specifically, we first design a Q-learning algorithm to achieve intelligent decision-making, which is based on the DQN (Deep Q Network) to model complex game behaviors. Then, an a priori knowledge-based algorithm PK-DQN (Prior Knowledge-Deep Q Network) is introduced to improve the DQN algorithm, which accelerates the convergence speed and stability of the algorithm. The experiments demonstrate the correctness of the PK-DQN algorithm, it is validated, and its performance surpasses the conventional DQN algorithm. Furthermore, the PK-DQN algorithm shows effectiveness in defeating the high level of rule-based opponents, which provides promising results for the exploration of the field of smart chess and intelligent game deduction