Now showing items 23-42 of 694

• #### Analytical tools for blockchain: review, taxonomy and open challenges.

Bitcoin has introduced a new concept that could feasibly revolutionise the entire Internet as it exists, and positively impact on many types of industries including, but not limited to, banking, public sector and supply chain. This innovation is grounded on pseudo-anonymity and strives on its innovative decentralised architecture based on the blockchain technology. Blockchain is pushing forward a race of transaction-based applications with trust establishment without the need for a centralised authority, promoting accountability and transparency within the business process. However, a blockchain ledger (e.g., Bitcoin) tend to become very complex and specialised tools, collectively called “Blockchain Analytics”, are required to allow individuals, law enforcement agencies and service providers to search, explore and visualise it. Over the last years, several analytical tools have been developed with capabilities that allow, e.g., to map relationships, examine flow of transactions and filter crime instances as a way to enhance forensic investigations. This paper discusses the current state of blockchain analytical tools and presents a thematic taxonomy model based on their applications. It also examines open challenges for future development and research.

• #### Application of Big Data for national security: A practitioner's guide to emerging technologies

Application of Big Data for National Security provides users with state-of-the-art concepts, methods, and technologies for Big Data analytics in the fight against terrorism and crime, including a wide range of case studies and application scenarios. This book combines expertise from an international team of experts in law enforcement, national security, and law, as well as computer sciences, criminology, linguistics, and psychology, creating a unique cross-disciplinary collection of knowledge and insights into this increasingly global issue. The strategic frameworks and critical factors presented in Application of Big Data for National Security consider technical, legal, ethical, and societal impacts, but also practical considerations of Big Data system design and deployment, illustrating how data and security concerns intersect. In identifying current and future technical and operational challenges it supports law enforcement and government agencies in their operational, tactical and strategic decisions when employing Big Data for national security •Contextualizes the Big Data concept and how it relates to national security and crime detection and prevention •Presents strategic approaches for the design, adoption, and deployment of Big Data technologies in preventing terrorism and reducing crime •Includes a series of case studies and scenarios to demonstrate the application of Big Data in a national security context •Indicates future directions for Big Data as an enabler of advanced crime prevention and detection
• #### Application of caputo–fabrizio operator to suppress the aedes aegypti mosquitoes via wolbachia: an LMI approach

The aim of this paper is to establish the stability results based on the approach of Linear Matrix Inequality (LMI) for the addressed mathematical model using Caputo–Fabrizio operator (CF operator). Firstly, we extend some existing results of Caputo fractional derivative in the literature to a new fractional order operator without using singular kernel which was introduced by Caputo and Fabrizio. Secondly, we have created a mathematical model to increase Cytoplasmic Incompatibility (CI) in Aedes Aegypti mosquitoes by releasing Wolbachia infected mosquitoes. By this, we can suppress the population density of A.Aegypti mosquitoes and can control most common mosquito-borne diseases such as Dengue, Zika fever, Chikungunya, Yellow fever and so on. Our main aim in this paper is to examine the behaviours of Caputo–Fabrizio operator over the logistic growth equation of a population system then, prove the existence and uniqueness of the solution for the considered mathematical model using CF operator. Also, we check the alpha-exponential stability results for the system via linear matrix inequality technique. Finally a numerical example is provided to check the behaviour of the CF operator on the population system by incorporating the real world data available in the known literature.
• #### Application of the Lomb-Scargle Periodogram to Investigate Heart Rate Variability during Haemodialysis

Short-term cardiovascular compensatory responses to perturbations in the circulatory system caused by haemodialysis can be investigated by the spectral analysis of heart rate variability, thus providing an important variable for categorising individual patients' response, leading to a more personalised treatment. This is typically accomplished by resampling the irregular heart rate to generate an equidistant time series prior to spectral analysis, but resampling can further distort the data series whose interpretation can already be compromised by the presence of artefacts. The Lomb-Scargle periodogram provides a more direct method of spectral analysis as this method is specifically designed for large, irregularly sampled, and noisy datasets such as those obtained in clinical settings. However, guidelines for preprocessing patient data have been established in combination with equidistant time-series methods and their validity when used in combination with the Lomb-Scargle approach is missing from literature. This paper examines the effect of common preprocessing methods on the Lomb-Scargle power spectral density estimate using both real and synthetic heart rate data and will show that many common techniques for identifying and editing suspect data points, particularly interpolation and replacement, will distort the resulting power spectrum potentially misleading clinical interpretations of the results. Other methods are proposed and evaluated for use with the Lomb-Scargle approach leading to the main finding that suspicious data points should be excluded rather than edited, and where required, denoising of the heart rate signal can be reliably accomplished by empirical mode decomposition. Some additional methods were found to be particularly helpful when used in conjunction with the Lomb-Scargle periodogram, such as the use of a false alarm probability metric to establish whether spectral estimates are valid and help automate the assessment of valid heart rate records, potentially leading to greater use of this powerful technique in a clinical setting.

• #### Applications of dynamic diffuse signal processing in sound reinforcement and reproduction.

Electroacoustic systems are subject to position-dependent frequency responses due to coherent interference between multiple sources and/or early reflections. Diffuse signal processing (DiSP) provides a mechanism for signal decorrelation to potentially alleviate this well-known issue in sound reinforcement and reproduction applications. Previous testing has indicated that DiSP provides reduced low-frequency spatial variance across wide audience areas, but in closed acoustic spaces is less effective due to coherent early reflections. In this paper, dynamic implementation of DiSP is examined, whereby the decorrelation algorithm varies over time, thus allowing for decorrelation between surface reflections and direct sounds. Potential applications of dynamic DiSP are explored in the context of sound reinforcement (subwoofers, stage monitoring) and sound reproduction (small-room low-frequency control, loudspeaker crossovers), with preliminary experimental results presented.

• #### Arabic machine translation: A survey of the latest trends and challenges

Given that Arabic is one of the most widely used languages in the world, the task of Arabic Machine Translation (MT) has recently received a great deal of attention from the research community. Indeed, the amount of research that has been devoted to this task has led to some important achievements and improvements. However, the current state of Arabic MT systems has not reached the quality achieved for some other languages. Thus, much research work is still needed to improve it. This survey paper introduces the Arabic language, its characteristics, and the challenges involved in its translation. It provides the reader with a full summary of the important research studies that have been accomplished with regard to Arabic MT along with the most important tools and resources that are available for building and testing new Arabic MT systems. Furthermore, the survey paper discusses the current state of Arabic MT and provides some insights into possible future research directions.
• #### Artificial neural networks training acceleration through network science strategies

The development of deep learning has led to a dramatic increase in the number of applications of artificial intelligence. However, the training of deeper neural networks for stable and accurate models translates into artificial neural networks (ANNs) that become unmanageable as the number of features increases. This work extends our earlier study where we explored the acceleration effects obtained by enforcing, in turn, scale freeness, small worldness, and sparsity during the ANN training process. The efficiency of that approach was confirmed by recent studies (conducted independently) where a million-node ANN was trained on non-specialized laptops. Encouraged by those results, our study is now focused on some tunable parameters, to pursue a further acceleration effect. We show that, although optimal parameter tuning is unfeasible, due to the high non-linearity of ANN problems, we can actually come up with a set of useful guidelines that lead to speed-ups in practical cases. We find that significant reductions in execution time can generally be achieved by setting the revised fraction parameter (ζ) to relatively low values.
• #### Assessing Domain Specificity in the Measurement of Mathematics Calculation Anxiety

An online, cross-sectional approach was taken, including an opportunity sample of 160 undergraduate students from a university in the Midlands, UK. Exploratory factor analysis indicated a parsimonious, four-factor solution: abstract maths anxiety, statistics probability anxiety, statistics calculation anxiety, and numerical calculation anxiety. The results support previous evidence for the existence of a separate “numerical anxiety” or “arithmetic computation” anxiety component of maths anxiety and also support the existence of anxiety that is specific to more abstract maths. This is the first study to consider the multidimensionality of maths anxiety at the level of the calculation type. The 26-item Maths Calculation Anxiety Scale appears to be a useful measurement tool in the context of maths calculation specifically.
• #### Assessing undergraduate and postgraduate hard and soft skills in analytics and data science courses

Traditional approaches to assessing undergraduate assignments in the field of software related courses, including Analytics and Data Science courses, involve the course tutors in reading the students’ code and getting the students to physically demonstrate their artefacts. However, this approach tends only to assess the technical skills of solving the set task. It generally fails to assess the many soft skills that industry is looking for, as identified in the e-skills UK (Tech Partnership) / SAS®1 report from Nov 2014 and the associated infographic poster. This paper will describe and evaluate the effectiveness of a different approach to defining the assessment task and formatively and summatively assessing the work of the students in order to effectively evaluate and mark both the soft skills, including creativity, curiosity, storytelling, problem solving and communication, as well as the technical skills. This approach works effectively at all levels of undergraduate and masters courses.

• #### Authentic-caller: Self-enforcing authentication in a next generation network

The Internet of Things (IoT) or the Cyber-Physical System (CPS) is the network of connected devices, things and people which collect and exchange information using the emerging telecommunication networks (4G, 5G IP-based LTE). These emerging telecommunication networks can also be used to transfer critical information between the source and destination, informing the control system about the outage in the electrical grid, or providing information about the emergency at the national express highway. This sensitive information requires authorization and authentication of source and destination involved in the communication. To protect the network from unauthorized access and to provide authentication, the telecommunication operators have to adopt the mechanism for seamless verification and authorization of parties involved in the communication. Currently, the next-generation telecommunication networks use a digest-based authentication mechanism, where the call-processing engine of the telecommunication operator initiates the challenge to the request-initiating client or caller, which is being solved by the client to prove his credentials. However, the digest-based authentication mechanisms are vulnerable to many forms of known attacks e.g., the Man-In-The-Middle (MITM) attack and the password guessing attack. Furthermore, the digest-based systems require extensive processing overheads. Several Public-Key Infrastructure (PKI) based and identity-based schemes have been proposed for the authentication and key agreements. However, these schemes generally require smart-card to hold long-term private keys and authentication credentials. In this paper, we propose a novel self-enforcing authentication protocol for the SIPbased next-generation network based on a low-entropy shared password without relying on any PKI or trusted third party system. The proposed system shows effective resistance against various attacks e.g., MITM, replay attack, password guessing attack, etc. We analyze the security properties of the proposed scheme in comparison to the state of the art.
• #### Automated analysis of security requirements through risk-based argumentation

Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about attacks, and constraints on organisational resources. In our earlier work on RISA (RIsk assessment in Security Argumentation), we showed that informal risk assessment can complement the formal analysis of security requirements. In this paper, we integrate the formal and informal assessment of security by proposing a unified meta-model and an automated tool for supporting security argumentation called OpenRISA. Using a uniform representation of risks and arguments, our automated checking of formal arguments can identify relevant risks as rebuttals to those arguments, and identify mitigations from publicly available security catalogues when possible. As a result, security engineers are able to make informed and traceable decisions about the security of their computer-based systems. The application of OpenRISA is illustrated with examples from a PIN Entry Device case study.
• #### Automatic emotion perception using eye movement information for E-Healthcare systems.

Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system.