• D-Meson Azimuthal Anisotropy in Midcentral Pb-Pb Collisions at ffiffi s p NN = 5.02 TeV.

      ALICE Collaboration; Barnby, Lee; STFC Daresbury Laboratory (APS Physics, 2018-03-09)
      The azimuthal anisotropy coefficient v 2 of prompt D 0 , D + , D * + , and D + s mesons was measured in midcentral (30%–50% centrality class) Pb-Pb collisions at a center-of-mass energy per nucleon pair √ s N N = 5.02     TeV , with the ALICE detector at the LHC. The D mesons were reconstructed via their hadronic decays at midrapidity, | y | < 0.8 , in the transverse momentum interval 1 < p T < 24     GeV / c . The measured D -meson v 2 has similar values as that of charged pions. The D + s v 2 , measured for the first time, is found to be compatible with that of nonstrange D mesons. The measurements are compared with theoretical calculations of charm-quark transport in a hydrodynamically expanding medium and have the potential to constrain medium parameters.
    • D-meson production in p-Pb collisions at √sNN = 5.02 TeV and in pp collisions at √s = 7 TeV

      ALICE Collaboration; Barnby, Lee; European Organization for Nuclear Research (CERN); University of Birmingham (American Physical Society, 2016-11-23)
      Background: In the context of the investigation of the quark gluon plasma produced in heavy-ion collisions, hadrons containing heavy (charm or beauty) quarks play a special role for the characterization of the hot and dense medium created in the interaction. The measurement of the production of charm and beauty hadrons in proton– proton collisions, besides providing the necessary reference for the studies in heavy-ion reactions, constitutes an important test of perturbative quantum chromodynamics (pQCD) calculations. Heavy-flavor production in proton–nucleus collisions is sensitive to the various effects related to the presence of nuclei in the colliding system, commonly denoted cold-nuclear-matter effects. Most of these effects are expected to modify open-charm production at low transverse momenta (pT) and, so far, no measurement of D-meson production down to zero transverse momentum was available at mid-rapidity at the energies attained at the CERN Large Hadron Collider (LHC). Purpose: The measurements of the production cross sections of promptly produced charmed mesons in p-Pb collisions at the LHC down to pT = 0 and the comparison to the results from pp interactions are aimed at the assessment of cold-nuclear-matter effects on open-charm production, which is crucial for the interpretation of the results from Pb-Pb collisions. Methods: The prompt charmed mesons D0, D+, D∗+, and Ds+ were measured at mid-rapidity in p-Pb collisions at a center-of-mass energy per nucleon pair √sNN = 5.02 TeV with the ALICE detector at the LHC.D mesons were reconstructed from their decays D0 → K−π+, D+ → K−π+π+, D∗+ → D0π+, Ds+ → φπ+ → K−K+π+, and their charge conjugates, using an analysis method based on the selection of decay topologies displaced from the interaction vertex. In addition, the prompt D0 production cross section was measured in pp collisions at √s = 7 TeV and p-Pb collisions at √sNN = 5.02 TeV down to pT = 0 using an analysis technique that is based on the estimation and subtraction of the combinatorial background, without reconstruction of the D0 decay vertex. Results: The production cross section in pp collisions is described within uncertainties by different implementations of pQCD calculations down to pT = 0. This allowed also a determination of the total cc ̄ production cross section in pp collisions, which is more precise than previous ALICE measurements because it is not affected by uncertainties owing to the extrapolation to pT = 0. The nuclear modification factor RpPb(pT), defined as the ratio of the pT-differential D meson cross section in p-Pb collisions and that in pp collisions scaled by the mass number of the Pb nucleus, was calculated for the four D-meson species and found to be compatible with unity within uncertainties. The results are compared to theoretical calculations that include cold-nuclear-matter effects and to transport model calculations incorporating the interactions of charm quarks with an expanding deconfined medium. Conclusions: These measurements add experimental evidence that the modification of the D-meson transverse momentum distributions observed in Pb–Pb collisions with respect to pp interactions is due to strong final-state effects induced by the interactions of the charm quarks with the hot and dense partonic medium created in ultrarelativistic heavy-ion collisions. The current precision of the measurement does not allow us to draw conclusions on the role of the different cold-nuclear-matter effects and on the possible presence of additional hot-medium effects in p-Pb collisions. However, the analysis technique without decay-vertex reconstruction, applied on future larger data samples, should provide access to the physics-rich range down to pT = 0
    • D.I.Y: Hydrophonics.

      Locke, Caroline; University of Derby; University of Chichester (University of Chichester, 2015)
      D.I.Y Too is a new book about “do it yourself” performance, with contributions made by over 30 arts practitioners and collectives. It's a sequel of sorts - or rather; a continuation - to a recent text that platformed a growing community of voices in theatre, art, dance and performance making. Its aim is to articulate and contextualise an ethos and practice within contemporary art called "DIY" theatre and performance. This book is a text that provokes, prescribes, instructs, argues, plays, advises, promotes and describes. Its emphasis is on how theatre makers can encourage and evolve performance making by sharing their theories and practices, to help empower more artists to engage with this way of working. Critically (or theoretically) this book addresses a wide range of perspectives on "DIY" theatre and performance and identifies key axioms and dichotomies between ethos and style. Contributors: Accidental Collective: Pippa Bailey: Simon Bowes: Daniel Bye: Karen Christopher: Helen Cole: Dirty Market: Fictional Dogshelf: Emma Frankland and Keir Cooper: Gob Squad: Donald Hutera: Mamoru Iriguchi: Dan Koop: Lila Dance: Caroline Locke: LOW PROFILE: Rachel Mars: Harun Morrison: Hannah Nicklin: Joseph O'Farrell (JOF): Paper Cinema: Patternfight: Plastic Castles: Sh!t Theatre: Sleeping Trees: Sleepwalk Collective: Tassos Stevens: Shamira Turner, Little Bulb: Uninvited Guests: Hannah Jane Walker: Melanie Wilson: Greg Wohead: Caroline Wright and Helen Paris.
    • The D2N2 employability framework: Employers and schools supporting young people's routes to work

      Hutchinson, Jo; Dickinson, Berni; Vickers, Rob; Hooley, Tristram; University of Derby (D2N2, 2015)
      The D2N2 Employability Framework provides the methodology by which we can significantly improve the employability and life skills of our young people regardless of academic ability or which career pathway they chose to take. Collectively schools, colleges, training providers, wealth creating companies, social enterprises and the public sector have a duty to ensure that we give our young people the best chances in gaining employment and at the same time addressing the skills needs of employers within our area.
    • Damage in single lap joints of woven fabric reinforced polymeric composites subjected to transverse impact loading

      Choudhry, Rizwan Saeed; Hassan, Syed F.; Li, Shuguang; Day, Richard; National University of Sciences and Technology; University of Manchester; University of Nottingham; Glyndŵr University (Elsevier, 2015-02-16)
      Single lap joints of woven glass fabric reinforced phenolic composites, having four different overlap widths, were impacted transversely using a hemispherical impactor with different velocities in the low velocity impact range. The resulting damage was observed at various length scales (from micro to macro) using transmission photography, ultrasonic c-scan and x-ray micro tomography (XMT), in support of each other. These experimental observations were used for classification of damage in terms of damage scale, location (i.e. ply, interfaces between plies or bond failure between the two adherends) and mechanisms, with changing overlap width and impact velocity. In addition, finite element analysis was used to simulate delamination and disbond failure. These simulations were used to further explain the observed dependence of damage on overlap width and impact velocity. The results from these experiments and simulations lead to the proposal of a concept of lower and upper characteristic overlap width. These bounds relate the dominant damage pattern (i.e. scale, location and mechanism) with overlap width of the joint for a given impact velocity range.
    • Damped forced vibration analysis of single-walled carbon nanotubes resting on viscoelastic foundation in thermal environment using nonlocal strain gradient theory

      Malikan, Mohammad; Nguyen, Van Bac; Tornabene, Francesco; Islamic Azad University; University of Derby; University of Bologna (Elsevier, 2018-08-01)
      In this paper, the damped forced vibration of single-walled carbon nanotubes (SWCNTs) is analyzed using a new shear deformation beam theory. The SWCNTs are modeled as a flexible beam on the viscoelastic foundation embedded in the thermal environment and subjected to a transverse dynamic load. The equilibrium equations are formulated by the new shear deformation beam theory which is accompanied with higher-order nonlocal strain gradient theory where the influences of both stress nonlocality and strain gradient size-dependent effects are taken into account. In this new shear deformation beam theory, there is no need to use any shear correction factor and also the number of unknown variables is the only one that is similar to the Euler-Bernoulli beam hypothesis. The governing equations are solved by utilizing an analytical approach by which the maximum dynamic deflection has been obtained with simple boundary conditions. To validate the results of the new proposed beam theory, the results in terms of natural frequencies are compared with the results from an available well-known reference. The effects of nonlocal parameter, half-wave length, damper, temperature and material variations on the dynamic vibration of the nanotubes, are discussed in detail.
    • The Dangerous Rise of Therapeutic Education

      Hayes, Dennis; Ecclestone, Kathryn; University of Derby; University of Sheffield (Routledge Education Classic Editions, 2019-02-07)
      The Dangerous Rise of Therapeutic Education confronts the silent ascendancy of a therapeutic ethos across the educational system and into the workplace. Controversial and compelling, Kathryn Ecclestone and Dennis Hayes’ classic text uses a wealth of examples across the education system, from primary schools to university and the workplace, to show how therapeutic education is turning children, young people and adults into anxious and self-preoccupied individuals rather than aspiring, optimistic and resilient learners who want to know everything about the world. Remaining extremely topical, the chapters illuminate the powerful effects of therapeutic education, including: How therapeutic learning is taking shape, now and in the future How therapeutic ideas from popular culture have come to govern social thought and policies How the fostering of dependence and compulsory participation in therapeutic activities that encourage the disclosing of emotions, can undermine parents’ and teachers’ confidence and authority How therapeutic forms of teacher training undermine faith in the pursuit of knowledge How political initiatives in emotional literacy, emotional wellbeing and ‘positive mental health’ propagate a diminished view of human potential throughout the education system and the workplace. The Dangerous Rise of Therapeutic Education is an eye-opening read for every teacher and leader across the field of education, and every parent and student, who is passionate about the power of knowledge to transform people’s lives. It is a call for a debate about the growing impact of therapeutic education and what it means for learning now and in the future.
    • The dangerous rise of therapeutic education

      Hayes, Dennis; Ecclestone, Kathryn; Oxford Brookes University (Routledge, 2008-06)
      The silent ascendancy of a therapeutic ethos across the education system and into the workplace demands a book that serves as a wake up call to everyone. Kathryn Ecclestone and Dennis Hayes' controversial and compelling book uses a wealth of examples across the education system, from primary schools to university, and the workplace to show how therapeutic education is turning children, young people and adults into anxious and self-preoccupied individuals rather than aspiring, optimistic and resilient learners who want to know everything about the world. The chapters address a variety of thought-provoking themes, including •how therapeutic ideas from popular culture dominate social thought and social policies and offer a diminished view of human potential •how schools undermine parental confidence and authority by fostering dependence and compulsory participation in therapeutic activities based on disclosing emotions to others •how higher education has adopted therapeutic forms of teacher training because many academics have lost faith in the pursuit of knowledge •how such developments are propelled by a deluge of political initiatives in areas such as emotional literacy, emotional well-being and the 'soft outcomes' of learning The Dangerous Rise of Therapeutic Education is eye-opening reading for every teacher, student teacher and parent who retains any belief in the power of knowledge to transform people's lives. Its insistent call for a serious public debate about the emotional state of education should also be at the forefront of the minds of every agent of change in society… from parent to policy maker.
    • The dark side of competition: How competitive behaviour and striving to avoid inferiority are linked to depression, anxiety, stress and self-harm.

      Gilbert, Paul; McEwan, Kirsten; Bellew, Rebecca; Mills, Alison; Gale, Corinne; University of Derby (British Psychological Society, 2009-06)
      This study was guided by the social rank theory of depression and aimed to explore the relationship between depression, anxiety, stress and self‐harm with striving to avoid inferiority, feelings of shame and styles of attachment. Participants diagnosed with depression (n=62) completed a series of questionnaires measuring striving to avoid inferiority, fears of missing out, being overlooked and active rejection, attachment, social rank and psychopathologies. Striving to avoid inferiority was significantly linked to social rank variables and anxious attachment. Mediator analyses revealed that the relationship between striving to avoid inferiority and depression was mediated by the social rank variable of external shame, and also anxious attachment. These findings suggest that elevated competitive behaviour can have a ‘dark side’. When people feel insecure in their social environments, it can focus them on a hierarchical view of themselves and others, with a fear of rejection if they feel they have become too inferior or subordinate. This may increase vulnerability to depression, anxiety and stress.
    • Data aggregation in wireless sensor networks for lunar exploration

      Zhai, Xiaojun; Vladimirova, Tanya; University of Derby; University of Leicester (IEEE, 2015-09-03)
      This paper presents research work related to the development of Wireless Sensor Networks (WSN) gathering environmental data from the surface of the Moon. Data aggregation algorithms are applied to reduce the amount of the multi-sensor data collected by the WSN, which are to be sent to a Moon orbiter and later to Earth. A particular issue that is of utmost importance to space applications is energy efficiency and a main goal of the research is to optimise the algorithm design so that the WSN energy consumption is reduced. An extensive simulation experiment is carried out, which confirms that the use of the proposed algorithms enhances significantly the network performance in terms of energy consumption compared to routing the raw data. In addition, the proposed data aggregation algorithms are implemented successfully on a System-on-a-chip (SoC) embedded platform using a Xilinx Zynq FPGA device. The data aggregation has two important effects: the WSN life time is extended due to the saved energy and the original data accuracy is preserved. This research could be beneficial for a number of future security related applications, such as monitoring of phenomena that may affect Earth's planetary security/safety as well as monitoring the implementation of Moon treaties preventing establishment of military bases on the lunar surface.
    • Data aggregation with end-to-end confidentiality and integrity for large-scale wireless sensor networks.

      Cui, Jie; Shao, Lili; Zhong, Hong; Xu, Yan; Liu, Lu; Anhui University; University of Derby (Springer, 2017-07-17)
      In wireless sensor networks, data aggregation allows in-network processing, which leads to reduced packet transmissions and reduced redundancy, and thus is helpful to prolong the overall lifetime of wireless sensor networks. In current studies, Elliptic Curve ElGamal homomorphic encryption algorithm has been widely used to protect end-to-end data confidentiality. However, these works suffer from the expensive mapping function during decryption. If the aggregated results are huge, the base station has no way to gain the original data due to the hardness of the elliptic curve discrete logarithm problem. Therefore, these schemes are unsuitable for the large-scale WSNs. In this paper, we propose a secure energy-saving data aggregation scheme designed for the large-scale WSNs. We employ Okamoto-Uchiyama homomorphic encryption algorithm to protect end-to-end data confidentiality, use MAC to achieve in-network false data filtering, and utilize the homomorphic MAC algorithm to achieve end-to-end data integrity. Two popular IEEE 802.15.4-compliant wireless sensor network platforms, Tmote Sky and iMote 2 have been used to evaluate the efficiency and feasibility of our scheme. The results demonstrate that our scheme achieved better performance in reducing energy consumption. Moreover, system delay, especially decryption delay at the base station, has been reduced when compared to other state-of-art methods.
    • Data classification using the Dempster–Shafer method.

      Chen, Qi; Whitbrook, Amanda; Aickelin, Uwe; Roadknight, Chris; University of Nottingham (Taylor and Francis, 2014-02-26)
      In this paper, the Dempster–Shafer (D–S) method is used as the theoretical basis for creating data classification systems. Testing is carried out using three popular multiple attribute benchmark data-sets that have two, three and four classes. In each case, a subset of the available data is used for training to establish thresholds, limits or likelihoods of class membership for each attribute, and hence create mass functions that establish probability of class membership for each attribute of the test data. Classification of each data item is achieved by combination of these probabilities via Dempster’s rule of combination. Results for the first two data-sets show extremely high classification accuracy that is competitive with other popular methods. The third data-set is non-numerical and difficult to classify, but good results can be achieved provided the system and mass functions are designed carefully and the right attributes are chosen for combination. In all cases, the D–S method provides comparable performance to other more popular algorithms, but the overhead of generating accurate mass functions increases the complexity with the addition of new attributes. Overall, the results suggest that the D–S approach provides a suitable framework for the design of classification systems and that automating the mass function design and calculation would increase the viability of the algorithm for complex classification problems.
    • Data completeness in healthcare: A literature survey.

      Liu, Caihua; Talaei-Khoei, Amir; Zowghi, Didae; Daniel, Jay; University of Technology Sydney; University of Nevada (Association for Information Systems, 2017-09-19)
      As the adoption of eHealth has made it easier to access and aggregate healthcare data, there has been growing application for clinical decisions, health services planning, and public health monitoring with daily collected data in clinical care. Reliable data quality is a precursor of the aforementioned tasks. There is a body of research on data quality in healthcare, however, a clear picture of data completeness in this field is missing. This research aims to identify and classify current research themes related to data completeness in healthcare. In addition, the paper presents problems with data completeness in the reviewed literature and identifies methods that have been adopted to address those problems. This study has reviewed 24 papers (January 2011–April 2016) published in information and computing sciences, biomedical engineering, and medicine and health sciences journals. The paper uncovers three main research themes, including design and development, evaluation, and determinants. In conclusion, this paper improves our understanding of the current state of the art of data completeness in healthcare records and indicates future research directions.
    • Data Driven Transmission Power Control for Wireless Sensor Networks

      Kotian, Roshan; Exarchakos, Georgios; Liotta, Antonio (Springer, 2015)
    • Data floes: Polar science as catalyst for the arts

      Locke, Caroline; University of Derby (British Antarctic Survey, 2016-10-20)
      In the Autumn of 2016 Locke was invited to participate in a half day workshop where artists and scientists talked together about how they use climate and environmental data sets to explore issues around the communication of science. The workshop took place as part of The Cambridge Festival of Ideas and was an opportunity to meet and share information with researchers from different disciplines whose work involves creating awareness and understanding of nature and science. Connections were made with climate scientists. Consultation began here with Dr Gareth Rees (Cambridge University) and his research into remote sensing techniques and the monitoring of the dynamics of Arctic glaciated and vegetated terrain. This later became an important connection. Locke worked in consultation with Gareth, who facilitated her links with The Norwegian Polar Institute and The Arctic University of Norway in 2020. Data floes: polar science as catalyst for the arts ended with an evening public event as part of The Festival of Ideas.
    • Data Intensive and Network Aware (DIANA) grid scheduling

      McClatchey, Richard; Anjum, Ashiq; Stockinger, Heinz; Ali, Arshad; Willers, Ian; Thomas, Michael; University of West England; Swiss Institute of Bioinformatics; National University of Sciences and Technology; CERN; et al. (Springer, 2007-01-27)
      In Grids scheduling decisions are often made on the basis of jobs being either data or computation intensive: in data intensive situations jobs may be pushed to the data and in computation intensive situations data may be pulled to the jobs. This kind of scheduling, in which there is no consideration of network characteristics, can lead to performance degradation in a Grid environment and may result in large processing queues and job execution delays due to site overloads. In this paper we describe a Data Intensive and Network Aware (DIANA) meta-scheduling approach, which takes into account data, processing power and network characteristics when making scheduling decisions across multiple sites. Through a practical implementation on a Grid testbed, we demonstrate that queue and execution times of data-intensive jobs can be significantly improved when we introduce our proposed DIANA scheduler. The basic scheduling decisions are dictated by a weighting factor for each potential target location which is a calculated function of network characteristics, processing cycles and data location and size. The job scheduler provides a global ranking of the computing resources and then selects an optimal one on the basis of this overall access and execution cost. The DIANA approach considers the Grid as a combination of active network elements and takes network characteristics as a first class criterion in the scheduling decision matrix along with computations and data. The scheduler can then make informed decisions by taking into account the changing state of the network, locality and size of the data and the pool of available processing cycles.
    • Data Mining for Monitoring and Managing Systems and Networks

      Liotta, Antonio; Di Fatta, Giuseppe (Springer US, 2014)
    • Data-driven knowledge acquisition, validation, and transformation into HL7 Arden Syntax

      Hussain, Maqbool; Afzal, Muhammad; Ali, Taqdir; Ali, Rahman; Khan, Wajahat Ali; Jamshed, Arif; Lee, Sungyoung; Kang, Byeong Ho; Latif, Khalid; Kyung Hee University, Seocheon-dong, Giheung-gu, Yongin-si 446-701, Gyeonggi-do, Republic of Korea; et al. (Elsevier BV, 2015-10-28)
      The objective of this study is to help a team of physicians and knowledge engineers acquire clinical knowledge from existing practices datasets for treatment of head and neck cancer, to validate the knowledge against published guidelines, to create refined rules, and to incorporate these rules into clinical workflow for clinical decision support. A team of physicians (clinical domain experts) and knowledge engineers adapt an approach for modeling existing treatment practices into final executable clinical models. For initial work, the oral cavity is selected as the candidate target area for the creation of rules covering a treatment plan for cancer. The final executable model is presented in HL7 Arden Syntax, which helps the clinical knowledge be shared among organizations. We use a data-driven knowledge acquisition approach based on analysis of real patient datasets to generate a predictive model (PM). The PM is converted into a refined-clinical knowledge model (R-CKM), which follows a rigorous validation process. The validation process uses a clinical knowledge model (CKM), which provides the basis for defining underlying validation criteria. The R-CKM is converted into a set of medical logic modules (MLMs) and is evaluated using real patient data from a hospital information system. We selected the oral cavity as the intended site for derivation of all related clinical rules for possible associated treatment plans. A team of physicians analyzed the National Comprehensive Cancer Network (NCCN) guidelines for the oral cavity and created a common CKM. Among the decision tree algorithms, chi-squared automatic interaction detection (CHAID) was applied to a refined dataset of 1229 patients to generate the PM. The PM was tested on a disjoint dataset of 739 patients, which gives 59.0% accuracy. Using a rigorous validation process, the R-CKM was created from the PM as the final model, after conforming to the CKM. The R-CKM was converted into four candidate MLMs, and was used to evaluate real data from 739 patients, yielding efficient performance with 53.0% accuracy. Data-driven knowledge acquisition and validation against published guidelines were used to help a team of physicians and knowledge engineers create executable clinical knowledge. The advantages of the R-CKM are twofold: it reflects real practices and conforms to standard guidelines, while providing optimal accuracy comparable to that of a PM. The proposed approach yields better insight into the steps of knowledge acquisition and enhances collaboration efforts of the team of physicians and knowledge engineers.
    • A day in the life of a subscriptions and document delivery librarian

      Kay, James; University of Derby (Ubiquity Press, 2016-07-05)
      In his role as part of the Learning Resources Development & Delivery team at the University of Derby, James Kay shares responsibility for acquisitions, serials, e-resources, inter-library loans, online reading lists, copyright and cataloguing.