PeerJ Computer Science Preprints: Emerging Technologieshttps://peerj.com/preprints/index.atom?journal=cs&subject=10100Emerging Technologies articles published in PeerJ Computer Science PreprintsData security analysis based on Blockchain Recurrence Qualitative Analysis (BRQA)https://peerj.com/preprints/278202019-06-242019-06-24Mohamed A El-dosukyGamal H Eladl
There is no doubt that the Blockchain has become an important technology that imposes itself in its use. With the increasing demand for this technology it is necessary to develop and update techniques proposed to deal with other technologies, especially in the field of cyber-security, which represents a vital and important field. This paper discussed the integration of Recurrence Qualitative Analysis (RQA) technology with the blockchain as well as exciting technical details of RQA operation in increasing Blockchain security. This paper found significant improvements, remarkable and differentiated compared to previous methods
There is no doubt that the Blockchain has become an important technology that imposes itself in its use. With the increasing demand for this technology it is necessary to develop and update techniques proposed to deal with other technologies, especially in the field of cyber-security, which represents a vital and important field. This paper discussed the integration of Recurrence Qualitative Analysis (RQA) technology with the blockchain as well as exciting technical details of RQA operation in increasing Blockchain security. This paper found significant improvements, remarkable and differentiated compared to previous methodsA use case centric survey of Blockchain: status quo and future directionshttps://peerj.com/preprints/275292019-02-112019-02-11Srinath PereraFrank LeymannPaul Fremantle
This paper presents an assessment of blockchain technology based on the Emerging Technology Analysis Canvas (ETAC) to evaluate the drivers and potential outcomes. The ETAC is a framework to critically analyze emerging technologies.
The assessment finds that blockchain can fundamentally transform the world. It is ready for specific applications in use cases such as digital currency, lightweight financial systems, ledgers, provenance, and disintermediation.
However, Blockchain faces significant technical gaps in other use cases and needs at least 5-10 years to come to full fruition in those spaces. Sustaining the current level of effort (e.g. startups, research) for this period of time may be challenging. We also find that the need and merits of decentralized infrastructures compared to centralized and semi-centralized alternatives is not always clear. Given the risk involved and significant potential returns, we recommend a cautiously optimistic approach to blockchain with the focus on concrete use cases.
The primary contributions of this paper are a use case centric categorization of the blockchain, a detailed discussion on challenges faced by those categories, and an assessment of their future.
This paper presents an assessment of blockchain technology based on the Emerging Technology Analysis Canvas (ETAC) to evaluate the drivers and potential outcomes. The ETAC is a framework to critically analyze emerging technologies.The assessment finds that blockchain can fundamentally transform the world. It is ready for specific applications in use cases such as digital currency, lightweight financial systems, ledgers, provenance, and disintermediation.However, Blockchain faces significant technical gaps in other use cases and needs at least 5-10 years to come to full fruition in those spaces. Sustaining the current level of effort (e.g. startups, research) for this period of time may be challenging. We also find that the need and merits of decentralized infrastructures compared to centralized and semi-centralized alternatives is not always clear. Given the risk involved and significant potential returns, we recommend a cautiously optimistic approach to blockchain with the focus on concrete use cases.The primary contributions of this paper are a use case centric categorization of the blockchain, a detailed discussion on challenges faced by those categories, and an assessment of their future.Implementation and validity of the long jump knowledge-based system: Case of the approach run phasehttps://peerj.com/preprints/275242019-02-082019-02-08Teerawat KamnardsiriWorawit JanchaiPattaraporn KhuwuthyakornWacharee Rittiwat
This study aimed to propose the method of implementation of the Knowledge-Based System (KBS) in the case of approach-run phase. The proposed method was implemented for improving the long jump performance of athletes in the approach-run phase. Moreover, this study aimed to examine KBS concurrent validity in distinguishing between professional and amateur populations and then KBS convergent validity against a Tracker video analysis tool. Seven running professionals aged 19 to 42 years and five amateurs aged 18 to 38 years had captured with ten conditions of different movements (C1 to C10) using a standard video camera (60 fps, 10 mm lens). The camera was fixed on the tripod. The results showing an age-related difference in a speed measurement of ten conditions were evidently using the KBS. Good associations were found between KBS and Tracker 4.94 video analysis tool across various conditions of three variables that were the starting position (r=0.926 and 0.963), the maximum velocity (r=0.972 and 0.995) and the location of maximum velocity (r=0.574 and 0.919). In conclusion, the proposed method is a reliable tool for measuring the starting position, maximum speed and position of maximum speed. Furthermore, the proposed method can also distinguish speed performance between professional and amateur across multiple movement conditions.
This study aimed to propose the method of implementation of the Knowledge-Based System (KBS) in the case of approach-run phase. The proposed method was implemented for improving the long jump performance of athletes in the approach-run phase. Moreover, this study aimed to examine KBS concurrent validity in distinguishing between professional and amateur populations and then KBS convergent validity against a Tracker video analysis tool. Seven running professionals aged 19 to 42 years and five amateurs aged 18 to 38 years had captured with ten conditions of different movements (C1 to C10) using a standard video camera (60 fps, 10 mm lens). The camera was fixed on the tripod. The results showing an age-related difference in a speed measurement of ten conditions were evidently using the KBS. Good associations were found between KBS and Tracker 4.94 video analysis tool across various conditions of three variables that were the starting position (r=0.926 and 0.963), the maximum velocity (r=0.972 and 0.995) and the location of maximum velocity (r=0.574 and 0.919). In conclusion, the proposed method is a reliable tool for measuring the starting position, maximum speed and position of maximum speed. Furthermore, the proposed method can also distinguish speed performance between professional and amateur across multiple movement conditions.GIZAChain: e-Government Interoperability Zone Alignment, based on blockchain technologyhttps://peerj.com/preprints/274772019-01-112019-01-11Mohamed A El-dosukyGamal H El-adl
E-government provides access to services anytime anywhere. There are many e-Government frameworks already exist to integrate e-government services, but efficient full interoperability still a challenge.
Interoperability per se can be modeled via four maturity stages, in which the interoperability zone is the holy grail of full interoperability to be reached ultimately with strategy alignment. As e-government services shift in the same way as e-commerce with value chain, this implicitly implies the possibility of benefiting from blockchain with e-government. Blockchain is a nascent promising architecture, whose transactions are permanent, verifiable, and recorded in a distributed ledger.
This research article suggests applying blockchain in achieving e- government interoperability. Forms are juxtaposed on the outer borders of the system. These forms adopt those used by UK government, because they are standard as well as they are available for Python developers. Once a form has been completed, PySOA calls the requested service, before storing the data in Ontology Blockchain. After the service is performed, the policies are analyzed in batch processing using quantgov. A report is submitted to the central government periodically. Ontology Blockchain has a dual effect. On the one hand, it works as a secure data storage. On the other hand, it cooperates with PySOA in supporting both technology and semantic interoperability . The most important feature of the proposed method is the presence of (Government Interoperability Zone Alignment; GIZA), which acts as a backbone that coherently connects the internal subcomponents. This linkage is possible, because each form has an title, that corresponds to the appropriate service name. Each service in turn has a counterpart in the wallets stored in Ontology blockchain.
To measure interoperability empirically, there is a need for metrics. This study adopts and quantizes a standard interoperability matrix along three dimensions of interoperability of Conceptual (Syntax& Semantics), Organizational (Responsibilities& Organization per se), and Technology (Platform& Communication). While concerns are : data, business, service, and process. Any deviation from the standard could contributes to the interoperability score (counting mismatches) or interoperability grade (counting absolute differences). An estimation is performed, for 1000 total random cases. It is estimated that the probability of getting a conceptual/technical interoperability score as large as the standard strategy score is (713 /1000 = 0.713 (2 in 3). It is estimated too that the probability of getting a organizational interoperability score as large as the standard strategy score is (712 /1000 = 0.712 (2 in 3). Then, Markov model is proposed to provide an accurate representation of the evolution of the strategies over time.
E-government provides access to services anytime anywhere. There are many e-Government frameworks already exist to integrate e-government services, but efficient full interoperability still a challenge.Interoperability per se can be modeled via four maturity stages, in which the interoperability zone is the holy grail of full interoperability to be reached ultimately with strategy alignment. As e-government services shift in the same way as e-commerce with value chain, this implicitly implies the possibility of benefiting from blockchain with e-government. Blockchain is a nascent promising architecture, whose transactions are permanent, verifiable, and recorded in a distributed ledger.This research article suggests applying blockchain in achieving e- government interoperability. Forms are juxtaposed on the outer borders of the system. These forms adopt those used by UK government, because they are standard as well as they are available for Python developers. Once a form has been completed, PySOA calls the requested service, before storing the data in Ontology Blockchain. After the service is performed, the policies are analyzed in batch processing using quantgov. A report is submitted to the central government periodically. Ontology Blockchain has a dual effect. On the one hand, it works as a secure data storage. On the other hand, it cooperates with PySOA in supporting both technology and semantic interoperability . The most important feature of the proposed method is the presence of (Government Interoperability Zone Alignment; GIZA), which acts as a backbone that coherently connects the internal subcomponents. This linkage is possible, because each form has an title, that corresponds to the appropriate service name. Each service in turn has a counterpart in the wallets stored in Ontology blockchain.To measure interoperability empirically, there is a need for metrics. This study adopts and quantizes a standard interoperability matrix along three dimensions of interoperability of Conceptual (Syntax& Semantics), Organizational (Responsibilities& Organization per se), and Technology (Platform& Communication). While concerns are : data, business, service, and process. Any deviation from the standard could contributes to the interoperability score (counting mismatches) or interoperability grade (counting absolute differences). An estimation is performed, for 1000 total random cases. It is estimated that the probability of getting a conceptual/technical interoperability score as large as the standard strategy score is (713 /1000 = 0.713 (2 in 3). It is estimated too that the probability of getting a organizational interoperability score as large as the standard strategy score is (712 /1000 = 0.712 (2 in 3). Then, Markov model is proposed to provide an accurate representation of the evolution of the strategies over time.Towards an open 3D participatory citizen debatehttps://peerj.com/preprints/272072018-09-142018-09-14Thibaud ChassinJens IngensandFlorent Joerin
This paper presents a platform aiming the ease of the debate between citizens. In the early 2010’ies, governments are seeking new ways to be more accountable and transparent towards their citizens; marking a renewal in public participation. In return, citizens are eager to be heard and to use new tools based on information and communication technologies (ICT) like the web 2.0. This public’s empowerment presents some costs for the authorities who are mainly concerned with the loss of decision making power. To face those challenges, several 2D online maps have been developed to help the governments to direct and centralize citizens insights. Those previous collaborating mapping tools helped to identify the characteristics of a reliable platform : user-friendly, simple and accessible (anywhere at any time). In our implementation, we adopted the third dimension which provides numerous benefits : 1. a more effective and effortless visualization, 2. An unbiased representation of the environment, 3. The merge of the participant cognition spaces. From our past experiences, we conceptualized the actors (citizens / facilitator / transcriber) interactions and dynamics in public engagement on-site meeting. From this approach, we evaluated how the utilization of a 3D virtual environment as the support of the participation will reshape and enhance the relation synergies between the actors : 1. Centralization of the interactions within the platform, 2. Automated analysis from the gathered raw information, 3. Reachability of a larger part of the population, 4. Lightening of the participatory processes.
This paper presents a platform aiming the ease of the debate between citizens. In the early 2010’ies, governments are seeking new ways to be more accountable and transparent towards their citizens; marking a renewal in public participation. In return, citizens are eager to be heard and to use new tools based on information and communication technologies (ICT) like the web 2.0. This public’s empowerment presents some costs for the authorities who are mainly concerned with the loss of decision making power. To face those challenges, several 2D online maps have been developed to help the governments to direct and centralize citizens insights. Those previous collaborating mapping tools helped to identify the characteristics of a reliable platform : user-friendly, simple and accessible (anywhere at any time). In our implementation, we adopted the third dimension which provides numerous benefits : 1. a more effective and effortless visualization, 2. An unbiased representation of the environment, 3. The merge of the participant cognition spaces. From our past experiences, we conceptualized the actors (citizens / facilitator / transcriber) interactions and dynamics in public engagement on-site meeting. From this approach, we evaluated how the utilization of a 3D virtual environment as the support of the participation will reshape and enhance the relation synergies between the actors : 1. Centralization of the interactions within the platform, 2. Automated analysis from the gathered raw information, 3. Reachability of a larger part of the population, 4. Lightening of the participatory processes.A survey on approaches to the protection of personal data gathered by IoT deviceshttps://peerj.com/preprints/264732018-07-252018-07-25Henry Tranter
Security is always at the forefront of developing technologies. One can seldom go a week without hearing of a new data breach or hacking attempt from various groups around the world, often taking advantage of a simple flaw in a system’s architecture. The Internet of Things (IoT) is one of these developing technologies which may be at risk of such attacks. IoT devices are becoming more and more prevalent in everyday life. From keeping track of an individual’s health, to suggesting meals from items available in an individual’s fridge, these technologies are taking a much larger role in the personal lives of their users. With this in mind, how is security being considered in the development of these technologies? Are these devices that monitor individual’s personal lives just additional vectors for potential data theft? Throughout this survey, various approaches to the development of security systems concerning IoT devices in the home will be discussed, compared, and contrasted in the hope of providing an ideal solution to the problems this technology may produce.
Security is always at the forefront of developing technologies. One can seldom go a week without hearing of a new data breach or hacking attempt from various groups around the world, often taking advantage of a simple flaw in a system’s architecture. The Internet of Things (IoT) is one of these developing technologies which may be at risk of such attacks. IoT devices are becoming more and more prevalent in everyday life. From keeping track of an individual’s health, to suggesting meals from items available in an individual’s fridge, these technologies are taking a much larger role in the personal lives of their users. With this in mind, how is security being considered in the development of these technologies? Are these devices that monitor individual’s personal lives just additional vectors for potential data theft? Throughout this survey, various approaches to the development of security systems concerning IoT devices in the home will be discussed, compared, and contrasted in the hope of providing an ideal solution to the problems this technology may produce.Virtual and remote laboratories augment self learning and interactions: Development, deployment and assessments with direct and online feedbackhttps://peerj.com/preprints/267152018-03-162018-03-16Dhanush KumarRakhi RadhamaniNijin NizarKrishnashree AchuthanBipin NairShyam Diwakar
Background. Over the last few decades, in developing nations including India, there have been rapid developments in information and communication technologies with progress towards sustainable development goals facilitating universal access to education. With the aim of augmenting laboratory skill training, India’s Ministry of Human Resource Development (MHRD)’s National Mission on Education through Information and Communication Technology (NME-ICT), launched Virtual laboratories project, enabling professors and institutions to deliver interactive animations, mathematical simulators and remotely-controlled equipment for online experiments in biosciences and engineering courses. Towards that mission of improving teaching and learning quality and with a focus on improving access to users in geographically remote and economically constrained institutes in India, we developed and deployed over 30 web-based laboratories consisting of over 360 computer-based online experiments. This paper focuses on the design, development, deployment of virtual laboratories and assesses the role of online experiments in providing self-learning and novel pedagogical practices for user communities.
Methods. As part of deployment, we evaluated the role virtual laboratories in facilitating self-organized learning and usage perception as a teaching tool in a blended education system. Direct feedback data was collected through organized workshops from 386 university-level students, 192 final year higher secondary school (pre-university) students and 234 college professors from various places across India. We also included online feedback from 2012-2018 to interpret usage analysis and adaptability of virtual and remote labs by online users.
Results. More than 80% of students who used virtual laboratories scored higher in examinations compared to a control group. With 386 students, 80% suggested adapted to self-learning using virtual laboratories. 82% of university teachers who employ virtual laboratories indicated using them to complement teaching material and reduce teaching time. Increase in online usage and feedback suggests novel trends in incorporating online platforms as pedagogical tools.
Discussion. Feedback indicated virtual laboratories altered and enhanced student’s autonomous learning abilities and improved interaction in blended classrooms. Pedagogical analysis suggests the use of ICT-enabled virtual laboratories as a self-organized distance education learning platform for university and pre-university students from economically challenged or time-restrained environments. Online usage statistics indicated steady increase of new users on this online repository suggesting global acceptance of virtual laboratories as a complementing laboratory skill-training online repository.
Background. Over the last few decades, in developing nations including India, there have been rapid developments in information and communication technologies with progress towards sustainable development goals facilitating universal access to education. With the aim of augmenting laboratory skill training, India’s Ministry of Human Resource Development (MHRD)’s National Mission on Education through Information and Communication Technology (NME-ICT), launched Virtual laboratories project, enabling professors and institutions to deliver interactive animations, mathematical simulators and remotely-controlled equipment for online experiments in biosciences and engineering courses. Towards that mission of improving teaching and learning quality and with a focus on improving access to users in geographically remote and economically constrained institutes in India, we developed and deployed over 30 web-based laboratories consisting of over 360 computer-based online experiments. This paper focuses on the design, development, deployment of virtual laboratories and assesses the role of online experiments in providing self-learning and novel pedagogical practices for user communities.Methods. As part of deployment, we evaluated the role virtual laboratories in facilitating self-organized learning and usage perception as a teaching tool in a blended education system. Direct feedback data was collected through organized workshops from 386 university-level students, 192 final year higher secondary school (pre-university) students and 234 college professors from various places across India. We also included online feedback from 2012-2018 to interpret usage analysis and adaptability of virtual and remote labs by online users.Results. More than 80% of students who used virtual laboratories scored higher in examinations compared to a control group. With 386 students, 80% suggested adapted to self-learning using virtual laboratories. 82% of university teachers who employ virtual laboratories indicated using them to complement teaching material and reduce teaching time. Increase in online usage and feedback suggests novel trends in incorporating online platforms as pedagogical tools.Discussion. Feedback indicated virtual laboratories altered and enhanced student’s autonomous learning abilities and improved interaction in blended classrooms. Pedagogical analysis suggests the use of ICT-enabled virtual laboratories as a self-organized distance education learning platform for university and pre-university students from economically challenged or time-restrained environments. Online usage statistics indicated steady increase of new users on this online repository suggesting global acceptance of virtual laboratories as a complementing laboratory skill-training online repository.Weather events identification in social media streams: tools to detect their evidence in Twitterhttps://peerj.com/preprints/22412017-09-212017-09-21Valentina GrassoImad ZazaFederica ZabiniGianni PantaleoPaolo NesiAlfonso Crisci
Severe weather impact identification and monitoring through social media data is a good challenge for data science. In last years we assisted to an increase of natural disasters, also due to climate change. Many works showed that during such events people tend to share specific messages by of mean of social media platforms, especially Twitter. Not only they contribute to"situational" awareness also improving the dissemination of information during emergency but can be used to assess social impact of crisis events. We present in this work preliminary findings concerning how temporal distribution of weather related messages may help the identification of severe events that impacted a community. Severe weather events are recognizable by observing the synchronization of twitter streams volumes concerning extractions by using different but semantically graduate terms and hash-tags including the specific containing geo-content names. Impacting events seems immediately recognizable by graphical representation of weather streams and when the time-line show a specific parallel-wise pattern that we named "Half Onion Shape". Different but weather semantically linked twitter streams could exhibits different magnitude, in order to their term popularity, but they show, when a weather event occurs, the same temporal relative maximum. In reason of to these interesting indications, that needs to be confirmed through more deeper analysis, and of the great use of social media, as Twitter, during crisis events it's becoming fundamental to have a suite of suitable tools to monitor social media data. For Twitter data a comprehensive suite of tools is presented: the DISIT-Twitter Vigilance Platform for twitter data retrieve,management and visualization.
Severe weather impact identification and monitoring through social media data is a good challenge for data science. In last years we assisted to an increase of natural disasters, also due to climate change. Many works showed that during such events people tend to share specific messages by of mean of social media platforms, especially Twitter. Not only they contribute to"situational" awareness also improving the dissemination of information during emergency but can be used to assess social impact of crisis events. We present in this work preliminary findings concerning how temporal distribution of weather related messages may help the identification of severe events that impacted a community. Severe weather events are recognizable by observing the synchronization of twitter streams volumes concerning extractions by using different but semantically graduate terms and hash-tags including the specific containing geo-content names. Impacting events seems immediately recognizable by graphical representation of weather streams and when the time-line show a specific parallel-wise pattern that we named "Half Onion Shape". Different but weather semantically linked twitter streams could exhibits different magnitude, in order to their term popularity, but they show, when a weather event occurs, the same temporal relative maximum. In reason of to these interesting indications, that needs to be confirmed through more deeper analysis, and of the great use of social media, as Twitter, during crisis events it's becoming fundamental to have a suite of suitable tools to monitor social media data. For Twitter data a comprehensive suite of tools is presented: the DISIT-Twitter Vigilance Platform for twitter data retrieve,management and visualization.System-on-chip sensor fusion front-end self-healing design for network-on-chip digital communicationshttps://peerj.com/preprints/26942017-02-152017-02-15Emmanuel SeamanJason Yuan Du
With the ultra-scaling of CMOS technology, high-speed and low-power millimeter-wave communication systems for network-on-chip have been attracting more and more attentions due to the wider bandwidth and higher data rate that can meet the ever-increasing needs for multimedia, massive external data storage, or even biomedical applications. However, from manufacturing’s perspective, the circuits implementations are increasingly susceptible to fabrication process variations with the scaling of CMOS technology, which results in loss of yield rate. To solve this issue, a sensor-fusion solution is proposed in this paper by adding multiple on-chip sensors, including power detectors, temperature sensors, information envelope detectors and related filters, instrumentation amplifiers using a standard CMOS process. These sensors and detectors aim to collect critical system performance and environmental parameters, which will be utilized by a self-healing and optimization algorithm to adjust the state of system components by digitized control knobs.
With the ultra-scaling of CMOS technology, high-speed and low-power millimeter-wave communication systems for network-on-chip have been attracting more and more attentions due to the wider bandwidth and higher data rate that can meet the ever-increasing needs for multimedia, massive external data storage, or even biomedical applications. However, from manufacturing’s perspective, the circuits implementations are increasingly susceptible to fabrication process variations with the scaling of CMOS technology, which results in loss of yield rate. To solve this issue, a sensor-fusion solution is proposed in this paper by adding multiple on-chip sensors, including power detectors, temperature sensors, information envelope detectors and related filters, instrumentation amplifiers using a standard CMOS process. These sensors and detectors aim to collect critical system performance and environmental parameters, which will be utilized by a self-healing and optimization algorithm to adjust the state of system components by digitized control knobs.IoT and Robotics: a synergyhttps://peerj.com/preprints/27602017-01-312017-01-31Ankur Roy Chowdhury
The Internet of Robotic Things (IoRT) is a concept first introduced by Dan Kara at ABI Research, which talks about augmenting the existing IoT with active sensorization; thereby, opening the doors to novel business ideas, at the intersection of both IoT and Robotics. This position paper considers the synergy between IoT and robotics: it talks about the technologies in IoT that would benefit the robotics domain. The advent of Cloud Robotics and its role in aiding robot functions like sensing, manipulation, and mobility. The paper then discusses the ways in which robots can extend the capabilities of existing IoT infrastructure by acting as a special class of edge device. IoT-aided robotic applications are discussed in various domains like health-care, military, industrial plants and rescue operations. The paper concludes by considering the use case of an Intelligent Transportation System endowed by an IoRT-inspired architecture.
The Internet of Robotic Things (IoRT) is a concept first introduced by Dan Kara at ABI Research, which talks about augmenting the existing IoT with active sensorization; thereby, opening the doors to novel business ideas, at the intersection of both IoT and Robotics. This position paper considers the synergy between IoT and robotics: it talks about the technologies in IoT that would benefit the robotics domain. The advent of Cloud Robotics and its role in aiding robot functions like sensing, manipulation, and mobility. The paper then discusses the ways in which robots can extend the capabilities of existing IoT infrastructure by acting as a special class of edge device. IoT-aided robotic applications are discussed in various domains like health-care, military, industrial plants and rescue operations. The paper concludes by considering the use case of an Intelligent Transportation System endowed by an IoRT-inspired architecture.