Individual Sensor

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 8739 Experts worldwide ranked by ideXlab platform

Lau M Andersen - One of the best experts on this subject based on the ideXlab platform.

  • group analysis in fieldtrip of time frequency responses a pipeline for reproducibility at every step of processing going from Individual Sensor space representations to an across group source space representation
    Frontiers in Neuroscience, 2018
    Co-Authors: Lau M Andersen
    Abstract:

    An important aim of an analysis pipeline for magnetoencephalographic (MEG) data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer his or her questions. The example question being answered here is whether the so-called beta rebound differs between novel and repeated stimulations. Two analyses are presented: going from Individual Sensor space representations to respectively an across-group Sensor space representation and an across-group source space representation. The data analysed are neural responses to tactile stimulations of the right index finger in a group of twenty healthy participants acquired from an Elekta Neuromag System. The processing steps covered for the first analysis are MaxFiltering the raw data, defining, preprocessing and epoching the data, cleaning the data, finding and removing independent components related to eye blinks, eye movements and heart beats, calculating participants' Individual evoked responses by averaging over epoched data and subsequently removing the average response from single epochs, calculating a time-frequency representation and baselining it with non-stimulation trials and finally calculating a grand average, an across-group Sensor space representation. The second analysis starts from the grand average Sensor space representation and after identification of the beta rebound the neural origin is imaged using beamformer source reconstruction. This analysis covers reading in co-registered magnetic resonance images, segmenting the data, creating a volume conductor, creating a forward model, cutting out MEG data of interest in the time and frequency domains, getting Fourier transforms and estimating source activity with a beamformer model where power is expressed relative to MEG data measured during periods of non-stimulation. Finally, morphing the source estimates onto a common template and performing group-level statistics on the data are covered. Functions for saving relevant figures in an automated and structured manner are also included. The protocol presented here can be applied to any research protocol where the emphasis is on source reconstruction of induced responses where the underlying sources are not coherent.

  • group analysis in mne python of evoked responses from a tactile stimulation paradigm a pipeline for reproducibility at every step of processing going from Individual Sensor space representations to an across group source space representation
    Frontiers in Neuroscience, 2018
    Co-Authors: Lau M Andersen
    Abstract:

    An important aim of an analysis pipeline for magnetoencephalographic data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer the questions of the researcher, while in turn spending minimal effort on the intricacies and machinery of the pipeline. I here present a set of functions and scripts that allow for setting up a clear, reproducible structure for separating raw and processed data into folders and files such that minimal effort can be spend on: 1) double-checking that the right input goes into the right functions; 2) making sure that output and intermediate steps can be accessed meaningfully; 3) applying operations efficiently across groups of subjects; 4) re-processing data if changes to any intermediate step are desirable. Applying the scripts requires only general knowledge about the Python language. The data analysed are neural responses to tactile stimulations of the right index finger in a group of twenty healthy participants acquired from an Elekta Neuromag System. Two analyses are presented: going from Individual Sensor space representations to respectively an across-group Sensor space representation and an across-group source space representation. The processing steps covered for the first analysis are filtering the raw data, finding events of interest in the data, epoching data, finding and removing independent components related to eye blinks and heart beats, calculating participants' Individual evoked responses by averaging over epoched data and calculating a grand average Sensor space representation over participants. The second analysis starts from the participants' Individual evoked responses and covers: estimating noise covariance, creating a forward model, creating an inverse operator, estimating distributed source activity on the cortical surface using a minimum norm procedure, morphing those estimates onto a common cortical template and calculating the patterns of activity that are statistically different from baseline. To estimate source activity, processing of the anatomy of subjects based on magnetic resonance imaging is necessary. The necessary steps are covered here: importing magnetic resonance images, segmenting the brain, estimating boundaries between different tissue layers, making fine-resolution scalp surfaces for facilitating co-registration, creating source spaces and creating volume conductors for each subject.

Seth Hutchinson - One of the best experts on this subject based on the ideXlab platform.

  • worst case performance of a mobile Sensor network under Individual Sensor failure
    International Conference on Robotics and Automation, 2013
    Co-Authors: Hyongju Park, Seth Hutchinson
    Abstract:

    In this paper, we consider the problem of worst-case performance by a mobile Sensor network (MSN) when some of the nodes in the network fail. We formulate the problem as a game in which some subset of the nodes act in an adversarial manner, choosing their motion strategies to maximally degrade overall performance of the network as a whole. We restrict our attention in the present paper to a target detection problem in which the goal is to minimize the probability of missed detection. We use a partitioned cost function that is minimized when each Sensor executes a motion strategy given by Lloyd's algorithm (i.e., each agent moves toward the centroid of its Voronoi partition at each time instant), and when the probability of missed detection for each functioning Sensor increases with the distance between Sensor and target for correctly functioning Sensors; adversarial nodes in the network are unable to detect the target, and move to maximally increase the probability of missed detection by the properly functioning Sensors. We pose the problem as a multi-stage decision process, and use forward dynamic programming over a finite horizon to numerically compute optimal strategies for the adversaries. We compare the resulting strategies to a greedy algorithm, providing both system trajectories and evolution of the probability of missed detection during execution.

  • ICRA - Worst-case performance of a mobile Sensor network under Individual Sensor failure
    2013 IEEE International Conference on Robotics and Automation, 2013
    Co-Authors: Hyongju Park, Seth Hutchinson
    Abstract:

    In this paper, we consider the problem of worst-case performance by a mobile Sensor network (MSN) when some of the nodes in the network fail. We formulate the problem as a game in which some subset of the nodes act in an adversarial manner, choosing their motion strategies to maximally degrade overall performance of the network as a whole. We restrict our attention in the present paper to a target detection problem in which the goal is to minimize the probability of missed detection. We use a partitioned cost function that is minimized when each Sensor executes a motion strategy given by Lloyd's algorithm (i.e., each agent moves toward the centroid of its Voronoi partition at each time instant), and when the probability of missed detection for each functioning Sensor increases with the distance between Sensor and target for correctly functioning Sensors; adversarial nodes in the network are unable to detect the target, and move to maximally increase the probability of missed detection by the properly functioning Sensors. We pose the problem as a multi-stage decision process, and use forward dynamic programming over a finite horizon to numerically compute optimal strategies for the adversaries. We compare the resulting strategies to a greedy algorithm, providing both system trajectories and evolution of the probability of missed detection during execution.

Kare Synnes - One of the best experts on this subject based on the ideXlab platform.

  • assessing the impact of Individual Sensor reliability within smart living environments
    Conference on Automation Science and Engineering, 2008
    Co-Authors: Chris Nugent, Xin Hong, Josef Hallberg, Dewar D Finlay, Kare Synnes
    Abstract:

    The potential of smart living environments to provide a form of independent living for the ageing population is becoming more recognised. These environments are comprised of Sensors which are used to assess the state of the environment, some form of information management to process the Sensor data and finally a suite of actuators which can be used to change the state of the environment. When providing a form of support which may impinge upon the well being of the end user it is essential that a high degree of reliability can be maintained. Within this paper we present an information management framework to process Sensor based data within smart environments. Based on this framework we assess the impact of Sensor reliability on the classification of activities of daily living. From this assessment we show how it is possible to identify which Sensors within a given set of experiments can be considered to be the most critical and as such consider how this information may be used for managing Sensor reliability from a practical point of view.

  • CASE - Assessing the impact of Individual Sensor reliability within smart living environments
    2008 IEEE International Conference on Automation Science and Engineering, 2008
    Co-Authors: Chris Nugent, Xin Hong, Josef Hallberg, Dewar D Finlay, Kare Synnes
    Abstract:

    The potential of smart living environments to provide a form of independent living for the ageing population is becoming more recognised. These environments are comprised of Sensors which are used to assess the state of the environment, some form of information management to process the Sensor data and finally a suite of actuators which can be used to change the state of the environment. When providing a form of support which may impinge upon the well being of the end user it is essential that a high degree of reliability can be maintained. Within this paper we present an information management framework to process Sensor based data within smart environments. Based on this framework we assess the impact of Sensor reliability on the classification of activities of daily living. From this assessment we show how it is possible to identify which Sensors within a given set of experiments can be considered to be the most critical and as such consider how this information may be used for managing Sensor reliability from a practical point of view.

Hyongju Park - One of the best experts on this subject based on the ideXlab platform.

  • worst case performance of a mobile Sensor network under Individual Sensor failure
    International Conference on Robotics and Automation, 2013
    Co-Authors: Hyongju Park, Seth Hutchinson
    Abstract:

    In this paper, we consider the problem of worst-case performance by a mobile Sensor network (MSN) when some of the nodes in the network fail. We formulate the problem as a game in which some subset of the nodes act in an adversarial manner, choosing their motion strategies to maximally degrade overall performance of the network as a whole. We restrict our attention in the present paper to a target detection problem in which the goal is to minimize the probability of missed detection. We use a partitioned cost function that is minimized when each Sensor executes a motion strategy given by Lloyd's algorithm (i.e., each agent moves toward the centroid of its Voronoi partition at each time instant), and when the probability of missed detection for each functioning Sensor increases with the distance between Sensor and target for correctly functioning Sensors; adversarial nodes in the network are unable to detect the target, and move to maximally increase the probability of missed detection by the properly functioning Sensors. We pose the problem as a multi-stage decision process, and use forward dynamic programming over a finite horizon to numerically compute optimal strategies for the adversaries. We compare the resulting strategies to a greedy algorithm, providing both system trajectories and evolution of the probability of missed detection during execution.

  • ICRA - Worst-case performance of a mobile Sensor network under Individual Sensor failure
    2013 IEEE International Conference on Robotics and Automation, 2013
    Co-Authors: Hyongju Park, Seth Hutchinson
    Abstract:

    In this paper, we consider the problem of worst-case performance by a mobile Sensor network (MSN) when some of the nodes in the network fail. We formulate the problem as a game in which some subset of the nodes act in an adversarial manner, choosing their motion strategies to maximally degrade overall performance of the network as a whole. We restrict our attention in the present paper to a target detection problem in which the goal is to minimize the probability of missed detection. We use a partitioned cost function that is minimized when each Sensor executes a motion strategy given by Lloyd's algorithm (i.e., each agent moves toward the centroid of its Voronoi partition at each time instant), and when the probability of missed detection for each functioning Sensor increases with the distance between Sensor and target for correctly functioning Sensors; adversarial nodes in the network are unable to detect the target, and move to maximally increase the probability of missed detection by the properly functioning Sensors. We pose the problem as a multi-stage decision process, and use forward dynamic programming over a finite horizon to numerically compute optimal strategies for the adversaries. We compare the resulting strategies to a greedy algorithm, providing both system trajectories and evolution of the probability of missed detection during execution.

Quan Z Sheng - One of the best experts on this subject based on the ideXlab platform.

  • Reduce or remove: Individual Sensor reliability profiling and data cleaning
    Intelligent Data Analysis, 2016
    Co-Authors: Yihong Zhang, Claudia Szabo, Quan Z Sheng
    Abstract:

    Environmental sensing using multitudes of wirelessly connected Sensors is becoming critical for resolving environmental problems, given recent technology advances in the Internet of Things (IoT). Current environmental sensing projects typically deploy commodity Sensors, which are known to be unreliable and prone to produce noisy and erroneous data. Moreover, the majority of current Sensor data cleaning techniques have not moved beyond using the mean or the median of spatially correlated readings, thus providing unsatisfying accuracies. In this paper, we propose a Sensor reliability-based cleaning method, called Influence Mean (IM), which uses weighted aggregation based on Individual Sensor reliabilities. We investigate whether reducing or removing unreliable Sensors can be more effective to provide accurate cleaning results, by designing and testing respective algorithms on synthetic and real datasets. The experimental results show that our method generally improves the data cleaning accuracy, particularly when the behaviors of unreliable Sensors vary drastically from reliable Sensors.Yihong Zhang, Claudia Szabo and Quan Z. Shen

  • ICPADS - An Estimation Maximization Based Approach for Finding Reliable Sensors in Environmental Sensing
    2015 IEEE 21st International Conference on Parallel and Distributed Systems (ICPADS), 2015
    Co-Authors: Yihong Zhang, Claudia Szabo, Quan Z Sheng
    Abstract:

    Emerging Internet of Things (IoT)-based environmental sensing projects provide large-scale sensing data from Individual Sensors with high reading frequencies. These readings are usually produced by commodity Sensors with varied reliabilities, and inevitably contain noises and errors. Most existing data cleaning techniques focus on issues such as communication overhead reduction and energy preservation, and do not take advantage of the unaggregated data from Individual Sensors that IoT environmental sensing projects offer. In this paper, we propose an Expectation Maximization algorithm for finding reliable Sensors in environmental sensing data that assumes the preservation of Individual Sensor readings and high reading frequencies. Our approach simultaneously finds the environmental feature model and the faulty state of the Sensors. Our extensive experiments show that the proposed approach is significantly more effective than existing approaches. Particularly, in a case where reliable Sensors and faulty Sensors differ significantly in their readings, the maximum squared error for other approaches exceeds 200, but for our approach is only 1.23.

  • cleaning environmental sensing data streams based on Individual Sensor reliability
    Web Information Systems Engineering, 2014
    Co-Authors: Yihong Zhang, Claudia Szabo, Quan Z Sheng
    Abstract:

    Environmental sensing is becoming a significant way for understanding and transforming the environment, given recent technology advances in the Internet of Things (IoT). Current environmental sensing projects typically deploy commodity Sensors, which are known to be unreliable and prone to produce noisy and erroneous data. Unfortunately, the accuracy of current cleaning techniques based on mean or median prediction is unsatisfactory. In this paper, we propose a cleaning method based on incrementally adjusted Individual Sensor reliabilities, called influence mean cleaning (IMC). By incrementally adjusting Sensor reliabilities, our approach can properly discover latent Sensor reliability values in a data stream, and improve reliability-weighted prediction even in a Sensor network with changing conditions. The experimental results based on both synthetic and real datasets show that our approach achieves higher accuracy than the mean and median-based approaches after some initial adjustment iterations.

  • WISE (2) - Cleaning Environmental Sensing Data Streams Based on Individual Sensor Reliability
    Web Information Systems Engineering – WISE 2014, 2014
    Co-Authors: Yihong Zhang, Claudia Szabo, Quan Z Sheng
    Abstract:

    Environmental sensing is becoming a significant way for understanding and transforming the environment, given recent technology advances in the Internet of Things (IoT). Current environmental sensing projects typically deploy commodity Sensors, which are known to be unreliable and prone to produce noisy and erroneous data. Unfortunately, the accuracy of current cleaning techniques based on mean or median prediction is unsatisfactory. In this paper, we propose a cleaning method based on incrementally adjusted Individual Sensor reliabilities, called influence mean cleaning (IMC). By incrementally adjusting Sensor reliabilities, our approach can properly discover latent Sensor reliability values in a data stream, and improve reliability-weighted prediction even in a Sensor network with changing conditions. The experimental results based on both synthetic and real datasets show that our approach achieves higher accuracy than the mean and median-based approaches after some initial adjustment iterations.