The Experts below are selected from a list of 105600 Experts worldwide ranked by ideXlab platform
Ken Takeuchi - One of the best experts on this subject based on the ideXlab platform.
-
over 10 times high speed energy efficient 3d tsv integrated hybrid reram mlc nand ssd by intelligent Data Fragmentation suppression
Asia and South Pacific Design Automation Conference, 2013Co-Authors: Chao Sun, Hiroki Fujii, Kousuke Miyaji, Koh Johguchi, Kazuhide Higuchi, Ken TakeuchiAbstract:A 3D through-silicon-via (TSV)-integrated hybrid ReRAM/multi-level-cell (MLC) NAND solid-state drive's (SSD's) architecture is proposed with NAND-like interface (I/F) and sector-access overwrite policy for ReRAM. Furthermore, intelligent Data management algorithms are proposed to suppress Data Fragmentation and excess usage of MLC NAND. As a result, 11-times performance increase, 6.9-times endurance enhancement and 93% write energy reduction are achieved. Both ReRAM write and read latency should be less than 3 μs to obtain these improvements. The required endurance for ReRAM is 105.
-
ASP-DAC - Over 10-times high-speed, energy efficient 3D TSV-integrated hybrid ReRAM/MLC NAND SSD by intelligent Data Fragmentation suppression
2013 18th Asia and South Pacific Design Automation Conference (ASP-DAC), 2013Co-Authors: Chao Sun, Hiroki Fujii, Kousuke Miyaji, Koh Johguchi, Kazuhide Higuchi, Ken TakeuchiAbstract:A 3D through-silicon-via (TSV)-integrated hybrid ReRAM/multi-level-cell (MLC) NAND solid-state drive's (SSD's) architecture is proposed with NAND-like interface (I/F) and sector-access overwrite policy for ReRAM. Furthermore, intelligent Data management algorithms are proposed to suppress Data Fragmentation and excess usage of MLC NAND. As a result, 11-times performance increase, 6.9-times endurance enhancement and 93% write energy reduction are achieved. Both ReRAM write and read latency should be less than 3 μs to obtain these improvements. The required endurance for ReRAM is 105.
-
x11 performance increase x6 9 endurance enhancement 93 energy reduction of 3d tsv integrated hybrid reram mlc nand ssds by Data Fragmentation suppression
Symposium on VLSI Circuits, 2012Co-Authors: Hiroki Fujii, Kousuke Miyaji, Koh Johguchi, Kazuhide Higuchi, Ken TakeuchiAbstract:A 3D through-silicon-via (TSV) -integrated hybrid ReRAM/multi-level-cell (MLC) NAND solid-state drives' (SSDs') architecture is proposed for PC, server and smart phone applications. NAND-like interface (I/F) and sector-access overwrite policy are proposed for the ReRAM. Furthermore, intelligent Data management algorithms are proposed. The proposed algorithms suppress Data Fragmentation and excess usage of the MLC NAND by storing hot Data in the ReRAM. As a result, 11 times performance increase, 6.9 times endurance enhancement and 93% write energy reduction are achieved compared with the conventional MLC NAND SSD. Both ReRAM write and read latency should be less than 3µs to obtain these improvements. The required endurance for ReRAM is 105. 3D TSV interconnects reduce the energy consumption by 68%.
-
VLSIC - x11 performance increase, x6.9 endurance enhancement, 93% energy reduction of 3D TSV-integrated hybrid ReRAM/MLC NAND SSDs by Data Fragmentation suppression
2012 Symposium on VLSI Circuits (VLSIC), 2012Co-Authors: Hiroki Fujii, Chao Sun, Kousuke Miyaji, Koh Johguchi, Kazuhide Higuchi, Ken TakeuchiAbstract:A 3D through-silicon-via (TSV) -integrated hybrid ReRAM/multi-level-cell (MLC) NAND solid-state drives' (SSDs') architecture is proposed for PC, server and smart phone applications. NAND-like interface (I/F) and sector-access overwrite policy are proposed for the ReRAM. Furthermore, intelligent Data management algorithms are proposed. The proposed algorithms suppress Data Fragmentation and excess usage of the MLC NAND by storing hot Data in the ReRAM. As a result, 11 times performance increase, 6.9 times endurance enhancement and 93% write energy reduction are achieved compared with the conventional MLC NAND SSD. Both ReRAM write and read latency should be less than 3µs to obtain these improvements. The required endurance for ReRAM is 105. 3D TSV interconnects reduce the energy consumption by 68%.
-
ferroelectric fe nand flash memory with non volatile page buffer for Data center application enterprise solid state drives ssd
Symposium on VLSI Circuits, 2009Co-Authors: Teruyoshi Hatanaka, Ryoji Yajima, Takeshi Horiuchi, Shouyu Wang, Xizhen Zhang, Mitsue Takahashi, Shigeki Sakai, Ken TakeuchiAbstract:A ferroelectric(Fe)-NAND flash memory with a non-volatile (NV) page buffer is proposed. The Data Fragmentation in a random write is removed by introducing a batch write algorithm. As a result, the SSD performance can double. The NV-page buffer realizes a power outage immune highly reliable operation. With a low program/erase voltage, 6V and a high endurance, 100Million cycles, the proposed Fe-NAND is most suitable for a highly reliable high-speed low power Data center application enterprise SSD.
Christopher G. Chute - One of the best experts on this subject based on the ideXlab platform.
-
impact of Data Fragmentation across healthcare centers on the accuracy of a high throughput clinical phenotyping algorithm for specifying subjects with type 2 diabetes mellitus
Journal of the American Medical Informatics Association, 2012Co-Authors: Cynthia L. Leibson, Jeanine E. Ransom, Pedro J. Caraballo, Barbara P. Yawn, Jennifer A. Pacheco, High Seng Chai, Christopher G. ChuteAbstract:Objective: To evaluate Data Fragmentation across healthcare centers with regard to the accuracy of a high-throughput clinical phenotyping (HTCP) algorithm developed to differentiate (1) patients with type 2 diabetes mellitus (T2DM) and (2) patients with no diabetes. Materials and methods: This population-based study identified all Olmsted County, Minnesota residents in 2007. We used provider-linked electronic medical record Data from the two healthcare centers that provide >95% of all care to County residents (ie, Olmsted Medical Center and Mayo Clinic in Rochester, Minnesota, USA). Subjects were limited to residents with one or more encounter January 1, 2006 through December 31, 2007 at both healthcare centers. DM-relevant Data on diagnoses, laboratory results, and medication from both centers were obtained during this period. The algorithm was first executed using Data from both centers (ie, the gold standard) and then from Mayo Clinic alone. Positive predictive values and false-negative rates were calculated, and the McNemar test was used to compare categorization when Data from the Mayo Clinic alone were used with the gold standard. Age and sex were compared between true-positive and false-negative subjects with T2DM. Statistical significance was accepted as p<0.05. Results: With Data from both medical centers, 765 subjects with T2DM (4256 non-DM subjects) were identified. When single-center Data were used, 252 T2DM subjects (1573 non-DM subjects) were missed; an additional false-positive 27 T2DM subjects (215 non-DM subjects) were identified. The positive predictive values and false-negative rates were 95.0% (513/540) and 32.9% (252/765), respectively, for T2DM subjects and 92.6% (2683/2898) and 37.0% (1573/4256), respectively, for non-DM subjects. Age and sex distribution differed between true-positive (mean age 62.1; 45% female) and false-negative (mean age 65.0; 56.0% female) T2DM subjects. Conclusion: The findings show that application of an HTCP algorithm using Data from a single medical center contributes to misclassification. These findings should be considered carefully by researchers when developing and executing HTCP algorithms.
-
Impact of Data Fragmentation across healthcare centers on the accuracy of a high-throughput clinical phenotyping algorithm for specifying subjects with type 2 diabetes mellitus.
Journal of the American Medical Informatics Association : JAMIA, 2012Co-Authors: Wei-qi Wei, Cynthia L. Leibson, Jeanine E. Ransom, Abel N. Kho, Pedro J. Caraballo, Seng Chai, Barbara P. Yawn, Jennifer A. Pacheco, Christopher G. ChuteAbstract:Objective: To evaluate Data Fragmentation across healthcare centers with regard to the accuracy of a high-throughput clinical phenotyping (HTCP) algorithm developed to differentiate (1) patients with type 2 diabetes mellitus (T2DM) and (2) patients with no diabetes. Materials and methods: This population-based study identified all Olmsted County, Minnesota residents in 2007. We used provider-linked electronic medical record Data from the two healthcare centers that provide >95% of all care to County residents (ie, Olmsted Medical Center and Mayo Clinic in Rochester, Minnesota, USA). Subjects were limited to residents with one or more encounter January 1, 2006 through December 31, 2007 at both healthcare centers. DM-relevant Data on diagnoses, laboratory results, and medication from both centers were obtained during this period. The algorithm was first executed using Data from both centers (ie, the gold standard) and then from Mayo Clinic alone. Positive predictive values and false-negative rates were calculated, and the McNemar test was used to compare categorization when Data from the Mayo Clinic alone were used with the gold standard. Age and sex were compared between true-positive and false-negative subjects with T2DM. Statistical significance was accepted as p
Hyogon Kim - One of the best experts on this subject based on the ideXlab platform.
-
Data Fragmentation scheme in ieee 802 15 4 wireless sensor networks
Vehicular Technology Conference, 2007Co-Authors: Jongwon Yoon, Hyogon KimAbstract:The IEEE 802.15.4 medium access control (MAC) protocol is designed for low Data rate, short distance and low power communication applications such as wireless sensor networks (WSN). However, in the standard 802.15.4 MAC, if the remaining number of backoff periods in the current superframe are not enough to complete Data transmission procedure, the sensor nodes hold the transmission until the next superframe. When two or more sensor nodes hold Data transmission and restart the transmission procedure simultaneously in the next superframe, it causes a collision of Data packets and waste of the channel utilization. Therefore, the MAC design is inadequate to deal with high contention environments such as densely deployed sensor networks. In this paper, we propose a Data Fragmentation scheme to increase channel utilization and avoid inevitable collision. Our proposed scheme outperforms the standard IEEE 802.15.4 MAC in terms of collision probability and aggregate throughput. The proposed scheme is easily adapted to the standard IEEE 802.15.4 MAC without any additional message types
-
VTC Spring - Data Fragmentation Scheme in IEEE 802.15.4 Wireless Sensor Networks
2007 IEEE 65th Vehicular Technology Conference - VTC2007-Spring, 2007Co-Authors: Jongwon Yoon, Hyogon KimAbstract:The IEEE 802.15.4 medium access control (MAC) protocol is designed for low Data rate, short distance and low power communication applications such as wireless sensor networks (WSN). However, in the standard 802.15.4 MAC, if the remaining number of backoff periods in the current superframe are not enough to complete Data transmission procedure, the sensor nodes hold the transmission until the next superframe. When two or more sensor nodes hold Data transmission and restart the transmission procedure simultaneously in the next superframe, it causes a collision of Data packets and waste of the channel utilization. Therefore, the MAC design is inadequate to deal with high contention environments such as densely deployed sensor networks. In this paper, we propose a Data Fragmentation scheme to increase channel utilization and avoid inevitable collision. Our proposed scheme outperforms the standard IEEE 802.15.4 MAC in terms of collision probability and aggregate throughput. The proposed scheme is easily adapted to the standard IEEE 802.15.4 MAC without any additional message types
Hyontai Sug - One of the best experts on this subject based on the ideXlab platform.
-
MICAI - Using reliable short rules to avoid unnecessary tests in decision trees
Lecture Notes in Computer Science, 2006Co-Authors: Hyontai SugAbstract:It is known that in decision trees the reliability of lower branches is worse than the upper branches due to Data Fragmentation problem. As a result, unnecessary tests of attributes may be done, because decision trees may require tests that are not best for some part of the Data objects. To supplement the weak point of decision trees of Data Fragmentation, using reliable short rules with decision tree is suggested, where the short rules come from limited application of association rule finding algorithms. Experiment shows the method can not only generate more reliable decisions but also save test costs by using the short rules.
-
Using reliable short rules to avoid unnecessary tests in decision trees
Lecture Notes in Computer Science, 2006Co-Authors: Hyontai SugAbstract:It is known that in decision trees the reliability of lower branches is worse than the upper branches due to Data Fragmentation problem. As a result, unnecessary tests of attributes may be done, because decision trees may require tests that are not best for some part of the Data objects. To supplement the weak point of decision trees of Data Fragmentation, using reliable short rules with decision tree is suggested, where the short rules come from limited application of association rule finding algorithms. Experiment shows the method can not only generate more reliable decisions but also save test costs by using the short rules.
Cynthia L. Leibson - One of the best experts on this subject based on the ideXlab platform.
-
impact of Data Fragmentation across healthcare centers on the accuracy of a high throughput clinical phenotyping algorithm for specifying subjects with type 2 diabetes mellitus
Journal of the American Medical Informatics Association, 2012Co-Authors: Cynthia L. Leibson, Jeanine E. Ransom, Pedro J. Caraballo, Barbara P. Yawn, Jennifer A. Pacheco, High Seng Chai, Christopher G. ChuteAbstract:Objective: To evaluate Data Fragmentation across healthcare centers with regard to the accuracy of a high-throughput clinical phenotyping (HTCP) algorithm developed to differentiate (1) patients with type 2 diabetes mellitus (T2DM) and (2) patients with no diabetes. Materials and methods: This population-based study identified all Olmsted County, Minnesota residents in 2007. We used provider-linked electronic medical record Data from the two healthcare centers that provide >95% of all care to County residents (ie, Olmsted Medical Center and Mayo Clinic in Rochester, Minnesota, USA). Subjects were limited to residents with one or more encounter January 1, 2006 through December 31, 2007 at both healthcare centers. DM-relevant Data on diagnoses, laboratory results, and medication from both centers were obtained during this period. The algorithm was first executed using Data from both centers (ie, the gold standard) and then from Mayo Clinic alone. Positive predictive values and false-negative rates were calculated, and the McNemar test was used to compare categorization when Data from the Mayo Clinic alone were used with the gold standard. Age and sex were compared between true-positive and false-negative subjects with T2DM. Statistical significance was accepted as p<0.05. Results: With Data from both medical centers, 765 subjects with T2DM (4256 non-DM subjects) were identified. When single-center Data were used, 252 T2DM subjects (1573 non-DM subjects) were missed; an additional false-positive 27 T2DM subjects (215 non-DM subjects) were identified. The positive predictive values and false-negative rates were 95.0% (513/540) and 32.9% (252/765), respectively, for T2DM subjects and 92.6% (2683/2898) and 37.0% (1573/4256), respectively, for non-DM subjects. Age and sex distribution differed between true-positive (mean age 62.1; 45% female) and false-negative (mean age 65.0; 56.0% female) T2DM subjects. Conclusion: The findings show that application of an HTCP algorithm using Data from a single medical center contributes to misclassification. These findings should be considered carefully by researchers when developing and executing HTCP algorithms.
-
Impact of Data Fragmentation across healthcare centers on the accuracy of a high-throughput clinical phenotyping algorithm for specifying subjects with type 2 diabetes mellitus.
Journal of the American Medical Informatics Association : JAMIA, 2012Co-Authors: Wei-qi Wei, Cynthia L. Leibson, Jeanine E. Ransom, Abel N. Kho, Pedro J. Caraballo, Seng Chai, Barbara P. Yawn, Jennifer A. Pacheco, Christopher G. ChuteAbstract:Objective: To evaluate Data Fragmentation across healthcare centers with regard to the accuracy of a high-throughput clinical phenotyping (HTCP) algorithm developed to differentiate (1) patients with type 2 diabetes mellitus (T2DM) and (2) patients with no diabetes. Materials and methods: This population-based study identified all Olmsted County, Minnesota residents in 2007. We used provider-linked electronic medical record Data from the two healthcare centers that provide >95% of all care to County residents (ie, Olmsted Medical Center and Mayo Clinic in Rochester, Minnesota, USA). Subjects were limited to residents with one or more encounter January 1, 2006 through December 31, 2007 at both healthcare centers. DM-relevant Data on diagnoses, laboratory results, and medication from both centers were obtained during this period. The algorithm was first executed using Data from both centers (ie, the gold standard) and then from Mayo Clinic alone. Positive predictive values and false-negative rates were calculated, and the McNemar test was used to compare categorization when Data from the Mayo Clinic alone were used with the gold standard. Age and sex were compared between true-positive and false-negative subjects with T2DM. Statistical significance was accepted as p