Unix Operating System

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 7851 Experts worldwide ranked by ideXlab platform

Diomidis Spinellis - One of the best experts on this subject based on the ideXlab platform.

  • a repository of Unix history and evolution
    Empirical Software Engineering, 2017
    Co-Authors: Diomidis Spinellis
    Abstract:

    The history and evolution of the Unix Operating System is made available as a revision management repository, covering the period from its inception in 1972 as a five thousand line kernel, to 2016 as a widely-used 27 million line System. The 1.1gb repository contains 496 thousand commits and 2,523 branch merges. The repository employs the commonly used Git version control System for its storage, and is hosted on the popular GitHub archive. It has been created by synthesizing with custom software 24 snapshots of Systems developed at Bell Labs, the University of California at Berkeley, and the 386bsd team, two legacy repositories, and the modern repository of the open source Freebsd System. In total, 973 individual contributors are identified, the early ones through primary research. The data set can be used for empirical research in software engineering, information Systems, and software archaeology.

  • The Evolution of C Programming Practices: A Study of the Unix Operating System 1973-2015
    2016 IEEE ACM 38th International Conference on Software Engineering (ICSE), 2016
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Tracking long-term progress in engineering and applied science allows us to take stock of things we have achieved, appreciate the factors that led to them, and set realistic goals for where we want to go. We formulate seven hypotheses associated with the long term evolution of C programming in the Unix Operating System, and examine them by extracting, aggregating, and synthesising metrics from 66 snapshots obtained from a synthetic software configuration management repository covering a period of four decades. We found that over the years developers of the Unix Operating System appear to have evolved their coding style in tandem with advancements in hardware technology, promoted modularity to tame rising complexity, adopted valuable new language features, allowed compilers to allocate registers on their behalf, and reached broad agreement regarding code formatting. The progress we have observed appears to be slowing or even reversing prompting the need for new sources of innovation to be discovered and followed.

  • An Exploratory Study on the Evolution of C Programming in the Unix Operating System
    2015 ACM IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), 2015
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Context: Numerous factors drive long term progress in programming practices. Goal: We study the evolution of C programming in the Unix Operating System. Method: We extract, aggregate, and synthesize metrics from 66 snapshots obtained from an artificial software configuration management repository tracking the evolution of the Unix Operating System over four decades. Results: C language programming practices appear to evolve over long term periods; our study identified some continuous trends with highly significant coefficients of determination. Many trends point toward increasing code quality through adherence to numerous programming guidelines, while some others indicate adoption that has reached maturity. In the area of commenting progress appears to have stalled. Conclusions: Studying the long term evolution of programming practices identifies areas where progress has been achieved along an expected path, as well as cases where there is room for improvement.

Maria Kechagia - One of the best experts on this subject based on the ideXlab platform.

  • The Evolution of C Programming Practices: A Study of the Unix Operating System 1973-2015
    2016 IEEE ACM 38th International Conference on Software Engineering (ICSE), 2016
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Tracking long-term progress in engineering and applied science allows us to take stock of things we have achieved, appreciate the factors that led to them, and set realistic goals for where we want to go. We formulate seven hypotheses associated with the long term evolution of C programming in the Unix Operating System, and examine them by extracting, aggregating, and synthesising metrics from 66 snapshots obtained from a synthetic software configuration management repository covering a period of four decades. We found that over the years developers of the Unix Operating System appear to have evolved their coding style in tandem with advancements in hardware technology, promoted modularity to tame rising complexity, adopted valuable new language features, allowed compilers to allocate registers on their behalf, and reached broad agreement regarding code formatting. The progress we have observed appears to be slowing or even reversing prompting the need for new sources of innovation to be discovered and followed.

  • An Exploratory Study on the Evolution of C Programming in the Unix Operating System
    2015 ACM IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), 2015
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Context: Numerous factors drive long term progress in programming practices. Goal: We study the evolution of C programming in the Unix Operating System. Method: We extract, aggregate, and synthesize metrics from 66 snapshots obtained from an artificial software configuration management repository tracking the evolution of the Unix Operating System over four decades. Results: C language programming practices appear to evolve over long term periods; our study identified some continuous trends with highly significant coefficients of determination. Many trends point toward increasing code quality through adherence to numerous programming guidelines, while some others indicate adoption that has reached maturity. In the area of commenting progress appears to have stalled. Conclusions: Studying the long term evolution of programming practices identifies areas where progress has been achieved along an expected path, as well as cases where there is room for improvement.

P. L. Pradhan - One of the best experts on this subject based on the ideXlab platform.

  • proposed isomorphic graph model for risk assessment on a Unix Operating System
    International Journal of Risk and Contingency Management archive, 2013
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Control and risk are the two parts of the coin. Risk assessment is the process of identifying uncertainties, vulnerabilities and threats to the Operating System resources in order to achieve business objectives. Risk evaluation involved deciding what counter measures to take in reducing uncertainty to the lowest level of risk. Control is probably the most important aspect of communications security and becoming increasingly important as basic building block for System security. Advanced Encryption Standard AES is a primary method of protecting System resources. AES is inversely proportional to the Risk C=K/R & mean while control is directly proportional to the quality of standard. AES Control will be optimize the risk as well as improve the IS standard. Control is directly proportional to risk mitigation & mitigation is directly proportional to standard. This paper contributes to the development of an optimization method that aims to determine the optimal cost to be invested into security method, model & mechanisms deciding on the measure component of Operating System resources i.e. Processor, Memory & Encryption. Furthermore, the method & mechanism optimize the cost, time & resources is supposed to optimize the System risks. The proposed model would be update the value of Processor, Memory & Encryption key dynamically as per business requirement and availability of technology & resources. Proposed model is going to be optimizing risk and maximizing the performance. In this study the researchers develop an isomorphic graph model for optimizing risk in the Unix Operating System.

  • AUTOMATED CONTROL FOR RISK ASSESSMENT ON Unix Operating System -I
    International Journal of Security Privacy and Trust Management, 2013
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Control and Risk are the two parts of the coin. Risk management is the process of identifying vulnerabilities and threats to Operating System resources to achieving business objectives and deciding what counter measures to take in reducing the lowest level of risk. The increased use of computer & communications System by IT industries has increased the risk of theft of proprietary information. Cryptographic control (Encryption) is a primary method of protecting System resources. Automated Control is probably the most important aspect of communications security and becoming increasingly important as basic building block for computer security. Automated Control is inversely proportional to the risk & mean while control is directly proportional to the quality of standard (S). Automated Control provides accountability for individuals who are accessing sensitive information on application, System software, server and network. This accountability is accomplished through access control mechanisms that require identification, authentication, authorization, non-repudiation, availability, reliability & integrity through the audit function. We have to develop java automated control for risk optimization based on Unix Operating System.

  • Hardening of Unix Operating System
    2001
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Operating System hardening is the process to address security weaknesses in the operation Systems by implementing the latest Operating System patches, hot fixes and updates as well as follow up the specific procedures and policies to reduce attacks and System down time. Hardening is a not a one time activity, it is a on going task to mitigate the risk to performing high quality of computing. We have to build-up the secure production server in such a way to remove the unwanted devices, fix up the miss configuration, not allow the default setting, enhancement the current configuration and develop the new System programming and applying new security patches before going to the production environment. Hardening of the Operating System should be support to the high integrity, reliability, availability, privacy, scalability and confidentiality at the lowest level of risk to achieve the highest level of objective ( benefits) from the critical IT infrastructure of the organization. Safeguarding information and protecting the integrity of your network and Systems are vital to our business. IT security professionals in many companies have established policies applicable to their entire organization, but it may be up to individual departments that manage the Systems to implement security in accordance with these policies. Security professionals recognize the need for flexibility when it comes to implementation, due to the unique requirements of each department. Hardening of an Operating System involves the removal of all non essential tools, utilities and other Systems administration options, any of which could be used to ease a hacker's path to your Systems. Following this, the hardening process will ensure that all appropriate security features are activated and configured correctly. Again, 'out of the box' Systems will likely be set up for ease of access with access to administrator account. Some vendors have now recognized that a market exists for the OS-hardened Systems. This thesis especially focuses on the hardening of Unix sun solaris Operating System.

Panagiotis Louridas - One of the best experts on this subject based on the ideXlab platform.

  • The Evolution of C Programming Practices: A Study of the Unix Operating System 1973-2015
    2016 IEEE ACM 38th International Conference on Software Engineering (ICSE), 2016
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Tracking long-term progress in engineering and applied science allows us to take stock of things we have achieved, appreciate the factors that led to them, and set realistic goals for where we want to go. We formulate seven hypotheses associated with the long term evolution of C programming in the Unix Operating System, and examine them by extracting, aggregating, and synthesising metrics from 66 snapshots obtained from a synthetic software configuration management repository covering a period of four decades. We found that over the years developers of the Unix Operating System appear to have evolved their coding style in tandem with advancements in hardware technology, promoted modularity to tame rising complexity, adopted valuable new language features, allowed compilers to allocate registers on their behalf, and reached broad agreement regarding code formatting. The progress we have observed appears to be slowing or even reversing prompting the need for new sources of innovation to be discovered and followed.

  • An Exploratory Study on the Evolution of C Programming in the Unix Operating System
    2015 ACM IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), 2015
    Co-Authors: Diomidis Spinellis, Panagiotis Louridas, Maria Kechagia
    Abstract:

    Context: Numerous factors drive long term progress in programming practices. Goal: We study the evolution of C programming in the Unix Operating System. Method: We extract, aggregate, and synthesize metrics from 66 snapshots obtained from an artificial software configuration management repository tracking the evolution of the Unix Operating System over four decades. Results: C language programming practices appear to evolve over long term periods; our study identified some continuous trends with highly significant coefficients of determination. Many trends point toward increasing code quality through adherence to numerous programming guidelines, while some others indicate adoption that has reached maturity. In the area of commenting progress appears to have stalled. Conclusions: Studying the long term evolution of programming practices identifies areas where progress has been achieved along an expected path, as well as cases where there is room for improvement.

P. K. Patra - One of the best experts on this subject based on the ideXlab platform.

  • proposed isomorphic graph model for risk assessment on a Unix Operating System
    International Journal of Risk and Contingency Management archive, 2013
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Control and risk are the two parts of the coin. Risk assessment is the process of identifying uncertainties, vulnerabilities and threats to the Operating System resources in order to achieve business objectives. Risk evaluation involved deciding what counter measures to take in reducing uncertainty to the lowest level of risk. Control is probably the most important aspect of communications security and becoming increasingly important as basic building block for System security. Advanced Encryption Standard AES is a primary method of protecting System resources. AES is inversely proportional to the Risk C=K/R & mean while control is directly proportional to the quality of standard. AES Control will be optimize the risk as well as improve the IS standard. Control is directly proportional to risk mitigation & mitigation is directly proportional to standard. This paper contributes to the development of an optimization method that aims to determine the optimal cost to be invested into security method, model & mechanisms deciding on the measure component of Operating System resources i.e. Processor, Memory & Encryption. Furthermore, the method & mechanism optimize the cost, time & resources is supposed to optimize the System risks. The proposed model would be update the value of Processor, Memory & Encryption key dynamically as per business requirement and availability of technology & resources. Proposed model is going to be optimizing risk and maximizing the performance. In this study the researchers develop an isomorphic graph model for optimizing risk in the Unix Operating System.

  • AUTOMATED CONTROL FOR RISK ASSESSMENT ON Unix Operating System -I
    International Journal of Security Privacy and Trust Management, 2013
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Control and Risk are the two parts of the coin. Risk management is the process of identifying vulnerabilities and threats to Operating System resources to achieving business objectives and deciding what counter measures to take in reducing the lowest level of risk. The increased use of computer & communications System by IT industries has increased the risk of theft of proprietary information. Cryptographic control (Encryption) is a primary method of protecting System resources. Automated Control is probably the most important aspect of communications security and becoming increasingly important as basic building block for computer security. Automated Control is inversely proportional to the risk & mean while control is directly proportional to the quality of standard (S). Automated Control provides accountability for individuals who are accessing sensitive information on application, System software, server and network. This accountability is accomplished through access control mechanisms that require identification, authentication, authorization, non-repudiation, availability, reliability & integrity through the audit function. We have to develop java automated control for risk optimization based on Unix Operating System.

  • Hardening of Unix Operating System
    2001
    Co-Authors: P. K. Patra, P. L. Pradhan
    Abstract:

    Operating System hardening is the process to address security weaknesses in the operation Systems by implementing the latest Operating System patches, hot fixes and updates as well as follow up the specific procedures and policies to reduce attacks and System down time. Hardening is a not a one time activity, it is a on going task to mitigate the risk to performing high quality of computing. We have to build-up the secure production server in such a way to remove the unwanted devices, fix up the miss configuration, not allow the default setting, enhancement the current configuration and develop the new System programming and applying new security patches before going to the production environment. Hardening of the Operating System should be support to the high integrity, reliability, availability, privacy, scalability and confidentiality at the lowest level of risk to achieve the highest level of objective ( benefits) from the critical IT infrastructure of the organization. Safeguarding information and protecting the integrity of your network and Systems are vital to our business. IT security professionals in many companies have established policies applicable to their entire organization, but it may be up to individual departments that manage the Systems to implement security in accordance with these policies. Security professionals recognize the need for flexibility when it comes to implementation, due to the unique requirements of each department. Hardening of an Operating System involves the removal of all non essential tools, utilities and other Systems administration options, any of which could be used to ease a hacker's path to your Systems. Following this, the hardening process will ensure that all appropriate security features are activated and configured correctly. Again, 'out of the box' Systems will likely be set up for ease of access with access to administrator account. Some vendors have now recognized that a market exists for the OS-hardened Systems. This thesis especially focuses on the hardening of Unix sun solaris Operating System.