Corporate Firewall

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 489 Experts worldwide ranked by ideXlab platform

Jim Euchner - One of the best experts on this subject based on the ideXlab platform.

  • The Uses and Risks of Open Innovation: User Innovation Is a More Radical Approach to Open Innovation That Can Challenge the Fundamental Assumptions about the Nature of Business and the Role of the Business Model
    Research-technology Management, 2013
    Co-Authors: Jim Euchner
    Abstract:

    Open innovation and user innovation are often conflated. In this article, excerpted from a chapter that first appeared in Chance and Intent, edited by David Bodde and Caron St. John, Jim Euchner discusses the continuum between open innovation (what he calls "open-boundary innovation ") and user innovation (or "open-source innovation "). Open innovation is a broad term. It encompasses everything from Procter & Gamble's Connect + Develop program to major open-source software initiatives and an increasing number of variants in between. The approaches all have in common the notion that innovation can be accelerated if companies break down the traditional boundaries of the corporation so that "valuable ideas can come from inside or outside the company, and can go to market from inside or outside the company, as well" (Chesbrough 2003). They differ in the degree to which this opening up happens within the traditional business model or moves beyond it and begins to shape the notion of the corporation itself. Current approaches to open innovation can be characterized in two general categories: open-boundary innovation and open-source innovation (Euchner 2010). Open-boundary innovation describes initiatives, like those advocated by Chesbrough, that are designed to source new technology and concepts broadly, seeking the seeds of the next innovation both within and outside of the Corporate Firewall (Slowinski and Sagal 2010). Control of the innovation process itself remains within the firm, which defines priorities, chooses how to source the innovations to support them, selects providers, and integrates resulting developments into its product roadmap. Open-boundary innovation stretches the role of R&D in important ways, but it operates within the current management paradigm. Open-source innovation, by contrast, is a more radical model that challenges fundamental assumptions about the nature of the business. At its roots, it views the source of much innovation as originating in the collective knowledge and motivation of users. In an open-source initiative, a large, and largely anonymous, community of users and innovators not only generates ideas but develops products, as well. Because community members are the source of innovation, they also govern its process and benefit from its results. It is users, acting both individually and as a community, who decide what gets worked on. Open innovation in this context means open governance and open direction, and so moves beyond the current business model. Open source is increasingly important in the development of everything from software to prosthetics, from sporting equipment to bioengineering. Mthough it emerges from similar roots, it is based on an entirely different management paradigm than open-boundary innovation. But whatever risks such new paradigms might hold, established companies ignore open-source innovation at their peril. Economic research indicates that open-source innovation may dominate Corporate innovation in a steadily increasing number of fields as the costs of communication and collaboration continue to fall (Baldwin and von Hippel 2009). One of the central differences between open-boundary and open-source innovation concerns the understanding and management of intellectual property (IP). In open-boundary innovation, control of IP remains a critical part of the management model. In the open-source model, there is no owned IP. Anyone can freely access, use, modify, and build upon the base IP. Open-boundary innovation is important because it can accelerate the process of innovation within firms. Open-source innovation is important for more fundamental reasons: 1. It can radically change the economics of innovation by redistributing its costs and benefits. 2. It can shift the basis for competitive advantage in an industry by creating competing business models in some circumstances. 3. …

  • Two Flavors of Open Innovation
    Research-Technology Management, 2010
    Co-Authors: Jim Euchner
    Abstract:

    Since Henry Chesbrough published Open Innovation (2003), the paradigm he described has been a subject of great interest and experimentation in corporations. Chesbrough defined open innovation as breaking down the boundaries of the corporation so that "valuable ideas can come from inside or outside the company and can go to market from inside or outside the company, as well." He contrasted this open paradigm with the more-traditional closed innovation paradigm based on the captive R&D laboratory. Chesbrough's work encouraged companies to create porous innovation pipelines and to become more aggressive about licensing, working with start-up companies, spinning out concepts that don't fit with the core business, and partnering with other organizations to produce innovations. These approaches have created increased value for firms as diverse as P&G and GE, but they may be only the start of the redefinition of innovation. The emergence of open-source intellectual property (IP) and online communities for innovation and customer input is forcing continued rethinking. Open innovation approaches are designed to source new technology and concepts broadly, seeking the seeds of the next innovation both within and outside of the Corporate Firewall (see, for example, Slowinski et al. 2009). Such initiatives are often supported by companies like Innocentive or Gen3 Partners, which help to frame the problem, connect the firm with external sources of expertise, and manage resulting IR Control of the IP is a critical part of the management model. Similarly, control of the innovation process itself remains with the firm, which defines priorities, chooses how to source them, selects providers, and integrates them into its product roadmap. Open innovation stretches the role of R&D in important ways, but it operates within the current management paradigm. Open-source innovation, on the other hand, redefines the corporation itself Two critical factors distinguish the approaches: the treatment of intellectual property and control of the direction of innovation. Open-source innovation is a more radical model that is increasingly important in the development of everything from software to sports equipment. Economic research indicates that it may soon dominate Corporate innovation in a steadily increasing number of fields. It is best known today in software development, where open-source software projects such as Linux and Apache are both communities and platforms that enable users to develop and share code that they need. In the open-source software model, there is no owned IP. Anyone can access, use, and modify the code. A large, and largely anonymous, crowd contributes to the development of the software. Although there are governance structures for deciding which code is inCorporated into which release of the software, it is users, acting both individually and as a community, that decide what gets worked on. The users, therefore, dictate the direction of the product. Open innovation in this context means open governance, open IP, open direction. Open-source innovation requires three large changes in Corporate innovation thinking, each of which is difficult. First, it requires that firms take a modified view of IP, trading patent control for other sources of competitive advantage (speed, customer intimacy, voluntary contributions to the product). …

Avishai Wool - One of the best experts on this subject based on the ideXlab platform.

  • trends in Firewall configuration errors measuring the holes in swiss cheese
    IEEE Internet Computing, 2010
    Co-Authors: Avishai Wool
    Abstract:

    The first quantitative evaluation of the quality of Corporate Firewall configurations appeared in 2004, based on Check Point Firewall-1 rule sets. In general, that survey indicated that Corporate Firewalls often enforced poorly written rule sets. This article revisits the first survey. In addition to being larger, the current study includes configurations from two major vendors. It also introduces a Firewall complexity. The study's findings validate the 2004 study's main observations: Firewalls are (still) poorly configured, and a rule -set's complexity is (still) positively correlated with the number of detected configuration errors. However, unlike the 2004 study, the current study doesn't suggest that later software versions have fewer errors.

  • Trends in Firewall Configuration Errors -- Measuring the Holes in Swiss Cheese
    2010
    Co-Authors: Avishai Wool
    Abstract:

    Security experts generally agree that Corporate Firewalls often enforce poorly written rule sets. This article revisits a 2004 survey of Corporate Firewall configurations that quantified the extent of this issue. In addition to being much larger, the current study includes configurations from two major vendors. It also introduces a new Firewall complexity measure that applies to both types of Firewalls. The study’s findings validate the 2004 study’s main observations: Firewalls are (still) poorly configured, and a rule set’s complexity is (still) positively correlated with the number of detected configuration errors. However, unlike the 2004 study, the current study doesn’t suggest that later software versions have fewer errors

  • Firewall Configuration Errors Revisited
    arXiv: Cryptography and Security, 2009
    Co-Authors: Avishai Wool
    Abstract:

    The first quantitative evaluation of the quality of Corporate Firewall configurations appeared in 2004, based on Check Point Firewall-1 rule-sets. In general that survey indicated that Corporate Firewalls were often enforcing poorly written rule-sets, containing many mistakes. The goal of this work is to revisit the first survey. The current study is much larger. Moreover, for the first time, the study includes configurations from two major vendors. The study also introduce a novel "Firewall Complexity" (FC) measure, that applies to both types of Firewalls. The findings of the current study indeed validate the 2004 study's main observations: Firewalls are (still) poorly configured, and a rule-set's complexity is (still) positively correlated with the number of detected risk items. Thus we can conclude that, for well-configured Firewalls, ``small is (still) beautiful''. However, unlike the 2004 study, we see no significant indication that later software versions have fewer errors (for both vendors).

  • Firewall Configuration Errors Revisited
    2009
    Co-Authors: Avishai Wool
    Abstract:

    Practically every corporation that is connected to the Internet uses Firewalls as the first line of its cyber-defense. However, the protection that these Firewalls provide is only as good as the policy they are configured to implement. The first quantitative evaluation of the quality of Corporate Firewall configurations appeared in 2004, based on Check Point Firewall-1 rule-sets. In general that survey indicated that Corporate Firewalls were often enforcing poorly written rule-sets, containing many errors. One important finding was that high rule-set complexity was positively correlated with the number of detected configuration errors. Another finding was an indication that rule-sets from later software versions had slightly fewer errors. The goal of this work is to revisit the first survey, and to test whether its findings remain valid. The current study is much larger, and is based on newer data, collected from Firewalls running later Firewall versions. Furthermore, for the first time the study includes configurations from two major vendors: both Check Point Firewalls and Cisco PIX Firewalls. Finally, the study considers three times as many possible configuration errors, consisting of 36 vendor-neutral errors instead of the 12 used in the 2004 study. In order to compare the complexity of configurations from different vendors, this work also introduce

Wu Chou - One of the best experts on this subject based on the ideXlab platform.

  • IEEE CLOUD - SaaS Integration for Software Cloud
    2010 IEEE 3rd International Conference on Cloud Computing, 2010
    Co-Authors: Feng Liu, Weiping Guo, Zhi Qiang Zhao, Wu Chou
    Abstract:

    Software as a Service (SaaS) has been adopted in a fast pace for applications and services on software clouds. However, the success of SaaS in software cloud cannot obscure the integration challenges faced by developers and enterprise infrastructure IT. Among those challenges, Firewall/NAT traversal and security issues often pose a serious bottleneck as enterprises may not be entirely comfortable running mission critical applications outside the Corporate Firewall. On the other hand, SaaS applications in the cloud need to access enterprise on-premise applications for data exchange and on-premises services. The current approaches through opening special pin-holes on Firewall or using dedicated VPNs have encountered a number of limitations and drawbacks. This paper presents a Proxy-based Firewall/NAT traversal solution for SaaS integration (PASS). It allows SaaS applications to integrate with on-premise applications without Firewall reconfiguration, while maintaining the security of on-premise applications. In addition, this approach is platform and application independent, making the SaaS integration seamless. Moreover, PASS is consistent with the enterprise web browsing infrastructure, and it requires little or no change to enterprise Firewall/NAT configurations. In this paper we present the architecture of PASS and address SaaS integration challenges in software cloud, such as security/Firewall, performance, and scalability. Experimental study based on our implemented system shows that the proposed approach of PASS is promising to resolve Firewall/NAT traversal for SaaS integration with on-premise services.

Feng Liu - One of the best experts on this subject based on the ideXlab platform.

  • IEEE CLOUD - SaaS Integration for Software Cloud
    2010 IEEE 3rd International Conference on Cloud Computing, 2010
    Co-Authors: Feng Liu, Weiping Guo, Zhi Qiang Zhao, Wu Chou
    Abstract:

    Software as a Service (SaaS) has been adopted in a fast pace for applications and services on software clouds. However, the success of SaaS in software cloud cannot obscure the integration challenges faced by developers and enterprise infrastructure IT. Among those challenges, Firewall/NAT traversal and security issues often pose a serious bottleneck as enterprises may not be entirely comfortable running mission critical applications outside the Corporate Firewall. On the other hand, SaaS applications in the cloud need to access enterprise on-premise applications for data exchange and on-premises services. The current approaches through opening special pin-holes on Firewall or using dedicated VPNs have encountered a number of limitations and drawbacks. This paper presents a Proxy-based Firewall/NAT traversal solution for SaaS integration (PASS). It allows SaaS applications to integrate with on-premise applications without Firewall reconfiguration, while maintaining the security of on-premise applications. In addition, this approach is platform and application independent, making the SaaS integration seamless. Moreover, PASS is consistent with the enterprise web browsing infrastructure, and it requires little or no change to enterprise Firewall/NAT configurations. In this paper we present the architecture of PASS and address SaaS integration challenges in software cloud, such as security/Firewall, performance, and scalability. Experimental study based on our implemented system shows that the proposed approach of PASS is promising to resolve Firewall/NAT traversal for SaaS integration with on-premise services.

Grant Fritchey - One of the best experts on this subject based on the ideXlab platform.

  • SQL Server Private Cloud
    Beginning SQL Server 2012 Administration, 2012
    Co-Authors: Robert E. Walters, Grant Fritchey
    Abstract:

    Today, customers are heavily embracing virtualization technology. In the previous chapter, we discussed the various cloud deployments including Infrastructure as a Service (IaaS). IaaS gives IT the ability to quickly spin up an operating system environment in a fraction of the time it would normally take to procure new hardware and install an operating system and applications. Private cloud computing holds similar characteristics to public cloud computing. For one thing, both are elastic. By “elastic,” we are referring to the capability to quickly and easily spin up and down new operating system environments. The big difference between public and private clouds is that private clouds occur within a Corporate Firewall and sometimes leverage existing hardware. Private clouds are shielded from the theoretical insecurities of the Internet. For this reason, many companies are quicker to adopt a private cloud infrastructure. In fact, according to the online article “Sizing of the Datacenter” located at www.ctoedge.com/content/sizing-state-data-center?slide=13, the Association of Data Management Professionals (www.afcom.com) estimates that 70% of customers are planning on or are currently implementing a private cloud-based solution.