Complex Mathematics

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 261 Experts worldwide ranked by ideXlab platform

Dave Crewe - One of the best experts on this subject based on the ideXlab platform.

L Urankar - One of the best experts on this subject based on the ideXlab platform.

  • common compact analytical formulas for computation of geometry integrals on a basic cartesian sub domain in boundary and volume integral methods
    Engineering Analysis With Boundary Elements, 1990
    Co-Authors: L Urankar
    Abstract:

    Abstract In both the boundary element (BEM) and the volume integral method (VIM), the discretization of the material medium and the ansatz for the source function always lead to corresponding integrals on the basic sub-domain. These types of integrals are classified. It is shown that these integrals on a basic cartesian element (volume or boundary element) consist of the same few simple contour integrals multiplied by the model (ansatz) dependent coefficients of the source fields. These contour integrals are pure geometry integrals and involve an algebraic combination of only seven line integrals on the basic cartesian sub-domain. The line integrals consist of simple transcendental functions. Their computation can be performed analytically in a compact package of subroutines. As this computation depends only on the geometry, tt can be effectively used to analyse a very wide range of field problems involving either vector or scalar fields and potentials, since the Complex Mathematics is relegated mainly to the subroutines of the package. Moreover, the analytical integration package can be easily incorporated into existing BIM/BEM and VIM algorithms, and thus permit time-saving computations. Some examples of practical applications are given.

  • Common compact analytical formulas for computation of geometry integrals on a basic conic sub-domain in boundary and volume integral methods
    Archiv für Elektrotechnik, 1990
    Co-Authors: L Urankar
    Abstract:

    In der Volumen-(VIM) und Randintegralmethode (BIM bzw. Boundary Element Methode, BEM) treten, verursacht durch die Diskretisierung und den Lösungsansatz, immer wieder entsprechende Integrale für die Basiselemente auf. Die Typen dieser Integrale werden klassifiziert. Es wird gezeigt, daß diese Integrale für beliebige konische Basiselemente sich auf zehn Linienintegrale entlang der Elementkanten zurückführen lassen. Von diesen Linienintegralen sind sieben vollständig analytisch zu elliptischen Integralen dreier Gattungen sowie zu einfachen zyklometrischen Funktionen auflösbar. Die Koeffizienten der Ansatzfunktionen der Feldquellen wie z. B. Stromdichte, Magnetisierung, elektro- und magnetostatisches Potential etc., treten dabei als Vorfaktoren dieser nur von der Geometrie abhängigen Linienintegrale auf. Eine Feldberechnungssoftware läßt sich damit wesentlich kompakter strukturieren, da die aufwendige Mathematik auf wenige Unterprogramme beschränkt bleibt, die dann für unterschiedliche Fragestellungen wie magnetisches Feld, magnetisches Potential, elektrisches Feld, elektrisches Potential, Temperatur etc. verwendet werden können. Einige praktische Anwendungen sind angegeben. Magnetfeldberechnung; Volumenintegralmethode; Randintegralmethode; Boundary Integral Method (BIM): Boundary Element Method (BEM); Magnetisierung; Potentiale In both the boundary element (BEM) and the volume integral method (VIM), the discretization of the material medium and the ansatz for the source function always lead to corresponding integrals on the basic subdomain. The types of these integrals are classified. It is shown that these integrals on a basic conic element (volume or boundary element) consist of the same few simple contour integrals multiplied by the model (ansatz) dependent coefficients of the source fields. These contour integrals are pure geometry integrals and involve an algebraic combination of only ten line integrals on the basic conic sub-domain. The line integrals consist of elliptic integrals of three kinds and/or simple transcendental functions. Their computation can be performed analytically in a compact package of subroutines. As this computation depends only on the geometry, it can be effectively used to analyse a very wide range of field problems involving either vector or scalar fields and potentials, since the Complex Mathematics is relegated mainly to the subroutines of the package. Moreover, the analytical integration package can be casily incorporated into existing BIM/BEM and VIM algorithms, and thus permit time-saving computations. Some examples of practical applications are given.

Patricia Melin - One of the best experts on this subject based on the ideXlab platform.

  • Comparative study of particle swarm optimization variants in Complex Mathematics functions
    Studies in Computational Intelligence, 2015
    Co-Authors: Juan Carlos Vazquez, Fevrier Valdez, Patricia Melin
    Abstract:

    © Springer International Publishing Switzerland 2015. Particle Swarm Optimization (PSO) is one of the evolutionary computation techniques based on the social behaviors of birds flocking or fish schooling, biologically inspired computational search and optimization method. Since first introduced by Kennedy and Eberhart (A new optimizer using particle swarm theory 39–43, 1995 [1]) in 1995, several variants of the original PSO have been developed to improve speed of convergence, improve the quality of solutions found, avoid getting trapped in the local optima and so on. This paper is focused on performing a comparison of different approaches of inertia weight such as constant, random adjustments, linear decreasing, nonlinear decreasing and fuzzy particle swarm optimization; we are using a set of 4 mathematical functions to validate our approach. These functions are widely used in this field of study.

  • comparative study of particle swarm optimization variants in Complex Mathematics functions
    Hybrid Intelligent Systems, 2013
    Co-Authors: Juan Carlos Vazquez, Fevrier Valdez, Patricia Melin
    Abstract:

    Particle Swarm Optimization (PSO) is one of the evolutionary computation techniques based on the social behaviors of birds flocking or fish schooling, biologically inspired computational search and optimization method. Since first introduced by Kennedy and Eberhart [7] in 1995, several variants of the original PSO have been developed to improve speed of convergence, improve the quality of solutions found, avoid getting trapped in the local optima and so on. This paper is focused on performing a comparison of different PSO variants such as full model, only cognitive, only social, weight inertia, and constriction factor. We are using a set of 4 mathematical functions to validate our approach. These functions are widely used in this field of study.

Hanyue Zheng - One of the best experts on this subject based on the ideXlab platform.

  • APSEC - A Domain Experts Centric Approach to Formal Requirements Modeling and V&V of Embedded Control Software
    2019 26th Asia-Pacific Software Engineering Conference (APSEC), 2019
    Co-Authors: Weikai Miao, Yihao Huang, Jincao Feng, Hanyue Zheng
    Abstract:

    Formal method is a promising solution for precise software requirements modeling and V&V (Validation and Verification). However, domain experts are suffering from using Complex Mathematics formal notations to precisely describe their domain specific software requirements. Meanwhile, the lack of systematic engineering methodologies that can effectively encompass precise requirements modeling and rigorous requirements V&V makes the application of formal methods in industry still a big challenge. To tackle this challenge, in this paper, we present a domain experts centric approach to the formal requirements modeling and V&V in the domain of embedded control software. The major advancements of the approach are: 1) a domain-specific and systematic engineering approach to the formal requirements specification construction and 2) scenario-based requirements validation and verification requirements technique. Specifically, the approach offers a domain-specific template for formal specification construction through a three-step specification evolution process. For formal requirements V&V, diagrams are derived from formal specification and domain experts' concerned scenarios can be checked based on the diagrams. These modeling and V&V technologies are coherently incorporated in the approach and fully automated by a supporting tool. We have applied the approach real software projects of our industrial partners. The experimental results show that it significantly facilitates the formal modeling and V&V in industry.

  • A Domain Experts Centric Approach to Formal Requirements Modeling and V&V of Embedded Control Software
    2019 26th Asia-Pacific Software Engineering Conference (APSEC), 2019
    Co-Authors: Weikai Miao, Yihao Huang, Jincao Feng, Hanyue Zheng
    Abstract:

    Formal method is a promising solution for precise software requirements modeling and V&V (Validation and Verification). However, domain experts are suffering from using Complex Mathematics formal notations to precisely describe their domain specific software requirements. Meanwhile, the lack of systematic engineering methodologies that can effectively encompass precise requirements modeling and rigorous requirements V&V makes the application of formal methods in industry still a big challenge. To tackle this challenge, in this paper, we present a domain experts centric approach to the formal requirements modeling and V&V in the domain of embedded control software. The major advancements of the approach are: 1) a domain-specific and systematic engineering approach to the formal requirements specification construction and 2) scenario-based requirements validation and verification requirements technique. Specifically, the approach offers a domain-specific template for formal specification construction through a three-step specification evolution process. For formal requirements V&V, diagrams are derived from formal specification and domain experts' concerned scenarios can be checked based on the diagrams. These modeling and V&V technologies are coherently incorporated in the approach and fully automated by a supporting tool. We have applied the approach real software projects of our industrial partners. The experimental results show that it significantly facilitates the formal modeling and V&V in industry.

Michael Negnevitsky - One of the best experts on this subject based on the ideXlab platform.

  • Artificial intelligence: A guide to intelligent systems
    Artificial Intelligence, 2011
    Co-Authors: Michael Negnevitsky
    Abstract:

    3rd ed. "Artificial Intelligence is one of the most rapidly evolving subjects within the computing/engineering curriculum, with an emphasis on creating practical applications from hybrid techniques. Despite this, the traditional textbooks continue to expect mathematical and programming expertise beyond the scope of current undergraduates and focus on areas not relevant to many of today's courses. Negnevitsky shows students how to build intelligent systems drawing on techniques from knowledge-based systems, neural networks, fuzzy systems, evolutionary computation and now also data mining. The principles behind these techniques are explained without resorting to Complex Mathematics, showing how the various techniques are implemented, when they are useful and when they are not. No particular programming language is assumed and the book does not tie itself to any of the software tools available. However, available tools and their uses will be described and program examples will be given in MATLAB. The lack of assumed prior knowledge makes this book ideal for any introductory courses in artificial intelligence or intelligent systems design, while the contemporary coverage means more advanced students will benefit by discovering the latest state-of-the-art techniques."--Publisher's website. Introduction to knowledge -- Rule-based expert systems -- Uncertainity management in rule-based expert systems -- Fuzzy expert systems -- Frame-based expert systems -- Artificial neural networks -- Evolutionary computation -- Hybrid intelligent systems -- Knowledge engineering -- Data mining and knowledge discovery.

  • Uncertainty management in rule based expert systems
    Artificial intelligence - A guide to intelligent systems, 2002
    Co-Authors: Michael Negnevitsky
    Abstract:

    Artificial Intelligence is one of the most rapidly evolving subjects within the computing/engineering curriculum, with an emphasis on creating practical applications from hybrid techniques. Despite this, the traditional textbooks continue to expect mathematical and programming expertise beyond the scope of current undergraduates and focus on areas not relevant to many of today's courses. Negnevitsky shows students how to build intelligent systems drawing on techniques from knowledge-based systems, neural networks, fuzzy systems, evolutionary computation and now also intelligent agents. The principles behind these techniques are explained without resorting to Complex Mathematics, showing how the various techniques are implemented, when they are useful and when they are not. No particular programming language is assumed and the book does not tie itself to any of the software tools available. However, available tools and their uses will be described and program examples will be given in Java. The lack of assumed prior knowledge makes this book ideal for any introductory courses in artificial intelligence or intelligent systems design, while the contempory coverage means more advanced students will benefit by discovering the latest state-of-the-art techniques.

  • Frame based expert systems
    Artificial intelligence - A guide to intelligent systems, 2002
    Co-Authors: Michael Negnevitsky
    Abstract:

    Artificial Intelligence is one of the most rapidly evolving subjects within the computing/engineering curriculum, with an emphasis on creating practical applications from hybrid techniques. Despite this, the traditional textbooks continue to expect mathematical and programming expertise beyond the scope of current undergraduates and focus on areas not relevant to many of today's courses. Negnevitsky shows students how to build intelligent systems drawing on techniques from knowledge-based systems, neural networks, fuzzy systems, evolutionary computation and now also intelligent agents. The principles behind these techniques are explained without resorting to Complex Mathematics, showing how the various techniques are implemented, when they are useful and when they are not. No particular programming language is assumed and the book does not tie itself to any of the software tools available. However, available tools and their uses will be described and program examples will be given in Java. The lack of assumed prior knowledge makes this book ideal for any introductory courses in artificial intelligence or intelligent systems design, while the contempory coverage means more advanced students will benefit by discovering the latest state-of-the-art techniques.