Model-driven engineering (MDE), in general, and Domain-Specific Languages (DSLs), in particular, are increasingly being used to manage the complexityof developing applications in various domains.
Although many DSL benefits are qualitative, there is a need to quantitatively demonstrate the benefits of DSLs to simplify comparison and evaluation. This paper describes how we conducted productivity analysis for the Distributed Quality-of-Service (QoS) Modeling Language (DQML).
Our analysis shows (1) the significant productivity gain using DQML compared with alternative methods when configuring application entities and (2) the viability of quantitative productivity metrics for DSLs.
DISTRIBUTED QOS MODELING LANGUAGE
The Distributed QoS Modeling Language (DQML) is a DSL that addresses key inherent and accidental complexities of ensuring semantically compatible QoS policy configurations for publish/subscribe (pub/sub) middleware. DQML initially focused on QoS policy configurations for the Data Distribution Service (DDS) (a pub-sub middleware standard defined by the Object Management Group and summarized in Sidebar 1), though the approach can be applied to other pub-sub technologies. DQML has been developed using the Generic Modeling Environment (GME) which is a metaprogrammable environment for developing DSLs. This section provides an overview of DQML’s structure and functionality.
A. Structure of the DQML Metamodel:
The DQML metamodel constrains the possible set of QoS policy configuration models that can be generated. The metamodel includes all 22 QoS policy types defined by DDS, as well as the DDS entity types that can have QoS policies associated with them.
Along with the entities described in Sidebar 1, the metamodel also includes support for domain participants, which create DDS entities within a particular domain, and domain participant factories, which are used to generate domain participants.
B. Functionality of DQML:
DQML allows users to incorporate an arbitrary number of DDS entity instances from the seven entity types supported (e.g., any number of data readers), as shown in Figure 1. DQML also allows users to specify an arbitrary number of DDS QoS policy instances (e.g., any number of deadline QoS policies). All DDS QoS policy parameters are supported along with the appropriate ranges of parameter values, as well as the default values.
DQML CASE STUDY : DDS BENCHMARKING ENVIRONMENT(DBE)
At least five different implementations of DDS are available, each with its own set of strengths and market discriminators. A systematic benchmarking environment is needed to objectively evaluate the QoS of these implementations. Such evaluations can also help guide the addition of new features to the DDS standard as it evolves.
DSL PRODUCTIVITY ANALYSIS
This section provides a taxonomy of approaches to developing quantitative productivity analysis for a DSL. It also presents a productivity analysis for DQML that evaluates implementing QoS configurations for the DBE.
A. Productivity Analysis Approach:
When analyzing productivity gains for a given DSL, analysts can employ several different types of strategies, such as
- Design development effort
- Implementation development effort
- Design quality
- Required developer experience
- Solution exploration
B. DQML Productivity Analysis:
Below we analyze the effect on productivity and the break even point of using DQML as opposed to manual implementations of QoS policy configurations for DBE. Although configurations can be designed using various methods as outlined in previous work, manual implementation of configurations is applicable to these other design solutions since these solutions provide no guidance for implementation.
- Interpreter development
Although MDE and DSLs have become increasingly popular, quantitative evidence is needed to support the quantitative evaluation of DSLs. This paper described various approaches to quantitatively evaluating DSLs via productivity analysis. We applied one of these approaches to a case study involving the Distributed QoS Modeling Language (DQML). The following is a summary of the lessons learned from our experience applying productivity analysis to DQML:
- Trade-offs and the break-even point for DSLs must be clearly understood and communicated. There are pros and cons to any technical approach including DSLs. The use of DSLs may not be appropriate for every case and these cases must be evaluated to provide balanced and objective analysis.
- The context for DSL productivity analysis needs to be well defined. Broad generalizations of a DSL being “X” times better than some other technology is not particularly helpful for comparison and evaluation.A representative case study can be useful to provide a concrete context for productivity analysis.
- Provide analysis for as minimal or conservative a scenario as possible.Using a minimal scenario in productivity analysis allows developers to extrapolate to larger scenarios where the DSL use will be justified.
DQML is available as open-source software and can be downloaded in GME’s XML format along with supporting files from www.dre.vanderbilt.edu/∼jhoffert/DQML/DQML.zip.
Source: Vanderbilt University
Authors: Joe Hoffert | Douglas C. Schmidt | Aniruddha Gokhale