The Data Optimization Module, also known as PARROT, is a tool within Thermo-Calc Console Mode used for optimization of model parameters during the development of CALPHAD databases.
At Thermo-Calc Software, we develop our databases using the CALPHAD methodology, a four-step process that relies on the rigorous capture and assessment of thermodynamic, kinetic, and other properties data of material systems. Once data is captured and assessed, it is optimized to fit the chosen model parameters using the PARROT Module, which is included in the Console Mode of both Thermo-Calc and the Diffusion Module (DICTRA).
Figure 1. Our databases are developed using the proven CALPHAD methodology, as shown in the graphic. The PARROT Module is a module included in Thermo-Calc that is used mainly during the Optimization step, but is also important in the Assessment step.
PARROT is a basic and integrated Module in both Thermo-Calc and the Diffusion Module (DICTRA). The objective of PARROT is to provide functionality for optimization of thermodynamic, kinetic, or, more recently, thermophysical model parameters. The optimization is made by fitting model parameters to large numbers of experimental observations of quantities describing equilibrium states or dynamic processes in multicomponent heterogeneous systems. PARROT offers the user the possibility to interactively enter and modify phase descriptions, model connections, basic thermodynamic parameters, and so on.
Free Pure Elements Database for Compatibility
In order for different thermodynamic assessments to be compatible with each other, the Scientific Group Thermodata Europe (SGTE) have established a set of reference data for pure elements (i.e. unary systems), which they recommend to be used for all new assessments. The SGTE Pure Element Database is provided for free with Thermo-Calc and is referred to as PURE5 inside Thermo-Calc. Taking the data for pure elements from this database will also reduce the number of optimizing parameters, in other words, only binary and higher order parameters need to be determined using PARROT.
When working on database development, after assigning models to phases, as described on the Assessment of Thermodynamic Data page, the undetermined (=optimizing) parameters of the models are fitted to the input data that was captured in the first step. The selected input data (experimental data and first principle calculation results) is written in POLY-3 syntax in a so-called POP file. This file is read into the PARROT Module, which allows model parameters to be optimized to this input data. The optimization demands extensive human judgement at different stages, mainly due to the fact that the optimizing parameters of all phases should be consistent with each other. In other words, the modeling task is a multi-objective optimization with constraints. This is analogous to training multiple-models in the machine learning context.
Traceability is important in any CALPHAD-type project, but especially when the objective is to develop a multicomponent database. It is crucial to keep track of every change or addition made. Using a version control system is highly recommended. It is also recommended that detailed descriptions/comments are made at every change and update.
All files used in the assessment should be stored in a way that is traceable in case a re-assessment is needed, for instance because new experimental information becomes available. These files are of many different types: setup files (text files containing model information), POP files (text files containing the selected input data), PAR files (the binary files where optimization is performed), EXP files (text files for graphic comparison with experimental data), TDB files (text files with the current state of the database under optimization), various other result files, and more.
Once the optimization is complete, the Gibbs energy functions with their optimized free parameters are stored in a text file, or so-called database, with a format that is readable by Thermo-Calc. The database can now be loaded into Thermo-Calc and used to make predictions.
Validation and Consistency
The final step in the process of developing a CALPHAD database is to validate the predictions against experimental results. The first step of validation follows with the optimization itself. When the fit to binary, ternary, and sometimes quaternary experimental data is good enough, there is a validated result. What “good enough” means in this context is up to the humans involved, but there are tools in PARROT that help with this judgment. However, good descriptions of binary and ternary sub-systems are not always enough.
When developing a multicomponent database for a specific alloy system (Fe/Steels, Ni-base Superalloys, Al-, Ti-, Mg- alloys, and so on), validation from commercial and other multicomponent alloys is of great importance. If the agreement with real multicomponent commercial alloys is not good for key data points, a re-optimization of one or more lower order systems is the only way to rectify this. A systematic evaluation of the agreement between calculations and experiments for real multicomponent alloys can also give good guidance on areas to improve and where more experimental information is desirable.
It is recommended that such validation is performed after every update of the database to ensure continuing good agreement between experiments and calculations. All of our major thermodynamic and properties databases include a collection of validation examples, which you can find on their respective pages in the database section. You can also read the Ni-based Superalloys Database Examples Collection as an example.
Figure 2. Calculated precipitate solvus temperature for various Ni-base superalloys compared with literature data. The calculation is included in the collection of examples used to validate our Ni-based superalloys database, TCNI.
The content of the above paragraphs is written with evaluation of the thermodynamics of alloy systems in mind. Most of this is, however, equally valid for the optimization of mobility parameters and also other additional properties such as molar volume, viscosity, thermal conductivity, and electric resistivity.
For the optimization and assessment of mobility (diffusion) parameters, there is a special version of PARROT called DIC_PARROT that should be used. It is available to users who have a license for the Diffusion Module (DICTRA).
For more information about PARROT you can consult the Data Optimization User Guide for Thermo-Calc and the Diffusion Module (DICTRA), which can be found in:
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
HubSpot sets this cookie to keep track of sessions and to determine if HubSpot should increment the session number and timestamps in the __hstc cookie.
This cookie is set by Hubspot whenever it changes the session cookie. The __hssrc cookie set to 1 indicates that the user has restarted the browser, and if the cookie does not exist, it is assumed to be a new session.
5 months 27 days
This is the main cookie set by Hubspot, for tracking visitors. It contains the domain, initial timestamp (first visit), last timestamp (last visit), current timestamp (this visit), and session number (increments for each subsequent session).
The _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors.
This cookie is installed by Google Analytics.
A variation of the _gat cookie set by Google Analytics and Google Tag Manager to allow website owners to track visitor behaviour and measure site performance. The pattern element in the name contains the unique identity number of the account or website it relates to.
Installed by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously.
Linkedin set this cookie to store information about the time a sync took place with the lms_analytics cookie.
YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
5 months 27 days
HubSpot sets this cookie to keep track of the visitors to the website. This cookie is passed to HubSpot on form submission and used when deduplicating contacts.
Wistia sets this cookie to collect data on visitor interaction with the website's video-content, to make the website's video-content more relevant for the visitor.