2. Activation Energy Analysis (KAS) from Simulated TGA¶
In this example, the conversion data is generated by leveraging the modelling capabilities of FireSciPy.
An example with real experimental data is available separately.
This example is also available as Jupyter Notebook.
Theoretical Background¶
The conversion rate \(\frac{d \alpha}{dt}\) of a single-step pyrolysis reaction can be described as:
It depends on the rate constant \(k(T)\) and the reaction model \(f(\alpha)\). Commonly, the rate constant is expressed as an Arrhenius equation (2), where \(A\) is the pre-exponential factor, \(E\) the activation energy and \(R\) the gas constant.
Together, \(A\), \(E\) and \(f(\alpha)\) form the kinetic triplet.
Kinetics computations aim to deduce the elements of this triplet from experimental data. Here, the focus is on estimating the activation energy \(E\) using an isoconversional method. These methods rely on the observation that, at a given level of conversion, the reaction rate depends only on the temperature. Taking the logarithmic derivative of equation (2) leads to:
At constant conversion, the model \(f(\alpha)\) becomes a constant and drops out of the equation. This is why isoconversional methods are often called “model-free”. However, this does not mean that it can be ignored with respect to the kinetics triplet.
Under a constant heating rate program, there is no analytical solution and the temperature integral must be approximated. Data from multiple heating rates can be used to determine the temperature at a given conversion. One such method is the Kissinger–Akahira–Sunose (KAS) method (4), with \(B=2\) and \(C=1\).
It has a higher accuracy compared to other well known methods like the Ozawa–Flynn–Wall (OFW) method. By default, FireSciPy uses the improved form of (4) from Starink, with \(B = 1.92\) and \(C = 1.0008\).
More details are available in the recommendations provided by the International Confederation for Thermal Analysis and Calorimetry (ICTAC) Kinetics Committee:
Synthetic TGA Data and KAS Estimation¶
FireSciPy provides functionalities to conduct reaction kinetics computations. In this example, the activation energy \(E\) is determined, using the Kissinger-Akaira-Sunose (KAS) isoconversional method. Synthetic TGA data is used for this example, see also the section on modelling.
At first the necessary Python packages are imported, plot colours exposed and basic setting for the plots defined.
# Import necessary packages
import numpy as np
import matplotlib.pyplot as plt
import FireSciPy as fsp
# Order of plot colors
plt_colors = ["tab:blue", "tab:orange", "tab:green",
"tab:red", "tab:purple", "tab:brown",
"tab:pink", "tab:gray", "tab:olive",
"tab:cyan"]
# Global settings for plotting
plt.rcParams.update({
'axes.axisbelow': True, # Keep grid behind plots
'figure.autolayout': True, # Equivalent to calling tight_layout()
'axes.facecolor': 'white', # Prevents transparent background
'grid.alpha': 0.6, # Makes gridlines more readable
'font.size': 12 # Set global font size
})
Then the basic parameters of the pyrolysis reaction are defined. The parameters are taken from Vyazovkin, Advanced Isoconversional Method, 1997.
# Reaction constants
A = 10**10 / 60 # 1/s
E = 125.4 * 1000 # J/mol
alpha0 = 1e-12 # Avoid exactly zero to prevent numerical issues (e.g., division by zero)
FireSciPy introduces an internal data structure for the pyrolysis computations. The goal of this data structure is to keep all the data organised. Furthermore, the different methods of FireSciPy are aware of the structure and can thus easily call the data they need and store their own results.
In this particular example, the reading and processing of experimental data is not discussed. Data series will be generated from modelling, therefore some steps are skipped. The generated data will be placed manually in the data structure in the locations where the processed experimental data would end up. A dedicated example with experimental data is available too.
Below, the skeleton of the data structure is initialized. It also allows tracking of metadata such as the investigated material, the experimental device used, and the person or institution performing the experiments.
The signal argument is used to distinguish between different types of data recorded in micro-scale experiments — for example, mass, heat flow, or heat release rate. This information enables FireSciPy to process different data types in a unified and consistent way.
# Initialise data structure for the kinetics assessment
data_structure = fsp.pyrolysis.kinetics.initialize_investigation_skeleton(
material=f"Unobtainium",
investigator="John Doe, Miskatonic University",
instrument="ACME TGA 9000",
date="Stardate: 42.69",
notes="It has a colour out of space",
signal={"name": "Mass", "unit": "mg"})
Basic settings for the temperature programs are defined below. For this example multiple heating rates are chosen and stored in a Python dictionary. The dictionary allows to easily keep track of temperature programs.
Keep in mind, as per the ICTAC recommendations, the range of heating rates should cover at least a factor of 10 from the lowest to the highest. Furthermore, at the very least 3 heating rates should be used for isoconversional computations. Better would be 5 or more. Furthermore, basic isoconversional methods like KAS are built on the assumption that the heating rate is perfectly linear. This is captured here and the heating rates are thus labelled “nominal”.
The temperature programs all start at \(300 K\) and end at \(750 K\). Means are provided to adjust the sampling frequency (temperature spacing \(\Delta T\)). A few different frequencies are predefined and can easily be commented in/out to play around with.
# Nominal heating rates
heating_rates = {
"1Kmin": 1,
"3Kmin": 3,
"10Kmin": 10,
"15Kmin": 15,
"30Kmin": 30,
"45Kmin": 45,
"60Kmin": 60
}
# Data for temperature program (Kelvin)
T_start = 300
T_end = 750
# Set resolution for the temperature spacing (ΔT)
# resolution_factor = 0.5 # ΔT = 1.0 K
resolution_factor = 1 # ΔT = 0.5 K
# resolution_factor = 2 # ΔT = 0.25 K
# resolution_factor = 5 # ΔT = 0.1 K
# Number of points to be used in the arrays to be generated below
n_points = (int(900 * resolution_factor)) + 1
In the following code block, the individual temperature programs are created, based on the nominal heating rates. This is accomplished with the function fsp.pyrolysis.modeling.create_linear_temp_program()
.
Looping over the above dictionary, a temperature program is created for each nominal heating rate. The time and temperature data series of each program are then extracted. They are used in the pyrolysis solver fsp.pyrolysis.modeling.solve_kinetics()
, together with the n-th order reaction model. The computed conversion (\(\alpha\)) is added to the temperature program DataFrame. The column headers of the DataFrame are renamed to match the expected values. Nominal values of the heating rates are store in the data structure, as well as the DataFrame. This information would typically come from the experiments and would then be added differently, see the other example.
# Modelling conversion and store in data structure
for hr_label in heating_rates:
# Compute model heating rates
beta = heating_rates[hr_label]
hr_model = fsp.pyrolysis.modeling.create_linear_temp_program(
start_temp=T_start,
end_temp=T_end,
beta=beta,
beta_unit="K/min",
steps=n_points)
# Get temperature program
t_array = fsp.utils.series_to_numpy(hr_model["Time"])
T_array = fsp.utils.series_to_numpy(hr_model["Temperature"])
# Get overview over temperature resolution
ΔT = temp_model[1] - temp_model[0]
print(f"Temperature resolution ({hr_label}): ΔT = {ΔT} K")
# Compute conversion for decelerating reaction (n-th order)
t_sol, alpha_sol = fsp.pyrolysis.modeling.solve_kinetics(
t_array=time_model,
T_array=temp_model,
A=A,
E=E,
alpha0=alpha0,
R=fsp.constants.GAS_CONSTANT,
reaction_model='nth_order',
model_params={'n': 1.0}
)
# Convert to DataFrame
hr_model = pd.DataFrame(hr_model)
# Add new column with conversion data
hr_model["alpha"] = pd.Series(alpha_sol, index=hr_model.index)
# Rename column headers to match expected values
hr_model.rename(columns={
'Time': 'Time',
'Temperature': 'Temperature_Avg',
'alpha': 'Alpha'},
inplace=True)
# Ensure nested structure to store data, because model
# data is added manually here in this example and no data
# processing from experimental TGA curves takes place
keys = ["experiments", "TGA", "constant_heating_rate", hr_label]
heating_rate = fsp.pyrolysis.kinetics.ensure_nested_dict(data_structure, keys)
# Add nominal heating rates
nominal_beta = {"value": beta, "unit": "K/min"}
heating_rate["set_value"] = nominal_beta
# Add the modeled conversion data to the data structure
heating_rate["conversion"] = hr_model
Next, the desired conversion fractions need to be specified. They are necessary to ensure that the linear fit of the KAS method compares appropriate temperatures for a given level of conversion. During the experiments the data is recorded with a frequency that can be adjusted at the device. The changes that individual data points match up across many heating rates and for all desired conversion levels is very slim. Thus, interpolation of the input data is necessary. This interpolation is conducted with fsp.pyrolysis.kinetics.compute_conversion_fractions()
. As parameters, the data structure, the desired points for the analysis and the setup need to be provided. There is also an option to compute the fractions for selected temperature programs or all that are available.
Commonly, the conversion fractions range between \((0.05 < \alpha < 0.95)\) or \((0.1 < \alpha < 0.9)\). The reason being, that at the ends changes in conversion are small over long times and the experimental noise is comparatively large. This leads to spurious results. However, the ends should not be rejected flat out. The user needs to assess how significant fluctuations are to not neglect, for example, initial reactions. In this example, a wider range is chosen: \((0.01 < \alpha < 0.99)\). Since the input data comes from modelling, the noise is low for the provided settings. It will be highlighted below that noise increases towards the end.
# Define conversion fractions where to evaluate the activation energy
# Note: commonly, they range between (0.05 < α < 0.95) or (0.1 < α < 0.9)
# conversion_levels = np.linspace(0.05, 0.95, 37) # Δα = 2.5
conversion_levels = np.linspace(0.01, 0.99, 99) # Δα = 1.0
# Compute conversion fractions across all "experiments"
fsp.pyrolysis.kinetics.compute_conversion_levels(
data_structure,
desired_levels=conversion_levels,
setup="constant_heating_rate",
condition="all")
The isoconversional activation energy can now be computed, using the KAS implementation fsp.pyrolysis.kinetics.compute_Ea_KAS()
. It simply needs to be provided with the data structure, all the necessary information should now be in the correct place.
By default, the KAS method implemented in FireSciPy uses the Starink improvement with the parameters \(B=1.92\) and \(C=1.0008\). For the classical KAS method, they can be changed to \(B=2\) and \(C=1\). See also:
# Compute the activation energy using the KAS method
fsp.pyrolysis.kinetics.compute_Ea_KAS(data_structure, B=1.92, C=1.0008)
Finally, the results can be plotted: the development of the activation energy over the conversion. In gray, the first and last \(5\%\) of the conversion are highlighted. Even using model data as input, it is observable that the noise increases towards the ends. Feel free to play with different sampling rates to see how they affect the result.
fig, ax = plt.subplots()
# Get the Ea and convert to kJ/mol.
Ea_results_KAS = data_structure["experiments"]["TGA"]["Ea_results_KAS"]
Ea = Ea_results_KAS["Ea"]/1000
Ea_avg = np.average(Ea)
# Plot the Ea against conversion.
conv = Ea_results_KAS["Conversion"]
plt.scatter(conv,
Ea,
marker=".", s=42,
facecolors="none",
edgecolors="tab:blue",
label=f"KAS, ΔT = {ΔT} K"
)
# Plot target value, i.e. model input
plt.plot([0,1], [125.4, 125.4],
color="black", linestyle="--",
label="Target, E$_a$=125.40 kJ/mol"
)
# Shaded areas to indicate first/last 5 % to be typically cut off
x_min = -0.025
x_max = 1.025
ax.axvspan(x_min, 0.05, color='gray', alpha=0.3)
ax.axvspan(0.95, x_max, color='gray', alpha=0.3)
# Plot meta data.
plt.title(f"Activation Energy (KAS), E$_a$={Ea_avg:.2f} kJ/mol (avg.)")
plt.xlabel("Conversion ($\\alpha$) / -")
plt.ylabel("Activation Energy (E$_a$) / kJ/mol")
plt.xlim(left=x_min, right=x_max)
plt.ylim(bottom=125.175, top=125.525)
# plt.ylim(bottom=123.75, top=127.25)
plt.tight_layout()
plt.legend(loc="upper center")
plt.grid()
# # Save image.
# plot_label = f"Ea_Estimate_KAS.png"
# plot_path = os.path.join(plot_label)
# plt.savefig(plot_path, dpi=320, bbox_inches='tight', facecolor='w')
Tracking of Intermediate Steps and Basic Statistics¶
FireSciPy keeps track of the intermediate steps during the kinetic analysis. Intermediate results are stored in the data structure and can be accessed to check the computation or share condensed results with others.
As an example, the conversion fractions for the different heating rates can easily be plotted as demonstrated below.
# Pre-select data sets for convenience
heating_rate_data = data_structure["experiments"]["TGA"]["constant_heating_rate"]
# Go over all available heating rate data
for hr_label in heating_rate_data:
# Get conversion fractions data series
conv_frac_data = heating_rate_data[hr_label]["conversion_fractions"]
temp = conv_frac_data["Temperature_Avg"]
alpha = conv_frac_data["Alpha"]
# Plot conversion against sample temperature
plt.plot(temp, alpha,
linestyle='none', marker=".", ms=3,
label=f"{hr_label[:-4]} K/min")
# Plot meta data
plt.title("Sampling Rate of Conversion Fractions")
plt.xlabel("Sample Temperature / K")
plt.ylabel("Conversion ($\\alpha$) / -")
plt.xlim(left=480, right=730)
plt.tight_layout()
plt.legend()
plt.grid()
# # Save image.
# plot_label = f"ConversionFractions_KAS.png"
# plot_path = os.path.join(plot_label)
# plt.savefig(plot_path, dpi=320, bbox_inches='tight', facecolor='w')
FireSciPy keeps also track of the individual fits conducted for the conversion faction, by the KAS method. This includes the parameters of the linear fit for each conversion level as well as the respecpective coordinates involved in the fit. These intermediate results are stored together with the primary results of the activation energy, as shown below.
# Get the results of the activation energy
Ea_KAS = data_structure["experiments"]['TGA']["Ea_results_KAS"]
# Check results
Ea_KAS.head()
This information can be nicely used to visualise the KAS process. Below, data points and their respective fit are shown for a selection of conversion levels.
Note: Using the Starink improvement, the linear fit is taking place in the space of \(ln\left(\frac{\beta}{T^{B}}\right)\) against \(\frac{1}{T}\), with \(B=1.92\). In case a different value for \(B\) is chosen (see above) the space is adjusted accordingly. The user has to adjust the axis label accordingly.
# Pre-select data sets for convenience
Ea_KAS = data_structure["experiments"]['TGA']["Ea_results_KAS"]
# Define settings for the plots of the fits
marker_size = 42
fit_line = ":"
fit_alpha = 0.6
fit_color = "black"
# Choose conversion levels for the plot
conversion_levels = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]
# Initialise data collection
conv_temps = np.zeros((len(conversion_levels), len(heating_rates)))
conv_levels = np.zeros((len(conversion_levels), len(heating_rates)))
for conv_idx, level in enumerate(conversion_levels):
level_idx = np.abs(Ea_KAS["Conversion"] - level).argmin()
# Get the first three data points to match previous plot
conv_temps[conv_idx,:] = np.asarray(Ea_KAS.loc[:,"x1":"x7"].iloc[level_idx])
conv_levels[conv_idx,:] = np.asarray(Ea_KAS.loc[:,"y1":"y7"].iloc[level_idx])
# Indicate fits
m_fit = Ea_KAS["m_fit"].iloc[level_idx]
b_fit = Ea_KAS["b_fit"].iloc[level_idx]
x_fit = [conv_temps[conv_idx,:][0], conv_temps[conv_idx,:][-1]]
y_fit = [fsp.utils.linear_model(x_fit[0], m_fit, b_fit),
fsp.utils.linear_model(x_fit[-1], m_fit, b_fit)]
plt.plot(x_fit, y_fit,
linestyle=fit_line,
alpha=fit_alpha,
color=fit_color)
# Plot data points by heating rate
for idx in range(len(conv_temps.T)):
# Get colour for data series, i.e. heating rate
plot_colour = plt_colors[idx]
# Plot data points
plt.scatter(
conv_temps.T[idx],
conv_levels.T[idx],
marker='o', s=marker_size,
facecolors='none',
edgecolors=plot_colour)
# Plot meta data.
plt.title("Assess Linear Fits of KAS Method")
plt.xlabel("1/T")
plt.ylabel("ln($\\beta$/T$^{1.92}$)")
plt.xlim(left=0.00141, right=0.00194)
plt.ylim(bottom=-12.6, top=-7.9)
plt.tight_layout()
# plt.legend()
plt.grid()
# # Save image.
# plot_label = f"Ea_Estimate_KAS_Fit.png"
# plot_path = os.path.join(plot_label)
# plt.savefig(plot_path, dpi=320, bbox_inches='tight', facecolor='w')
FireSciPy provides limited means of statistics to assess the quality of the linear regression fits during the KAS computation. This information is stored together with the primary and intermediate results of the activation energy computation.
Specifically, the coefficient of determination (\(R^2\)) and the root mean square error (\(RMSE\)) are available. They can be plotted together as follows:
# Pre-select data sets for convenience
Ea_KAS = data_structure["experiments"]['TGA']["Ea_results_KAS"]
# Reduce number of data points by using every n-th
nth = 2
markersize = 42
fig, ax = plt.subplots()
# Plot R² statistics
x_data = np.asarray(Ea_KAS["Conversion"])[::nth]
y_data = np.asarray(Ea_KAS["R_squared"])[::nth]
plt.scatter(x_data,
y_data,
marker='.', s=markersize,
facecolors='none',
edgecolors='tab:blue',
label=f"R² avrg.: {np.average(y_data):.6f}")
# Plot RMSE statistics
y_data = np.asarray(Ea_KAS["RMSE"])[::nth]
plt.scatter(x_data,
y_data,
marker='.', s=markersize,
facecolors='none',
edgecolors='tab:orange',
label=f"RMSE avrg.: {np.average(y_data):.6f}")
# Shaded areas to indicate first/last 5 % to be typically cut off
x_min = -0.025
x_max = 1.025
ax.axvspan(x_min, 0.05, color='gray', alpha=0.3, label="5%")
ax.axvspan(0.95, x_max, color='gray', alpha=0.3)
# Plot meta data.
plt.title("Statistics of the KAS Fits")
plt.xlabel("Conversion, $\\alpha$ / -")
plt.ylabel('Arbitrary Units / -')
plt.xlim(left=-0.025, right=1.025)
plt.ylim(bottom=-0.05, top=1.05)
plt.tight_layout()
plt.legend()
plt.grid()
# # Save image.
# plot_label = "Ea_Estimate_KAS_Statistics.png"
# plot_path = os.path.join(plot_label)
# plt.savefig(plot_path, dpi=320, bbox_inches='tight', facecolor='w')
For this example the results are a bit boring. Using experimental, data the statistics are more useful.
For more details, see the FireSciPy documentation.
This example is also available as Jupyter Notebook.