CatBoost function get_all_params() can be used to get the values of all training parameters, including the ones that users do not explicitly specify. After training the model without setting any parameter, you can apply this function to the model, and it will all training parameters with their values.
From CatBoost's documentation:
If the value of a parameter is not explicitly specified, it is set to the default value. In some cases, these default values change dynamically depending on dataset properties and values of user-defined parameters. For example, the default learning rate changes in classification mode depending on the number of iterations and the dataset size. This method returns the values of all parameters, including those calculated during the training.
Here is an example to show how to use the function get_all_params():
from catboost import CatBoostClassifier
import numpy as np
# train data/labels
np.random.seed(7)
train_data = np.reshape(np.random.random(1000), (50, 20))
train_labels = np.round(np.random.random(50))
# train the model
model = CatBoostClassifier(verbose=False)
model.fit(train_data, train_labels)
# get list of parameters
print(model.get_all_params())
The above code will print the following output:
{'nan_mode': 'Min', 'eval_metric': 'Logloss', 'iterations': 1000, 'sampling_frequency': 'PerTree', 'leaf_estimation_method': 'Newton', 'grow_policy': 'SymmetricTree', 'penalties_coefficient': 1, 'boosting_type': 'Plain', 'model_shrink_mode': 'Constant', 'feature_border_type': 'GreedyLogSum', 'bayesian_matrix_reg': 0.10000000149011612, 'l2_leaf_reg': 3, 'random_strength': 1, 'rsm': 1, 'boost_from_average': False, 'model_size_reg': 0.5, 'pool_metainfo_options': {'tags': {}}, 'subsample': 1, 'use_best_model': False, 'class_names': [0, 1], 'random_seed': 0, 'depth': 6, 'posterior_sampling': False, 'border_count': 254, 'classes_count': 0, 'auto_class_weights': 'None', 'sparse_features_conflict_fraction': 0, 'leaf_estimation_backtracking': 'AnyImprovement', 'best_model_min_trees': 1, 'model_shrink_rate': 0, 'min_data_in_leaf': 1, 'loss_function': 'Logloss', 'learning_rate': 0.0028669999446719885, 'score_function': 'Cosine', 'task_type': 'CPU', 'leaf_estimation_iterations': 10, 'bootstrap_type': 'MVS', 'max_leaves': 64}