ADS Capstone Chronicles Revised

18

0.82}, achieving a cross-validation score of approximately 0.89. Neural network: Hyperparameters including the activation function (activation), regularization parameter (alpha), hidden layer sizes (hidden_layer_sizes), learning rate (learning_rate), and solver (solver) were optimized using RandomizedSearchCV. The optimal parameters were {‘activation’: ‘relu’, ‘alpha’: 0.0003, ‘hidden_layer_sizes’: (100, 100), ‘learning_rate’: ‘adaptive’, ‘learning_rate_init’: 0.01, ‘solver’: ‘sgd’}, resulting in a cross-validation score of about 0.91. Quadratic Discriminant Analysis: The hyperparameters including the regularization parameter (reg_param), covariance storage (store_covariance), and tolerance (tol) were optimized using GridSearchCV. The best configuration was {‘reg_param’: 1.0, ‘store_covariance’: True, ‘tol’: 0.0001}, achieving a cross-validation score of approximately 0.63. Bagging Classifier: This classifier used RandomizedSearchedCV to tune the hyperparameters. The selected hyperparameters for tuning were the number of estimators (n_estimators), the max samples used to train each base estimator (max_samples), the max features used when looking for best splits (max_features), whether bootstrap samples were used when building the

base estimators (bootstrap), whether features were drawn with replacement during selection (bootstrap_features), the depth of the base estimator (max_depth), and the minimum number of samples required to split an internal node (min_samples_split). The optimal configuration found was {‘n_estimators’: 20, ‘max_samples’: 1.0, ‘max_features’: 1.0, ‘estimator__min_samples_split’: 5, ‘estimator__max_depth’: None, ‘bootstrap_features’: False, ‘bootstrap’: True}, which led to a cross-validation score of 0.91. Stochastic Gradient Descent: For the SGD classifier, hyperparameters were optimized using GridSearchCV. The key parameters adjusted include the regularization strength (alpha), the maximum number of iterations (max_iter), the penalty type (penalty), and the tolerance for stopping criteria (tol). The optimal configuration found was {‘alpha’: 0.01, ‘max_iter’: 20000, ‘penalty’: ‘l1’, ‘tol’: 1e-05}, which achieved a cross-validation score of approximately 0.74. AdaBoost: The hyperparameters for the AdaBoost model, including the number of estimators (n_estimators), learning rate (learning_rate), and the maximum depth of the base estimator (estimator__max_depth), were optimized using GridSearchCV. The optimal configuration was {‘n_estimators’: 180,

142

Made with FlippingBook - Online Brochure Maker