|This thesis explores how machine learning techniques can be used for medium-term domestic hot water load prediction. A total of six models have been trained, fitted and validated with DHW consumption data from the Varmtvann2030 data set. The data set contains consumption data for 4 apartment buildings, 4 hotels and 4 nursing homes. The six models are the sum of Prophet and XGBoost models for each building type. The XGBoost model is a mathematical optimization process that uses regression tree gradient boosting to minimize the prediction error. The Prophet model is an additive model consisting of a trend component, a Fourier series to fit seasonality in the data, and a holiday component to adjust for holidays. The theory behind the two models is thoroughly explained in this thesis. Predictions on unseen test data are performed for all six models, and the results are displayed and compared for the three building categories.
The correlation between the DHW load, the area of the building and the number of units in the building are discussed and investigated through decomposition of the XGBoost- and Prophet models. A set of hyperparameters were tuned for the Prophet models both manually and through cross validation. These hyperparameters regulates the fitting of the trend- and seasonal components in the Prophet model.
The apartment building DHW load was predicted by a Prophet model and a XGBoost model with a Mean Absolute Percentage Error(MAPE) of ≈ 32% and ≈ 30%, respectively. None of the models made to predict the hotel DHW load performed MAPE values under 100%, but the Normalized Root Mean Squared Error(NRMSE) values of 0.49 from the Prophet prediction and 0.47 for the XGBOOST prediction shows that the prediction is not as far off as the MAPE values imply. The MAPE is heavily affected by low true consumption values, as this error metric is calculated by dividing the prediction error in each timestep by the true value. However, no good predictions were made for hotel DHW load. The nursing home DHW load was most accurately predicted by the XGBoost model, with a MAPE of 37% and NRMSE value of 0.27.