library(reticulate)
use_python("/Users/gregorycrooks/opt/anaconda3/envs/r-reticulate/bin/python")
The aim of this project is to analyze real estate parameters and seeing how these affect the house of price unit per area. In parallel to showcasing an exploratory data analysis with comprehensive data visualization, various non-linear Machine Learning techniques were also implemented in the modelling process. Lastly, this project includes an executive summary of results geared towards non-technical audiences.
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import seaborn as sns
Our aim consists in analyzing real estate parameters and seeing how these affect the house of price unit per area. The data was collected from Sindian Dist., New Taipei City, in Taiwan. Considering the vast fluctuations in real estate due to covid, it is interesting for investors to gain additional insight as to which factors influence house prices. In this case we want to see how real estate factors could affect the house price of a unit area. This can provide additional value given that real estate data is heavily reliant on the neighborhood’s location, ability to commute, and the nearest shopping center. Linear regression, polynomial regression and neural networks will be used to analyze the dataset and conduct our subsequent analysis.
Our initial exploration of data shows that there is a total of 7 variables. Given that our research question is centered around house prices, we established the “Y house price of unit area” as our dependent variable. The 6 others (i.e. ‘No’, ‘X1 transaction date’, ‘X2 house age’,‘X3 distance to the nearest MRT station’,‘X4 number of convenience stores’, ‘X5 latitude’, ‘X6 longitude’) will be the initial explanatory variables. In parallel, there is a total of 413 observations each.
df = pd.read_csv('/Users/gregorycrooks/Desktop/Real estate.csv', na_values='?')
df.shape
## (414, 8)
df.describe()
## No ... Y house price of unit area
## count 414.000000 ... 414.000000
## mean 207.500000 ... 37.980193
## std 119.655756 ... 13.606488
## min 1.000000 ... 7.600000
## 25% 104.250000 ... 27.700000
## 50% 207.500000 ... 38.450000
## 75% 310.750000 ... 46.600000
## max 414.000000 ... 117.500000
##
## [8 rows x 8 columns]
print(df.isnull().sum())
## No 0
## X1 transaction date 0
## X2 house age 0
## X3 distance to the nearest MRT station 0
## X4 number of convenience stores 0
## X5 latitude 0
## X6 longitude 0
## Y house price of unit area 0
## dtype: int64
To better analyze our data, we look at the different types (int or float) and notice that no substantial data cleaning will be required given that the data is in the appropriate numeric format. We also verify whether there is any null data within the dataset. In every column, there is a total of 0 Nan which means that no additional wrangling is required. Subsequently, we print a random sample of our dataset to further examine the state of the variables but no particular anomaly is detected.
pd.set_option('display.max_rows', df.shape[0]+1)
del df['No']
print(df.dtypes)
## X1 transaction date float64
## X2 house age float64
## X3 distance to the nearest MRT station float64
## X4 number of convenience stores int64
## X5 latitude float64
## X6 longitude float64
## Y house price of unit area float64
## dtype: object
df.sample(10)
## X1 transaction date ... Y house price of unit area
## 343 2013.000 ... 46.6
## 34 2012.750 ... 55.1
## 305 2013.083 ... 55.0
## 277 2013.417 ... 27.7
## 19 2012.667 ... 47.7
## 36 2012.917 ... 22.9
## 73 2013.167 ... 20.0
## 139 2012.667 ... 42.5
## 274 2013.167 ... 41.0
## 366 2012.750 ... 24.8
##
## [10 rows x 7 columns]
To make sure that the data is accurate we also look for outliers in our analysis and remove variables which are irrelevant.
We notice that there are a lot of outliers for the ’distance to the nearest MRT station variable. However, no particular anomaly is detected from the this variable given that it is not unrealistic for a real estate property to be far awar from a MRT station (even if it is 5-6 km or so). Given their geographical properties we sought to keep longitude and latitude given that they might bring insighful information with regards to our analysis. Indeed, price of houses is heavily dependent on location which might be an insightful indicator for our data.
We have a total of 6 independent variables for our analysis of the house pricing per unit: transaction data, house age, distance to the nearest MRT station, the number of convenience stores, longitude, and latitude.
sns.boxplot(x = df['X3 distance to the nearest MRT station']).set(xlabel=''
, title = 'Boxplot 1: Distance to the nearest MRT station')
sns.boxplot(x = df['Y house price of unit area']).set(xlabel=''
, title = 'Boxplot 2: House price of unit area')
On the account of further statistical analysis, we look at the correlation and multicollinearity between variables by using the heatmat function from the seaborn package. We found a moderately strong positive correlation between the price of unit area and the longitude, the latitude, and the number of convenience stores. We also find a strong negative correlation between the distance to the nearest MRT station and the house of price of unit area. Finally, we find that the distance to the nearest MRT station proves to have a very strong negative correlation with the longitude, and a strong negative correlation with the number of convenience stores as well latitude. Nevertheless, there are some limitations to the heat map. Since latitude and longitude are interrelated geographical measures, their individual values do not provide much information. As such, our scatterplots will display how both of these combined can provide insightful information with regards to geographical clusters.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.heatmap(df.corr(), annot=True,cmap='winter')
ax.set_title('Figure 1: Heatmap displaying correlation between variables',
pad = 20, fontsize = 10)
To analyze the distribution for our depedent variable, we plot a histogram, which shows a mild right-skewed distribution. We therefore notice that the most frequent price per unit area is between 35 and 45.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.displot(df['Y house price of unit area'],kde=True,bins=20, aspect=2).set(xlabel = 'Price of unit area',
title = "Histogram 1: Normal distribution of dependent variable")
In accordance with the correlation analysis found in the heat map, we interested in further examining variables which show a significant correlation with the house pricing. As such, we create scatterplots to analyze their relationships.
Scatterplot 1 follows the results displayed in the heat map between the price of unit area and the house age per transaction data (the min, 1st quantile, median, 3rd quantile, and max being displayed as interval values in the graph). Although no significant correlation is found in the scatterplot, we can visualise a very mild increase in price if the transaction date is between 2013.4 and 2013.6. While 12 months is not significant to determine this phenomenon, external factors such inflation or changes in the housing market could explain this.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['Y house price of unit area'], x=df['X1 transaction date'] , hue= 'X2 house age', palette="rocket")
ax.set(xlabel = 'Transaction date', ylabel = "House price of unit area")
ax.set_title("Scatterplot 1: Relationship between house price and house age for each house purchased",
pad = 20, fontsize = 10)
plt.legend(title='House age', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Scatterplot 2 shows the relationship between the house price of unit area and the number of convenience stores, and the distance to the nearest MRT starion. The graph concides with the correlation analysis from the heat map given that the closer distance to the nearest MRT station correlates with more convenience stores and an increase in house price of unit area. More specifically, the graph shows that houses which have a distance of up to 2500 meters to the nearest MRT station will have a significally higher price of unit area and 1 or more convenience stores. Starting from a distance of approximately 2500 meters from the nearest MRT onwards, little to no convenience stores can be found.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['Y house price of unit area'], x=df['X3 distance to the nearest MRT station']
, hue= 'X4 number of convenience stores', palette="rocket")
ax.set(xlabel = 'Distance to the nearest MRT station', ylabel = "House price of unit area")
ax.set_title("Scatterplot 2: Relationship between house price, number of convenience stores, and distance to the nearest MRT station",
pad = 20, fontsize = 10)
plt.legend(title='Convenience stores', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Scatterplot 3 inspects whether the distance between the nearest MRT station concurs with more frequent transaction dates. This graph confirms that it is the case, given that the data for the transaction date (regardless of the date of purchase), is clustered around houses whose distance to the nearest MRT station is at 2500 or less, even though this also concurs with an increase in the house price of unit area. This means that buyers will tend to buy houses which are easier to commute from even if the price is higher.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['Y house price of unit area'], x=df['X3 distance to the nearest MRT station'] , hue= 'X1 transaction date', palette="rocket")
ax.set(xlabel = 'Distance to the nearest MRT station', ylabel = "House price of unit area")
ax.set_title("Scatterplot 3: Relationship between house price, transaction date, and distance to the nearest MRT station",
pad = 20, fontsize = 10)
plt.legend(title='Transaction date', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Scatterplot 4 determines how the location can impact the price. Upon seeing how latitude and longitude interrelate, we noticed a geographical cluster located between the longitude of 121.54 and a latitude between 24.97 / 24.98. The closer to this location, the more common it is to find houses with a house price of unit area ranging between 60 to 100. On the outskirts of this geographical concentration, the price of houses depreciates. This means that real estate properties are more valued in this neighborhood. This also indicates that this neighborhood is central within the Sindian district.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['X5 latitude'], x=df['X6 longitude'] , hue= 'Y house price of unit area', palette="rocket")
ax.set(xlabel = 'Longitude', ylabel = "Latitude")
ax.set_title("Scatterplot 4: Relationship between geographical location and house price",
pad = 20, fontsize = 10)
plt.legend(title='House price of unit area', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
In scatterplot 5, we analyzed the relationship between the frequency of transactions and location. Data shows that houses which are the most commonly purchased are located in the neighborhood with the highest price of unit area. This further supports our findings from scatterplot 4 with regards to this neighborhood being a central location. Indeed, the vast majority of buyers want to buy houses located in this neighborhood, even if the price tends to be higher.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['X5 latitude'], x=df['X6 longitude'] , hue= 'X1 transaction date', palette="rocket")
ax.set(xlabel = 'Longitude', ylabel = "Latitude")
ax.set_title("Scatterplot 5: Relationship between geographical location and transaction date",
pad = 20, fontsize = 10)
plt.legend(title='Transaction date', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Scatterplot 6 shows the number of convenience stores in different neighborhoods in the Sindian Dist of New Taipei City, Taiwan. In concordance with scatterplots 4 and 5, this graph also indicates that the location which has the highest number of house purchases, the higher price of unit area, also has the highest number of convenience stores. Neighborhood on the outskirts of the city center will tend to have 0 convenience stores and are therefore less in demand than those with more convenience stores.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['X5 latitude'], x=df['X6 longitude'] , hue= 'X4 number of convenience stores', palette="rocket")
ax.set(xlabel = 'Longitude', ylabel = "Latitude")
ax.set_title("Scatterplot 6: Relationship between geographical location and number of convenience stores",
pad = 20, fontsize = 10)
plt.legend(title='Number of convenience stores', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Scatterplot 7 shows that the city center concentrates most of the oldest houses. While there are houses which have been built very recently in the same neighborhood, the median for house age is approximately 16. That is, out of the 40 year old houses, a much higher proportion of them is centered around this location, and it is much more common to find houses which have been built in the last 16 years outside of the city center. This implies that it is a residential neighborhood in which most of the residents are families.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['X5 latitude'], x=df['X6 longitude'] , hue= 'X2 house age', palette="rocket")
ax.set(xlabel = 'Longitude', ylabel = "Latitude")
ax.set_title("Scatterplot 7: Relationship between geographical location and house age",
pad = 20, fontsize = 10)
plt.legend(title='House age', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
The 8th scatterplot displays the distance to the nearest MRT station for different geographical locations. The graph very strongly indicates how most of the houses in the city center are closer to the nearest MRT station. This also explains how the demand and price of unit area is higher, given that it is easier to commute to and from this neighborhood. Indeed, the vast majority of houses are 1000 meters or less to a MRT station, whereas houses outside of the central location will increasingly stray away from the nearest station. For instance, residents located in the area whose longitude is 121.48 and latitude is 24.96 are 6 km away from the nearest station. This means that they will take much longer to commute.
plt.figure(figsize=(5, 4), dpi=100)
ax = sns.scatterplot(data=df, y=df['X5 latitude'], x=df['X6 longitude'], hue= 'X3 distance to the nearest MRT station', palette="rocket" )
ax.set(xlabel = 'Longitude', ylabel = "Latitude")
ax.set_title("Scatterplot 8: Relationship between geographical location and distance to the nearest MRT station",
pad = 20, fontsize = 10)
plt.legend(title='Distance to the nearest MRT station', bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.,
fontsize = 10)
Lastly, it is necessary to point issues of collinearity. Results strongly suggest that the closer to a MRT station, the more the number of convenience stores can be found. Out of both explanatory variables, we notice that the distance to the nearest MRT station has a stronger relationship. As such, we did not include the number of convenience stores in our resampling. We did not include latitude and longitude either given that they are storngly interrelated. There are now 3 explanatory variables: transaction date, house age, and distance to the nearest MRT station.
Our first step to detect overfitting is to split the data into a train and testing set (⅔ to ⅓ ) which we then pre-processed by standardizing features, removing the mean, and subsequently scaled to unit variance. We also set a seed to ensure reproducibility. Moreover, to assess the robustness of our model, we used the MAE (mean absolute error) as a metric. The MAE looks at the difference between the prediction of an observation and the true value of that observation.
from scipy import stats
df_changed = df.drop(columns = ["X4 number of convenience stores",
"X5 latitude",
"X6 longitude",
'Y house price of unit area'])
X_values = df_changed.values
X_values
## array([[2012.917 , 32. , 84.87882],
## [2012.917 , 19.5 , 306.5947 ],
## [2013.583 , 13.3 , 561.9845 ],
## ...,
## [2013.25 , 18.8 , 390.9696 ],
## [2013. , 8.1 , 104.8101 ],
## [2013.5 , 6.5 , 90.45606]])
y = df['Y house price of unit area'].values
from sklearn.model_selection import train_test_split
X_train_raw, X_test_raw, y_train, y_test = train_test_split(X_values, y, test_size=0.33, random_state=2022)
# for linear regression the data needs to be standard scaled
from sklearn.preprocessing import StandardScaler
SS = StandardScaler()
SS.fit(X_train_raw)
## StandardScaler()
X_train = SS.transform(X_train_raw)
X_test = SS.transform(X_test_raw)
Firstly, we ran standard linear regression (i.e. degrees 1 polynomial). This was to set a baseline to then compare with polynomial regress. We first and foremost notice that there is no substantial overfitting in our results given the little difference between our training and testing set. We find that the baseline absolute error with standard linear regression is about 7.4.
from sklearn.linear_model import LinearRegression
model = LinearRegression(fit_intercept=True)
model.fit(X_train, y_train)
## LinearRegression()
def mae(predictions, y_test):
return np.mean(np.abs(predictions-y_test))
LR_train_mae = mae(model.predict(X_train), y_train)
LR_test_mae = mae(model.predict(X_test), y_test)
print("LR train mae: {}".format(LR_train_mae))
## LR train mae: 6.521571756338549
print("LR test mae: {}".format(LR_test_mae))
## LR test mae: 7.384596893522332
We then create polynomial features and train a linear regression on those polynomial features (i.e. polynomial regression). We trial a number of different degrees. Results from this show that a degree has the lowest test error, with the test set having the lower mean absolute error (5.74). This polynomial degree also has the lowest overfitting given that it has the lowest gap between the train and testing sets.
from sklearn.linear_model import Ridge
from sklearn.preprocessing import PolynomialFeatures
n_values = range(1, 8)
train_errors = []
test_errors = []
for n in n_values:
poly = PolynomialFeatures(degree=n, interaction_only = False)
X_poly_train = poly.fit_transform(X_train)
X_poly_test = poly.transform(X_test)
model = LinearRegression(fit_intercept=True)
model.fit(X_poly_train, y_train)
LR_poly_train_mae = mae(model.predict(X_poly_train), y_train)
LR_poly_test_mae = mae(model.predict(X_poly_test), y_test)
train_errors.append(LR_poly_train_mae)
test_errors.append(LR_poly_test_mae)
print("N {}".format(n))
print("LR train mae: {}".format(LR_poly_train_mae))
print("LR test mae: {}".format(LR_poly_test_mae))
print("")
## LinearRegression()
## N 1
## LR train mae: 6.521571756338549
## LR test mae: 7.384596893522332
##
## LinearRegression()
## N 2
## LR train mae: 5.4197054384186245
## LR test mae: 6.142148548896064
##
## LinearRegression()
## N 3
## LR train mae: 5.257224893526242
## LR test mae: 5.746138475538864
##
## LinearRegression()
## N 4
## LR train mae: 5.078420090412917
## LR test mae: 6.072010237420974
##
## LinearRegression()
## N 5
## LR train mae: 4.593417187102522
## LR test mae: 7.629315126992115
##
## LinearRegression()
## N 6
## LR train mae: 10.218626607626353
## LR test mae: 26.1658203125
##
## LinearRegression()
## N 7
## LR train mae: 9.570644954309566
## LR test mae: 42.298952511975365
plt.figure()
plt.title("Graph 3: Polynomial degree against train and test error")
plt.plot(n_values, train_errors, label="train_mae")
plt.plot(n_values, test_errors, label="test_mae")
plt.xlabel("Polynomial degree")
plt.ylabel("Mean absolute error")
plt.legend()
This shows that polynomial value of N=3 is the best (it has the
lowest mean absolute error on the test set). As n goes about 3, the test
error starts to increase whilst the trainine error stays constant. The
means it starts to overfit.
from sklearn.linear_model import Ridge
for a in np.linspace(0.01, 0.1, 10):
model = Ridge(alpha=a)
model.fit(X_train, y_train)
LR_train_mae = mae(model.predict(X_train), y_train)
LR_test_mae = mae(model.predict(X_test), y_test)
print("alpha {}".format(a))
print("Ridge train mae: {}".format(LR_train_mae))
print("Ridge test mae: {}".format(LR_test_mae))
print("")
## Ridge(alpha=0.01)
## alpha 0.01
## Ridge train mae: 6.521596612634986
## Ridge test mae: 7.384567435651112
##
## Ridge(alpha=0.020000000000000004)
## alpha 0.020000000000000004
## Ridge train mae: 6.52162146722416
## Ridge test mae: 7.384537979824193
##
## Ridge(alpha=0.030000000000000006)
## alpha 0.030000000000000006
## Ridge train mae: 6.521646320106251
## Ridge test mae: 7.38450852604136
##
## Ridge(alpha=0.04000000000000001)
## alpha 0.04000000000000001
## Ridge train mae: 6.52167117128143
## Ridge test mae: 7.384479074302395
##
## Ridge(alpha=0.05000000000000001)
## alpha 0.05000000000000001
## Ridge train mae: 6.5216960207498795
## Ridge test mae: 7.384449624607088
##
## Ridge(alpha=0.06000000000000001)
## alpha 0.06000000000000001
## Ridge train mae: 6.521720868511773
## Ridge test mae: 7.3844201769552145
##
## Ridge(alpha=0.07)
## alpha 0.07
## Ridge train mae: 6.521745714567286
## Ridge test mae: 7.384390731346564
##
## Ridge(alpha=0.08)
## alpha 0.08
## Ridge train mae: 6.521770558916597
## Ridge test mae: 7.38436128778092
##
## Ridge(alpha=0.09000000000000001)
## alpha 0.09000000000000001
## Ridge train mae: 6.521795401559881
## Ridge test mae: 7.384331846258067
##
## Ridge(alpha=0.1)
## alpha 0.1
## Ridge train mae: 6.521820242497316
## Ridge test mae: 7.3843024067777865
Ridge regression shrinks the regression coefficients, so that variables, with minor contribution to the outcome, have their coefficients close to zero. The shrinkage of the coefficients is achieved by penalizing the regression model with a penalty term.
for n in range(1, 8):
poly = PolynomialFeatures(degree=n, interaction_only = False)
X_poly_train = poly.fit_transform(X_train)
X_poly_test = poly.transform(X_test)
for a in np.linspace(0.01, 0.1, 10):
model = Ridge(alpha=a)
model.fit(X_poly_train, y_train)
LR_train_mae = mae(model.predict(X_poly_train), y_train)
LR_test_mae = mae(model.predict(X_poly_test), y_test)
print("n: {}, alpha {}".format(n, a))
print("Ridge train mae: {}".format(LR_train_mae))
print("Ridge test mae: {}".format(LR_test_mae))
print("")
## Ridge(alpha=0.01)
## n: 1, alpha 0.01
## Ridge train mae: 6.521596612634987
## Ridge test mae: 7.384567435651111
##
## Ridge(alpha=0.020000000000000004)
## n: 1, alpha 0.020000000000000004
## Ridge train mae: 6.52162146722416
## Ridge test mae: 7.384537979824193
##
## Ridge(alpha=0.030000000000000006)
## n: 1, alpha 0.030000000000000006
## Ridge train mae: 6.521646320106251
## Ridge test mae: 7.38450852604136
##
## Ridge(alpha=0.04000000000000001)
## n: 1, alpha 0.04000000000000001
## Ridge train mae: 6.52167117128143
## Ridge test mae: 7.384479074302395
##
## Ridge(alpha=0.05000000000000001)
## n: 1, alpha 0.05000000000000001
## Ridge train mae: 6.5216960207498795
## Ridge test mae: 7.384449624607087
##
## Ridge(alpha=0.06000000000000001)
## n: 1, alpha 0.06000000000000001
## Ridge train mae: 6.521720868511773
## Ridge test mae: 7.384420176955215
##
## Ridge(alpha=0.07)
## n: 1, alpha 0.07
## Ridge train mae: 6.5217457145672855
## Ridge test mae: 7.384390731346564
##
## Ridge(alpha=0.08)
## n: 1, alpha 0.08
## Ridge train mae: 6.521770558916597
## Ridge test mae: 7.384361287780919
##
## Ridge(alpha=0.09000000000000001)
## n: 1, alpha 0.09000000000000001
## Ridge train mae: 6.521795401559881
## Ridge test mae: 7.384331846258067
##
## Ridge(alpha=0.1)
## n: 1, alpha 0.1
## Ridge train mae: 6.521820242497316
## Ridge test mae: 7.3843024067777865
##
## Ridge(alpha=0.01)
## n: 2, alpha 0.01
## Ridge train mae: 5.419788015892516
## Ridge test mae: 6.14219237087354
##
## Ridge(alpha=0.020000000000000004)
## n: 2, alpha 0.020000000000000004
## Ridge train mae: 5.4198705641308536
## Ridge test mae: 6.142246438154977
##
## Ridge(alpha=0.030000000000000006)
## n: 2, alpha 0.030000000000000006
## Ridge train mae: 5.41995308314937
## Ridge test mae: 6.142316141301987
##
## Ridge(alpha=0.04000000000000001)
## n: 2, alpha 0.04000000000000001
## Ridge train mae: 5.420035572963773
## Ridge test mae: 6.142385805767498
##
## Ridge(alpha=0.05000000000000001)
## n: 2, alpha 0.05000000000000001
## Ridge train mae: 5.420118033589764
## Ridge test mae: 6.14245543157578
##
## Ridge(alpha=0.06000000000000001)
## n: 2, alpha 0.06000000000000001
## Ridge train mae: 5.420200465043034
## Ridge test mae: 6.142525018751079
##
## Ridge(alpha=0.07)
## n: 2, alpha 0.07
## Ridge train mae: 5.4202828673392585
## Ridge test mae: 6.142594567317626
##
## Ridge(alpha=0.08)
## n: 2, alpha 0.08
## Ridge train mae: 5.420365240494108
## Ridge test mae: 6.142664077299633
##
## Ridge(alpha=0.09000000000000001)
## n: 2, alpha 0.09000000000000001
## Ridge train mae: 5.420447584523234
## Ridge test mae: 6.142733548721289
##
## Ridge(alpha=0.1)
## n: 2, alpha 0.1
## Ridge train mae: 5.420529899442285
## Ridge test mae: 6.142802981606769
##
## Ridge(alpha=0.01)
## n: 3, alpha 0.01
## Ridge train mae: 5.257306460235128
## Ridge test mae: 5.746328030735457
##
## Ridge(alpha=0.020000000000000004)
## n: 3, alpha 0.020000000000000004
## Ridge train mae: 5.2573879459838935
## Ridge test mae: 5.74651728936748
##
## Ridge(alpha=0.030000000000000006)
## n: 3, alpha 0.030000000000000006
## Ridge train mae: 5.257469350923302
## Ridge test mae: 5.746706252073681
##
## Ridge(alpha=0.04000000000000001)
## n: 3, alpha 0.04000000000000001
## Ridge train mae: 5.25755067520373
## Ridge test mae: 5.746894919491035
##
## Ridge(alpha=0.05000000000000001)
## n: 3, alpha 0.05000000000000001
## Ridge train mae: 5.257631918975165
## Ridge test mae: 5.747083292254784
##
## Ridge(alpha=0.06000000000000001)
## n: 3, alpha 0.06000000000000001
## Ridge train mae: 5.2577130823871885
## Ridge test mae: 5.747271370998428
##
## Ridge(alpha=0.07)
## n: 3, alpha 0.07
## Ridge train mae: 5.257794165589003
## Ridge test mae: 5.747459156353747
##
## Ridge(alpha=0.08)
## n: 3, alpha 0.08
## Ridge train mae: 5.257875168729412
## Ridge test mae: 5.747646648950769
##
## Ridge(alpha=0.09000000000000001)
## n: 3, alpha 0.09000000000000001
## Ridge train mae: 5.2579560919568396
## Ridge test mae: 5.747833849417803
##
## Ridge(alpha=0.1)
## n: 3, alpha 0.1
## Ridge train mae: 5.258036935419317
## Ridge test mae: 5.748020758381457
##
## Ridge(alpha=0.01)
## n: 4, alpha 0.01
## Ridge train mae: 5.078453229869225
## Ridge test mae: 6.07166585425305
##
## Ridge(alpha=0.020000000000000004)
## n: 4, alpha 0.020000000000000004
## Ridge train mae: 5.078486728809414
## Ridge test mae: 6.071324703217347
##
## Ridge(alpha=0.030000000000000006)
## n: 4, alpha 0.030000000000000006
## Ridge train mae: 5.078520581945106
## Ridge test mae: 6.070986746705667
##
## Ridge(alpha=0.04000000000000001)
## n: 4, alpha 0.04000000000000001
## Ridge train mae: 5.078554784064771
## Ridge test mae: 6.070651947652118
##
## Ridge(alpha=0.05000000000000001)
## n: 4, alpha 0.05000000000000001
## Ridge train mae: 5.078589330032319
## Ridge test mae: 6.0703202695232665
##
## Ridge(alpha=0.06000000000000001)
## n: 4, alpha 0.06000000000000001
## Ridge train mae: 5.078624214785893
## Ridge test mae: 6.069991676309093
##
## Ridge(alpha=0.07)
## n: 4, alpha 0.07
## Ridge train mae: 5.078659433336498
## Ridge test mae: 6.069666132513707
##
## Ridge(alpha=0.08)
## n: 4, alpha 0.08
## Ridge train mae: 5.07869498076683
## Ridge test mae: 6.069343603146454
##
## Ridge(alpha=0.09000000000000001)
## n: 4, alpha 0.09000000000000001
## Ridge train mae: 5.078730852229972
## Ridge test mae: 6.069024053713063
##
## Ridge(alpha=0.1)
## n: 4, alpha 0.1
## Ridge train mae: 5.078767042948241
## Ridge test mae: 6.068707450207112
##
## Ridge(alpha=0.01)
## n: 5, alpha 0.01
## Ridge train mae: 4.592942321154879
## Ridge test mae: 7.6263872681781395
##
## Ridge(alpha=0.020000000000000004)
## n: 5, alpha 0.020000000000000004
## Ridge train mae: 4.592472239791051
## Ridge test mae: 7.623392592210273
##
## Ridge(alpha=0.030000000000000006)
## n: 5, alpha 0.030000000000000006
## Ridge train mae: 4.592006794810789
## Ridge test mae: 7.620335412965674
##
## Ridge(alpha=0.04000000000000001)
## n: 5, alpha 0.04000000000000001
## Ridge train mae: 4.591545846385613
## Ridge test mae: 7.617219781492997
##
## Ridge(alpha=0.05000000000000001)
## n: 5, alpha 0.05000000000000001
## Ridge train mae: 4.591089262458251
## Ridge test mae: 7.61404950461583
##
## Ridge(alpha=0.06000000000000001)
## n: 5, alpha 0.06000000000000001
## Ridge train mae: 4.590675166559448
## Ridge test mae: 7.610923137786759
##
## Ridge(alpha=0.07)
## n: 5, alpha 0.07
## Ridge train mae: 4.590284071959667
## Ridge test mae: 7.60782720216512
##
## Ridge(alpha=0.08)
## n: 5, alpha 0.08
## Ridge train mae: 4.589894872552797
## Ridge test mae: 7.604685114387745
##
## Ridge(alpha=0.09000000000000001)
## n: 5, alpha 0.09000000000000001
## Ridge train mae: 4.58950755824834
## Ridge test mae: 7.601499903464217
##
## Ridge(alpha=0.1)
## n: 5, alpha 0.1
## Ridge train mae: 4.589122119093563
## Ridge test mae: 7.598274424900914
##
## Ridge(alpha=0.01)
## n: 6, alpha 0.01
## Ridge train mae: 4.372854140732388
## Ridge test mae: 13.578451730706403
##
## Ridge(alpha=0.020000000000000004)
## n: 6, alpha 0.020000000000000004
## Ridge train mae: 4.361406078796805
## Ridge test mae: 13.063643280873213
##
## Ridge(alpha=0.030000000000000006)
## n: 6, alpha 0.030000000000000006
## Ridge train mae: 4.351787912179385
## Ridge test mae: 12.606954306842633
##
## Ridge(alpha=0.04000000000000001)
## n: 6, alpha 0.04000000000000001
## Ridge train mae: 4.343233339562473
## Ridge test mae: 12.199491457494283
##
## Ridge(alpha=0.05000000000000001)
## n: 6, alpha 0.05000000000000001
## Ridge train mae: 4.335503318596305
## Ridge test mae: 11.833373501412852
##
## Ridge(alpha=0.06000000000000001)
## n: 6, alpha 0.06000000000000001
## Ridge train mae: 4.32974017655789
## Ridge test mae: 11.50287180855749
##
## Ridge(alpha=0.07)
## n: 6, alpha 0.07
## Ridge train mae: 4.325716639441944
## Ridge test mae: 11.203357982585137
##
## Ridge(alpha=0.08)
## n: 6, alpha 0.08
## Ridge train mae: 4.322743599071541
## Ridge test mae: 10.932097376706288
##
## Ridge(alpha=0.09000000000000001)
## n: 6, alpha 0.09000000000000001
## Ridge train mae: 4.320829259063197
## Ridge test mae: 10.855601985120046
##
## Ridge(alpha=0.1)
## n: 6, alpha 0.1
## Ridge train mae: 4.3191742804859405
## Ridge test mae: 10.844328497651988
##
## Ridge(alpha=0.01)
## n: 7, alpha 0.01
## Ridge train mae: 3.6695634828980355
## Ridge test mae: 32.971535138420286
##
## Ridge(alpha=0.020000000000000004)
## n: 7, alpha 0.020000000000000004
## Ridge train mae: 3.6519236204065595
## Ridge test mae: 30.344671517825283
##
## Ridge(alpha=0.030000000000000006)
## n: 7, alpha 0.030000000000000006
## Ridge train mae: 3.6499653510706715
## Ridge test mae: 28.488463369577268
##
## Ridge(alpha=0.04000000000000001)
## n: 7, alpha 0.04000000000000001
## Ridge train mae: 3.65595485104245
## Ridge test mae: 27.13309705562507
##
## Ridge(alpha=0.05000000000000001)
## n: 7, alpha 0.05000000000000001
## Ridge train mae: 3.661490497062608
## Ridge test mae: 26.151214938327325
##
## Ridge(alpha=0.06000000000000001)
## n: 7, alpha 0.06000000000000001
## Ridge train mae: 3.6664852845603
## Ridge test mae: 25.360421728811534
##
## Ridge(alpha=0.07)
## n: 7, alpha 0.07
## Ridge train mae: 3.6723092910229034
## Ridge test mae: 24.705760260224125
##
## Ridge(alpha=0.08)
## n: 7, alpha 0.08
## Ridge train mae: 3.6797223955442577
## Ridge test mae: 24.149637403823164
##
## Ridge(alpha=0.09000000000000001)
## n: 7, alpha 0.09000000000000001
## Ridge train mae: 3.686285498881524
## Ridge test mae: 23.668271292252104
##
## Ridge(alpha=0.1)
## n: 7, alpha 0.1
## Ridge train mae: 3.6924817648873645
## Ridge test mae: 23.244456914649188
n=3
poly = PolynomialFeatures(degree=n, interaction_only = False)
X_poly_train = poly.fit_transform(X_train)
X_poly_test = poly.transform(X_test)
train_mae = []
test_mae = []
a_values = np.linspace(0.01, 0.1, 10)
for a in a_values:
model = Ridge(alpha=a)
model.fit(X_poly_train, y_train)
LR_train_mae = mae(model.predict(X_poly_train), y_train)
LR_test_mae = mae(model.predict(X_poly_test), y_test)
train_mae.append(LR_train_mae)
test_mae.append(LR_test_mae)
#print("n: {}, alpha {}".format(n, a))
#print("Ridge train mae: {}".format(LR_train_mae))
#print("Ridge test mae: {}".format(LR_test_mae))
#print("")
## Ridge(alpha=0.01)
## Ridge(alpha=0.020000000000000004)
## Ridge(alpha=0.030000000000000006)
## Ridge(alpha=0.04000000000000001)
## Ridge(alpha=0.05000000000000001)
## Ridge(alpha=0.06000000000000001)
## Ridge(alpha=0.07)
## Ridge(alpha=0.08)
## Ridge(alpha=0.09000000000000001)
## Ridge(alpha=0.1)
plt.plot(a_values, train_mae, label="train_mae")
plt.plot(a_values, test_mae, label="test_mae")
plt.xlabel("alpha")
plt.legend()
Results from ridge regression show that the tuning parameter does not make a sustantial difference when using the polynomial of best fit. Indeed, ridge is effective when there is a high number of predictors and high collinearity. In this case, the highly collinear variables have been removed in the EDA, leaving only 3 predictors for out modelling process. As such, we will use neural networks to further reinforce our model.
Neural networks are extremely good at finding patterns in complex data such as images or sound. Neural networks create their own non-linear features through the process of backpropagation. Whilst neural networks are known for working well on complex homogenous data (images, sound, video), we are interested to see if they would work on a task such as this (tabular data).
import keras
class Mul(keras.layers.Layer):
def __init__(self, val):
self.const = val
def call(self, inputs):
return inputs*self.const
We trial two model architectures: architecture one has two hidden layers of 10 and 5 nodes; architecture two has one hidden layer of 3 nodes. Each architecture was trained for 100 epochs, using an ADAM optimizer, and a loss function of mean squared error (however we continue to validate using MAE).
model = keras.models.Sequential()
model.add(keras.layers.Dense(10, input_dim=3, activation="relu"))
model.add(keras.layers.Dense(5, activation="relu"))
model.add(keras.layers.Dense(1, activation="relu"))
#model.add(keras.layers.Lambda(lambda x:x*78.3))
# the softmax puts the output of the network between 0 and 1.
# We need to multiple this by the max expected value to ensure the network outputs in the desired range
model.compile(loss="mse", optimizer="adam", metrics=["mae"])
history = model.fit(X_train, y_train, epochs=100, batch_size=1, validation_data=(X_test, y_test))
## Epoch 1/100
##
1/277 [..............................] - ETA: 2:44 - loss: 2470.0901 - mae: 49.7000
36/277 [==>...........................] - ETA: 0s - loss: 1585.0010 - mae: 37.5212
67/277 [======>.......................] - ETA: 0s - loss: 1523.6066 - mae: 36.8430
99/277 [=========>....................] - ETA: 0s - loss: 1594.0640 - mae: 37.6664
136/277 [=============>................] - ETA: 0s - loss: 1625.4529 - mae: 37.8797
172/277 [=================>............] - ETA: 0s - loss: 1626.1511 - mae: 37.9397
205/277 [=====================>........] - ETA: 0s - loss: 1613.7715 - mae: 37.7664
238/277 [========================>.....] - ETA: 0s - loss: 1636.7510 - mae: 38.0901
266/277 [===========================>..] - ETA: 0s - loss: 1639.4038 - mae: 38.1997
277/277 [==============================] - 1s 3ms/step - loss: 1623.0206 - mae: 37.9851 - val_loss: 1446.9401 - val_mae: 35.3607
## Epoch 2/100
##
1/277 [..............................] - ETA: 0s - loss: 104.0011 - mae: 10.1981
35/277 [==>...........................] - ETA: 0s - loss: 1531.8750 - mae: 35.6593
74/277 [=======>......................] - ETA: 0s - loss: 1493.2561 - mae: 35.6105
109/277 [==========>...................] - ETA: 0s - loss: 1377.2374 - mae: 34.1404
133/277 [=============>................] - ETA: 0s - loss: 1373.6099 - mae: 34.1585
152/277 [===============>..............] - ETA: 0s - loss: 1383.6621 - mae: 34.4774
171/277 [=================>............] - ETA: 0s - loss: 1390.2277 - mae: 34.6792
190/277 [===================>..........] - ETA: 0s - loss: 1359.4875 - mae: 34.2496
215/277 [======================>.......] - ETA: 0s - loss: 1312.6556 - mae: 33.6298
241/277 [=========================>....] - ETA: 0s - loss: 1337.4366 - mae: 33.9648
264/277 [===========================>..] - ETA: 0s - loss: 1297.0752 - mae: 33.3564
277/277 [==============================] - 1s 3ms/step - loss: 1282.7953 - mae: 33.1719 - val_loss: 882.3262 - val_mae: 26.1436
## Epoch 3/100
##
1/277 [..............................] - ETA: 1s - loss: 1814.3890 - mae: 42.5956
13/277 [>.............................] - ETA: 1s - loss: 945.6012 - mae: 27.6106
33/277 [==>...........................] - ETA: 0s - loss: 835.2387 - mae: 26.0366
62/277 [=====>........................] - ETA: 0s - loss: 820.8144 - mae: 25.9476
92/277 [========>.....................] - ETA: 0s - loss: 732.7256 - mae: 24.5281
124/277 [============>.................] - ETA: 0s - loss: 718.3583 - mae: 24.1829
155/277 [===============>..............] - ETA: 0s - loss: 690.1375 - mae: 23.3792
186/277 [===================>..........] - ETA: 0s - loss: 634.6744 - mae: 22.2206
213/277 [======================>.......] - ETA: 0s - loss: 609.5428 - mae: 21.5228
237/277 [========================>.....] - ETA: 0s - loss: 576.8354 - mae: 20.8218
260/277 [===========================>..] - ETA: 0s - loss: 539.7739 - mae: 19.9117
277/277 [==============================] - 1s 3ms/step - loss: 521.2773 - mae: 19.5483 - val_loss: 233.1032 - val_mae: 10.8795
## Epoch 4/100
##
1/277 [..............................] - ETA: 0s - loss: 109.3886 - mae: 10.4589
21/277 [=>............................] - ETA: 0s - loss: 234.5261 - mae: 13.6821
40/277 [===>..........................] - ETA: 0s - loss: 249.3173 - mae: 12.6337
55/277 [====>.........................] - ETA: 0s - loss: 208.8372 - mae: 11.4012
68/277 [======>.......................] - ETA: 0s - loss: 201.7417 - mae: 11.4719
80/277 [=======>......................] - ETA: 0s - loss: 182.4078 - mae: 10.9005
96/277 [=========>....................] - ETA: 0s - loss: 164.9253 - mae: 10.3065
111/277 [===========>..................] - ETA: 0s - loss: 152.6112 - mae: 9.8550
127/277 [============>.................] - ETA: 0s - loss: 149.2401 - mae: 9.8212
141/277 [==============>...............] - ETA: 0s - loss: 142.7501 - mae: 9.5647
156/277 [===============>..............] - ETA: 0s - loss: 138.8747 - mae: 9.5186
175/277 [=================>............] - ETA: 0s - loss: 132.9365 - mae: 9.3362
194/277 [====================>.........] - ETA: 0s - loss: 132.8639 - mae: 9.2847
212/277 [=====================>........] - ETA: 0s - loss: 127.0636 - mae: 9.0284
228/277 [=======================>......] - ETA: 0s - loss: 131.0988 - mae: 9.0690
243/277 [=========================>....] - ETA: 0s - loss: 126.1761 - mae: 8.8826
258/277 [==========================>...] - ETA: 0s - loss: 122.7879 - mae: 8.7811
273/277 [============================>.] - ETA: 0s - loss: 119.5142 - mae: 8.6710
277/277 [==============================] - 1s 4ms/step - loss: 117.9464 - mae: 8.5848 - val_loss: 115.0206 - val_mae: 7.4276
## Epoch 5/100
##
1/277 [..............................] - ETA: 0s - loss: 118.6164 - mae: 10.8911
22/277 [=>............................] - ETA: 0s - loss: 44.8535 - mae: 5.7210
43/277 [===>..........................] - ETA: 0s - loss: 72.3914 - mae: 6.7920
61/277 [=====>........................] - ETA: 0s - loss: 59.4439 - mae: 6.0624
82/277 [=======>......................] - ETA: 0s - loss: 55.3029 - mae: 5.8916
102/277 [==========>...................] - ETA: 0s - loss: 51.5058 - mae: 5.6803
122/277 [============>.................] - ETA: 0s - loss: 62.8848 - mae: 6.0548
143/277 [==============>...............] - ETA: 0s - loss: 57.9190 - mae: 5.8721
162/277 [================>.............] - ETA: 0s - loss: 56.2607 - mae: 5.7886
183/277 [==================>...........] - ETA: 0s - loss: 56.6964 - mae: 5.7589
204/277 [=====================>........] - ETA: 0s - loss: 56.3083 - mae: 5.7954
224/277 [=======================>......] - ETA: 0s - loss: 54.6258 - mae: 5.7146
243/277 [=========================>....] - ETA: 0s - loss: 60.9433 - mae: 5.9691
262/277 [===========================>..] - ETA: 0s - loss: 65.4843 - mae: 6.1153
277/277 [==============================] - 1s 4ms/step - loss: 71.0354 - mae: 6.2787 - val_loss: 100.2990 - val_mae: 6.7794
## Epoch 6/100
##
1/277 [..............................] - ETA: 0s - loss: 653.2460 - mae: 25.5587
23/277 [=>............................] - ETA: 0s - loss: 61.0698 - mae: 5.6561
45/277 [===>..........................] - ETA: 0s - loss: 81.1723 - mae: 6.5622
66/277 [======>.......................] - ETA: 0s - loss: 69.1783 - mae: 6.1926
86/277 [========>.....................] - ETA: 0s - loss: 104.9826 - mae: 7.4678
107/277 [==========>...................] - ETA: 0s - loss: 89.2531 - mae: 6.7937
128/277 [============>.................] - ETA: 0s - loss: 81.4578 - mae: 6.5079
149/277 [===============>..............] - ETA: 0s - loss: 84.5265 - mae: 6.5581
170/277 [=================>............] - ETA: 0s - loss: 79.2313 - mae: 6.4047
192/277 [===================>..........] - ETA: 0s - loss: 75.0745 - mae: 6.2491
217/277 [======================>.......] - ETA: 0s - loss: 70.9571 - mae: 6.0723
248/277 [=========================>....] - ETA: 0s - loss: 68.7197 - mae: 5.9260
277/277 [==============================] - 1s 3ms/step - loss: 64.7187 - mae: 5.7063 - val_loss: 94.7780 - val_mae: 6.4805
## Epoch 7/100
##
1/277 [..............................] - ETA: 0s - loss: 6.5656 - mae: 2.5623
36/277 [==>...........................] - ETA: 0s - loss: 103.5271 - mae: 7.1643
68/277 [======>.......................] - ETA: 0s - loss: 86.1765 - mae: 6.4793
103/277 [==========>...................] - ETA: 0s - loss: 88.9114 - mae: 6.5265
127/277 [============>.................] - ETA: 0s - loss: 84.3702 - mae: 6.2104
148/277 [===============>..............] - ETA: 0s - loss: 77.7900 - mae: 5.9906
166/277 [================>.............] - ETA: 0s - loss: 75.1909 - mae: 5.9270
185/277 [===================>..........] - ETA: 0s - loss: 71.4747 - mae: 5.8236
203/277 [====================>.........] - ETA: 0s - loss: 67.9380 - mae: 5.6970
225/277 [=======================>......] - ETA: 0s - loss: 65.9511 - mae: 5.6816
254/277 [==========================>...] - ETA: 0s - loss: 65.1944 - mae: 5.6518
277/277 [==============================] - 1s 3ms/step - loss: 63.3085 - mae: 5.5795 - val_loss: 93.6428 - val_mae: 6.4115
## Epoch 8/100
##
1/277 [..............................] - ETA: 0s - loss: 0.5929 - mae: 0.7700
37/277 [===>..........................] - ETA: 0s - loss: 68.9690 - mae: 6.0505
70/277 [======>.......................] - ETA: 0s - loss: 64.4171 - mae: 5.8620
101/277 [=========>....................] - ETA: 0s - loss: 58.3360 - mae: 5.5235
136/277 [=============>................] - ETA: 0s - loss: 59.0053 - mae: 5.2864
175/277 [=================>............] - ETA: 0s - loss: 56.6301 - mae: 5.2944
210/277 [=====================>........] - ETA: 0s - loss: 62.2523 - mae: 5.5200
248/277 [=========================>....] - ETA: 0s - loss: 65.2024 - mae: 5.6234
277/277 [==============================] - 1s 2ms/step - loss: 62.7366 - mae: 5.5280 - val_loss: 92.7976 - val_mae: 6.3701
## Epoch 9/100
##
1/277 [..............................] - ETA: 0s - loss: 54.6007 - mae: 7.3892
36/277 [==>...........................] - ETA: 0s - loss: 43.7004 - mae: 4.6326
72/277 [======>.......................] - ETA: 0s - loss: 55.9864 - mae: 5.5429
108/277 [==========>...................] - ETA: 0s - loss: 62.4096 - mae: 5.5233
131/277 [=============>................] - ETA: 0s - loss: 71.3928 - mae: 5.8135
159/277 [================>.............] - ETA: 0s - loss: 66.5445 - mae: 5.6706
189/277 [===================>..........] - ETA: 0s - loss: 67.8519 - mae: 5.8201
218/277 [======================>.......] - ETA: 0s - loss: 70.1414 - mae: 5.8063
245/277 [=========================>....] - ETA: 0s - loss: 66.9693 - mae: 5.6851
274/277 [============================>.] - ETA: 0s - loss: 62.2213 - mae: 5.4865
277/277 [==============================] - 1s 2ms/step - loss: 62.1602 - mae: 5.5044 - val_loss: 91.7074 - val_mae: 6.3057
## Epoch 10/100
##
1/277 [..............................] - ETA: 0s - loss: 7.3133 - mae: 2.7043
29/277 [==>...........................] - ETA: 0s - loss: 41.3270 - mae: 4.8764
61/277 [=====>........................] - ETA: 0s - loss: 41.5370 - mae: 4.4742
92/277 [========>.....................] - ETA: 0s - loss: 46.3510 - mae: 4.6970
120/277 [===========>..................] - ETA: 0s - loss: 65.7495 - mae: 5.3160
151/277 [===============>..............] - ETA: 0s - loss: 71.8047 - mae: 5.6798
185/277 [===================>..........] - ETA: 0s - loss: 72.1990 - mae: 5.7529
214/277 [======================>.......] - ETA: 0s - loss: 67.3413 - mae: 5.5899
239/277 [========================>.....] - ETA: 0s - loss: 63.4983 - mae: 5.4624
269/277 [============================>.] - ETA: 0s - loss: 62.8580 - mae: 5.5183
277/277 [==============================] - 1s 2ms/step - loss: 61.7240 - mae: 5.4706 - val_loss: 92.1252 - val_mae: 6.3121
## Epoch 11/100
##
1/277 [..............................] - ETA: 0s - loss: 10.2782 - mae: 3.2060
29/277 [==>...........................] - ETA: 0s - loss: 60.8942 - mae: 5.6027
57/277 [=====>........................] - ETA: 0s - loss: 46.8939 - mae: 4.9756
87/277 [========>.....................] - ETA: 0s - loss: 41.4603 - mae: 4.6299
113/277 [===========>..................] - ETA: 0s - loss: 42.0463 - mae: 4.7887
141/277 [==============>...............] - ETA: 0s - loss: 54.3815 - mae: 5.0250
176/277 [==================>...........] - ETA: 0s - loss: 59.9204 - mae: 5.3302
208/277 [=====================>........] - ETA: 0s - loss: 61.1359 - mae: 5.4754
244/277 [=========================>....] - ETA: 0s - loss: 61.2416 - mae: 5.3835
277/277 [==============================] - 1s 2ms/step - loss: 61.3575 - mae: 5.3898 - val_loss: 90.7678 - val_mae: 6.1978
## Epoch 12/100
##
1/277 [..............................] - ETA: 0s - loss: 24.6365 - mae: 4.9635
35/277 [==>...........................] - ETA: 0s - loss: 52.2816 - mae: 5.4727
73/277 [======>.......................] - ETA: 0s - loss: 66.0379 - mae: 5.7484
116/277 [===========>..................] - ETA: 0s - loss: 58.8523 - mae: 5.2968
155/277 [===============>..............] - ETA: 0s - loss: 55.6930 - mae: 5.2226
192/277 [===================>..........] - ETA: 0s - loss: 59.6112 - mae: 5.3930
229/277 [=======================>......] - ETA: 0s - loss: 63.3024 - mae: 5.5597
264/277 [===========================>..] - ETA: 0s - loss: 60.0648 - mae: 5.3648
277/277 [==============================] - 1s 2ms/step - loss: 61.3264 - mae: 5.4117 - val_loss: 90.3098 - val_mae: 6.1619
## Epoch 13/100
##
1/277 [..............................] - ETA: 0s - loss: 15.0427 - mae: 3.8785
45/277 [===>..........................] - ETA: 0s - loss: 59.3843 - mae: 5.6037
86/277 [========>.....................] - ETA: 0s - loss: 59.2491 - mae: 5.3261
125/277 [============>.................] - ETA: 0s - loss: 63.2347 - mae: 5.3836
166/277 [================>.............] - ETA: 0s - loss: 64.6308 - mae: 5.5330
199/277 [====================>.........] - ETA: 0s - loss: 60.1450 - mae: 5.3708
237/277 [========================>.....] - ETA: 0s - loss: 60.8867 - mae: 5.3520
277/277 [==============================] - 0s 2ms/step - loss: 60.9800 - mae: 5.4448 - val_loss: 90.8200 - val_mae: 6.1889
## Epoch 14/100
##
1/277 [..............................] - ETA: 0s - loss: 175.9592 - mae: 13.2650
36/277 [==>...........................] - ETA: 0s - loss: 72.1883 - mae: 5.5733
72/277 [======>.......................] - ETA: 0s - loss: 65.5976 - mae: 5.6126
110/277 [==========>...................] - ETA: 0s - loss: 59.5437 - mae: 5.3231
148/277 [===============>..............] - ETA: 0s - loss: 62.2674 - mae: 5.2912
186/277 [===================>..........] - ETA: 0s - loss: 58.3871 - mae: 5.2528
226/277 [=======================>......] - ETA: 0s - loss: 62.3106 - mae: 5.4952
263/277 [===========================>..] - ETA: 0s - loss: 60.0152 - mae: 5.4344
277/277 [==============================] - 1s 2ms/step - loss: 61.0885 - mae: 5.4398 - val_loss: 91.6540 - val_mae: 6.2739
## Epoch 15/100
##
1/277 [..............................] - ETA: 0s - loss: 42.4804 - mae: 6.5177
41/277 [===>..........................] - ETA: 0s - loss: 65.2598 - mae: 6.0085
82/277 [=======>......................] - ETA: 0s - loss: 85.7180 - mae: 6.3553
121/277 [============>.................] - ETA: 0s - loss: 72.4970 - mae: 5.9901
158/277 [================>.............] - ETA: 0s - loss: 62.6036 - mae: 5.5116
197/277 [====================>.........] - ETA: 0s - loss: 60.5060 - mae: 5.4875
235/277 [========================>.....] - ETA: 0s - loss: 56.1341 - mae: 5.2914
271/277 [============================>.] - ETA: 0s - loss: 60.1978 - mae: 5.3915
277/277 [==============================] - 1s 2ms/step - loss: 60.8460 - mae: 5.3953 - val_loss: 90.1211 - val_mae: 6.1333
## Epoch 16/100
##
1/277 [..............................] - ETA: 0s - loss: 97.2016 - mae: 9.8591
36/277 [==>...........................] - ETA: 0s - loss: 56.2285 - mae: 5.3438
72/277 [======>.......................] - ETA: 0s - loss: 52.3105 - mae: 5.2124
103/277 [==========>...................] - ETA: 0s - loss: 62.4597 - mae: 5.4999
132/277 [=============>................] - ETA: 0s - loss: 54.8259 - mae: 5.1714
170/277 [=================>............] - ETA: 0s - loss: 53.2260 - mae: 5.0451
201/277 [====================>.........] - ETA: 0s - loss: 58.0763 - mae: 5.3524
229/277 [=======================>......] - ETA: 0s - loss: 62.8855 - mae: 5.4549
262/277 [===========================>..] - ETA: 0s - loss: 61.1940 - mae: 5.4056
277/277 [==============================] - 1s 2ms/step - loss: 60.7129 - mae: 5.4128 - val_loss: 89.7553 - val_mae: 6.1455
## Epoch 17/100
##
1/277 [..............................] - ETA: 0s - loss: 54.7216 - mae: 7.3974
40/277 [===>..........................] - ETA: 0s - loss: 50.2469 - mae: 4.7798
81/277 [=======>......................] - ETA: 0s - loss: 66.3035 - mae: 5.8130
115/277 [===========>..................] - ETA: 0s - loss: 66.9770 - mae: 5.5458
152/277 [===============>..............] - ETA: 0s - loss: 61.8836 - mae: 5.3948
190/277 [===================>..........] - ETA: 0s - loss: 57.2526 - mae: 5.2742
223/277 [=======================>......] - ETA: 0s - loss: 55.0103 - mae: 5.1093
258/277 [==========================>...] - ETA: 0s - loss: 53.3162 - mae: 5.1144
277/277 [==============================] - 1s 2ms/step - loss: 60.1900 - mae: 5.3987 - val_loss: 91.2614 - val_mae: 6.2485
## Epoch 18/100
##
1/277 [..............................] - ETA: 0s - loss: 0.0203 - mae: 0.1426
37/277 [===>..........................] - ETA: 0s - loss: 82.3757 - mae: 5.8962
74/277 [=======>......................] - ETA: 0s - loss: 77.6830 - mae: 6.2317
107/277 [==========>...................] - ETA: 0s - loss: 70.7317 - mae: 5.9849
142/277 [==============>...............] - ETA: 0s - loss: 70.4760 - mae: 5.8807
176/277 [==================>...........] - ETA: 0s - loss: 65.6402 - mae: 5.7470
210/277 [=====================>........] - ETA: 0s - loss: 68.4053 - mae: 5.7848
251/277 [==========================>...] - ETA: 0s - loss: 62.9046 - mae: 5.5297
277/277 [==============================] - 1s 2ms/step - loss: 60.1262 - mae: 5.4036 - val_loss: 90.8319 - val_mae: 6.1611
## Epoch 19/100
##
1/277 [..............................] - ETA: 0s - loss: 62.3051 - mae: 7.8934
35/277 [==>...........................] - ETA: 0s - loss: 52.2037 - mae: 5.7344
71/277 [======>.......................] - ETA: 0s - loss: 95.5340 - mae: 6.8965
111/277 [===========>..................] - ETA: 0s - loss: 75.9115 - mae: 6.2099
152/277 [===============>..............] - ETA: 0s - loss: 72.9653 - mae: 5.8805
191/277 [===================>..........] - ETA: 0s - loss: 63.2571 - mae: 5.4172
235/277 [========================>.....] - ETA: 0s - loss: 61.0962 - mae: 5.3628
275/277 [============================>.] - ETA: 0s - loss: 58.3668 - mae: 5.2810
277/277 [==============================] - 1s 2ms/step - loss: 59.8911 - mae: 5.3490 - val_loss: 91.0848 - val_mae: 6.1839
## Epoch 20/100
##
1/277 [..............................] - ETA: 0s - loss: 19.3275 - mae: 4.3963
43/277 [===>..........................] - ETA: 0s - loss: 99.3543 - mae: 6.2296
80/277 [=======>......................] - ETA: 0s - loss: 69.4257 - mae: 5.2170
120/277 [===========>..................] - ETA: 0s - loss: 68.3062 - mae: 5.4812
158/277 [================>.............] - ETA: 0s - loss: 63.4270 - mae: 5.3295
197/277 [====================>.........] - ETA: 0s - loss: 63.8321 - mae: 5.4434
237/277 [========================>.....] - ETA: 0s - loss: 60.9118 - mae: 5.4631
273/277 [============================>.] - ETA: 0s - loss: 60.2109 - mae: 5.3721
277/277 [==============================] - 1s 2ms/step - loss: 59.8212 - mae: 5.3750 - val_loss: 89.3640 - val_mae: 6.0898
## Epoch 21/100
##
1/277 [..............................] - ETA: 0s - loss: 34.7606 - mae: 5.8958
36/277 [==>...........................] - ETA: 0s - loss: 91.0651 - mae: 6.9560
68/277 [======>.......................] - ETA: 0s - loss: 72.4344 - mae: 6.0508
104/277 [==========>...................] - ETA: 0s - loss: 68.4009 - mae: 5.9631
140/277 [==============>...............] - ETA: 0s - loss: 70.3115 - mae: 5.8633
179/277 [==================>...........] - ETA: 0s - loss: 59.7695 - mae: 5.3957
218/277 [======================>.......] - ETA: 0s - loss: 58.7465 - mae: 5.3488
249/277 [=========================>....] - ETA: 0s - loss: 60.7046 - mae: 5.4464
277/277 [==============================] - 1s 2ms/step - loss: 59.7461 - mae: 5.3951 - val_loss: 89.5464 - val_mae: 6.0995
## Epoch 22/100
##
1/277 [..............................] - ETA: 0s - loss: 256.8375 - mae: 16.0262
40/277 [===>..........................] - ETA: 0s - loss: 98.2784 - mae: 6.5195
74/277 [=======>......................] - ETA: 0s - loss: 81.0695 - mae: 6.1099
107/277 [==========>...................] - ETA: 0s - loss: 68.4407 - mae: 5.6434
129/277 [============>.................] - ETA: 0s - loss: 64.3109 - mae: 5.5475
146/277 [==============>...............] - ETA: 0s - loss: 62.3588 - mae: 5.4912
165/277 [================>.............] - ETA: 0s - loss: 57.8265 - mae: 5.2457
183/277 [==================>...........] - ETA: 0s - loss: 53.7525 - mae: 5.0442
201/277 [====================>.........] - ETA: 0s - loss: 58.6180 - mae: 5.1358
223/277 [=======================>......] - ETA: 0s - loss: 63.4442 - mae: 5.4245
247/277 [=========================>....] - ETA: 0s - loss: 61.8944 - mae: 5.3978
277/277 [==============================] - 1s 3ms/step - loss: 59.1370 - mae: 5.3011 - val_loss: 89.5793 - val_mae: 6.0811
## Epoch 23/100
##
1/277 [..............................] - ETA: 0s - loss: 119.4586 - mae: 10.9297
23/277 [=>............................] - ETA: 0s - loss: 22.9989 - mae: 3.3281
39/277 [===>..........................] - ETA: 0s - loss: 47.0211 - mae: 4.5439
60/277 [=====>........................] - ETA: 0s - loss: 61.7966 - mae: 5.2480
87/277 [========>.....................] - ETA: 0s - loss: 64.3730 - mae: 5.4797
122/277 [============>.................] - ETA: 0s - loss: 76.8154 - mae: 5.9115
166/277 [================>.............] - ETA: 0s - loss: 68.4972 - mae: 5.7618
206/277 [=====================>........] - ETA: 0s - loss: 62.9064 - mae: 5.5419
234/277 [========================>.....] - ETA: 0s - loss: 57.8920 - mae: 5.3171
267/277 [===========================>..] - ETA: 0s - loss: 55.1272 - mae: 5.2727
277/277 [==============================] - 1s 2ms/step - loss: 58.9538 - mae: 5.3448 - val_loss: 91.0750 - val_mae: 6.1820
## Epoch 24/100
##
1/277 [..............................] - ETA: 0s - loss: 31.7153 - mae: 5.6316
31/277 [==>...........................] - ETA: 0s - loss: 43.2026 - mae: 5.3250
65/277 [======>.......................] - ETA: 0s - loss: 64.3729 - mae: 5.7123
104/277 [==========>...................] - ETA: 0s - loss: 58.1731 - mae: 5.4159
139/277 [==============>...............] - ETA: 0s - loss: 51.1329 - mae: 5.1329
167/277 [=================>............] - ETA: 0s - loss: 55.3936 - mae: 5.2593
188/277 [===================>..........] - ETA: 0s - loss: 60.0168 - mae: 5.4715
197/277 [====================>.........] - ETA: 0s - loss: 58.2379 - mae: 5.4073
205/277 [=====================>........] - ETA: 0s - loss: 58.3833 - mae: 5.3536
212/277 [=====================>........] - ETA: 0s - loss: 57.2453 - mae: 5.3020
233/277 [========================>.....] - ETA: 0s - loss: 60.9489 - mae: 5.3572
262/277 [===========================>..] - ETA: 0s - loss: 60.5837 - mae: 5.4116
277/277 [==============================] - 1s 3ms/step - loss: 59.3487 - mae: 5.3564 - val_loss: 89.3995 - val_mae: 6.0709
## Epoch 25/100
##
1/277 [..............................] - ETA: 0s - loss: 36.4818 - mae: 6.0400
29/277 [==>...........................] - ETA: 0s - loss: 51.9547 - mae: 5.3191
60/277 [=====>........................] - ETA: 0s - loss: 46.0376 - mae: 5.0957
95/277 [=========>....................] - ETA: 0s - loss: 49.9508 - mae: 5.4289
129/277 [============>.................] - ETA: 0s - loss: 50.0154 - mae: 5.1632
164/277 [================>.............] - ETA: 0s - loss: 51.4975 - mae: 5.2739
201/277 [====================>.........] - ETA: 0s - loss: 60.4849 - mae: 5.4632
232/277 [========================>.....] - ETA: 0s - loss: 57.8704 - mae: 5.3091
261/277 [===========================>..] - ETA: 0s - loss: 54.9090 - mae: 5.2396
277/277 [==============================] - 1s 2ms/step - loss: 59.4867 - mae: 5.3693 - val_loss: 89.1353 - val_mae: 6.0562
## Epoch 26/100
##
1/277 [..............................] - ETA: 0s - loss: 35.8095 - mae: 5.9841
36/277 [==>...........................] - ETA: 0s - loss: 25.8798 - mae: 3.8250
67/277 [======>.......................] - ETA: 0s - loss: 50.6494 - mae: 5.2447
93/277 [=========>....................] - ETA: 0s - loss: 56.5472 - mae: 5.5118
125/277 [============>.................] - ETA: 0s - loss: 51.0360 - mae: 5.2480
160/277 [================>.............] - ETA: 0s - loss: 57.2452 - mae: 5.4617
200/277 [====================>.........] - ETA: 0s - loss: 57.4949 - mae: 5.3369
239/277 [========================>.....] - ETA: 0s - loss: 61.0452 - mae: 5.3933
273/277 [============================>.] - ETA: 0s - loss: 58.9758 - mae: 5.3260
277/277 [==============================] - 1s 2ms/step - loss: 58.6831 - mae: 5.3213 - val_loss: 88.3136 - val_mae: 6.0345
## Epoch 27/100
##
1/277 [..............................] - ETA: 0s - loss: 57.0344 - mae: 7.5521
39/277 [===>..........................] - ETA: 0s - loss: 40.1164 - mae: 4.2888
78/277 [=======>......................] - ETA: 0s - loss: 71.4417 - mae: 5.5400
106/277 [==========>...................] - ETA: 0s - loss: 65.0630 - mae: 5.4685
127/277 [============>.................] - ETA: 0s - loss: 75.3395 - mae: 5.8089
144/277 [==============>...............] - ETA: 0s - loss: 69.6403 - mae: 5.5999
162/277 [================>.............] - ETA: 0s - loss: 65.4892 - mae: 5.4603
185/277 [===================>..........] - ETA: 0s - loss: 64.2270 - mae: 5.4575
218/277 [======================>.......] - ETA: 0s - loss: 64.8965 - mae: 5.6066
257/277 [==========================>...] - ETA: 0s - loss: 60.7612 - mae: 5.4032
277/277 [==============================] - 1s 2ms/step - loss: 58.5898 - mae: 5.3026 - val_loss: 90.5190 - val_mae: 6.1478
## Epoch 28/100
##
1/277 [..............................] - ETA: 0s - loss: 42.8865 - mae: 6.5488
42/277 [===>..........................] - ETA: 0s - loss: 44.5865 - mae: 4.9140
81/277 [=======>......................] - ETA: 0s - loss: 67.1038 - mae: 5.7655
113/277 [===========>..................] - ETA: 0s - loss: 72.0104 - mae: 5.8125
146/277 [==============>...............] - ETA: 0s - loss: 67.3980 - mae: 5.6549
184/277 [==================>...........] - ETA: 0s - loss: 61.3175 - mae: 5.3768
221/277 [======================>.......] - ETA: 0s - loss: 61.2513 - mae: 5.2983
256/277 [==========================>...] - ETA: 0s - loss: 60.1071 - mae: 5.2829
277/277 [==============================] - 1s 2ms/step - loss: 58.8507 - mae: 5.2839 - val_loss: 88.5909 - val_mae: 6.0390
## Epoch 29/100
##
1/277 [..............................] - ETA: 0s - loss: 72.8576 - mae: 8.5357
40/277 [===>..........................] - ETA: 0s - loss: 29.8168 - mae: 4.3446
78/277 [=======>......................] - ETA: 0s - loss: 65.5703 - mae: 5.4785
110/277 [==========>...................] - ETA: 0s - loss: 60.8986 - mae: 5.4456
142/277 [==============>...............] - ETA: 0s - loss: 56.8951 - mae: 5.3133
177/277 [==================>...........] - ETA: 0s - loss: 55.3426 - mae: 5.1852
213/277 [======================>.......] - ETA: 0s - loss: 55.7008 - mae: 5.0816
247/277 [=========================>....] - ETA: 0s - loss: 55.2233 - mae: 5.1567
277/277 [==============================] - 1s 2ms/step - loss: 58.2525 - mae: 5.2989 - val_loss: 89.6183 - val_mae: 6.1004
## Epoch 30/100
##
1/277 [..............................] - ETA: 0s - loss: 92.8178 - mae: 9.6342
38/277 [===>..........................] - ETA: 0s - loss: 49.8727 - mae: 4.9221
69/277 [======>.......................] - ETA: 0s - loss: 48.5909 - mae: 5.0138
99/277 [=========>....................] - ETA: 0s - loss: 54.9908 - mae: 5.4460
131/277 [=============>................] - ETA: 0s - loss: 55.2166 - mae: 5.4156
171/277 [=================>............] - ETA: 0s - loss: 52.7007 - mae: 5.2902
205/277 [=====================>........] - ETA: 0s - loss: 48.4763 - mae: 5.0614
241/277 [=========================>....] - ETA: 0s - loss: 54.6512 - mae: 5.1771
277/277 [==============================] - 1s 2ms/step - loss: 58.5629 - mae: 5.2766 - val_loss: 87.7345 - val_mae: 5.9872
## Epoch 31/100
##
1/277 [..............................] - ETA: 0s - loss: 0.0438 - mae: 0.2092
38/277 [===>..........................] - ETA: 0s - loss: 67.3844 - mae: 5.2404
74/277 [=======>......................] - ETA: 0s - loss: 67.5295 - mae: 5.5562
113/277 [===========>..................] - ETA: 0s - loss: 59.8469 - mae: 5.4165
149/277 [===============>..............] - ETA: 0s - loss: 56.2368 - mae: 5.3624
185/277 [===================>..........] - ETA: 0s - loss: 53.9630 - mae: 5.2204
222/277 [=======================>......] - ETA: 0s - loss: 50.4345 - mae: 5.1128
256/277 [==========================>...] - ETA: 0s - loss: 55.5067 - mae: 5.2534
277/277 [==============================] - 1s 2ms/step - loss: 58.4518 - mae: 5.3031 - val_loss: 88.0219 - val_mae: 5.9975
## Epoch 32/100
##
1/277 [..............................] - ETA: 0s - loss: 17.7840 - mae: 4.2171
38/277 [===>..........................] - ETA: 0s - loss: 71.4815 - mae: 5.6563
76/277 [=======>......................] - ETA: 0s - loss: 55.4132 - mae: 5.1470
117/277 [===========>..................] - ETA: 0s - loss: 58.2147 - mae: 5.3634
153/277 [===============>..............] - ETA: 0s - loss: 55.0822 - mae: 5.1295
188/277 [===================>..........] - ETA: 0s - loss: 51.8501 - mae: 5.0225
227/277 [=======================>......] - ETA: 0s - loss: 56.1524 - mae: 5.2075
264/277 [===========================>..] - ETA: 0s - loss: 59.4908 - mae: 5.3261
277/277 [==============================] - 1s 2ms/step - loss: 58.1015 - mae: 5.2719 - val_loss: 87.7934 - val_mae: 5.9897
## Epoch 33/100
##
1/277 [..............................] - ETA: 0s - loss: 4.4875 - mae: 2.1184
38/277 [===>..........................] - ETA: 0s - loss: 77.2530 - mae: 5.9930
76/277 [=======>......................] - ETA: 0s - loss: 70.7140 - mae: 6.0912
113/277 [===========>..................] - ETA: 0s - loss: 63.5410 - mae: 5.8383
153/277 [===============>..............] - ETA: 0s - loss: 63.0702 - mae: 5.5884
189/277 [===================>..........] - ETA: 0s - loss: 57.5962 - mae: 5.2922
224/277 [=======================>......] - ETA: 0s - loss: 60.0948 - mae: 5.3641
264/277 [===========================>..] - ETA: 0s - loss: 58.0438 - mae: 5.2308
277/277 [==============================] - 1s 2ms/step - loss: 58.4764 - mae: 5.2832 - val_loss: 88.1504 - val_mae: 6.0189
## Epoch 34/100
##
1/277 [..............................] - ETA: 0s - loss: 77.0801 - mae: 8.7795
36/277 [==>...........................] - ETA: 0s - loss: 86.2519 - mae: 6.3250
80/277 [=======>......................] - ETA: 0s - loss: 64.8239 - mae: 5.4511
118/277 [===========>..................] - ETA: 0s - loss: 57.4465 - mae: 5.2454
154/277 [===============>..............] - ETA: 0s - loss: 53.9281 - mae: 5.1715
191/277 [===================>..........] - ETA: 0s - loss: 55.0015 - mae: 5.1016
224/277 [=======================>......] - ETA: 0s - loss: 54.4042 - mae: 5.1032
260/277 [===========================>..] - ETA: 0s - loss: 58.0165 - mae: 5.2565
277/277 [==============================] - 1s 2ms/step - loss: 57.3072 - mae: 5.2990 - val_loss: 87.6568 - val_mae: 5.9852
## Epoch 35/100
##
1/277 [..............................] - ETA: 0s - loss: 35.4942 - mae: 5.9577
37/277 [===>..........................] - ETA: 0s - loss: 44.8586 - mae: 5.1091
75/277 [=======>......................] - ETA: 0s - loss: 44.9240 - mae: 5.0881
113/277 [===========>..................] - ETA: 0s - loss: 52.9135 - mae: 5.1395
152/277 [===============>..............] - ETA: 0s - loss: 55.3328 - mae: 5.1605
190/277 [===================>..........] - ETA: 0s - loss: 55.6933 - mae: 5.0912
229/277 [=======================>......] - ETA: 0s - loss: 61.0110 - mae: 5.2694
266/277 [===========================>..] - ETA: 0s - loss: 58.6089 - mae: 5.2178
277/277 [==============================] - 1s 2ms/step - loss: 57.7776 - mae: 5.1971 - val_loss: 88.8243 - val_mae: 6.0579
## Epoch 36/100
##
1/277 [..............................] - ETA: 0s - loss: 134.4067 - mae: 11.5934
36/277 [==>...........................] - ETA: 0s - loss: 73.3329 - mae: 6.1095
77/277 [=======>......................] - ETA: 0s - loss: 67.1707 - mae: 6.0441
114/277 [===========>..................] - ETA: 0s - loss: 60.9414 - mae: 5.6028
147/277 [==============>...............] - ETA: 0s - loss: 69.7676 - mae: 5.7793
180/277 [==================>...........] - ETA: 0s - loss: 69.2524 - mae: 5.5959
211/277 [=====================>........] - ETA: 0s - loss: 64.6170 - mae: 5.4247
244/277 [=========================>....] - ETA: 0s - loss: 59.3909 - mae: 5.2357
277/277 [==============================] - 1s 2ms/step - loss: 57.9905 - mae: 5.2495 - val_loss: 87.6656 - val_mae: 5.9798
## Epoch 37/100
##
1/277 [..............................] - ETA: 0s - loss: 0.6557 - mae: 0.8097
32/277 [==>...........................] - ETA: 0s - loss: 77.1869 - mae: 5.7527
57/277 [=====>........................] - ETA: 0s - loss: 63.6444 - mae: 5.5380
85/277 [========>.....................] - ETA: 0s - loss: 60.2226 - mae: 5.5078
116/277 [===========>..................] - ETA: 0s - loss: 51.3104 - mae: 5.1392
146/277 [==============>...............] - ETA: 0s - loss: 48.3470 - mae: 4.9202
175/277 [=================>............] - ETA: 0s - loss: 48.9210 - mae: 4.9231
206/277 [=====================>........] - ETA: 0s - loss: 58.1464 - mae: 5.2607
236/277 [========================>.....] - ETA: 0s - loss: 60.2096 - mae: 5.2888
265/277 [===========================>..] - ETA: 0s - loss: 59.9013 - mae: 5.3601
277/277 [==============================] - 1s 2ms/step - loss: 57.8075 - mae: 5.2505 - val_loss: 87.8749 - val_mae: 5.9877
## Epoch 38/100
##
1/277 [..............................] - ETA: 0s - loss: 8.1752 - mae: 2.8592
34/277 [==>...........................] - ETA: 0s - loss: 36.2864 - mae: 4.7459
66/277 [======>.......................] - ETA: 0s - loss: 60.0328 - mae: 5.6246
95/277 [=========>....................] - ETA: 0s - loss: 62.9563 - mae: 5.4820
124/277 [============>.................] - ETA: 0s - loss: 64.5519 - mae: 5.5681
151/277 [===============>..............] - ETA: 0s - loss: 60.4609 - mae: 5.4119
180/277 [==================>...........] - ETA: 0s - loss: 63.6329 - mae: 5.4290
212/277 [=====================>........] - ETA: 0s - loss: 60.7366 - mae: 5.3604
242/277 [=========================>....] - ETA: 0s - loss: 58.1808 - mae: 5.2674
273/277 [============================>.] - ETA: 0s - loss: 57.5158 - mae: 5.2764
277/277 [==============================] - 1s 2ms/step - loss: 57.6881 - mae: 5.2946 - val_loss: 87.7492 - val_mae: 5.9794
## Epoch 39/100
##
1/277 [..............................] - ETA: 0s - loss: 1.7220 - mae: 1.3122
38/277 [===>..........................] - ETA: 0s - loss: 40.1984 - mae: 4.7020
76/277 [=======>......................] - ETA: 0s - loss: 61.6157 - mae: 5.3561
111/277 [===========>..................] - ETA: 0s - loss: 69.2233 - mae: 5.3600
148/277 [===============>..............] - ETA: 0s - loss: 61.1794 - mae: 5.2194
190/277 [===================>..........] - ETA: 0s - loss: 60.5657 - mae: 5.2752
230/277 [=======================>......] - ETA: 0s - loss: 59.2432 - mae: 5.2733
267/277 [===========================>..] - ETA: 0s - loss: 57.1532 - mae: 5.1882
277/277 [==============================] - 1s 2ms/step - loss: 57.5092 - mae: 5.2359 - val_loss: 88.6470 - val_mae: 6.0484
## Epoch 40/100
##
1/277 [..............................] - ETA: 0s - loss: 17.6898 - mae: 4.2059
33/277 [==>...........................] - ETA: 0s - loss: 57.0082 - mae: 5.2721
70/277 [======>.......................] - ETA: 0s - loss: 48.9822 - mae: 4.9874
105/277 [==========>...................] - ETA: 0s - loss: 68.0196 - mae: 5.4906
141/277 [==============>...............] - ETA: 0s - loss: 60.2327 - mae: 5.2526
176/277 [==================>...........] - ETA: 0s - loss: 60.3131 - mae: 5.2668
201/277 [====================>.........] - ETA: 0s - loss: 59.4042 - mae: 5.3354
229/277 [=======================>......] - ETA: 0s - loss: 56.8381 - mae: 5.2320
263/277 [===========================>..] - ETA: 0s - loss: 58.9363 - mae: 5.3113
277/277 [==============================] - 1s 2ms/step - loss: 57.4515 - mae: 5.2653 - val_loss: 88.2045 - val_mae: 6.0193
## Epoch 41/100
##
1/277 [..............................] - ETA: 0s - loss: 11.2698 - mae: 3.3570
40/277 [===>..........................] - ETA: 0s - loss: 54.1507 - mae: 5.3540
77/277 [=======>......................] - ETA: 0s - loss: 58.4281 - mae: 4.9964
113/277 [===========>..................] - ETA: 0s - loss: 65.1351 - mae: 5.3062
152/277 [===============>..............] - ETA: 0s - loss: 57.1780 - mae: 4.9819
195/277 [====================>.........] - ETA: 0s - loss: 57.8602 - mae: 5.2243
231/277 [========================>.....] - ETA: 0s - loss: 62.7489 - mae: 5.5072
266/277 [===========================>..] - ETA: 0s - loss: 59.2836 - mae: 5.3499
277/277 [==============================] - 1s 2ms/step - loss: 57.4513 - mae: 5.2501 - val_loss: 87.0614 - val_mae: 5.9509
## Epoch 42/100
##
1/277 [..............................] - ETA: 0s - loss: 9.7035 - mae: 3.1150
39/277 [===>..........................] - ETA: 0s - loss: 51.2894 - mae: 5.6680
77/277 [=======>......................] - ETA: 0s - loss: 80.4619 - mae: 6.3475
108/277 [==========>...................] - ETA: 0s - loss: 67.5197 - mae: 5.6747
138/277 [=============>................] - ETA: 0s - loss: 61.9778 - mae: 5.4573
174/277 [=================>............] - ETA: 0s - loss: 65.4230 - mae: 5.4584
208/277 [=====================>........] - ETA: 0s - loss: 60.8774 - mae: 5.3569
244/277 [=========================>....] - ETA: 0s - loss: 57.2851 - mae: 5.2572
275/277 [============================>.] - ETA: 0s - loss: 55.1560 - mae: 5.1726
277/277 [==============================] - 1s 2ms/step - loss: 57.5231 - mae: 5.2753 - val_loss: 87.5765 - val_mae: 5.9549
## Epoch 43/100
##
1/277 [..............................] - ETA: 0s - loss: 1176.3610 - mae: 34.2981
41/277 [===>..........................] - ETA: 0s - loss: 78.2336 - mae: 6.1417
77/277 [=======>......................] - ETA: 0s - loss: 68.5236 - mae: 5.9675
110/277 [==========>...................] - ETA: 0s - loss: 61.9956 - mae: 5.6389
149/277 [===============>..............] - ETA: 0s - loss: 58.2814 - mae: 5.4783
184/277 [==================>...........] - ETA: 0s - loss: 61.0736 - mae: 5.4994
220/277 [======================>.......] - ETA: 0s - loss: 64.0014 - mae: 5.4886
258/277 [==========================>...] - ETA: 0s - loss: 59.8799 - mae: 5.3561
277/277 [==============================] - 1s 2ms/step - loss: 57.4418 - mae: 5.2422 - val_loss: 87.7055 - val_mae: 5.9650
## Epoch 44/100
##
1/277 [..............................] - ETA: 0s - loss: 0.6025 - mae: 0.7762
41/277 [===>..........................] - ETA: 0s - loss: 45.9951 - mae: 4.8950
82/277 [=======>......................] - ETA: 0s - loss: 43.6092 - mae: 4.7256
118/277 [===========>..................] - ETA: 0s - loss: 53.7457 - mae: 4.9897
152/277 [===============>..............] - ETA: 0s - loss: 63.3803 - mae: 5.2121
190/277 [===================>..........] - ETA: 0s - loss: 60.8902 - mae: 5.2061
222/277 [=======================>......] - ETA: 0s - loss: 55.8021 - mae: 5.0481
251/277 [==========================>...] - ETA: 0s - loss: 55.6088 - mae: 5.0996
277/277 [==============================] - 1s 2ms/step - loss: 57.3952 - mae: 5.2363 - val_loss: 88.1010 - val_mae: 6.0025
## Epoch 45/100
##
1/277 [..............................] - ETA: 0s - loss: 0.9426 - mae: 0.9709
32/277 [==>...........................] - ETA: 0s - loss: 58.5807 - mae: 5.8696
54/277 [====>.........................] - ETA: 0s - loss: 47.9849 - mae: 5.0811
74/277 [=======>......................] - ETA: 0s - loss: 45.2415 - mae: 5.0153
96/277 [=========>....................] - ETA: 0s - loss: 59.1902 - mae: 5.4514
120/277 [===========>..................] - ETA: 0s - loss: 70.7545 - mae: 5.7870
144/277 [==============>...............] - ETA: 0s - loss: 63.0864 - mae: 5.4365
170/277 [=================>............] - ETA: 0s - loss: 62.5363 - mae: 5.4558
200/277 [====================>.........] - ETA: 0s - loss: 65.7890 - mae: 5.5768
228/277 [=======================>......] - ETA: 0s - loss: 61.4734 - mae: 5.4495
258/277 [==========================>...] - ETA: 0s - loss: 57.9451 - mae: 5.2820
277/277 [==============================] - 1s 3ms/step - loss: 57.2738 - mae: 5.2316 - val_loss: 87.9570 - val_mae: 5.9966
## Epoch 46/100
##
1/277 [..............................] - ETA: 0s - loss: 0.3957 - mae: 0.6291
31/277 [==>...........................] - ETA: 0s - loss: 33.5879 - mae: 3.9836
62/277 [=====>........................] - ETA: 0s - loss: 40.8066 - mae: 4.7937
91/277 [========>.....................] - ETA: 0s - loss: 54.6632 - mae: 5.1234
119/277 [===========>..................] - ETA: 0s - loss: 64.1405 - mae: 5.4765
149/277 [===============>..............] - ETA: 0s - loss: 55.3786 - mae: 5.0856
182/277 [==================>...........] - ETA: 0s - loss: 56.0819 - mae: 5.2335
213/277 [======================>.......] - ETA: 0s - loss: 54.9238 - mae: 5.2052
242/277 [=========================>....] - ETA: 0s - loss: 54.1862 - mae: 5.2005
272/277 [============================>.] - ETA: 0s - loss: 55.8447 - mae: 5.2046
277/277 [==============================] - 1s 2ms/step - loss: 57.2073 - mae: 5.2649 - val_loss: 87.4374 - val_mae: 5.9418
## Epoch 47/100
##
1/277 [..............................] - ETA: 0s - loss: 1.0836 - mae: 1.0409
30/277 [==>...........................] - ETA: 0s - loss: 50.7066 - mae: 5.1440
60/277 [=====>........................] - ETA: 0s - loss: 73.6922 - mae: 5.8286
90/277 [========>.....................] - ETA: 0s - loss: 78.5792 - mae: 6.0502
123/277 [============>.................] - ETA: 0s - loss: 78.7465 - mae: 5.9769
160/277 [================>.............] - ETA: 0s - loss: 68.3259 - mae: 5.5892
189/277 [===================>..........] - ETA: 0s - loss: 64.0287 - mae: 5.4208
216/277 [======================>.......] - ETA: 0s - loss: 61.7838 - mae: 5.3744
243/277 [=========================>....] - ETA: 0s - loss: 60.6352 - mae: 5.3943
273/277 [============================>.] - ETA: 0s - loss: 57.7953 - mae: 5.2982
277/277 [==============================] - 1s 2ms/step - loss: 57.1448 - mae: 5.2595 - val_loss: 87.7079 - val_mae: 5.9534
## Epoch 48/100
##
1/277 [..............................] - ETA: 0s - loss: 4.2698 - mae: 2.0664
36/277 [==>...........................] - ETA: 0s - loss: 30.3668 - mae: 4.2579
73/277 [======>.......................] - ETA: 0s - loss: 48.9670 - mae: 5.0408
113/277 [===========>..................] - ETA: 0s - loss: 71.6630 - mae: 5.7806
149/277 [===============>..............] - ETA: 0s - loss: 68.0546 - mae: 5.8177
184/277 [==================>...........] - ETA: 0s - loss: 62.9085 - mae: 5.6683
222/277 [=======================>......] - ETA: 0s - loss: 64.6344 - mae: 5.5765
258/277 [==========================>...] - ETA: 0s - loss: 58.3462 - mae: 5.2429
277/277 [==============================] - 1s 2ms/step - loss: 57.1967 - mae: 5.2285 - val_loss: 87.5092 - val_mae: 5.9743
## Epoch 49/100
##
1/277 [..............................] - ETA: 0s - loss: 5.2990 - mae: 2.3020
36/277 [==>...........................] - ETA: 0s - loss: 32.9564 - mae: 4.0380
71/277 [======>.......................] - ETA: 0s - loss: 47.6760 - mae: 4.8324
111/277 [===========>..................] - ETA: 0s - loss: 60.3016 - mae: 5.3745
146/277 [==============>...............] - ETA: 0s - loss: 59.0047 - mae: 5.3799
181/277 [==================>...........] - ETA: 0s - loss: 58.2967 - mae: 5.2758
221/277 [======================>.......] - ETA: 0s - loss: 57.0120 - mae: 5.2546
257/277 [==========================>...] - ETA: 0s - loss: 58.1452 - mae: 5.2433
277/277 [==============================] - 1s 2ms/step - loss: 56.9222 - mae: 5.2174 - val_loss: 87.0356 - val_mae: 5.9497
## Epoch 50/100
##
1/277 [..............................] - ETA: 0s - loss: 1315.5459 - mae: 36.2705
37/277 [===>..........................] - ETA: 0s - loss: 80.5443 - mae: 6.2939
71/277 [======>.......................] - ETA: 0s - loss: 65.8909 - mae: 5.7512
106/277 [==========>...................] - ETA: 0s - loss: 59.6626 - mae: 5.3121
140/277 [==============>...............] - ETA: 0s - loss: 57.2236 - mae: 5.2711
174/277 [=================>............] - ETA: 0s - loss: 53.2043 - mae: 5.0437
209/277 [=====================>........] - ETA: 0s - loss: 48.4101 - mae: 4.8428
245/277 [=========================>....] - ETA: 0s - loss: 49.2861 - mae: 4.9731
277/277 [==============================] - 1s 2ms/step - loss: 56.8748 - mae: 5.2285 - val_loss: 87.5985 - val_mae: 5.9933
## Epoch 51/100
##
1/277 [..............................] - ETA: 0s - loss: 4.9631 - mae: 2.2278
37/277 [===>..........................] - ETA: 0s - loss: 108.8634 - mae: 7.0766
72/277 [======>.......................] - ETA: 0s - loss: 76.8223 - mae: 6.1664
100/277 [=========>....................] - ETA: 0s - loss: 69.2379 - mae: 5.9951
123/277 [============>.................] - ETA: 0s - loss: 61.6896 - mae: 5.6591
156/277 [===============>..............] - ETA: 0s - loss: 60.6817 - mae: 5.5181
190/277 [===================>..........] - ETA: 0s - loss: 55.9716 - mae: 5.2425
222/277 [=======================>......] - ETA: 0s - loss: 57.5456 - mae: 5.2843
258/277 [==========================>...] - ETA: 0s - loss: 53.9121 - mae: 5.1830
277/277 [==============================] - 1s 2ms/step - loss: 56.7964 - mae: 5.2518 - val_loss: 88.7175 - val_mae: 6.0629
## Epoch 52/100
##
1/277 [..............................] - ETA: 0s - loss: 0.2574 - mae: 0.5073
31/277 [==>...........................] - ETA: 0s - loss: 69.2276 - mae: 5.7438
61/277 [=====>........................] - ETA: 0s - loss: 102.5825 - mae: 6.5023
93/277 [=========>....................] - ETA: 0s - loss: 82.9357 - mae: 5.7416
126/277 [============>.................] - ETA: 0s - loss: 77.7548 - mae: 5.7557
163/277 [================>.............] - ETA: 0s - loss: 72.8385 - mae: 5.7439
202/277 [====================>.........] - ETA: 0s - loss: 65.4831 - mae: 5.5432
239/277 [========================>.....] - ETA: 0s - loss: 59.0440 - mae: 5.2485
277/277 [==============================] - ETA: 0s - loss: 56.4294 - mae: 5.2293
277/277 [==============================] - 1s 2ms/step - loss: 56.4294 - mae: 5.2293 - val_loss: 87.1404 - val_mae: 5.9369
## Epoch 53/100
##
1/277 [..............................] - ETA: 0s - loss: 2.1208 - mae: 1.4563
41/277 [===>..........................] - ETA: 0s - loss: 42.4452 - mae: 4.8939
74/277 [=======>......................] - ETA: 0s - loss: 41.0402 - mae: 4.7868
111/277 [===========>..................] - ETA: 0s - loss: 38.6623 - mae: 4.8461
153/277 [===============>..............] - ETA: 0s - loss: 47.3578 - mae: 4.9643
191/277 [===================>..........] - ETA: 0s - loss: 48.4630 - mae: 5.0033
229/277 [=======================>......] - ETA: 0s - loss: 48.7854 - mae: 4.9522
269/277 [============================>.] - ETA: 0s - loss: 56.7688 - mae: 5.1865
277/277 [==============================] - 1s 2ms/step - loss: 57.3551 - mae: 5.2138 - val_loss: 86.8590 - val_mae: 5.9227
## Epoch 54/100
##
1/277 [..............................] - ETA: 0s - loss: 28.5087 - mae: 5.3394
39/277 [===>..........................] - ETA: 0s - loss: 84.2244 - mae: 5.8083
76/277 [=======>......................] - ETA: 0s - loss: 74.4817 - mae: 5.7482
112/277 [===========>..................] - ETA: 0s - loss: 67.2509 - mae: 5.7381
149/277 [===============>..............] - ETA: 0s - loss: 57.3397 - mae: 5.3447
185/277 [===================>..........] - ETA: 0s - loss: 61.8517 - mae: 5.5226
221/277 [======================>.......] - ETA: 0s - loss: 59.8668 - mae: 5.3388
263/277 [===========================>..] - ETA: 0s - loss: 57.8517 - mae: 5.2388
277/277 [==============================] - 1s 2ms/step - loss: 56.8211 - mae: 5.2041 - val_loss: 86.9620 - val_mae: 5.9470
## Epoch 55/100
##
1/277 [..............................] - ETA: 0s - loss: 265.9478 - mae: 16.3079
41/277 [===>..........................] - ETA: 0s - loss: 32.4446 - mae: 4.2118
76/277 [=======>......................] - ETA: 0s - loss: 47.2654 - mae: 4.8178
111/277 [===========>..................] - ETA: 0s - loss: 55.7798 - mae: 5.3151
152/277 [===============>..............] - ETA: 0s - loss: 62.0218 - mae: 5.4214
189/277 [===================>..........] - ETA: 0s - loss: 57.7103 - mae: 5.3007
223/277 [=======================>......] - ETA: 0s - loss: 56.8266 - mae: 5.3135
260/277 [===========================>..] - ETA: 0s - loss: 59.0778 - mae: 5.3408
277/277 [==============================] - 1s 2ms/step - loss: 56.9622 - mae: 5.2548 - val_loss: 86.9486 - val_mae: 5.9324
## Epoch 56/100
##
1/277 [..............................] - ETA: 0s - loss: 1.3848 - mae: 1.1768
29/277 [==>...........................] - ETA: 0s - loss: 43.1497 - mae: 4.9997
57/277 [=====>........................] - ETA: 0s - loss: 33.9924 - mae: 4.5429
85/277 [========>.....................] - ETA: 0s - loss: 36.0229 - mae: 4.7829
113/277 [===========>..................] - ETA: 0s - loss: 42.3805 - mae: 5.0688
146/277 [==============>...............] - ETA: 0s - loss: 44.9885 - mae: 5.0713
188/277 [===================>..........] - ETA: 0s - loss: 48.2924 - mae: 4.9537
226/277 [=======================>......] - ETA: 0s - loss: 52.0743 - mae: 5.1170
264/277 [===========================>..] - ETA: 0s - loss: 58.6451 - mae: 5.3132
277/277 [==============================] - 1s 2ms/step - loss: 56.8219 - mae: 5.2121 - val_loss: 87.2013 - val_mae: 5.9749
## Epoch 57/100
##
1/277 [..............................] - ETA: 0s - loss: 6.9844 - mae: 2.6428
37/277 [===>..........................] - ETA: 0s - loss: 51.9718 - mae: 4.7963
75/277 [=======>......................] - ETA: 0s - loss: 52.2554 - mae: 4.7682
108/277 [==========>...................] - ETA: 0s - loss: 46.9857 - mae: 4.7017
142/277 [==============>...............] - ETA: 0s - loss: 48.8607 - mae: 5.0328
178/277 [==================>...........] - ETA: 0s - loss: 47.4135 - mae: 5.0118
212/277 [=====================>........] - ETA: 0s - loss: 54.3890 - mae: 5.2244
243/277 [=========================>....] - ETA: 0s - loss: 57.3801 - mae: 5.2156
272/277 [============================>.] - ETA: 0s - loss: 57.0109 - mae: 5.2220
277/277 [==============================] - 1s 2ms/step - loss: 56.8858 - mae: 5.2290 - val_loss: 86.9916 - val_mae: 5.9312
## Epoch 58/100
##
1/277 [..............................] - ETA: 0s - loss: 1.3519 - mae: 1.1627
32/277 [==>...........................] - ETA: 0s - loss: 58.4041 - mae: 5.5341
67/277 [======>.......................] - ETA: 0s - loss: 54.4853 - mae: 5.4887
104/277 [==========>...................] - ETA: 0s - loss: 60.0420 - mae: 5.4426
140/277 [==============>...............] - ETA: 0s - loss: 59.8065 - mae: 5.3554
176/277 [==================>...........] - ETA: 0s - loss: 56.0165 - mae: 5.2587
211/277 [=====================>........] - ETA: 0s - loss: 54.9491 - mae: 5.2033
240/277 [========================>.....] - ETA: 0s - loss: 58.9216 - mae: 5.2942
268/277 [============================>.] - ETA: 0s - loss: 56.9971 - mae: 5.1846
277/277 [==============================] - 1s 2ms/step - loss: 56.7490 - mae: 5.2138 - val_loss: 87.1137 - val_mae: 5.9591
## Epoch 59/100
##
1/277 [..............................] - ETA: 0s - loss: 17.0087 - mae: 4.1242
31/277 [==>...........................] - ETA: 0s - loss: 70.7701 - mae: 5.7022
62/277 [=====>........................] - ETA: 0s - loss: 69.5896 - mae: 5.8660
95/277 [=========>....................] - ETA: 0s - loss: 64.9976 - mae: 5.7331
126/277 [============>.................] - ETA: 0s - loss: 61.4049 - mae: 5.3999
158/277 [================>.............] - ETA: 0s - loss: 68.9643 - mae: 5.4831
188/277 [===================>..........] - ETA: 0s - loss: 63.5562 - mae: 5.3362
207/277 [=====================>........] - ETA: 0s - loss: 61.1886 - mae: 5.2921
227/277 [=======================>......] - ETA: 0s - loss: 58.5898 - mae: 5.2292
250/277 [==========================>...] - ETA: 0s - loss: 58.3933 - mae: 5.2749
277/277 [==============================] - ETA: 0s - loss: 56.5469 - mae: 5.2540
277/277 [==============================] - 1s 2ms/step - loss: 56.5469 - mae: 5.2540 - val_loss: 87.7696 - val_mae: 5.9697
## Epoch 60/100
##
1/277 [..............................] - ETA: 0s - loss: 23.6274 - mae: 4.8608
29/277 [==>...........................] - ETA: 0s - loss: 36.3173 - mae: 4.5905
58/277 [=====>........................] - ETA: 0s - loss: 64.2490 - mae: 5.3835
93/277 [=========>....................] - ETA: 0s - loss: 49.6535 - mae: 4.6993
128/277 [============>.................] - ETA: 0s - loss: 51.7118 - mae: 4.8466
168/277 [=================>............] - ETA: 0s - loss: 60.9307 - mae: 5.2886
208/277 [=====================>........] - ETA: 0s - loss: 60.9622 - mae: 5.4208
241/277 [=========================>....] - ETA: 0s - loss: 58.1299 - mae: 5.2366
269/277 [============================>.] - ETA: 0s - loss: 56.8301 - mae: 5.1893
277/277 [==============================] - 1s 2ms/step - loss: 56.4947 - mae: 5.1745 - val_loss: 86.9334 - val_mae: 5.9511
## Epoch 61/100
##
1/277 [..............................] - ETA: 0s - loss: 64.2592 - mae: 8.0162
36/277 [==>...........................] - ETA: 0s - loss: 22.2756 - mae: 3.5150
71/277 [======>.......................] - ETA: 0s - loss: 71.1273 - mae: 5.3632
103/277 [==========>...................] - ETA: 0s - loss: 66.3262 - mae: 5.1937
136/277 [=============>................] - ETA: 0s - loss: 57.5419 - mae: 5.0021
174/277 [=================>............] - ETA: 0s - loss: 59.8636 - mae: 5.3055
210/277 [=====================>........] - ETA: 0s - loss: 57.1817 - mae: 5.0939
247/277 [=========================>....] - ETA: 0s - loss: 56.9289 - mae: 5.2198
277/277 [==============================] - 1s 2ms/step - loss: 57.0253 - mae: 5.2476 - val_loss: 87.1181 - val_mae: 5.9372
## Epoch 62/100
##
1/277 [..............................] - ETA: 0s - loss: 54.8759 - mae: 7.4078
38/277 [===>..........................] - ETA: 0s - loss: 123.6667 - mae: 7.2089
76/277 [=======>......................] - ETA: 0s - loss: 87.2842 - mae: 6.2886
116/277 [===========>..................] - ETA: 0s - loss: 66.4762 - mae: 5.5100
153/277 [===============>..............] - ETA: 0s - loss: 61.2599 - mae: 5.4852
192/277 [===================>..........] - ETA: 0s - loss: 59.2901 - mae: 5.3215
231/277 [========================>.....] - ETA: 0s - loss: 56.4780 - mae: 5.1462
267/277 [===========================>..] - ETA: 0s - loss: 56.8461 - mae: 5.2216
277/277 [==============================] - 1s 2ms/step - loss: 56.9072 - mae: 5.2492 - val_loss: 88.2166 - val_mae: 6.0333
## Epoch 63/100
##
1/277 [..............................] - ETA: 0s - loss: 60.5800 - mae: 7.7833
39/277 [===>..........................] - ETA: 0s - loss: 54.5563 - mae: 4.8669
75/277 [=======>......................] - ETA: 0s - loss: 49.8065 - mae: 4.7256
115/277 [===========>..................] - ETA: 0s - loss: 51.2773 - mae: 4.9508
153/277 [===============>..............] - ETA: 0s - loss: 59.9294 - mae: 5.2191
187/277 [===================>..........] - ETA: 0s - loss: 66.0444 - mae: 5.4265
221/277 [======================>.......] - ETA: 0s - loss: 60.7590 - mae: 5.3047
243/277 [=========================>....] - ETA: 0s - loss: 57.8017 - mae: 5.2070
269/277 [============================>.] - ETA: 0s - loss: 57.5379 - mae: 5.2267
277/277 [==============================] - 1s 2ms/step - loss: 57.0174 - mae: 5.2231 - val_loss: 87.3554 - val_mae: 5.9603
## Epoch 64/100
##
1/277 [..............................] - ETA: 0s - loss: 0.3662 - mae: 0.6052
25/277 [=>............................] - ETA: 0s - loss: 66.6618 - mae: 6.2446
53/277 [====>.........................] - ETA: 0s - loss: 46.6132 - mae: 5.0786
83/277 [=======>......................] - ETA: 0s - loss: 37.0920 - mae: 4.5588
113/277 [===========>..................] - ETA: 0s - loss: 37.6642 - mae: 4.5787
144/277 [==============>...............] - ETA: 0s - loss: 38.4916 - mae: 4.6229
171/277 [=================>............] - ETA: 0s - loss: 49.2565 - mae: 4.8999
196/277 [====================>.........] - ETA: 0s - loss: 51.5091 - mae: 5.0578
224/277 [=======================>......] - ETA: 0s - loss: 54.9516 - mae: 5.2105
255/277 [==========================>...] - ETA: 0s - loss: 56.9348 - mae: 5.1838
277/277 [==============================] - 1s 3ms/step - loss: 56.3386 - mae: 5.1955 - val_loss: 87.5961 - val_mae: 5.9796
## Epoch 65/100
##
1/277 [..............................] - ETA: 0s - loss: 0.1004 - mae: 0.3169
25/277 [=>............................] - ETA: 0s - loss: 89.3571 - mae: 5.6464
49/277 [====>.........................] - ETA: 0s - loss: 95.1130 - mae: 6.0392
77/277 [=======>......................] - ETA: 0s - loss: 77.0982 - mae: 5.8199
102/277 [==========>...................] - ETA: 0s - loss: 77.3197 - mae: 5.8605
127/277 [============>.................] - ETA: 0s - loss: 79.3247 - mae: 6.0580
154/277 [===============>..............] - ETA: 0s - loss: 71.5094 - mae: 5.7948
183/277 [==================>...........] - ETA: 0s - loss: 69.7361 - mae: 5.6154
210/277 [=====================>........] - ETA: 0s - loss: 65.2752 - mae: 5.4623
237/277 [========================>.....] - ETA: 0s - loss: 62.2963 - mae: 5.4493
262/277 [===========================>..] - ETA: 0s - loss: 58.4535 - mae: 5.2820
277/277 [==============================] - 1s 3ms/step - loss: 56.4768 - mae: 5.2084 - val_loss: 86.7283 - val_mae: 5.9187
## Epoch 66/100
##
1/277 [..............................] - ETA: 0s - loss: 135.8277 - mae: 11.6545
35/277 [==>...........................] - ETA: 0s - loss: 47.6471 - mae: 5.2540
67/277 [======>.......................] - ETA: 0s - loss: 47.1827 - mae: 5.3232
95/277 [=========>....................] - ETA: 0s - loss: 56.9516 - mae: 5.7439
130/277 [=============>................] - ETA: 0s - loss: 58.3023 - mae: 5.4499
167/277 [=================>............] - ETA: 0s - loss: 53.7259 - mae: 5.2619
202/277 [====================>.........] - ETA: 0s - loss: 58.7869 - mae: 5.3660
234/277 [========================>.....] - ETA: 0s - loss: 54.9654 - mae: 5.1285
260/277 [===========================>..] - ETA: 0s - loss: 56.6571 - mae: 5.1720
277/277 [==============================] - 1s 2ms/step - loss: 57.0677 - mae: 5.2031 - val_loss: 87.3194 - val_mae: 5.9707
## Epoch 67/100
##
1/277 [..............................] - ETA: 0s - loss: 17.5864 - mae: 4.1936
31/277 [==>...........................] - ETA: 0s - loss: 46.5364 - mae: 4.3005
60/277 [=====>........................] - ETA: 0s - loss: 46.1346 - mae: 4.7059
78/277 [=======>......................] - ETA: 0s - loss: 49.0744 - mae: 5.0611
93/277 [=========>....................] - ETA: 0s - loss: 46.0926 - mae: 4.9109
109/277 [==========>...................] - ETA: 0s - loss: 54.5777 - mae: 5.1468
125/277 [============>.................] - ETA: 0s - loss: 55.8392 - mae: 5.1715
146/277 [==============>...............] - ETA: 0s - loss: 53.5374 - mae: 5.1201
170/277 [=================>............] - ETA: 0s - loss: 49.1939 - mae: 4.8805
196/277 [====================>.........] - ETA: 0s - loss: 48.8462 - mae: 4.8226
219/277 [======================>.......] - ETA: 0s - loss: 46.3437 - mae: 4.7246
246/277 [=========================>....] - ETA: 0s - loss: 55.1400 - mae: 5.1123
275/277 [============================>.] - ETA: 0s - loss: 55.0821 - mae: 5.1283
277/277 [==============================] - 1s 3ms/step - loss: 56.3971 - mae: 5.2025 - val_loss: 87.3357 - val_mae: 5.9398
## Epoch 68/100
##
1/277 [..............................] - ETA: 0s - loss: 1.2364 - mae: 1.1119
32/277 [==>...........................] - ETA: 0s - loss: 47.6557 - mae: 5.6957
68/277 [======>.......................] - ETA: 0s - loss: 54.9459 - mae: 5.7608
102/277 [==========>...................] - ETA: 0s - loss: 60.9561 - mae: 5.6299
136/277 [=============>................] - ETA: 0s - loss: 52.3271 - mae: 5.1705
164/277 [================>.............] - ETA: 0s - loss: 54.4770 - mae: 5.2984
189/277 [===================>..........] - ETA: 0s - loss: 59.0057 - mae: 5.4389
218/277 [======================>.......] - ETA: 0s - loss: 57.6317 - mae: 5.3048
256/277 [==========================>...] - ETA: 0s - loss: 55.8124 - mae: 5.2002
277/277 [==============================] - 1s 2ms/step - loss: 56.7936 - mae: 5.2103 - val_loss: 86.6269 - val_mae: 5.9133
## Epoch 69/100
##
1/277 [..............................] - ETA: 0s - loss: 1.3160 - mae: 1.1472
40/277 [===>..........................] - ETA: 0s - loss: 70.8765 - mae: 5.6519
78/277 [=======>......................] - ETA: 0s - loss: 53.5099 - mae: 5.0608
109/277 [==========>...................] - ETA: 0s - loss: 53.6070 - mae: 5.0938
129/277 [============>.................] - ETA: 0s - loss: 49.8411 - mae: 4.9348
150/277 [===============>..............] - ETA: 0s - loss: 55.8135 - mae: 5.0906
171/277 [=================>............] - ETA: 0s - loss: 55.0332 - mae: 5.0857
192/277 [===================>..........] - ETA: 0s - loss: 53.9870 - mae: 5.0734
212/277 [=====================>........] - ETA: 0s - loss: 56.6599 - mae: 5.2064
233/277 [========================>.....] - ETA: 0s - loss: 55.1160 - mae: 5.1720
255/277 [==========================>...] - ETA: 0s - loss: 55.8853 - mae: 5.2642
277/277 [==============================] - ETA: 0s - loss: 56.7869 - mae: 5.2437
277/277 [==============================] - 1s 3ms/step - loss: 56.7869 - mae: 5.2437 - val_loss: 87.0639 - val_mae: 5.9311
## Epoch 70/100
##
1/277 [..............................] - ETA: 0s - loss: 67.5942 - mae: 8.2216
19/277 [=>............................] - ETA: 0s - loss: 99.0065 - mae: 6.8037
38/277 [===>..........................] - ETA: 0s - loss: 69.6356 - mae: 5.8503
59/277 [=====>........................] - ETA: 0s - loss: 50.4829 - mae: 4.8750
77/277 [=======>......................] - ETA: 0s - loss: 61.0709 - mae: 5.2281
94/277 [=========>....................] - ETA: 0s - loss: 62.5336 - mae: 5.3876
113/277 [===========>..................] - ETA: 0s - loss: 63.9863 - mae: 5.4784
129/277 [============>.................] - ETA: 0s - loss: 62.3070 - mae: 5.4657
140/277 [==============>...............] - ETA: 0s - loss: 59.4244 - mae: 5.3122
152/277 [===============>..............] - ETA: 0s - loss: 59.6233 - mae: 5.3122
165/277 [================>.............] - ETA: 0s - loss: 58.6410 - mae: 5.3098
181/277 [==================>...........] - ETA: 0s - loss: 56.9167 - mae: 5.2627
196/277 [====================>.........] - ETA: 0s - loss: 55.1842 - mae: 5.2032
213/277 [======================>.......] - ETA: 0s - loss: 53.5415 - mae: 5.1614
236/277 [========================>.....] - ETA: 0s - loss: 51.5526 - mae: 5.0566
264/277 [===========================>..] - ETA: 0s - loss: 55.7036 - mae: 5.1900
277/277 [==============================] - 1s 4ms/step - loss: 56.0338 - mae: 5.2218 - val_loss: 87.2926 - val_mae: 5.9257
## Epoch 71/100
##
1/277 [..............................] - ETA: 0s - loss: 29.2003 - mae: 5.4037
17/277 [>.............................] - ETA: 0s - loss: 213.5534 - mae: 9.7684
33/277 [==>...........................] - ETA: 0s - loss: 137.3250 - mae: 7.9741
55/277 [====>.........................] - ETA: 0s - loss: 92.8613 - mae: 6.3913
73/277 [======>.......................] - ETA: 0s - loss: 80.5989 - mae: 6.0792
90/277 [========>.....................] - ETA: 0s - loss: 77.6902 - mae: 6.0774
110/277 [==========>...................] - ETA: 0s - loss: 70.6421 - mae: 5.7387
138/277 [=============>................] - ETA: 0s - loss: 62.9820 - mae: 5.4984
169/277 [=================>............] - ETA: 0s - loss: 57.8607 - mae: 5.2397
204/277 [=====================>........] - ETA: 0s - loss: 59.5522 - mae: 5.2668
240/277 [========================>.....] - ETA: 0s - loss: 58.4821 - mae: 5.2329
271/277 [============================>.] - ETA: 0s - loss: 57.0533 - mae: 5.2374
277/277 [==============================] - 1s 4ms/step - loss: 56.6575 - mae: 5.2256 - val_loss: 86.2501 - val_mae: 5.8989
## Epoch 72/100
##
1/277 [..............................] - ETA: 0s - loss: 11.4958 - mae: 3.3905
16/277 [>.............................] - ETA: 0s - loss: 112.6207 - mae: 7.4655
36/277 [==>...........................] - ETA: 0s - loss: 101.8548 - mae: 7.1242
49/277 [====>.........................] - ETA: 0s - loss: 106.4128 - mae: 6.5922
65/277 [======>.......................] - ETA: 0s - loss: 90.0511 - mae: 6.1491
77/277 [=======>......................] - ETA: 0s - loss: 83.1521 - mae: 5.9800
92/277 [========>.....................] - ETA: 0s - loss: 78.7729 - mae: 5.8485
104/277 [==========>...................] - ETA: 0s - loss: 73.6024 - mae: 5.7447
124/277 [============>.................] - ETA: 0s - loss: 72.2609 - mae: 5.7127
146/277 [==============>...............] - ETA: 0s - loss: 68.7622 - mae: 5.6421
163/277 [================>.............] - ETA: 0s - loss: 66.1087 - mae: 5.5233
188/277 [===================>..........] - ETA: 0s - loss: 64.5747 - mae: 5.4911
206/277 [=====================>........] - ETA: 0s - loss: 61.9449 - mae: 5.4383
220/277 [======================>.......] - ETA: 0s - loss: 60.7889 - mae: 5.4066
233/277 [========================>.....] - ETA: 0s - loss: 59.0661 - mae: 5.3380
246/277 [=========================>....] - ETA: 0s - loss: 57.3829 - mae: 5.2678
258/277 [==========================>...] - ETA: 0s - loss: 57.6428 - mae: 5.2685
272/277 [============================>.] - ETA: 0s - loss: 57.0718 - mae: 5.2677
277/277 [==============================] - 1s 4ms/step - loss: 56.3537 - mae: 5.2373 - val_loss: 86.7351 - val_mae: 5.9268
## Epoch 73/100
##
1/277 [..............................] - ETA: 0s - loss: 2.6686 - mae: 1.6336
20/277 [=>............................] - ETA: 0s - loss: 22.0287 - mae: 2.7322
34/277 [==>...........................] - ETA: 0s - loss: 32.8169 - mae: 3.7681
46/277 [===>..........................] - ETA: 0s - loss: 38.1917 - mae: 4.0683
58/277 [=====>........................] - ETA: 0s - loss: 34.7648 - mae: 4.0240
72/277 [======>.......................] - ETA: 0s - loss: 43.1875 - mae: 4.3458
88/277 [========>.....................] - ETA: 0s - loss: 47.3236 - mae: 4.6543
109/277 [==========>...................] - ETA: 0s - loss: 58.9019 - mae: 5.0108
135/277 [=============>................] - ETA: 0s - loss: 58.7972 - mae: 5.1351
170/277 [=================>............] - ETA: 0s - loss: 52.3750 - mae: 4.9491
205/277 [=====================>........] - ETA: 0s - loss: 51.5026 - mae: 4.9923
237/277 [========================>.....] - ETA: 0s - loss: 54.1076 - mae: 5.1008
273/277 [============================>.] - ETA: 0s - loss: 55.7810 - mae: 5.2118
277/277 [==============================] - 1s 3ms/step - loss: 56.3585 - mae: 5.2352 - val_loss: 87.1710 - val_mae: 5.9783
## Epoch 74/100
##
1/277 [..............................] - ETA: 0s - loss: 22.3514 - mae: 4.7277
23/277 [=>............................] - ETA: 0s - loss: 112.7986 - mae: 7.3221
36/277 [==>...........................] - ETA: 0s - loss: 91.4521 - mae: 6.6866
48/277 [====>.........................] - ETA: 0s - loss: 71.3514 - mae: 5.7338
60/277 [=====>........................] - ETA: 0s - loss: 69.8270 - mae: 5.8131
74/277 [=======>......................] - ETA: 0s - loss: 63.0433 - mae: 5.6066
94/277 [=========>....................] - ETA: 0s - loss: 60.6803 - mae: 5.5014
115/277 [===========>..................] - ETA: 0s - loss: 51.9927 - mae: 5.0321
133/277 [=============>................] - ETA: 0s - loss: 62.1836 - mae: 5.3204
148/277 [===============>..............] - ETA: 0s - loss: 59.5036 - mae: 5.2430
166/277 [================>.............] - ETA: 0s - loss: 62.3913 - mae: 5.3422
181/277 [==================>...........] - ETA: 0s - loss: 58.4282 - mae: 5.1549
194/277 [====================>.........] - ETA: 0s - loss: 57.9755 - mae: 5.1891
210/277 [=====================>........] - ETA: 0s - loss: 55.1741 - mae: 5.0612
227/277 [=======================>......] - ETA: 0s - loss: 55.6973 - mae: 5.1643
241/277 [=========================>....] - ETA: 0s - loss: 55.7645 - mae: 5.1634
259/277 [===========================>..] - ETA: 0s - loss: 55.4063 - mae: 5.1465
275/277 [============================>.] - ETA: 0s - loss: 56.2100 - mae: 5.1767
277/277 [==============================] - 1s 4ms/step - loss: 56.5460 - mae: 5.2106 - val_loss: 87.7814 - val_mae: 6.0089
## Epoch 75/100
##
1/277 [..............................] - ETA: 0s - loss: 13.2050 - mae: 3.6339
38/277 [===>..........................] - ETA: 0s - loss: 23.5574 - mae: 3.9109
65/277 [======>.......................] - ETA: 0s - loss: 26.9354 - mae: 4.0109
82/277 [=======>......................] - ETA: 0s - loss: 44.2991 - mae: 4.8842
97/277 [=========>....................] - ETA: 0s - loss: 42.5889 - mae: 4.8817
110/277 [==========>...................] - ETA: 0s - loss: 41.1333 - mae: 4.8277
124/277 [============>.................] - ETA: 0s - loss: 60.7729 - mae: 5.2155
136/277 [=============>................] - ETA: 0s - loss: 57.3848 - mae: 5.0885
155/277 [===============>..............] - ETA: 0s - loss: 52.5129 - mae: 4.8747
179/277 [==================>...........] - ETA: 0s - loss: 53.1016 - mae: 5.0084
208/277 [=====================>........] - ETA: 0s - loss: 54.9321 - mae: 5.1343
234/277 [========================>.....] - ETA: 0s - loss: 54.7208 - mae: 5.1341
250/277 [==========================>...] - ETA: 0s - loss: 56.8057 - mae: 5.2653
266/277 [===========================>..] - ETA: 0s - loss: 58.2102 - mae: 5.3316
277/277 [==============================] - 1s 4ms/step - loss: 56.1757 - mae: 5.2029 - val_loss: 87.1217 - val_mae: 5.9713
## Epoch 76/100
##
1/277 [..............................] - ETA: 0s - loss: 0.1492 - mae: 0.3863
42/277 [===>..........................] - ETA: 0s - loss: 31.2273 - mae: 4.6125
72/277 [======>.......................] - ETA: 0s - loss: 58.9307 - mae: 5.5409
89/277 [========>.....................] - ETA: 0s - loss: 77.9550 - mae: 6.2242
104/277 [==========>...................] - ETA: 0s - loss: 68.2444 - mae: 5.7185
119/277 [===========>..................] - ETA: 0s - loss: 63.4996 - mae: 5.5738
132/277 [=============>................] - ETA: 0s - loss: 60.1077 - mae: 5.4219
144/277 [==============>...............] - ETA: 0s - loss: 56.2409 - mae: 5.1966
156/277 [===============>..............] - ETA: 0s - loss: 54.8184 - mae: 5.1552
168/277 [=================>............] - ETA: 0s - loss: 56.6717 - mae: 5.2531
178/277 [==================>...........] - ETA: 0s - loss: 56.0208 - mae: 5.1869
197/277 [====================>.........] - ETA: 0s - loss: 57.1343 - mae: 5.2008
219/277 [======================>.......] - ETA: 0s - loss: 55.3982 - mae: 5.1303
235/277 [========================>.....] - ETA: 0s - loss: 55.3196 - mae: 5.1567
246/277 [=========================>....] - ETA: 0s - loss: 56.2233 - mae: 5.2139
256/277 [==========================>...] - ETA: 0s - loss: 55.5307 - mae: 5.2011
268/277 [============================>.] - ETA: 0s - loss: 56.9953 - mae: 5.2535
277/277 [==============================] - 1s 5ms/step - loss: 55.8565 - mae: 5.2058 - val_loss: 89.5742 - val_mae: 6.1115
## Epoch 77/100
##
1/277 [..............................] - ETA: 1s - loss: 10.3258 - mae: 3.2134
19/277 [=>............................] - ETA: 0s - loss: 36.3146 - mae: 4.7017
34/277 [==>...........................] - ETA: 0s - loss: 35.0449 - mae: 4.5980
49/277 [====>.........................] - ETA: 0s - loss: 27.0618 - mae: 3.7876
61/277 [=====>........................] - ETA: 0s - loss: 31.3720 - mae: 4.0549
74/277 [=======>......................] - ETA: 0s - loss: 52.6049 - mae: 4.7994
87/277 [========>.....................] - ETA: 0s - loss: 53.0764 - mae: 4.9373
100/277 [=========>....................] - ETA: 0s - loss: 52.2528 - mae: 4.9577
115/277 [===========>..................] - ETA: 0s - loss: 53.4707 - mae: 5.0749
129/277 [============>.................] - ETA: 0s - loss: 55.1689 - mae: 5.1719
143/277 [==============>...............] - ETA: 0s - loss: 62.9704 - mae: 5.5194
157/277 [================>.............] - ETA: 0s - loss: 64.4696 - mae: 5.6135
172/277 [=================>............] - ETA: 0s - loss: 61.3896 - mae: 5.5246
188/277 [===================>..........] - ETA: 0s - loss: 58.4241 - mae: 5.3689
201/277 [====================>.........] - ETA: 0s - loss: 58.4727 - mae: 5.3682
215/277 [======================>.......] - ETA: 0s - loss: 56.4478 - mae: 5.3055
229/277 [=======================>......] - ETA: 0s - loss: 54.6642 - mae: 5.2026
242/277 [=========================>....] - ETA: 0s - loss: 52.8139 - mae: 5.1266
253/277 [==========================>...] - ETA: 0s - loss: 53.3068 - mae: 5.1860
271/277 [============================>.] - ETA: 0s - loss: 55.1266 - mae: 5.1309
277/277 [==============================] - 1s 5ms/step - loss: 56.7368 - mae: 5.2297 - val_loss: 86.6123 - val_mae: 5.8984
## Epoch 78/100
##
1/277 [..............................] - ETA: 1s - loss: 0.0222 - mae: 0.1490
13/277 [>.............................] - ETA: 1s - loss: 21.5137 - mae: 3.6953
30/277 [==>...........................] - ETA: 0s - loss: 44.1686 - mae: 4.6427
47/277 [====>.........................] - ETA: 0s - loss: 46.1804 - mae: 4.8634
70/277 [======>.......................] - ETA: 0s - loss: 62.1383 - mae: 5.1632
93/277 [=========>....................] - ETA: 0s - loss: 68.9010 - mae: 5.5621
122/277 [============>.................] - ETA: 0s - loss: 58.9312 - mae: 5.2132
153/277 [===============>..............] - ETA: 0s - loss: 56.2576 - mae: 5.1449
187/277 [===================>..........] - ETA: 0s - loss: 55.0676 - mae: 5.0567
225/277 [=======================>......] - ETA: 0s - loss: 50.6174 - mae: 4.9162
257/277 [==========================>...] - ETA: 0s - loss: 54.4179 - mae: 5.0909
277/277 [==============================] - 1s 3ms/step - loss: 56.2142 - mae: 5.2268 - val_loss: 86.8251 - val_mae: 5.9389
## Epoch 79/100
##
1/277 [..............................] - ETA: 0s - loss: 123.8315 - mae: 11.1280
29/277 [==>...........................] - ETA: 0s - loss: 35.2272 - mae: 5.0895
56/277 [=====>........................] - ETA: 0s - loss: 37.9400 - mae: 4.8617
85/277 [========>.....................] - ETA: 0s - loss: 59.2085 - mae: 5.8684
119/277 [===========>..................] - ETA: 0s - loss: 49.9182 - mae: 5.2855
149/277 [===============>..............] - ETA: 0s - loss: 61.4967 - mae: 5.6288
179/277 [==================>...........] - ETA: 0s - loss: 65.2936 - mae: 5.5401
212/277 [=====================>........] - ETA: 0s - loss: 61.7315 - mae: 5.4153
242/277 [=========================>....] - ETA: 0s - loss: 58.9283 - mae: 5.3316
275/277 [============================>.] - ETA: 0s - loss: 56.7429 - mae: 5.2228
277/277 [==============================] - 1s 2ms/step - loss: 56.3550 - mae: 5.1977 - val_loss: 86.9132 - val_mae: 5.9070
## Epoch 80/100
##
1/277 [..............................] - ETA: 0s - loss: 11.7044 - mae: 3.4212
30/277 [==>...........................] - ETA: 0s - loss: 57.2939 - mae: 5.0661
48/277 [====>.........................] - ETA: 0s - loss: 55.1931 - mae: 5.1912
64/277 [=====>........................] - ETA: 0s - loss: 51.1874 - mae: 5.1681
81/277 [=======>......................] - ETA: 0s - loss: 46.5616 - mae: 4.9405
94/277 [=========>....................] - ETA: 0s - loss: 54.8428 - mae: 5.1138
112/277 [===========>..................] - ETA: 0s - loss: 54.3144 - mae: 5.1110
138/277 [=============>................] - ETA: 0s - loss: 51.4976 - mae: 5.0145
170/277 [=================>............] - ETA: 0s - loss: 58.1156 - mae: 5.1477
209/277 [=====================>........] - ETA: 0s - loss: 60.7119 - mae: 5.3751
235/277 [========================>.....] - ETA: 0s - loss: 57.3448 - mae: 5.2426
246/277 [=========================>....] - ETA: 0s - loss: 57.7405 - mae: 5.2776
255/277 [==========================>...] - ETA: 0s - loss: 57.0520 - mae: 5.2284
266/277 [===========================>..] - ETA: 0s - loss: 56.2422 - mae: 5.1839
277/277 [==============================] - ETA: 0s - loss: 56.2299 - mae: 5.1879
277/277 [==============================] - 1s 4ms/step - loss: 56.2299 - mae: 5.1879 - val_loss: 87.3156 - val_mae: 5.9458
## Epoch 81/100
##
1/277 [..............................] - ETA: 0s - loss: 0.9464 - mae: 0.9728
15/277 [>.............................] - ETA: 0s - loss: 25.9546 - mae: 4.2440
29/277 [==>...........................] - ETA: 0s - loss: 43.4284 - mae: 5.1333
42/277 [===>..........................] - ETA: 0s - loss: 41.0875 - mae: 5.1024
55/277 [====>.........................] - ETA: 0s - loss: 53.4011 - mae: 5.3091
69/277 [======>.......................] - ETA: 0s - loss: 50.6826 - mae: 5.2244
81/277 [=======>......................] - ETA: 0s - loss: 47.7403 - mae: 5.1104
93/277 [=========>....................] - ETA: 0s - loss: 43.8728 - mae: 4.8775
105/277 [==========>...................] - ETA: 0s - loss: 55.5438 - mae: 5.1854
117/277 [===========>..................] - ETA: 0s - loss: 52.3635 - mae: 5.0171
128/277 [============>.................] - ETA: 0s - loss: 50.6401 - mae: 4.9372
139/277 [==============>...............] - ETA: 0s - loss: 47.4224 - mae: 4.7617
152/277 [===============>..............] - ETA: 0s - loss: 48.0219 - mae: 4.7277
159/277 [================>.............] - ETA: 0s - loss: 49.4583 - mae: 4.8209
166/277 [================>.............] - ETA: 0s - loss: 48.8128 - mae: 4.7529
176/277 [==================>...........] - ETA: 0s - loss: 50.0352 - mae: 4.8063
189/277 [===================>..........] - ETA: 0s - loss: 53.7357 - mae: 5.0425
200/277 [====================>.........] - ETA: 0s - loss: 52.4186 - mae: 4.9478
213/277 [======================>.......] - ETA: 0s - loss: 57.1846 - mae: 5.0498
226/277 [=======================>......] - ETA: 0s - loss: 56.3951 - mae: 5.0677
240/277 [========================>.....] - ETA: 0s - loss: 55.4351 - mae: 5.0559
256/277 [==========================>...] - ETA: 0s - loss: 55.5897 - mae: 5.1324
277/277 [==============================] - 1s 5ms/step - loss: 56.1793 - mae: 5.2231 - val_loss: 86.9822 - val_mae: 5.9165
## Epoch 82/100
##
1/277 [..............................] - ETA: 0s - loss: 1152.0941 - mae: 33.9425
24/277 [=>............................] - ETA: 0s - loss: 109.8034 - mae: 6.5695
47/277 [====>.........................] - ETA: 0s - loss: 78.9215 - mae: 5.8812
62/277 [=====>........................] - ETA: 0s - loss: 76.8911 - mae: 5.9590
76/277 [=======>......................] - ETA: 0s - loss: 69.0204 - mae: 5.7780
90/277 [========>.....................] - ETA: 0s - loss: 81.7246 - mae: 6.1729
104/277 [==========>...................] - ETA: 0s - loss: 75.8513 - mae: 5.9759
117/277 [===========>..................] - ETA: 0s - loss: 69.9779 - mae: 5.7199
137/277 [=============>................] - ETA: 0s - loss: 67.7474 - mae: 5.5929
161/277 [================>.............] - ETA: 0s - loss: 62.3503 - mae: 5.4345
178/277 [==================>...........] - ETA: 0s - loss: 58.6802 - mae: 5.2494
192/277 [===================>..........] - ETA: 0s - loss: 55.6415 - mae: 5.1063
208/277 [=====================>........] - ETA: 0s - loss: 53.6324 - mae: 5.0568
227/277 [=======================>......] - ETA: 0s - loss: 51.3320 - mae: 4.9988
247/277 [=========================>....] - ETA: 0s - loss: 56.7980 - mae: 5.1747
268/277 [============================>.] - ETA: 0s - loss: 55.9583 - mae: 5.1637
277/277 [==============================] - 1s 4ms/step - loss: 56.5667 - mae: 5.2103 - val_loss: 86.6120 - val_mae: 5.9036
## Epoch 83/100
##
1/277 [..............................] - ETA: 0s - loss: 3.5117 - mae: 1.8740
26/277 [=>............................] - ETA: 0s - loss: 40.3406 - mae: 4.6964
53/277 [====>.........................] - ETA: 0s - loss: 64.1771 - mae: 5.4033
80/277 [=======>......................] - ETA: 0s - loss: 50.2221 - mae: 4.9737
102/277 [==========>...................] - ETA: 0s - loss: 45.2612 - mae: 4.8074
127/277 [============>.................] - ETA: 0s - loss: 45.2155 - mae: 4.8436
158/277 [================>.............] - ETA: 0s - loss: 44.6427 - mae: 4.8503
187/277 [===================>..........] - ETA: 0s - loss: 49.4917 - mae: 5.0459
217/277 [======================>.......] - ETA: 0s - loss: 57.5782 - mae: 5.2020
249/277 [=========================>....] - ETA: 0s - loss: 54.9130 - mae: 5.1634
277/277 [==============================] - 1s 3ms/step - loss: 56.3855 - mae: 5.2263 - val_loss: 86.1420 - val_mae: 5.8919
## Epoch 84/100
##
1/277 [..............................] - ETA: 0s - loss: 49.7632 - mae: 7.0543
31/277 [==>...........................] - ETA: 0s - loss: 45.5514 - mae: 5.0998
65/277 [======>.......................] - ETA: 0s - loss: 35.8458 - mae: 4.6293
105/277 [==========>...................] - ETA: 0s - loss: 60.0188 - mae: 5.5814
139/277 [==============>...............] - ETA: 0s - loss: 53.0768 - mae: 5.2848
160/277 [================>.............] - ETA: 0s - loss: 51.9970 - mae: 5.2105
177/277 [==================>...........] - ETA: 0s - loss: 52.6791 - mae: 5.2015
194/277 [====================>.........] - ETA: 0s - loss: 51.7532 - mae: 5.1638
207/277 [=====================>........] - ETA: 0s - loss: 51.0908 - mae: 5.1390
220/277 [======================>.......] - ETA: 0s - loss: 55.0011 - mae: 5.2568
233/277 [========================>.....] - ETA: 0s - loss: 60.7516 - mae: 5.4109
247/277 [=========================>....] - ETA: 0s - loss: 58.5189 - mae: 5.2934
260/277 [===========================>..] - ETA: 0s - loss: 57.8861 - mae: 5.2952
274/277 [============================>.] - ETA: 0s - loss: 56.1990 - mae: 5.2023
277/277 [==============================] - 1s 4ms/step - loss: 56.2037 - mae: 5.2093 - val_loss: 86.3886 - val_mae: 5.9203
## Epoch 85/100
##
1/277 [..............................] - ETA: 0s - loss: 15.7348 - mae: 3.9667
13/277 [>.............................] - ETA: 1s - loss: 21.6813 - mae: 3.4028
26/277 [=>............................] - ETA: 1s - loss: 60.2409 - mae: 5.2222
40/277 [===>..........................] - ETA: 0s - loss: 64.5002 - mae: 5.6801
56/277 [=====>........................] - ETA: 0s - loss: 73.4819 - mae: 5.9127
70/277 [======>.......................] - ETA: 0s - loss: 68.3015 - mae: 5.6825
86/277 [========>.....................] - ETA: 0s - loss: 65.2475 - mae: 5.6738
100/277 [=========>....................] - ETA: 0s - loss: 59.3810 - mae: 5.3427
114/277 [===========>..................] - ETA: 0s - loss: 58.0201 - mae: 5.4137
129/277 [============>.................] - ETA: 0s - loss: 58.0415 - mae: 5.4437
144/277 [==============>...............] - ETA: 0s - loss: 63.2997 - mae: 5.6144
159/277 [================>.............] - ETA: 0s - loss: 58.9074 - mae: 5.3797
178/277 [==================>...........] - ETA: 0s - loss: 62.9163 - mae: 5.6101
191/277 [===================>..........] - ETA: 0s - loss: 61.1425 - mae: 5.5596
204/277 [=====================>........] - ETA: 0s - loss: 60.0330 - mae: 5.5106
218/277 [======================>.......] - ETA: 0s - loss: 57.3578 - mae: 5.3830
230/277 [=======================>......] - ETA: 0s - loss: 55.4446 - mae: 5.2709
243/277 [=========================>....] - ETA: 0s - loss: 59.0581 - mae: 5.3191
258/277 [==========================>...] - ETA: 0s - loss: 58.5246 - mae: 5.3512
272/277 [============================>.] - ETA: 0s - loss: 57.0307 - mae: 5.2903
277/277 [==============================] - 1s 5ms/step - loss: 56.3305 - mae: 5.2510 - val_loss: 86.6412 - val_mae: 5.8994
## Epoch 86/100
##
1/277 [..............................] - ETA: 0s - loss: 0.0084 - mae: 0.0915
36/277 [==>...........................] - ETA: 0s - loss: 48.1720 - mae: 5.3382
71/277 [======>.......................] - ETA: 0s - loss: 47.5308 - mae: 5.2568
90/277 [========>.....................] - ETA: 0s - loss: 64.1440 - mae: 5.7193
106/277 [==========>...................] - ETA: 0s - loss: 59.9126 - mae: 5.3779
123/277 [============>.................] - ETA: 0s - loss: 64.2280 - mae: 5.6112
137/277 [=============>................] - ETA: 0s - loss: 61.8408 - mae: 5.4710
150/277 [===============>..............] - ETA: 0s - loss: 63.4713 - mae: 5.6457
165/277 [================>.............] - ETA: 0s - loss: 60.7738 - mae: 5.5651
178/277 [==================>...........] - ETA: 0s - loss: 57.5572 - mae: 5.3937
191/277 [===================>..........] - ETA: 0s - loss: 54.6598 - mae: 5.2318
205/277 [=====================>........] - ETA: 0s - loss: 53.9926 - mae: 5.2235
212/277 [=====================>........] - ETA: 0s - loss: 53.5401 - mae: 5.1950
223/277 [=======================>......] - ETA: 0s - loss: 51.5979 - mae: 5.0854
236/277 [========================>.....] - ETA: 0s - loss: 55.0701 - mae: 5.1489
247/277 [=========================>....] - ETA: 0s - loss: 53.1939 - mae: 5.0483
258/277 [==========================>...] - ETA: 0s - loss: 56.2037 - mae: 5.1671
271/277 [============================>.] - ETA: 0s - loss: 56.2960 - mae: 5.1881
277/277 [==============================] - 1s 5ms/step - loss: 56.2357 - mae: 5.1894 - val_loss: 86.8056 - val_mae: 5.9417
## Epoch 87/100
##
1/277 [..............................] - ETA: 1s - loss: 37.2484 - mae: 6.1031
16/277 [>.............................] - ETA: 0s - loss: 41.4858 - mae: 5.3440
31/277 [==>...........................] - ETA: 0s - loss: 34.4115 - mae: 4.7391
43/277 [===>..........................] - ETA: 0s - loss: 32.9161 - mae: 4.6633
55/277 [====>.........................] - ETA: 0s - loss: 57.8085 - mae: 5.3595
68/277 [======>.......................] - ETA: 0s - loss: 57.1749 - mae: 5.2764
81/277 [=======>......................] - ETA: 0s - loss: 55.2880 - mae: 5.1811
101/277 [=========>....................] - ETA: 0s - loss: 59.2817 - mae: 5.5745
116/277 [===========>..................] - ETA: 0s - loss: 67.0782 - mae: 5.6579
130/277 [=============>................] - ETA: 0s - loss: 62.7230 - mae: 5.4708
146/277 [==============>...............] - ETA: 0s - loss: 59.2082 - mae: 5.3534
161/277 [================>.............] - ETA: 0s - loss: 59.6439 - mae: 5.3707
178/277 [==================>...........] - ETA: 0s - loss: 57.2538 - mae: 5.3019
194/277 [====================>.........] - ETA: 0s - loss: 55.3391 - mae: 5.2191
210/277 [=====================>........] - ETA: 0s - loss: 52.6228 - mae: 5.1169
225/277 [=======================>......] - ETA: 0s - loss: 55.0984 - mae: 5.1929
240/277 [========================>.....] - ETA: 0s - loss: 56.9781 - mae: 5.3383
253/277 [==========================>...] - ETA: 0s - loss: 59.0407 - mae: 5.3610
267/277 [===========================>..] - ETA: 0s - loss: 56.9867 - mae: 5.2561
277/277 [==============================] - 1s 5ms/step - loss: 56.0014 - mae: 5.2030 - val_loss: 86.1145 - val_mae: 5.8970
## Epoch 88/100
##
1/277 [..............................] - ETA: 1s - loss: 0.0169 - mae: 0.1299
15/277 [>.............................] - ETA: 0s - loss: 38.4346 - mae: 4.7241
29/277 [==>...........................] - ETA: 0s - loss: 45.0966 - mae: 5.0613
42/277 [===>..........................] - ETA: 0s - loss: 69.0631 - mae: 5.6044
55/277 [====>.........................] - ETA: 0s - loss: 70.6487 - mae: 5.7777
69/277 [======>.......................] - ETA: 0s - loss: 62.6857 - mae: 5.5247
85/277 [========>.....................] - ETA: 0s - loss: 60.9926 - mae: 5.5178
101/277 [=========>....................] - ETA: 0s - loss: 74.9367 - mae: 6.0168
116/277 [===========>..................] - ETA: 0s - loss: 67.0512 - mae: 5.6120
130/277 [=============>................] - ETA: 0s - loss: 63.6553 - mae: 5.5209
144/277 [==============>...............] - ETA: 0s - loss: 60.0973 - mae: 5.3915
156/277 [===============>..............] - ETA: 0s - loss: 63.6203 - mae: 5.5972
167/277 [=================>............] - ETA: 0s - loss: 61.0352 - mae: 5.5125
184/277 [==================>...........] - ETA: 0s - loss: 58.3145 - mae: 5.3852
199/277 [====================>.........] - ETA: 0s - loss: 54.9895 - mae: 5.1997
212/277 [=====================>........] - ETA: 0s - loss: 54.7596 - mae: 5.2076
230/277 [=======================>......] - ETA: 0s - loss: 53.8359 - mae: 5.1121
247/277 [=========================>....] - ETA: 0s - loss: 52.3289 - mae: 5.0260
262/277 [===========================>..] - ETA: 0s - loss: 52.9080 - mae: 5.0893
276/277 [============================>.] - ETA: 0s - loss: 56.5883 - mae: 5.2200
277/277 [==============================] - 1s 5ms/step - loss: 56.4053 - mae: 5.2099 - val_loss: 85.8990 - val_mae: 5.8743
## Epoch 89/100
##
1/277 [..............................] - ETA: 0s - loss: 3.1894 - mae: 1.7859
34/277 [==>...........................] - ETA: 0s - loss: 82.6797 - mae: 5.8925
69/277 [======>.......................] - ETA: 0s - loss: 63.8598 - mae: 5.4947
100/277 [=========>....................] - ETA: 0s - loss: 63.7682 - mae: 5.4330
131/277 [=============>................] - ETA: 0s - loss: 64.0330 - mae: 5.5991
164/277 [================>.............] - ETA: 0s - loss: 56.3804 - mae: 5.2581
193/277 [===================>..........] - ETA: 0s - loss: 56.3092 - mae: 5.2942
222/277 [=======================>......] - ETA: 0s - loss: 53.0217 - mae: 5.1897
251/277 [==========================>...] - ETA: 0s - loss: 56.6831 - mae: 5.2356
270/277 [============================>.] - ETA: 0s - loss: 56.6664 - mae: 5.1942
277/277 [==============================] - 1s 3ms/step - loss: 56.4184 - mae: 5.2032 - val_loss: 86.2723 - val_mae: 5.9145
## Epoch 90/100
##
1/277 [..............................] - ETA: 0s - loss: 1.3012 - mae: 1.1407
37/277 [===>..........................] - ETA: 0s - loss: 60.1745 - mae: 5.4018
77/277 [=======>......................] - ETA: 0s - loss: 66.9941 - mae: 5.5397
110/277 [==========>...................] - ETA: 0s - loss: 59.7999 - mae: 5.2938
144/277 [==============>...............] - ETA: 0s - loss: 51.6978 - mae: 4.9647
181/277 [==================>...........] - ETA: 0s - loss: 63.3786 - mae: 5.3496
214/277 [======================>.......] - ETA: 0s - loss: 60.4001 - mae: 5.3281
248/277 [=========================>....] - ETA: 0s - loss: 58.1295 - mae: 5.2990
277/277 [==============================] - 1s 2ms/step - loss: 56.5349 - mae: 5.2488 - val_loss: 86.2865 - val_mae: 5.8811
## Epoch 91/100
##
1/277 [..............................] - ETA: 0s - loss: 5.4562 - mae: 2.3358
35/277 [==>...........................] - ETA: 0s - loss: 40.0486 - mae: 4.6029
64/277 [=====>........................] - ETA: 0s - loss: 37.2771 - mae: 4.5304
91/277 [========>.....................] - ETA: 0s - loss: 39.6309 - mae: 4.8466
119/277 [===========>..................] - ETA: 0s - loss: 54.9890 - mae: 5.4265
149/277 [===============>..............] - ETA: 0s - loss: 52.1006 - mae: 5.2043
177/277 [==================>...........] - ETA: 0s - loss: 52.8779 - mae: 5.1886
205/277 [=====================>........] - ETA: 0s - loss: 57.0679 - mae: 5.4138
237/277 [========================>.....] - ETA: 0s - loss: 56.9498 - mae: 5.2285
270/277 [============================>.] - ETA: 0s - loss: 57.3063 - mae: 5.2455
277/277 [==============================] - 1s 3ms/step - loss: 56.5046 - mae: 5.1849 - val_loss: 86.0123 - val_mae: 5.8917
## Epoch 92/100
##
1/277 [..............................] - ETA: 0s - loss: 21.7998 - mae: 4.6690
35/277 [==>...........................] - ETA: 0s - loss: 34.0254 - mae: 4.3828
72/277 [======>.......................] - ETA: 0s - loss: 47.9030 - mae: 5.0969
107/277 [==========>...................] - ETA: 0s - loss: 55.1838 - mae: 5.2045
144/277 [==============>...............] - ETA: 0s - loss: 52.4026 - mae: 5.0768
183/277 [==================>...........] - ETA: 0s - loss: 66.7061 - mae: 5.5331
220/277 [======================>.......] - ETA: 0s - loss: 60.6864 - mae: 5.3585
253/277 [==========================>...] - ETA: 0s - loss: 57.0794 - mae: 5.2796
277/277 [==============================] - 1s 2ms/step - loss: 56.0248 - mae: 5.2143 - val_loss: 86.3465 - val_mae: 5.9065
## Epoch 93/100
##
1/277 [..............................] - ETA: 0s - loss: 46.7205 - mae: 6.8352
35/277 [==>...........................] - ETA: 0s - loss: 54.7331 - mae: 5.4145
75/277 [=======>......................] - ETA: 0s - loss: 50.3253 - mae: 5.2078
115/277 [===========>..................] - ETA: 0s - loss: 45.8364 - mae: 4.9464
153/277 [===============>..............] - ETA: 0s - loss: 57.0851 - mae: 5.1325
189/277 [===================>..........] - ETA: 0s - loss: 52.5734 - mae: 5.0676
222/277 [=======================>......] - ETA: 0s - loss: 57.2796 - mae: 5.1967
255/277 [==========================>...] - ETA: 0s - loss: 54.2930 - mae: 5.1215
277/277 [==============================] - 1s 2ms/step - loss: 56.3440 - mae: 5.1933 - val_loss: 85.9479 - val_mae: 5.8978
## Epoch 94/100
##
1/277 [..............................] - ETA: 0s - loss: 69.0829 - mae: 8.3116
37/277 [===>..........................] - ETA: 0s - loss: 56.9566 - mae: 6.1556
72/277 [======>.......................] - ETA: 0s - loss: 43.9975 - mae: 5.0080
110/277 [==========>...................] - ETA: 0s - loss: 51.9545 - mae: 5.1114
149/277 [===============>..............] - ETA: 0s - loss: 49.4764 - mae: 4.9657
189/277 [===================>..........] - ETA: 0s - loss: 56.5909 - mae: 5.1921
226/277 [=======================>......] - ETA: 0s - loss: 54.8463 - mae: 5.0630
261/277 [===========================>..] - ETA: 0s - loss: 54.7940 - mae: 5.0683
277/277 [==============================] - 1s 2ms/step - loss: 56.0687 - mae: 5.1769 - val_loss: 85.9863 - val_mae: 5.8881
## Epoch 95/100
##
1/277 [..............................] - ETA: 0s - loss: 106.2036 - mae: 10.3055
32/277 [==>...........................] - ETA: 0s - loss: 68.6652 - mae: 5.9208
71/277 [======>.......................] - ETA: 0s - loss: 51.9009 - mae: 5.1275
112/277 [===========>..................] - ETA: 0s - loss: 53.4509 - mae: 5.1493
150/277 [===============>..............] - ETA: 0s - loss: 47.6111 - mae: 4.8986
189/277 [===================>..........] - ETA: 0s - loss: 43.8006 - mae: 4.7333
226/277 [=======================>......] - ETA: 0s - loss: 56.8747 - mae: 5.1197
264/277 [===========================>..] - ETA: 0s - loss: 57.2969 - mae: 5.1892
277/277 [==============================] - 1s 2ms/step - loss: 55.8384 - mae: 5.1535 - val_loss: 85.4000 - val_mae: 5.8393
## Epoch 96/100
##
1/277 [..............................] - ETA: 0s - loss: 434.6029 - mae: 20.8471
38/277 [===>..........................] - ETA: 0s - loss: 46.0220 - mae: 4.9623
76/277 [=======>......................] - ETA: 0s - loss: 40.6165 - mae: 4.7004
118/277 [===========>..................] - ETA: 0s - loss: 37.9911 - mae: 4.5817
155/277 [===============>..............] - ETA: 0s - loss: 34.3589 - mae: 4.3611
190/277 [===================>..........] - ETA: 0s - loss: 40.6984 - mae: 4.6788
228/277 [=======================>......] - ETA: 0s - loss: 53.6893 - mae: 5.1180
262/277 [===========================>..] - ETA: 0s - loss: 52.2926 - mae: 5.0836
277/277 [==============================] - 1s 2ms/step - loss: 56.0756 - mae: 5.1798 - val_loss: 85.6500 - val_mae: 5.8877
## Epoch 97/100
##
1/277 [..............................] - ETA: 0s - loss: 248.1198 - mae: 15.7518
38/277 [===>..........................] - ETA: 0s - loss: 81.9894 - mae: 6.1160
72/277 [======>.......................] - ETA: 0s - loss: 64.3820 - mae: 5.8119
109/277 [==========>...................] - ETA: 0s - loss: 58.2950 - mae: 5.6720
147/277 [==============>...............] - ETA: 0s - loss: 50.7166 - mae: 5.3538
182/277 [==================>...........] - ETA: 0s - loss: 59.6364 - mae: 5.4992
220/277 [======================>.......] - ETA: 0s - loss: 56.7680 - mae: 5.3299
259/277 [===========================>..] - ETA: 0s - loss: 55.2330 - mae: 5.1748
277/277 [==============================] - 1s 2ms/step - loss: 55.5398 - mae: 5.2243 - val_loss: 87.3205 - val_mae: 5.9867
## Epoch 98/100
##
1/277 [..............................] - ETA: 0s - loss: 3.1225 - mae: 1.7671
38/277 [===>..........................] - ETA: 0s - loss: 37.8133 - mae: 4.4477
79/277 [=======>......................] - ETA: 0s - loss: 55.3101 - mae: 4.8214
113/277 [===========>..................] - ETA: 0s - loss: 54.1975 - mae: 5.0252
149/277 [===============>..............] - ETA: 0s - loss: 49.0330 - mae: 4.8567
189/277 [===================>..........] - ETA: 0s - loss: 50.4331 - mae: 5.0397
221/277 [======================>.......] - ETA: 0s - loss: 54.6371 - mae: 5.2144
256/277 [==========================>...] - ETA: 0s - loss: 53.2883 - mae: 5.1671
277/277 [==============================] - 1s 2ms/step - loss: 56.3822 - mae: 5.2044 - val_loss: 86.7429 - val_mae: 5.9352
## Epoch 99/100
##
1/277 [..............................] - ETA: 0s - loss: 69.5300 - mae: 8.3385
37/277 [===>..........................] - ETA: 0s - loss: 44.8379 - mae: 5.1943
74/277 [=======>......................] - ETA: 0s - loss: 33.5993 - mae: 4.4896
114/277 [===========>..................] - ETA: 0s - loss: 42.3922 - mae: 4.7732
138/277 [=============>................] - ETA: 0s - loss: 44.8836 - mae: 5.0325
153/277 [===============>..............] - ETA: 0s - loss: 43.9122 - mae: 4.9688
164/277 [================>.............] - ETA: 0s - loss: 51.4735 - mae: 5.1411
175/277 [=================>............] - ETA: 0s - loss: 49.5092 - mae: 5.0456
193/277 [===================>..........] - ETA: 0s - loss: 50.8282 - mae: 5.0626
220/277 [======================>.......] - ETA: 0s - loss: 53.6252 - mae: 5.1475
253/277 [==========================>...] - ETA: 0s - loss: 58.4922 - mae: 5.3015
277/277 [==============================] - 1s 3ms/step - loss: 55.9688 - mae: 5.2002 - val_loss: 86.2894 - val_mae: 5.8698
## Epoch 100/100
##
1/277 [..............................] - ETA: 0s - loss: 12.0438 - mae: 3.4704
20/277 [=>............................] - ETA: 0s - loss: 112.2303 - mae: 6.7402
44/277 [===>..........................] - ETA: 0s - loss: 66.6415 - mae: 5.1697
64/277 [=====>........................] - ETA: 0s - loss: 66.3234 - mae: 5.5732
83/277 [=======>......................] - ETA: 0s - loss: 59.8172 - mae: 5.3133
105/277 [==========>...................] - ETA: 0s - loss: 58.6972 - mae: 5.3574
131/277 [=============>................] - ETA: 0s - loss: 64.5431 - mae: 5.5126
158/277 [================>.............] - ETA: 0s - loss: 57.9391 - mae: 5.2688
186/277 [===================>..........] - ETA: 0s - loss: 58.0589 - mae: 5.1729
214/277 [======================>.......] - ETA: 0s - loss: 58.0066 - mae: 5.2461
242/277 [=========================>....] - ETA: 0s - loss: 56.0266 - mae: 5.1339
269/277 [============================>.] - ETA: 0s - loss: 55.7529 - mae: 5.1556
277/277 [==============================] - 1s 3ms/step - loss: 55.7764 - mae: 5.1819 - val_loss: 86.2569 - val_mae: 5.9141
Despite the neural network getting the same result as the polynomial of the 3rd degree, the polynomial linear regression is far more favorable in this case. Occam’s razor states that everything should be as simple as possible, and the latter is far more simple (Domingo, 1999). Regression techniques are also far faster to compute (although both take the order of seconds for a dataset of this size). Linear regression has a major advantage in that the weights are interpretable, i.e. one can look at the average change in house price per unit change in a feature. This means it can be easily described as a non-expert. However, this advantage is lost when the polynomial features are used (one cannot expect a non-expert to understand polynomial features). This means there is a tradeoff between model accuracy and model simplicity, and the best course of action depends on the specific use case. Therefore if we were advising an estate agent who was looking to obtain a tool for predicting house prices, we would advise that they use a linear regression technique, as the MAE is very low, and it uses simple, interpretable features.
Neural networks would have been a better alternative if more data was provided. Indeed, deep learning networks start as false positives and require substantial iteration to be optimized. In our case, we solely had 413 observations which is not ideal considering that neural networks require millions of data to be efficient. They also require significant computational costs which is not ideal in a business scenario. Furthermore, a potentially better approach to explore could have been regression splines, which algorithmically smoothen curves in order to fit the data using different polynomials. This method consists in locally adjusting the best fit when each point changes curvature. Indeed, our data would have been splitted into different segments to have a uniform relationship into one of the sub-segments.
The problem that we wished to address was finding which were the most important factors for house price in the Sindian Dist of Taiwan, and using these factors to predict the house price per unit area. We hope this tool could be used by investors. This also allows us to see which section of the district is the most demanded by local buyers. To do so, we looked at the significance of house age, ability to commute, and location in relation to the house price. We noticed that there was a central location in the city which concentrates the most buyers, the highest prices, the majority of the MRT stations, the oldest houses, and most of the purchases. As such, we can conclude that this area is principally a residential area consisting mostly of families. It is subsequently the most interesting region to invest in with the district.
Subsequently, we created a model which predicts the house price per unit area by multiplying each feature by a number and adding a separate number:
Distance to MRT * w1 + Transaction date * w2 + House age * w3 + w4 = Predict price unit area.
Investors can use this simple equation and numbers w1-w4 to calculate an accurate prediction of price per unit area. This equation has an average error of 7.4 (difference in price per unit area). Multiplying this error by the number of units areas in the property will give a range for the average error. The investors can use this error to decide if the investment is worth the risk.