In this post, we will look at the classification of a two-dimensional space using a linear perceptron. We will
use the familiar “Two Moons” classification. In doing so, we will observe the classification by applying
various transformations to the two moons. The architecture of the Neural Net Linear Layer
and Logistic Sigmoid
I have included the Logistic Sigmoid to smoothen out the output.
Trained Nets
The following shows the training set and the trained set and its perception of
the two-dimensional space. The configurations are also listed on the top of each of the images.
Code
I have used Wolfram Mathematica 12.0 for
Feel free to change the configurations.
(*Net definition*)
NetGraph@NetChain[{LinearLayer[1], LogisticSigmoid}]
Module[
{configurations = <||>, topKeys},
configurations["NoGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && #[[2]] > 0 &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && #[[2]] < 0 &)|>;
configurations["GapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0.1, 0}] <= 0.8 && #[[2]] > 0 &),
"Lower" -> (0.6 <= Norm[# - {0.1, 0}] <= 0.8 && #[[2]] < 0 &)|>;
configurations["HighGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0, -0.1}] <= 0.8 && #[[2]] > 0.1 &),
"Lower" -> (0.6 <= Norm[# - {0, -0.1}] <= 0.8 && #[[2]] < -0.1 &)|>;
configurations["HighGapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0.1, -0.1}] <= 0.8 && #[[2]] > 0.1 &),
"Lower" -> (0.6 <= Norm[# - {0.1, -0.1}] <= 0.8 && #[[2]] < -0.1 &)|>;
configurations["NegativeGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0, 0.1}] <= 0.8 && #[[2]] > -0.1 &),
"Lower" -> (0.6 <= Norm[# - {0, 0.1}] <= 0.8 && #[[2]] < 0.1 &)|>;
configurations["NegativeGapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {-0.1, 0.1}] <= 0.8 && #[[2]] > -0.1 &),
"Lower" -> (0.6 <= Norm[# - {-0.1, 0.1}] <= 0.8 && #[[2]] < 0.1 &)|>;
configurations["NoGapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) > 0) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) < 0) &)|>;
configurations["GapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) > 0) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) < 0) &)|>;
configurations["GapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) > 0.1) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[[1]] + #[[2]]) < -0.1) &)|>;
topKeys = Keys@configurations[[ ;; ]];
Riffle[
MapThread[
Module[{allData, upperMoon, lowerMoon, trainedNet,
preTrainedGraphics, trainedGraphics, combinedGraphics},
topKeys = Keys[configurations];
allData = RandomReal[{-1, 1}, {15000, 2}];
upperMoon = Select[allData, #1];
lowerMoon = Select[allData, #2];
preTrainedGraphics =
Graphics[{PointSize@0.005, Opacity@0.5, Darker@Green,
Point[upperMoon], {PointSize@0.005, Opacity@0.5, Red,
Point[lowerMoon]}}, AspectRatio -> 1,
PlotRange -> {{-1, 1}, {-1, 1}}, ImageSize -> 300,
PlotLabel -> "TrainingSet"];
trainedNet =
NetTrain[NetChain[{LinearLayer[1], LogisticSigmoid}],
Join @@ {(# -> 1) & /@ upperMoon, (# -> 0) & /@ lowerMoon}];
trainedGraphics =
With[{blends = Join @@
Table[{Blend[{Red, Green}, trainedNet[{x, y}, None][[1]]], PointSize@0.005,
Point[{x, y}]},
{x, -1, 1, 0.025},
{y, -1, 1, 0.025}
]
},
Graphics[
{
{PointSize@0.005, Opacity@0.5, Darker@Green, Point[upperMoon]},
{PointSize@0.005, Opacity@0.5, Red, Point[lowerMoon]}, blends},
ImageSize -> 300,
AspectRatio -> 1,
PlotRange -> {{-1, 1}, {-1, 1}},
PlotLabel -> "TrainedNet's ViewOfSpace"]
];
combinedGraphics = Labeled[Framed[GraphicsRow[{preTrainedGraphics, trainedGraphics}]], #3, Top];
Export[#3 <> ".png", combinedGraphics, ImageSize -> 700, ImageResolution -> 1000]
] &
, {configurations[#]["Upper"] & /@ topKeys, configurations[#]["Lower"] & /@ topKeys, topKeys}
], "\n\n"]
] // Column