[Local Link Removed for Guests] wrote: [Local Link Removed for Guests]Mon Apr 15, 2024 12:36 pm
I wonder if changing to two-dimensional weights will make feedforward and backprop a bit faster.
If the index to the weights is calculated in the top level BASIC interpreted code, it might take a little more time than
if it were calculated in the compiled Annex code.
I guess that I'll have to try it and see.
For working out the indexes I've moved away from nested For/Next loops that calculate it.
Instead, I'm now using nested Do/Loop and increment an index counter in the inner loop and reset it in the outer loop.
Doing that avoids several multiplcations during each loop.
Interestingly, I had a 2x2x7 NN which was working really well with just 3 patterns to learn.
When I increased the pattern count, it maxed out with a total error of ~0.25.
Thinking the issue was neuron count, I increased it to 3x5x7 and then got an array subscript error.
Turns out that I'd forgotten to rename the arrays after a cut-n-paste and was updating the input weights with the hidden weight deltas.
The weird thing was how well it performed.
After correcting that fault, things just got worse. Even with a higher neuron count and an overnight run, it just keeps trending towards an average of the outputs.
I had this problem earlier when I first started which turned out to be an error in the derivative calculation but I can't find the cause this time.
For a single pattern, it gets all outputs spot on to better than 6 decimal places, but more than 3 and I only get an average.
I may well try it out in VB6 so that I can send all weights and neuron outputs to screen to see what's going on.
Doing that with WLOG would be a bit too painful