▲ | somenameforme 3 days ago | |||||||
That trick can be easily explained. All you need to do there is repeatedly add no more than 9 to 4 small numbers. Keep a running tally of the thousands, hundreds, tens, and ones places. Then at the end you re-add those numbers while obviously adjusting for overflow. For a simple example: 1234 + 5678 + 9012. You get: 15 thousands, 8 hundreds, 11 tens, 14 ones. Now adjust for overflow (small to big): 4 ones (1 overflow moved), 2 tens (1 overflow moved), 9 hundreds, 15 thousands. Final calculation: 15,924. Notably the final 'adjustment' phase does not need to be done in 300ms, so all he's demonstrating is being able to repeatedly add 0-9 to 4 small numbers in 300ms. That's certainly an achievement and one that would require a lot of training, but nothing beyond that. You can also see this in the video by his timing. The numbers start at 22 seconds and he finishes at 59 seconds. So he spent 30 seconds on the numbers (100 numbers at 0.3 seconds), and then around 6 seconds to input his answer. | ||||||||
▲ | codehotter 3 days ago | parent [-] | |||||||
I appreciate that you're demystifying this, but you are downplaying the difficulty of keeping 4 tallies with perfect accuracy for 100 steps while processing new inputs every 300ms. That produces demands on working memory and parallel processing that probably exceed the capabilities of our linguistic systems. The claim is not that the algorithm is complicated, but that the abacus training helps in execution by involving visuospatial brain areas instead. Your argument is like saying training methods for track athletes are all equally effective because running is simply putting one foot in front of the other quickly. | ||||||||
|