Seismologist Susan Hough has suggested that a magnitude 10 quake may represent a very approximate upper limit for what the Earth's tectonic zones are capable of, which would be the result of the largest known continuous belt of faults rupturing together (along the Pacific coast of the Americas).[17] A research at the Tohoku University in Japan found that a magnitude 10 earthquake was theoretically possible if a combined 3,000 kilometres (1,900 mi) of faults from the Japan Trench to the Kuril–Kamchatka Trench ruptured together and moved by 60 metres (200 ft) (or if a similar large-scale rupture occurred elsewhere). Such an earthquake would cause ground motions for up to an hour, with tsunamis hitting shores while the ground is still shaking, and if this kind of earthquake occurred, it would probably be a 1-in-10,000-year event.[18]
So a 10 could be like, if California decides to finally go walkabout. Physically possible, but the sheer amount of movement a part of the crust would need to experience is very unlikely and therefore a scale measuring that or above isn't needed.
I even heard that nine-point-something is pretty much the limit because the rock just can’t store enough energy to go beyond ten, resulting in earthquakes before it hits that mark.
However, if you got the energy into the system from outside, it’s very possible to cross that line. The dinosaur asteroid supposedly resulted in a quake up to 11 on the Richter scale.
So… Is it likely? No. But the scale doesn’t end at 10.
Treasure your badly-scanned papers from 1980, and be thankful you didn’t have to do historical research by sorting through bad scans from the 1980s of printouts of microfilm archives (yes, instead of scanning the microfilm) of photos of the original documents that were photographed in 1961 at a 45° angle by a lazy archivist who used the cheapest film he could get his hands on. And the scans have blotches that make some pages literally unreadable because the microfilms were allowed to sit exposed to moisture for 25 years before being digitized. No I’m not bitter and my collegiate education wasn’t a waste, not one bit of either.
Dude on the right is correct that perturbed gradient descent with threshold functions and backprop feedback was implemented before most of us were born.
The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.
I think the usage implies it’s so easy to parallelize that any competent programmer should be embarrassed if they weren’t running it in parallel. Whereas many classes of problems can be extremely complex or impossible to parallelize, and running them sequentially would be perfectly acceptable.
The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.
Plus organizations outside of the FAANGs having hit critical mass on data that's actually useful for mass comparison multiple correlation analyses, and data as a service platforms making things seem sexier to management in those organizations.
mander.xyz
Oldest