• 0 Posts
  • 5 Comments
Joined 2 years ago
cake
Cake day: July 15th, 2023

help-circle
  • Yeah, this one took me a while to wrap my head around and intuitively “get it”. I first learned it was true from that mythbusters episode where they correct their past mistakes… and even they had thought that two cars hitting head on would receive the same energy as hitting a stationary wall at the speed of the sum of their speeds. They were corrected in letters written to them, and then they experimentally verified it.

    And even seeing the experimental verification, it still took me a while to really get it. The opposite speeds cancel out, making you go from your speed to zero. Same as if you hit a brick wall at that speed.

    Let’s say the two cars are going 50 mph (kph, whatever unit you want). 50-50=0. You experience the same as hitting the brick wall. It’s the difference between initial speed and final speed that matters, not the sum of their speeds.



  • I feel like “passing it through a statistical model”, while absolutely true on a technical implementation level, doesn’t get to the heart of what it is doing so that people understand. It’s using the math terms, potentially deliberately to obfuscate and make it seem either simpler than it is. It’s like reducing it to “it just predicts the next word”. Technically true, but I could implement a black box next word predictor by sticking a real person in the black box and ask them to predict the next word, and it’d still meet that description.

    The statistical model seems to be building some sort of conceptual grid of word relationships that approximates something very much like actually understanding what the words mean, and how the words are used semantically, with some random noise thrown into the mix at just the right amounts to generate some surprises that look very much like creativity.

    Decades before LLMs were a thing, the Zompist wrote a nice essay on the Chinese room thought experiment that I think provides some useful conceptual models: http://zompist.com/searle.html

    Searle’s own proposed rule (“Take a squiggle-squiggle sign from basket number one…”) depends for its effectiveness on xenophobia. Apparently computers are as baffled at Chinese characters as most Westerners are; the implication is that all they can do is shuffle them around as wholes, or put them in boxes, or replace one with another, or at best chop them up into smaller squiggles. But pointers change everything. Shouldn’t Searle’s confidence be shaken if he encountered this rule?

    If you see 马, write down horse.

    If the man in the CR encountered enough such rules, could it really be maintained that he didn’t understand any Chinese?

    Now, this particular rule still is, in a sense, “symbol manipulation”; it’s exchanging a Chinese symbol for an English one. But it suggests the power of pointers, which allow the computer to switch levels. It can move from analyzing Chinese brushstrokes to analyzing English words… or to anything else the programmer specifies: a manual on horse training, perhaps.

    Searle is arguing from a false picture of what computers do. Computers aren’t restricted to turning 马 into “horse”; they can also relate “horse” to pictures of horses, or a database of facts about horses, or code to allow a robot to ride a horse. We may or may not be willing to describe this as semantics, but it sure as hell isn’t “syntax”.


  • Mine was just all repeated digits of whatever hour. 1:11, 2:22, 3:33, 4:44, 5:55, 11:11 all “counted” in my mind when I was entering university, and it happened so freaking often it was really weirding me out. It seemed like anytime I glanced at a clock without other intention, it would be one of those times. There were probably times I looked at a clock normally, but of course confirmation bias reinforces things. But it really did seem far more often than you’d expect. My bet is that my inner clock was prompting me to look at those times because I got an adreneline or dopamine or something spike, so my subconscious got trained into finding it.