1. Why don't you Sign Up? It's free and doesn't require anything special.
    We are a friendly community and you might be much smarter than you think.
    Don't just look at us. We need YOUR help. Let's build together the next big High IQ Society.
Dismiss Notice
This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Going to pieces

Discussion in 'Forum Games' started by Tiaric, Aug 9, 2017.

  1. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Couldn't think where else to post this.

    So, the question: If you hear the sounds of something falling apart, are you listening to entropy? If you watch something fall apart, are you seeing entropy?

  2. Nico

    Nico Active Member IQ: 140+

    Entropy is just a concept. Therefore you can not see, listen or feel entropy.
  3. marom

    marom Well-Known Member Claimed IQ: 140+

    If you knock a cup off a table and shatter it, the cup will not come back together. Even if you repair the cup, it will not be exactly the same; the cracks will still be visible. Entropy is an increase in disorder, and signs of that are constant in the Universe; therefore, entropy can be observed. Astronomers observe increasing entropy constantly.
    AguirresPriest likes this.
  4. Nico

    Nico Active Member IQ: 140+

    Yes, you can observe increasing entropy. Still it is just a concept. I can also observe that particles behave like the given wave-function tells them, but I can never directly observe the wave-function itself, because the wave function is just a concept. You can also use the concept of entropy in information theory. This makes it clear, that it is a concept. If something falls apart I hear soundwaves, which come from a part of the energy that is "lost" during the process of falling apart. Because I know that entropy can be interpreted as how much disorder there is, I can say that the entropy has increased, but I did not perceive the entropy directly.
    AguirresPriest likes this.
  5. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Yes, I've seen entropy described that way. It means also that the concept of entropy depends on how you define boundaries, there must be a finite set of probabilities, a probability distribution. An infinite distribution is nonsensical, so it must be finite, and so is any system that has a probability distribution (of energies) defined on it.
    An irreversible process has occurred, but isn't that because you defined a "whole" state for the cup, and you then attempted to correct some errors in that state, errors that occurred because "some information was irreversibly erased"?
    I think entropy is an irreversible erasure of information--what the information is, is defined by you, so the erasure and the entropy must be too.
    AguirresPriest and Nico like this.
  6. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Since it's the games subforum:

    Screenshot from 2017-08-10 01-12-47.png

    Which image has more entropy, and entropy of what?
  7. marom

    marom Well-Known Member Claimed IQ: 140+

    That depends on which image is closer to the cube's original state. It's so easy to pick the one on the left- but the cube could have started out like the one on the right. It's only our aesthetic sense that declares the one on the left to be more ordered.
    Last edited: Aug 9, 2017
    AguirresPriest likes this.
  8. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Before I look at my question, and whether it's even meaningful, I'd like to explore the notion that entropy can be characterized as energy which can't be used by a system for work.

    This is explained, sometimes, by the fact that most engines have an operating temperature range, and that warming up is a loss of energy, which then can't do work.

    But thermodynamic entropy doesn't have units of Joules, it has units of Joules per unit (degree) of temperature. Hence thermodynamic entropy has units of energy divided by units of energy per particle (the numerator is the total system energy, the denominator is the mean energy per particle). So entropy is essentially dimensionless. Entropy can also be defined as S = k ln(P), where P is a probability, so the physical units depend on Boltzmann's constant, k.
    Again, k has units of Joules per degree Kelvin. Notice there is no number of particles, no molar measurement.

    On to my question about the 2x2x2 Rubik's cubes: It doesn't make sense to say either image has an entropy, you have to compare the images. If you choose the left-hand image to be the solution state, then the right-hand image has more entropy (disorder?). In particular, unless the moves that take the cube from one state to the other have been recorded, they must be unknown or forgotten (i.e. irreversibly erased). The entropy measure is algorithmic, therefore.

    So marom has it almost right; the choice of which image represents a more disordered, or randomized, state is a matter of aesthetics. Any algorithm that takes the cube from this chosen state to the other state will have an uncertain (but finite) number of 'moves' in it. This is the algorithmic entropy.

    A footnote: when you rotate a layer of a Rubik's cube, it makes a noise or generates some sound energy (I've seen this called "cube chuckle"). Since the entropy is (the logarithm of) the number of moves you make to "solve" the cube, the amount of sound energy is a measure of ... what?

    Another footnote: Sometimes I unintentionally overcorrect mistakes, resulting in another mistake (I just edited the last para. again).

    I'd like to go over what I posted though. There is a number N, of moves that separates the cubes. You can reduce this to N - 2 immediately though, by recognizing that there is no uncertainty when there is only one move between two states. So you can posit the existence of two states, each one move away from the two cube images, and each a partial solution.

    N is unknown, but hopefully not too large. The entropy is actually the uncertainty (how is it measured though?), in the size of N, not the log of N. Is probability involved? I can't see why thermodynamics would be needed--you don't have temperature or pressure, you have position and orientation of 'parts' which have cubic symmetry, bounded physically by a structure with . . . cubic symmetry.
    You can also consider that solving (right to left, image-wise) is effectively erasing some information, by restoring an overall, or global, symmetry.
    Last edited: Aug 9, 2017
  9. extr3me

    extr3me Well-Known Member Claimed IQ: 130+

    I know is counter intuitive but the left one just gives me the creeps, no matter what you do, you are going to mess up it's perfect composure.
    And no, I'm not listening to entropy, I'm listening to whatever is breaking, sometimes it can also be a good thing. Imagine you are trapped and you use and explosive to shatter your way out. I'm seeing entropy on people, when they listen to TV nonsense :ROFLMAO:
  10. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Actually I've said some things that aren't really true.

    The cube, in whatever state, doesn't have entropy, the entropy is in the 'permutation distance' between two states (one is the chosen solved state, but it needn't be, it could be any state you decide is a partial solution, i.e. closer to the solution than where you started).

    I can permute the cube from the solved state, with a completely deterministic algorithm, remembering what I did, but when you look at the result you see randomness. Then the entropy is all yours, I don't have any uncertainty. Hence randomness or disorder is not entropy.

    This is because randomness is subjective; look at all the 16-bit binary strings, the ones that look random have no recognizable pattern in them, but that's all.
  11. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    Information, according to physics, can't be destroyed (or created). Erasing information means an irreversible transformation, such that the information is "lost to the environment"--hence is not physically destroyed.

    But that means the amount of information in the universe must be fixed, although the amount in any given context (a binary computer, a brain, etc), need not be if erasure is included (and of course, it is, because information is only stable for a limited time, irreversible errors occur). An error in a store of information is a random loss, requiring correction if possible. Correction depends critically on how the information is encoded. So all the information in the universe is transformed eventually, or re-encoded if you will, by "random processes".

    Otherwise, information is transformed by deterministic algorithms, some (all?) of which depend on information being erased

    If you could count, by listening to, the number of rotations a cube-solver makes, that's information. The sounds dissipate, but you've written some information down (or just remembered it). Of course, you could also build a device that does the same thing. The number of moves, however, doesn't tell you anything about what the moves are--you still have an entropy, if you define the actual moves as information (which is relevant to your context).

    So if you see some article saying "A coin has one bit of entropy", that's just wrong. A coin has one bit of information (or rather it gives you that much), the entropy is the (your) uncertainty of the state of this bit, and that's it. The number of bits and the log (base 2) of the number of states coincide, but they are not equivalent--entropy is the absence of information (loss of predictability), in an additive form (the logarithm to a chosen base, but you're free to choose the base).
    Last edited: Aug 14, 2017 at 3:35 AM
  12. marom

    marom Well-Known Member Claimed IQ: 140+

    One of my little hobbies is information theory. You've used the broadest possible definition of information in your post. I do not necessarily disagree w/you. However, still it remains that entropy describes the process of change in any given system, whether the system is small and local or the Universe itself ... I'm not a physicist nor have I made any close study of the subject; no, my subject is mathematics. Still, of course, I cannot help but be interested in physics, at least to the point where I have my own favorite physicist, Dr Lawrence Krauss. I think that you should watch some of his lectures on YouTube. I recommend, "Our Miserable Future," which details Krauss' understanding of entropy. Full disclosure, of course, Krauss has replaced the late Christopher Hitchens as one of the "Four Horseman of the Apocalypse," the others being Daniel Dennett, Richard Dawkins, and Sam Harris, atheism's outspoken braintrust.
  13. albinoblanke

    albinoblanke Member Claimed IQ: 140+

    eh.. I don't like entropy. You have to assume there is an ordered state before entropy can take place. I rather think that entropy is taking place all the time instead of just after an ordered state. If a piece of paper is an ordered state then what was the tree that the paper is made of? or the carbon that makes up the tree.. bleh.. Or is an ordered state just a state of entropy that is used as a starting point? iewl.. I don't know :X3:. I would appreciate it a lot if somebody explained these things to me because there must be some logical explanation for this.:thumbsup:
  14. marom

    marom Well-Known Member Claimed IQ: 140+

    The "ordered state" is the initial state of whatever system you are talking about - and this can be quite arbitrary. A sheet of paper might be viewed as an entropic state, if one begins w/the tree; however, if one begins w/the piece of paper, ignoring its antecedent states, then whatever happens to the paper would be its own entropic states. The reason that the Universe as a whole is deemed a singular entropic entity is that it is believed that there was no antecedent state for the Universe: it and everything it contains - including time itself - came into being at the Big Bang.
    albinoblanke likes this.
  15. albinoblanke

    albinoblanke Member Claimed IQ: 140+

    If points of origins continue forever, like there is always something before something, wouldn't that mean that there is always entropy and an ordered state doesn't exist, only if you 'imagine' an ordered state of something? If we found evidence of 'something' before the big bang, wouldn't that just mean that the big bang is an entropic state of 'something'? The big bang sounds so lacking, there must be something before it! Thank you for the info.:thumbsup:
  16. marom

    marom Well-Known Member Claimed IQ: 140+

    Because our minds have evolved to be pattern finding machines ( very helpful from an evolutionary standpoint), we find it hard to conceive of an effect w/o a cause; however, this great strength of the human mind is also a trap because it leaves out the very real possibility of some effects that do not have causes. Eastern philosophy is much better adapted to cope w/this startling possibility and accepts that the Universe itself might have required no cause - that the Universe, as they say, "just is' ... Yes, I know perfectly well that such an argument is falling into the hands of theists, who say that God required no cause. So be it. My reason for not believing in God has nothing to do w/the possibility of God's existence - which I acknowledge - but my own failure to see any convincing evidence of that existence. "My Mommy told me that there was a God" is just not good enough as far as I am concerned, and I, of course, discount scripture as worthless.
  17. albinoblanke

    albinoblanke Member Claimed IQ: 140+

    the problem is what comes out of nothing. If there is no cause for, example, the Universe, then what is the reason that the Universe just happen to be the thing that has no cause? things don't shape themselves, they have to be affected by something. Either there are other realities (infinite of them) and each of them resembles every possible thing that can exist without a cause or everything doesn't exist, making our reality an illusion, including the Universe. Or, one more logical answer, the Universe doesn't exist without cause and is caused by something and that thing is caused by something and that one by that one and by that one etc. One of these three must be true in order for everything to function properly. Even god or any intelligent designer would probably scratch his/her/its head if he/she/it wondered why he/she/it just happen to be the thing with no cause. (<--Huge humanlike assumption, I know but you get the point) :thumbsup:
  18. Tiaric

    Tiaric Active Member Claimed IQ: 130+

    About order and disorder.

    These are things you, as an observer/experimenter, choose, or rather you choose the states.
    That's an obvious thing about a Rubik's cube: if the ordered state is chosen to be the same as the 'factory' state, i.e. what it looks like in the original packaging, then you can randomly apply some set of rotations and get a 'randomised' state which is no longer ordered.
    Screenshot from 2017-08-11 13-40-35.png
    Alternatively you say the opposite is true, and that the disordered looking state is actually a word or a symbol, it has an ordering. Then what you do by "solving the cube" is erase this word, it dissipates irreversibly (but not spontaneously) if you forget the moves.

    What about a set of sequential coins tosses? It's unlikely you will see a pattern repeat, much more likely to see an unordered sequence. But the coin isn't random, the sequence is; the cube isn't random, the word you generate is.
    Last edited: Aug 16, 2017 at 7:56 AM

Share This Page