If I could point to some paper that had my name down as contributing a 10,000th part to one detector in the Large Hadron Collider I would be over the moon.
As it is I'm mentioned in the middle of the credits on a couple of obscure papers, that nobody will ever read, just because I set up some communications or embedded system for their projects. I rather wish they had not put my name on them.
We're all putting ourselves "out there" ... even in this forum. Bits and pieces of code here, begged and borrowed pieces of schematic there, etc.
Every now and then you might get a decent hit. ...Wait until your in a small room with a few Feds and there debriefing you on what you can and can't say just prior to a presentation your giving at a security conference. That's always interesting.
Math. It contains truths that are distinct from the physical world. Square root of two, would be one of those.
Having been puzzling over the square root of two since high school I'm still not totally sure if it exists or not. Even in the rarefied world of mathematical musing. Sure we have a speculation that there is some number that when squared will result in 2. That is to x * x = 2.
But we can never write that number, x, down. We can never produce it as an actual result. We can't even do simple things like ask what is last digit of x.
At best mathematicians seem to prove the existence of the square root of 2 by defining a process that you can perform that will provably converge on on some number which if you could square it would yield 2 . That's all fine. In maths such a process does not take any time. You just assume it's done and move on. It also skips over the issue of actually squaring the result of that process.
Some mathematicians like to avoid such numbers and stick to the rationals. Like Norman Wildberger.
Sometimes I think this is all brilliant. Other times I'm some how not satisfied with it.
So, information itself must be embodied by something, but that which it represents doesn't have to be. My head hurts!
That's my point. If the thing which is being represented doesn't have to be embodied by something, indeed cannot be, how can we claim the thing exists. All we can claim is that if root 2 existed then squaring it would result in 2.
Aside: Did I once point out here that 1 + 2 + 3 + 4 + 5 + 6....to infinity = -1/12
My head hurts too now. I don't think I'll be getting any sleep tonight.
Having been puzzling over the square root of two since high school I'm still not totally sure if it exists or not. Even in the rarefied world of mathematical musing. Sure we have a speculation that there is some number that when squared will result in 2. That is to x * x = 2.
But we can never write that number, x, down.
Sure we can. Change the base of your number system from ten to the square root of 2, and it's just 10. And 2 is just 10*10 = 100. Of course, now other numbers that were rational in base 10 no longer are in the new system. Irrationality isn't a function of quantity, per se, just in how we represent numbers.
I think what you are suggesting is equivalent to making a right angled triangle with a hypotenuse of length 1 and the other two sides of length one over the square root of two. As you say it basically moves the problem elsewhere!
I must point out though that irrationality is a lot deeper than our number representation.
We have integers, 1, 2, 3 .....
We have rational numbers. Rationals can be expressed as p/q where both p and q are integers. 1/2, 2/ 3, and so on.
We have irrational numbers. The Irrationals cannot be expressed as an integer divided by an integer.
Now the deep part...
It turns out that there is an infinite number of integers. There is also an infinite number of rationals. The weird part here is that you can count all the rationals with integers, that is to say that the infinity of rationals and the infinity of integers is the same size.
But, weirder and weirder, you cannot count the number reals with integers. There are infinitely more reals than integers or rationals!
Thank you Georg Cantor for that extra headache inducing thought.
So, this whole square root of two thing is much deeper than our choice of number representation.
There are infinitely more reals than integers or rationals!
It's just the difference between digital and analog!
BTW, I just finished watching the four-part series, "The Story of Maths" from BBC. Highly recommended! If you want to learn more about Cantor's enumeration of infinities, see this:
Only I'm now pretty much convinced nature is not analogue. Everything is quantized. Certainly in electronics you either have an electron or you don't.
Cantor's diagonal method and his alephs and the Hilbert Hotel have been giving me headache for decades now. I think I would sleep better at the Bates Motel
I had obligations today and wanted to touch my p2 before I came back.
It is really nice to have a decent conversation... and I only barely skimmed page one, but I think I need to clarify a little.
I love science. I am in no way a physicist, but I have brushed up against some really good ones. So, I do have that kind of experience to rely on and I have given the subject a fair amount time.
I think we are all largely agreeing about almost everything... even when it seems like we aren't.
Many years ago, I was getting pleasantly toasted with my favorite scientist, Bob, at a very nice bar in Urbana... and I said, "you know this whole business of physical laws is really bothering me." and so we started ... we listed every physical law that we knew and proceeded to explain why it was wrong. He would do a few, I would do a few and before long, there weren't any laws left standing.
He asked me to give a lecture on the subject to his ESR lecture and I did!!!
Bob's lesson for the students was that we use physical theories as a context to explore nature,
but don't mistake any of them for being true... because they aren't. They are constructs that help us make progress. Bob taught his student this because that is what he was taught by his mentor, an historically important physical chemist.
We didn't destroy physics in our drunken haze... the actual reality is that every physical law has a "domain" of truth. So long as this is understood, a law provides good insight into what can be expected within that domain, but step outside of that domain and use that same law, and you are asking for trouble.
I have two major problem with gr.
1) The "twins paradox" if that is what I described in my first post.
What I described isn't a paradox to me. To me it is an inherent contradiction in the thought process that is encapsulated in the theory. A theory that contradicts its own postulates is a problem for me.
2) E=mc^2.... which looks to my mind like E=mv^2... where v becomes e. This is a serious issue.
I talked to Bob about this as well... E=mv^2 is the equation for kinetic energy. Bob chimed in at that point in the conversation that sometimes it is written E=mv^2/2:)
This to me this indicates that Einstein actually believed that energy is conserved. I wish Newton had been there to explain to Einstein why this simply cannot be true.
Here is the reason:
Except for specialized cases, you can conserve momentum or you can conserve energy... but you cannot conserve both energy and momentum at the same time (and by same time... I actually mean the same experiment or physical context) Nature forces you to pick one. If you pick the wrong one, things get screwed up and if everyone believes you, they can stay screwed up for a very long time.
I tend to think that the conservation of momentum is closer to the vast majority of cases... overwhelmingly ... essentially almost always. If I were to pick a principle to be at the center... it certainly wouldn't be based upon energy. Most of the time it isn't conserved. You cannot have a physical theory without conservation... it leaks stuff all over the place. It isn't actually a physical theory. Conservation is one of those first principles. If you don't have it, you don't have a physical theory. You have a religion.
The deeper truth is that at some level even the concept of momentum kind of loses its meaning... don't ask me exactly where or how... and you can't ask Bob... he's dead. But somewhere deep down in the ethernet... even momentum cannot be used as a meaningful construct. I seem to remember that at some point you have no choice but to use something else, "spin?" I think that's what it was.
But of course, spin isn't actually spin in any conventional way, at least that's what Bob said:)
I don't have link, but there is a video of Feynman's first public presentation of quantum electrodynamics, which was in New Zealand. Someone here will know exactly where to find it. It is a classic. Listen to the questions... they were not very happy with our hero:)
You have to remember, everything changed after the end of the War in the Pacific. Physics was in a strange place... having information and theory that could (in the wrong hands) destroy the world.
I view Feynman's work as the best publicly available work that won't get us all killed.
But that doesn't mean that it is not also a poor reflection of clearer levels of thought that are still not ready for public consumption.
For a brief time. I employed a lady physicist, who was at the same time doing passivation work for the Navy. She was pretty blunt about the assertion I just made:) I hired her, because her hubby had done some work using radio signals to stimulate retinal responses... he could make his lady see all sorts of things. Once I had hired him and finally understood exactly what he was doing... I lost interest.
The title of Feynman's book might be a statement of fact.
Rjo_ There is a playful aspect to Feynman that suggests he never really did let on all his thoughts. Guarded, but also a tease, like just daring people to ask just a bit better questions.
I'm not sure I can qualify the quality of thought.
I am sure I don't care, content to travel, experience, and enjoy. It's a big place we live in, and it's full of wonders and toys. I like to do things that make me happy, and money and it's all quite good enough.
Maybe, I can't care as you suggest too.
Feynman has the most value, to me, in how he made the method a part of him, yet somehow managed to also cultivate that young sense of wonder about it all. He lost a part of that in Los Almos. When one looks, it's there, plain as day, and I'm not speaking to the other, obvious and difficult loss of his wife.
Let's just say I don't doubt you.
I had a similar sort of experience with a primate researcher. I asked one of those kinds of questions. Innocent really, just idle curiosity. Had no idea. Just unlucky enough to hit close to a mark.
I didn't let her finish, nor did she press. I'm happy for that.
The concept of thermodynamic entropy preceded information theory. Originally the construct was intended to represent disorder... the eventual movement of any state toward randomness. At the time the "world view" was that disorder was natural and to be expected... that highly ordered states, such as life were possible only because it was allowed by effect of statistical probabilities... anything however unlikely can and probably will eventually occur, but no matter because eventually entropy wins. That was the thermodynamic explanation of life and the dim view of our future. It was a law(probably still is:).
Then guys like Shannon came along and demonstrated that almost the exact same statistics... minus the energies... could be used to explain the transmission of information... in which case information is maximized when all probabilities are equal...
So you have a problem... the same statistics, one describing the end of the Universe as we know and the other describing our conversations on the telephone... minus the energy terms of course:)
So information is entropy. Entropy is disorder. Order is disorder. Where is the problem?
Here is the Nobel prize I missed because of a pregnant immuno-therapist:)
If you look at long term immunity... it is mostly the job an immunoglobulin called IgG.
IgG has two binding spots ... they look like scruntched up stubby arms. They are all the same size and not very big.
If you look at the binding sites, they are built up of amino acids... it's a protein. Since you have a fixed sized protein, there is a limit to the number of combinations of amino acids which are possible. By treating each unique type as it's own species, information statistics can be easily applied.
That means that we should be able measure the total amount of immunologic information that is being used up at any time. When an infection or tumor occurs, the overall heterogeneity of immunoglobins drops... More is produced of selected species. The overall information can go up because there is the same amount of immunoglobulin of the old type and more of the new specific type or total information can go down because the immune system is fatigued in some way.
We aren't doing it. If you carried the idea forward a little, you could figure out who was going to catch the next virus.
The technology has been available for at least 15 years.
It used to be possible for individuals to do work like this... today, it is almost impossible.
Oh yes, I have had many such toasted debates with many Bobs over the decades. Some actual physicist Bob's and many layman Bob's. Seems everybody likes to ponder the nature of reality, the lack of a deep appreciation of the mathematics of physics is always a problem. For me as well, despite having studied physics for 4 years and been up to my eyeballs in quantum mechanics, Bose-Einstein statistics and so on at the time.
"every physical law has a "domain" of truth
Quite so.
As a simple example, around here we use Ohms law a lot, V = I * R, to calculate voltages and currents in resistors. That's nice and easy and demonstrably works.
Until you notice that it's not exactly correct, at higher currents the voltage is higher than predicted for a given current. Something else is going on. At higher currents more power is dissipated, the temperature of the resistor rises, that changes it's resistance.
At the extreme the resistor explodes, our Ohms law is totally busted, something else is going on there!
At the low end, if we are looking at really low currents we will find a lot of noise coming out of our resistor. Our Ohm's law says nothing about that. There is something else going on. It's thermal noise and such.
Similarly Newtonian mechanics breaks down at really high speeds, or in high gravitational fields, or at the very small scale. Einsteins General Relativity takes care of the first two. We need quantum mechanics for the latter.
General Relativity also breaks down at the extremes.
None of this makes our Ohms Law or Newtonian mechanics wrong, it's just that there are other effects they don't take into account going on.
For those that prescribe to the big bang theory and the singularity before time and space yet cannot fathom how that so much mass can exist now from such a small entity I have my own perception.
The singularity is still very small but also very fast and occupies all space at the same time due to being very fast, it's also why I and everything else exists physically, yet my perception is merely being tuned to a particular frequency or bunch of frequencies that I can react with.
Although the particle is very fast there are areas where the tuning isn't perfect aligned and doesn't interact in so far as I or for that matter "everyone else" are concerned, these are areas where there are voids or "Dark Matter".
The exclamation marks are intended for reality purposes ie do we actually exist? Is reality real? or are we just interactions of energy .
I believe I am typing on a keyboard and posting on a forum but am I??
The "twins paradox" if that is what I described in my first post. What I described isn't a paradox to me. To me it is an inherent contradiction in the thought process that is encapsulated in the theory
That is exactly what a paradox is, a "contradiction in the thought process that is encapsulated in the theory"
The Twins Paradox comes out of special relativity. In SR the twins are travelling at constant speed. They never get back together. They see each other ageing at different rates in an apparently contradictory manner. There is no contradiction here because they never get back together to be in the same place and same frame of reference to compare their ages. The contradiction does not matter, it does not happen in physical reality.
If the twins do get back together then at least one of them has been changing his inertial reference frame. He was going away, then he was coming back. Take all that into account and we have an accurate prediction of which twin will have aged most when they get back together.
I think a lot of this "paradox" comes about because relativity predicts that the rate at which a clock ticks depends on the speed of the observer. But we still tend to look at the results with a Newtonian idea of a universal time that is the same everywhere and then get very confused.
This has been demonstrated many times. Using atomic clock "twins" flying around the world on air liners. Our GPS system relies on General Relativity to take into account the different ageing rates of the satellite "twins". Otherwise the accuracy of GPS would be terrible.
E=mc^2.... which looks to my mind like E=mv^2... where v becomes e. This is a serious issue.
I talked to Bob about this as well... E=mv^2 is the equation for kinetic energy. Bob chimed in at that point in the conversation that sometimes it is written E=mv^2/2:)
Wait a minute there is something wrong here:
The kinetic energy of a moving body is E = mv²/2 NOT E = mv²
The equivalence of mass and energy E = mv² is something else. Let's not go there...yet.
Bob must have been well toasted
This to me this indicates that Einstein actually believed that energy is conserved. I wish Newton had been there to explain to Einstein why this simply cannot be true.
I'm not sure what you mean there. Energy is conserved. Newton and Einstein would both agree on that. As would every other physicist.
Except of course Einstein showed the equivalence of mass and energy, E = mc². Still energy is conserved, sometimes in the form of mass.
Except for specialized cases, you can conserve momentum or you can conserve energy... but you cannot conserve both energy and momentum at the same time (and by same time... I actually mean the same experiment or physical context) Nature forces you to pick one.
Wait another minute. Something is very wrong here.
Both energy and momentum are conserved. At the same time. In the same system.
Ten minutes playing with a Newton's Cradle or playing billiards will convince you of that.
Consider this:
If you slam two balls into the stationary balls of a Newton's Cradle then two balls come flying out the other end at the same speed. Slam three in and three come out. And so on. Well, why is not that only one ball comes flying out the other end at some higher speed, thus conserving energy and or momentum? Or why not more balls coming out than went in, at lower speeds to conserve energy and/or momentum? Well, it's because to make the right prediction you have to assume conservation of energy and momentum at the same time. Then you see that the only thing that can happen is two in two out, three in three out, etc.
Conservation is one of those first principles. If you don't have it, you don't have a physical theory. You have a religion.
You could put it like that. That's why all of physics adheres to conservation laws. Energy, momentum, information etc.
...even momentum cannot be used as a meaningful construct. I seem to remember that at some point you have no choice but to use something else, "spin?" I think that's what it was.
Well, yes. Now we get into the wonderful world of quantum mechanics. All those conservation laws still apply. But now you have to take into account the quantized nature of everything...
information
H=-Sum p(i)log(base2)p(i) as p goes from 1 to N.
where p(i) is the frequency of some occurrence.
And please be serious... no making fun of me. I'm a little sensitive about this.
Why mix bases? Why take a base 10 statistic... and mix it with base 2?
I don't get it... why isn't the frequency expressed as a base two statistic?
In that equation p(i) is the probability of occurrence of the i-th possible value of the source symbol. As wikipedia puts it. p(i) is therefore a number form zero to one. It really does not mater what number base we use for that number. Why would it? It's just a fraction of 1 however we represent it.
So why use log base 2 you ask?
No particular reason. Except we can define information as the number of yes/no questions required to determine the state of a thing. Each of those yes/no questions is a "bit" of information. That is the smallest unit of information we can think of. All of this is familiar to computer users.
Well, yes/no "bits" can be represented as 0 and 1. Which fits with the binary number system. That naturally leads to using log to the base 2.
The end result of the equation as you wrote it, using log base 2, gives a result in bits per symbol. There is no reason you cannot natural logs, base e, in which case your result is in "nats" per symbol. Or more famously you can use log base 8 which results in "bytes" per symbol.
All you are doing by changing the base of the log is changing the units of the result. A simple scaling, multiplication, can turn any one of those results into any other.
Somebody once wrote that "Thermodynamics owes more to the steam engine than the steam engine does to thermodynamics"
When they were developing the steam engine they had a problem. At one end you had energy in the form of hot steam under pressure. At the other end you had work being done by a rotating shaft. Problem was that the energy coming out was less than the energy going in. That's expensive. How to make work cheaper?
That problem leads to wondering where the missing energy went. It seems to have disappeared somewhere. But we like energy to be conserved. Well it's gone into heat along the way, it has become hidden from us. Entropy is a measure of that hidden stuff.
...anything however unlikely can and probably will eventually occur,...
There is a thought experiment about this that I quite like:
Take a bucket and half fill it with white sand. Now fill the other half with red sand. Now, we can imagine that we start swapping those white sand grains among each other and swapping those red sand grains among each other. There are billions and billions of ways to rearrange the grains like this, swapping red for red and white for white. No matter what we do we still have a bucket half filled with white and half filled with red.
Now, take a stick and start stirring up the sand. Soon enough we will have what looks like a bucket of pink sand, as red grains have moved in to the white half and vice versa. We can continue stirring for a very long time and we will always have pink sand.
Why is this? Because there are even more billions and billions of ways of rearranging the sand grains to get what looks like pink if we don't care what colours we are swapping around. Far more than just stirring up the red by itself or the white by itself as we did before.
Ergo, the probability of ever getting back to what looks like a half red, half pink, bucket of sand is vanishingly small.
The pink sand situation has a higher entropy than the half white, half red, situation.
It's not that can never happen, it's just really unlikely.
In modern times physicists like to say "Entropy almost always increases"
...that highly ordered states, such as life were possible only because it was allowed by effect of statistical probabilities.
The idea that ordered states like life are somehow working against the principal of entropy always increasing is oft quoted.
The thing to remember is that whilst a life form may look like it is highly ordered and contrary to entropy increasing the mistake is to be just looking at the life form. One has to consider the system as a whole. That life form is consuming energy, it's increasing the entropy of it's environment. Nothing says you can't decrease the entropy of a given volume of stuff. Like when you put your beer in the fridge. But you do so at the expense of increasing entropy elsewhere.
In my younger days I always thought gravity was what decreased entropy some how. Gravity makes ordered systems by sucking the heavy stuff to the bottom and letting the light stuff float on top. We end up with ordered layers of different stuff.
Then I found Hawkins agreed. He said that ultimately gravity sucks everything around into a black hole and then all the information/entropy you have is lost. A black hole is simple specified by it's mass, charge and spin. Not much information there is there?
Then I found Susskind showed that a black hole can indeed have entropy, it has a temperature. Information is not lost.
Grrr...I don't know...
Man this is a long physics session! I need a beer.
One of you mentioned the LHC, the computers they use are all running Linux. Now do you want them to include the names of individual penguins (the computers) each time they file papers on their discoveries?
Yes. And the name of every single tax payer who has contributed to the billions of Euros it all cost.
It's quite amazing really. After a few decades and billions of Euros and millions of man hours we end up with the biggest single machine on Earth. Then comes the presentation of the evidence for the existence of the Higgs Boson. A graph on a Power Point slide with a barely noticeable bump where a bump would not be if there were no Higgs. Everyone cheers, Higgs and the old guys are reduced to tears. I'm looking at that tiny bump on the graph and wondering WTF?!
I am not complaining mind. I firmly believe we should all be prepared to throw a few dollars each every year into such fundamental research.
Over the years since then I slowly get some dim understanding of what they have been up to. It's all good stuff.
One of you mentioned the LHC, the computers they use are all running Linux. Now do you want them to include the names of individual penguins (the computers) each time they file papers on their discoveries?
I think that was me, and no, of course not. But in something like the LHC, or LIGO, since that's where this started, the physics wouldn't happen without all the technology. If the technology is commodity stuff, like generic computers, then there's no reason at all to mention "the penguins". But in those projects, the technology is being pushed beyond anything that previously existed, and the teams that created the detectors, interferometers, etc. have perhaps been has critical to the success as the physicists. Shouldn't their names be on the paper as well?
Thanks everybody. Absolutely wonderful conversation. I still haven't regained myself from my viruses... wish I had a tad more energy.
Heater,
"Then I found Susskind showed that a black hole can indeed have entropy, it has a temperature. Information is not lost."
It is lost... because it is not available to us.
I suspect that large scale plasma's are self-organizing, where a self-organizing system is defined as any system that consumes information. That is large scale consumer based consumption:)
Beau,
Every time you post... you keep me up at night thinking about what you just said. Shame on you.
One of the many reasons that I took up electronics as a hobby was the idea of being able to play with electrons... I thought if I learned enough, I thought I could engage in some garage scale experiments about some ill-defined fundamental question. At this stage, I don't think I will ever know enough about electronics to get close... but I know where you guys are and that might be enough.
There was a conversation about numbers that aren't numbers and I just have to chime in.
Math needs to stop calling things circles, points, lines, squares or triangles. In math the use of any of these nouns requires the reader to postulate the impossible...something infinitely small. If we have learned anything from Cern it should be that there is always something smaller:) Why do we have fifth graders considering the infinitely small?
What is the harm in calling something "circular" or "linear" rather than a "circle" or a "line"... Those kids know about adjectives. And then fully explaining why we don't call them circles or lines anymore. What is the harm?
The minute you do that, you can then define Pi as a real number and talk about ways of actually determining it. Every circular structure is not actually a circle, it actually does have a circumference and a radius(or series of them) and Pi actually can be computed.
None of these things ever exist except in the minds of confused mathematicians.
We also should also stop using the equal sign. No thing ever equals any other thing exactly. When I was in first grade, I completely stopped listening to my teacher(whom I loved) the first time she explained the equal sign. I remember thinking to myself...
Honest to God... on my mother's grave... this is true:
"If that is how you people are using numbers, no wonder the world is so screwed up." All because she said that the equals sign meant that whatever was on one side of the equal sign was exactly the same as what was on the other side... And I could easily see that this was not true. The two sides are always different. I knew what she was trying to say, but I also knew what she was saying and I found what she did say to be absolutely abhorrent.
Why risk it... get rid of the equal sign... never talk about equality... always talk about transforms.
Easy peazy, problem solved:) True mathematics is born.
What you postulate is a carpenter's dream, but mathematicians are not carpenters. I have no problem with abstract concepts dealing with the impossibly small -- or large -- nor to teaching those concepts to students. I grew up learning those concepts as part of the "modern math" curriculum of the late '50s and early '60s. It was as exciting back then as it should be today.
There's nothing wrong with teaching math as an academic subject, rather than as a vocational one, which you seem to be proposing.
Re: the conversation about the square root of two.
I am on thin ice here... what I say above I have thought about for years. The square root of two has always bothered me, but I never had a thought about what should actually be done about it.
If this is wrong, I blame Phil. What Phil demonstrated is something most of us have thought about... it just isn't in the front part of our brains most of the time.
There really is no need for the square root of two.
No-one ever measures anything and reports back that the measurement is the square root of 2.
Taking Phil's example... a rightly sided triangular object can only have an hypotenuse that measures the square root of two if you have chosen the wrong unit:) So, in True Math we could have a rule... If your answer involves the square root of two, you have either measured the wrong side or used the wrong unit... it doesn't matter if it is true. It simply gets rid of a number, which we really don't like. In any case you get no credit for the answer.
Comments
As it is I'm mentioned in the middle of the credits on a couple of obscure papers, that nobody will ever read, just because I set up some communications or embedded system for their projects. I rather wish they had not put my name on them.
We're all putting ourselves "out there" ... even in this forum. Bits and pieces of code here, begged and borrowed pieces of schematic there, etc.
Every now and then you might get a decent hit. ...Wait until your in a small room with a few Feds and there debriefing you on what you can and can't say just prior to a presentation your giving at a security conference. That's always interesting.
But we can never write that number, x, down. We can never produce it as an actual result. We can't even do simple things like ask what is last digit of x.
At best mathematicians seem to prove the existence of the square root of 2 by defining a process that you can perform that will provably converge on on some number which if you could square it would yield 2 . That's all fine. In maths such a process does not take any time. You just assume it's done and move on. It also skips over the issue of actually squaring the result of that process.
Some mathematicians like to avoid such numbers and stick to the rationals. Like Norman Wildberger.
If you'd like a good read on the existence of root two there is this from the Maths Dept of the University of Cambridge: https://www.dpmms.cam.ac.uk/~wtg10/roottwo.html
Sometimes I think this is all brilliant. Other times I'm some how not satisfied with it.
That's my point. If the thing which is being represented doesn't have to be embodied by something, indeed cannot be, how can we claim the thing exists. All we can claim is that if root 2 existed then squaring it would result in 2.
Aside: Did I once point out here that 1 + 2 + 3 + 4 + 5 + 6....to infinity = -1/12
My head hurts too now. I don't think I'll be getting any sleep tonight.
-Phil
I think what you are suggesting is equivalent to making a right angled triangle with a hypotenuse of length 1 and the other two sides of length one over the square root of two. As you say it basically moves the problem elsewhere!
I must point out though that irrationality is a lot deeper than our number representation.
We have integers, 1, 2, 3 .....
We have rational numbers. Rationals can be expressed as p/q where both p and q are integers. 1/2, 2/ 3, and so on.
We have irrational numbers. The Irrationals cannot be expressed as an integer divided by an integer.
Now the deep part...
It turns out that there is an infinite number of integers. There is also an infinite number of rationals. The weird part here is that you can count all the rationals with integers, that is to say that the infinity of rationals and the infinity of integers is the same size.
But, weirder and weirder, you cannot count the number reals with integers. There are infinitely more reals than integers or rationals!
Thank you Georg Cantor for that extra headache inducing thought.
So, this whole square root of two thing is much deeper than our choice of number representation.
BTW, I just finished watching the four-part series, "The Story of Maths" from BBC. Highly recommended! If you want to learn more about Cantor's enumeration of infinities, see this:
-Phil
Yes indeed.
Only I'm now pretty much convinced nature is not analogue. Everything is quantized. Certainly in electronics you either have an electron or you don't.
Cantor's diagonal method and his alephs and the Hilbert Hotel have been giving me headache for decades now. I think I would sleep better at the Bates Motel
It is really nice to have a decent conversation... and I only barely skimmed page one, but I think I need to clarify a little.
I love science. I am in no way a physicist, but I have brushed up against some really good ones. So, I do have that kind of experience to rely on and I have given the subject a fair amount time.
I think we are all largely agreeing about almost everything... even when it seems like we aren't.
Many years ago, I was getting pleasantly toasted with my favorite scientist, Bob, at a very nice bar in Urbana... and I said, "you know this whole business of physical laws is really bothering me." and so we started ... we listed every physical law that we knew and proceeded to explain why it was wrong. He would do a few, I would do a few and before long, there weren't any laws left standing.
He asked me to give a lecture on the subject to his ESR lecture and I did!!!
Bob's lesson for the students was that we use physical theories as a context to explore nature,
but don't mistake any of them for being true... because they aren't. They are constructs that help us make progress. Bob taught his student this because that is what he was taught by his mentor, an historically important physical chemist.
We didn't destroy physics in our drunken haze... the actual reality is that every physical law has a "domain" of truth. So long as this is understood, a law provides good insight into what can be expected within that domain, but step outside of that domain and use that same law, and you are asking for trouble.
I have two major problem with gr.
1) The "twins paradox" if that is what I described in my first post.
What I described isn't a paradox to me. To me it is an inherent contradiction in the thought process that is encapsulated in the theory. A theory that contradicts its own postulates is a problem for me.
2) E=mc^2.... which looks to my mind like E=mv^2... where v becomes e. This is a serious issue.
I talked to Bob about this as well... E=mv^2 is the equation for kinetic energy. Bob chimed in at that point in the conversation that sometimes it is written E=mv^2/2:)
This to me this indicates that Einstein actually believed that energy is conserved. I wish Newton had been there to explain to Einstein why this simply cannot be true.
Here is the reason:
Except for specialized cases, you can conserve momentum or you can conserve energy... but you cannot conserve both energy and momentum at the same time (and by same time... I actually mean the same experiment or physical context) Nature forces you to pick one. If you pick the wrong one, things get screwed up and if everyone believes you, they can stay screwed up for a very long time.
I tend to think that the conservation of momentum is closer to the vast majority of cases... overwhelmingly ... essentially almost always. If I were to pick a principle to be at the center... it certainly wouldn't be based upon energy. Most of the time it isn't conserved. You cannot have a physical theory without conservation... it leaks stuff all over the place. It isn't actually a physical theory. Conservation is one of those first principles. If you don't have it, you don't have a physical theory. You have a religion.
The deeper truth is that at some level even the concept of momentum kind of loses its meaning... don't ask me exactly where or how... and you can't ask Bob... he's dead. But somewhere deep down in the ethernet... even momentum cannot be used as a meaningful construct. I seem to remember that at some point you have no choice but to use something else, "spin?" I think that's what it was.
But of course, spin isn't actually spin in any conventional way, at least that's what Bob said:)
Shannon's theory.
information
H=-Sum p(i)log(base2)p(i) as p goes from 1 to N.
where p(i) is the frequency of some occurrence.
And please be serious... no making fun of me. I'm a little sensitive about this.
Why mix bases? Why take a base 10 statistic... and mix it with base 2?
I don't get it... why isn't the frequency expressed as a base two statistic?
I don't have link, but there is a video of Feynman's first public presentation of quantum electrodynamics, which was in New Zealand. Someone here will know exactly where to find it. It is a classic. Listen to the questions... they were not very happy with our hero:)
You have to remember, everything changed after the end of the War in the Pacific. Physics was in a strange place... having information and theory that could (in the wrong hands) destroy the world.
I view Feynman's work as the best publicly available work that won't get us all killed.
But that doesn't mean that it is not also a poor reflection of clearer levels of thought that are still not ready for public consumption.
For a brief time. I employed a lady physicist, who was at the same time doing passivation work for the Navy. She was pretty blunt about the assertion I just made:) I hired her, because her hubby had done some work using radio signals to stimulate retinal responses... he could make his lady see all sorts of things. Once I had hired him and finally understood exactly what he was doing... I lost interest.
The title of Feynman's book might be a statement of fact.
https://en.wikipedia.org/wiki/Surely_You%27re_Joking,_Mr._Feynman!.."
I'm not sure I can qualify the quality of thought.
I am sure I don't care, content to travel, experience, and enjoy. It's a big place we live in, and it's full of wonders and toys. I like to do things that make me happy, and money and it's all quite good enough.
Maybe, I can't care as you suggest too.
Feynman has the most value, to me, in how he made the method a part of him, yet somehow managed to also cultivate that young sense of wonder about it all. He lost a part of that in Los Almos. When one looks, it's there, plain as day, and I'm not speaking to the other, obvious and difficult loss of his wife.
Let's just say I don't doubt you.
I had a similar sort of experience with a primate researcher. I asked one of those kinds of questions. Innocent really, just idle curiosity. Had no idea. Just unlucky enough to hit close to a mark.
I didn't let her finish, nor did she press. I'm happy for that.
The concept of thermodynamic entropy preceded information theory. Originally the construct was intended to represent disorder... the eventual movement of any state toward randomness. At the time the "world view" was that disorder was natural and to be expected... that highly ordered states, such as life were possible only because it was allowed by effect of statistical probabilities... anything however unlikely can and probably will eventually occur, but no matter because eventually entropy wins. That was the thermodynamic explanation of life and the dim view of our future. It was a law(probably still is:).
Then guys like Shannon came along and demonstrated that almost the exact same statistics... minus the energies... could be used to explain the transmission of information... in which case information is maximized when all probabilities are equal...
So you have a problem... the same statistics, one describing the end of the Universe as we know and the other describing our conversations on the telephone... minus the energy terms of course:)
So information is entropy. Entropy is disorder. Order is disorder. Where is the problem?
If you look at long term immunity... it is mostly the job an immunoglobulin called IgG.
IgG has two binding spots ... they look like scruntched up stubby arms. They are all the same size and not very big.
If you look at the binding sites, they are built up of amino acids... it's a protein. Since you have a fixed sized protein, there is a limit to the number of combinations of amino acids which are possible. By treating each unique type as it's own species, information statistics can be easily applied.
That means that we should be able measure the total amount of immunologic information that is being used up at any time. When an infection or tumor occurs, the overall heterogeneity of immunoglobins drops... More is produced of selected species. The overall information can go up because there is the same amount of immunoglobulin of the old type and more of the new specific type or total information can go down because the immune system is fatigued in some way.
We aren't doing it. If you carried the idea forward a little, you could figure out who was going to catch the next virus.
The technology has been available for at least 15 years.
It used to be possible for individuals to do work like this... today, it is almost impossible.
@all: fun, thought provoking thread. Thanks.
Oh yes, I have had many such toasted debates with many Bobs over the decades. Some actual physicist Bob's and many layman Bob's. Seems everybody likes to ponder the nature of reality, the lack of a deep appreciation of the mathematics of physics is always a problem. For me as well, despite having studied physics for 4 years and been up to my eyeballs in quantum mechanics, Bose-Einstein statistics and so on at the time. Quite so.
As a simple example, around here we use Ohms law a lot, V = I * R, to calculate voltages and currents in resistors. That's nice and easy and demonstrably works.
Until you notice that it's not exactly correct, at higher currents the voltage is higher than predicted for a given current. Something else is going on. At higher currents more power is dissipated, the temperature of the resistor rises, that changes it's resistance.
At the extreme the resistor explodes, our Ohms law is totally busted, something else is going on there!
At the low end, if we are looking at really low currents we will find a lot of noise coming out of our resistor. Our Ohm's law says nothing about that. There is something else going on. It's thermal noise and such.
Similarly Newtonian mechanics breaks down at really high speeds, or in high gravitational fields, or at the very small scale. Einsteins General Relativity takes care of the first two. We need quantum mechanics for the latter.
General Relativity also breaks down at the extremes.
None of this makes our Ohms Law or Newtonian mechanics wrong, it's just that there are other effects they don't take into account going on.
The singularity is still very small but also very fast and occupies all space at the same time due to being very fast, it's also why I and everything else exists physically, yet my perception is merely being tuned to a particular frequency or bunch of frequencies that I can react with.
Although the particle is very fast there are areas where the tuning isn't perfect aligned and doesn't interact in so far as I or for that matter "everyone else" are concerned, these are areas where there are voids or "Dark Matter".
The exclamation marks are intended for reality purposes ie do we actually exist? Is reality real? or are we just interactions of energy .
I believe I am typing on a keyboard and posting on a forum but am I??
The Twins Paradox comes out of special relativity. In SR the twins are travelling at constant speed. They never get back together. They see each other ageing at different rates in an apparently contradictory manner. There is no contradiction here because they never get back together to be in the same place and same frame of reference to compare their ages. The contradiction does not matter, it does not happen in physical reality.
If the twins do get back together then at least one of them has been changing his inertial reference frame. He was going away, then he was coming back. Take all that into account and we have an accurate prediction of which twin will have aged most when they get back together.
I think a lot of this "paradox" comes about because relativity predicts that the rate at which a clock ticks depends on the speed of the observer. But we still tend to look at the results with a Newtonian idea of a universal time that is the same everywhere and then get very confused.
This has been demonstrated many times. Using atomic clock "twins" flying around the world on air liners. Our GPS system relies on General Relativity to take into account the different ageing rates of the satellite "twins". Otherwise the accuracy of GPS would be terrible.
The kinetic energy of a moving body is E = mv²/2 NOT E = mv²
The equivalence of mass and energy E = mv² is something else. Let's not go there...yet.
Bob must have been well toasted I'm not sure what you mean there. Energy is conserved. Newton and Einstein would both agree on that. As would every other physicist.
Except of course Einstein showed the equivalence of mass and energy, E = mc². Still energy is conserved, sometimes in the form of mass. Wait another minute. Something is very wrong here.
Both energy and momentum are conserved. At the same time. In the same system.
Ten minutes playing with a Newton's Cradle or playing billiards will convince you of that.
Consider this:
If you slam two balls into the stationary balls of a Newton's Cradle then two balls come flying out the other end at the same speed. Slam three in and three come out. And so on. Well, why is not that only one ball comes flying out the other end at some higher speed, thus conserving energy and or momentum? Or why not more balls coming out than went in, at lower speeds to conserve energy and/or momentum? Well, it's because to make the right prediction you have to assume conservation of energy and momentum at the same time. Then you see that the only thing that can happen is two in two out, three in three out, etc. You could put it like that. That's why all of physics adheres to conservation laws. Energy, momentum, information etc. Well, yes. Now we get into the wonderful world of quantum mechanics. All those conservation laws still apply. But now you have to take into account the quantized nature of everything...
In that equation p(i) is the probability of occurrence of the i-th possible value of the source symbol. As wikipedia puts it. p(i) is therefore a number form zero to one. It really does not mater what number base we use for that number. Why would it? It's just a fraction of 1 however we represent it.
So why use log base 2 you ask?
No particular reason. Except we can define information as the number of yes/no questions required to determine the state of a thing. Each of those yes/no questions is a "bit" of information. That is the smallest unit of information we can think of. All of this is familiar to computer users.
Well, yes/no "bits" can be represented as 0 and 1. Which fits with the binary number system. That naturally leads to using log to the base 2.
The end result of the equation as you wrote it, using log base 2, gives a result in bits per symbol. There is no reason you cannot natural logs, base e, in which case your result is in "nats" per symbol. Or more famously you can use log base 8 which results in "bytes" per symbol.
All you are doing by changing the base of the log is changing the units of the result. A simple scaling, multiplication, can turn any one of those results into any other.
[...entropy...]
Somebody once wrote that "Thermodynamics owes more to the steam engine than the steam engine does to thermodynamics"
When they were developing the steam engine they had a problem. At one end you had energy in the form of hot steam under pressure. At the other end you had work being done by a rotating shaft. Problem was that the energy coming out was less than the energy going in. That's expensive. How to make work cheaper?
That problem leads to wondering where the missing energy went. It seems to have disappeared somewhere. But we like energy to be conserved. Well it's gone into heat along the way, it has become hidden from us. Entropy is a measure of that hidden stuff. There is a thought experiment about this that I quite like:
Take a bucket and half fill it with white sand. Now fill the other half with red sand. Now, we can imagine that we start swapping those white sand grains among each other and swapping those red sand grains among each other. There are billions and billions of ways to rearrange the grains like this, swapping red for red and white for white. No matter what we do we still have a bucket half filled with white and half filled with red.
Now, take a stick and start stirring up the sand. Soon enough we will have what looks like a bucket of pink sand, as red grains have moved in to the white half and vice versa. We can continue stirring for a very long time and we will always have pink sand.
Why is this? Because there are even more billions and billions of ways of rearranging the sand grains to get what looks like pink if we don't care what colours we are swapping around. Far more than just stirring up the red by itself or the white by itself as we did before.
Ergo, the probability of ever getting back to what looks like a half red, half pink, bucket of sand is vanishingly small.
The pink sand situation has a higher entropy than the half white, half red, situation.
It's not that can never happen, it's just really unlikely.
In modern times physicists like to say "Entropy almost always increases" The idea that ordered states like life are somehow working against the principal of entropy always increasing is oft quoted.
The thing to remember is that whilst a life form may look like it is highly ordered and contrary to entropy increasing the mistake is to be just looking at the life form. One has to consider the system as a whole. That life form is consuming energy, it's increasing the entropy of it's environment. Nothing says you can't decrease the entropy of a given volume of stuff. Like when you put your beer in the fridge. But you do so at the expense of increasing entropy elsewhere.
In my younger days I always thought gravity was what decreased entropy some how. Gravity makes ordered systems by sucking the heavy stuff to the bottom and letting the light stuff float on top. We end up with ordered layers of different stuff.
Then I found Hawkins agreed. He said that ultimately gravity sucks everything around into a black hole and then all the information/entropy you have is lost. A black hole is simple specified by it's mass, charge and spin. Not much information there is there?
Then I found Susskind showed that a black hole can indeed have entropy, it has a temperature. Information is not lost.
Grrr...I don't know...
Man this is a long physics session! I need a beer.
Yes. And the name of every single tax payer who has contributed to the billions of Euros it all cost.
It's quite amazing really. After a few decades and billions of Euros and millions of man hours we end up with the biggest single machine on Earth. Then comes the presentation of the evidence for the existence of the Higgs Boson. A graph on a Power Point slide with a barely noticeable bump where a bump would not be if there were no Higgs. Everyone cheers, Higgs and the old guys are reduced to tears. I'm looking at that tiny bump on the graph and wondering WTF?!
I am not complaining mind. I firmly believe we should all be prepared to throw a few dollars each every year into such fundamental research.
Over the years since then I slowly get some dim understanding of what they have been up to. It's all good stuff.
I think that was me, and no, of course not. But in something like the LHC, or LIGO, since that's where this started, the physics wouldn't happen without all the technology. If the technology is commodity stuff, like generic computers, then there's no reason at all to mention "the penguins". But in those projects, the technology is being pushed beyond anything that previously existed, and the teams that created the detectors, interferometers, etc. have perhaps been has critical to the success as the physicists. Shouldn't their names be on the paper as well?
Heater,
"Then I found Susskind showed that a black hole can indeed have entropy, it has a temperature. Information is not lost."
It is lost... because it is not available to us.
I suspect that large scale plasma's are self-organizing, where a self-organizing system is defined as any system that consumes information. That is large scale consumer based consumption:)
Beau,
Every time you post... you keep me up at night thinking about what you just said. Shame on you.
One of the many reasons that I took up electronics as a hobby was the idea of being able to play with electrons... I thought if I learned enough, I thought I could engage in some garage scale experiments about some ill-defined fundamental question. At this stage, I don't think I will ever know enough about electronics to get close... but I know where you guys are and that might be enough.
Thanks
Math needs to stop calling things circles, points, lines, squares or triangles. In math the use of any of these nouns requires the reader to postulate the impossible...something infinitely small. If we have learned anything from Cern it should be that there is always something smaller:) Why do we have fifth graders considering the infinitely small?
What is the harm in calling something "circular" or "linear" rather than a "circle" or a "line"... Those kids know about adjectives. And then fully explaining why we don't call them circles or lines anymore. What is the harm?
The minute you do that, you can then define Pi as a real number and talk about ways of actually determining it. Every circular structure is not actually a circle, it actually does have a circumference and a radius(or series of them) and Pi actually can be computed.
None of these things ever exist except in the minds of confused mathematicians.
We also should also stop using the equal sign. No thing ever equals any other thing exactly. When I was in first grade, I completely stopped listening to my teacher(whom I loved) the first time she explained the equal sign. I remember thinking to myself...
Honest to God... on my mother's grave... this is true:
"If that is how you people are using numbers, no wonder the world is so screwed up." All because she said that the equals sign meant that whatever was on one side of the equal sign was exactly the same as what was on the other side... And I could easily see that this was not true. The two sides are always different. I knew what she was trying to say, but I also knew what she was saying and I found what she did say to be absolutely abhorrent.
Why risk it... get rid of the equal sign... never talk about equality... always talk about transforms.
Easy peazy, problem solved:) True mathematics is born.
What you postulate is a carpenter's dream, but mathematicians are not carpenters. I have no problem with abstract concepts dealing with the impossibly small -- or large -- nor to teaching those concepts to students. I grew up learning those concepts as part of the "modern math" curriculum of the late '50s and early '60s. It was as exciting back then as it should be today.
There's nothing wrong with teaching math as an academic subject, rather than as a vocational one, which you seem to be proposing.
-Phil
I am on thin ice here... what I say above I have thought about for years. The square root of two has always bothered me, but I never had a thought about what should actually be done about it.
If this is wrong, I blame Phil. What Phil demonstrated is something most of us have thought about... it just isn't in the front part of our brains most of the time.
There really is no need for the square root of two.
No-one ever measures anything and reports back that the measurement is the square root of 2.
Taking Phil's example... a rightly sided triangular object can only have an hypotenuse that measures the square root of two if you have chosen the wrong unit:) So, in True Math we could have a rule... If your answer involves the square root of two, you have either measured the wrong side or used the wrong unit... it doesn't matter if it is true. It simply gets rid of a number, which we really don't like. In any case you get no credit for the answer.