Isn't there a max filepath length? Or does find not ever deal with that and just deal in terms of building its own stack of inodes or something like that?
That’s what PATH_MAX is. It’s the size of the buffer used for paths - commonly 4096 bytes. You can’t navigate directly to a path longer than that, but you can navigate to relative paths beyond that (4096 bytes at a time).
One of my sisters had four boys (and no girls) and during summers they would drive her crazy with their boredom. When they were about ages 8-14 one summer she said: go in the back yard and see how big of a hole you can dig.
Wide-eyed they said: really? She said yes, dig as much as you want, but the only rule is it all gets filled in before school starts in the fall. 30 years later they say it was the best summer ever. Every day they were working on it and all of their friends would come by and help dig and plan what development would come next.
How deep did they get? Hope she kept an eye on it, unsupported holes quickly get dangerous, people underestimate how much weight is in the soil if the sides give out and just how dangerous that amount of weight moving can be.
It was more sprawling than deep. It was a series of trenches connecting "rooms". I know they also had "water features" at some points, but the water would soak into the ground pretty quickly then be a mess for a few days, so they didn't do that.
No collapses happened and everyone is still alive. :-)
Happy that all's well that ends well, but for any parents considering this, trench collapses have killed hundreds of workers in just the past decade. Anything deeper than a couple feet might be a hazard that needs to be mitigated.
Also depends on the local geology and how tenacious the kids are. I couldn't dig to dangerous depths as a kid even if I wanted to because the soil got way too rocky to dig through before 3 feet down.
Why not lean into it instead of becoming a wet blanket? Just look at the trench every few hours or so, and if it gets too deep, tell them about and help them with setting up some shoring.
He ended a critical commentary by suggesting that the author he was responding to should think more critically about the topic rather than repeating falsehoods because "they set off the tuning fork in the loins of your own dogmatism."
> "they set off the tuning fork in the loins of your own dogmatism."
Eh... I don't know. To me, that sounds very AI-ish.
Claude is very good -- at times -- coming up with flowery metaphoric language... if you tell it to. That one is so over-the-top that I'd edit it out.
Put something like this in your prompt and have it revise something:
"Make this read like Jim Thompson crossed with Thomas Harris, filtered through a paperback rack at a truck stop circa 1967. Make it gritty, efficient, and darkly comedic. Don't shy away from suggesting more elegant words or syntax. (For instance, Robert Howard -- Conan -- and H.P. Lovecraft were definitely pulp, but they had a sophisticated vocabulary.) I really want some purple prose and overwrought metaphors."
Occasionally you'll get some gems. Claude is much better than ChatGPT at this kinda stuff. The BEST ones are the ever-growing NSFW models populating huggingface.
In short, do the posts on OpenClawForum all sound alike? Of course.
Just like all the webpages circa 2000 looked alike. The uniformity wasn't because of HTML... rather it was because few people were using HTML to its full potential.
I'm learning to like 'em more, along with every other human idiosyncracy. Besides, it makes a kind of sense, the idea of some resonance occuring in one's gusset. Timber timbre. Flangent thrumming.
I thought it was more creative than sloppy. Don't forget that many ordinary phrases were once jarring mixed imagery. To "wear your heart on your sleeve" was coined by Shakespeare; we still use it because it "stuck" due to its unorthodox phrasing.
If you like your prose to be anodyne, then maybe you like what AI produces.
The Barnes & Noble Bookstore (at least the two or three I have been to in the past 10 years) has a single queue. Fry's Electronics did it that way. The self-pay corral at HEB (a huge Texas grocery chain) with about 14 check-out stations does it that way. The Academy Sporting Goods store near me does it that way. The Austin Bergstrom Airport security gates are that way.
I agree that many places have a queue for each registers, but the other way isn't entirely rare.
Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
The hint from quantum field theory (and things like lattice gauge theory) is that charge emerges from interesting topological states/defects of the underlying field (by "interesting topological shapes" I mean - imagine a vortex in the shape of a ring/doughnut). It's kind of a topological property of a state of the photonic field, if you will - something like a winding number (which has to be an integer). Electric charge is a kind of "defect" or "kink" in the photonic field, while color charge (quarks) are defects in the strong-force field, etc.
When an electron-positron pair is formed from a vacuum, we get all sorts of interesting geometry which I struggle to grasp or picture clearly. I understand the fact that these are fermions with spin-1/2 can similarly be explained as localized defects in a field of particles with integer spin (possibly a feature of the exact same "defect" as the charge itself, in the photonic field, which is what defines an electron as an electron).
EDIT:
> However, there are no theories why this is -- they are simply measured and that is it.
My take is that there _are_ accepted hypotheses for this, but solving the equations (of e.g. the standard model, in full 3D space) to a precision suitable to compare to experimental data is currently entirely impractical (at least for some things like absolute masses - though I think there are predictions of ratios etc that work out between theory and measurement - sorry not a specialist in high-energy physics, had more exposure to low-energy quantum topological defects).
> something like a winding number (which has to be an integer). Electric charge is a kind of "defect" or "kink" in the photonic field, while color charge (quarks) are defects in the strong-force field, etc.
Right, but then you have the questions of 1) why do leptons have (a multiple of) the same fundamental unit as quarks, and 2) why does that multiple equal the number of quarks in a baryon, so that protons have a charge of exactly the same magnitude as electrons?
I mean, I guess you could say that charge comes from (or is) the coupling of the quark/lepton field to the electromagnetic field, and therefore if it's something that's quantized on the electromagnetic side of that, then quarks and leptons would have the same scale. I'm not sure that's the real answer, much less that it's proven. (But it might be - it's a long time since my physics degree...)
Technically, the charge of a proton can be derived from its constituent 2 up quarks and 1 down quark, which have charges 2/3 and -1/3 respectively. I'm not aware of any deeper reason why these should be simple fractional ratios of the charge of the electron, however, I'm not sure there needs to be one. If you believe the stack of turtles ends somewhere, you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
There does appear to be a deeper reason, but it's really not well understood.
Consistent quantum field theories involving chiral fermions (such as the Standard Model) are relatively rare: the charges have to satisfy a set of polynomial relationships with the inspiring name "gauge anomaly cancellation conditions". If these conditions aren't satisfied, the mathematical model will fail pretty spectacularly. It won't be unitary, can't couple consistently to gravity, won't allow high and low energy behavior to decouple,..
For the Standard Model, the anomaly cancellation conditions imply that the sum of electric charges within a generation must vanish, which they do:
3 colors of quark * ( up charge 2/3 - down charge 1/3) + electron charge -1 + neutrino charge 0 = 0.
So, there's something quite special about the charge assignments in the Standard Model. They're nowhere near as arbitrary as they could be a priori.
Historically, this has been taken as a hint that the standard model should come from a simpler "grand unified" model. Particle accelerators and cosmology hace turned up at best circumstantial evidence for these so far. To me, it's one of the great mysteries.
So they have to cancel, or we don't have a universe? ("Have to" not because we need electrical neutrality for large-scale matter - though we do need that - but because you can't build a quantum field that doesn't explode in various ways without it.)
There's always some risk of confusing the model with the reality, but yeah, if you have chiral fermions interacting through gauge fields and gravity, the charges have to say satisfy all of the anomaly cancellation conditions (there's about half a dozen) or the model will be inconsistent.
I'm aware of the charge coming from quarks, but my point remains.
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
When the probability of coincidence is epsilon, then, no. Right now they are the same to 12 digits, but that undersells it, because that is just the trailing digits. There is nothing which says the leading digits must be the same, eg, one could be 10^30 times bigger than the other. Are you still going to just shrug and say "coincidence?"
That there are 26 fundamental constants and this one is just exactly the same is untenable.
I think I agree with you. It could be just a matter of static bias or some other fairly simple mechanism to explain why these numbers are the same.
Imagine an object made of only red marbles as the 'base state'. Now you somehow manage to remove one red marble: you're at -1. You add a red marble and you're at +1. It doesn't require any other marbles. Then you go and measure the charge of a marble and you and up at some 12 digit number. The one state will show negative that 12 digit number the other will show positive that 12 digit number.
Assigning charge as being the property of a proton or an electron rather than one of their equivalent constituent components is probably a mistake.
If you imagine the universe is made of random real fundamental constants rather than random integer fundamental constants, then indeed there's no reason to expect such collisions. But if our universe starts from discrete foundations, then there may be no more satisfying explanation to this than there is to the question of, say, why the survival threshold and the reproduction threshold in Conway's Game of Life both involve the number 3. That's just how that universe is defined.
Why do you assume the two have to be small integers? There is nothing currently in physics which would disallow the electron to be -1 and the proton to be +1234567891011213141516171819. The fact they are both of magnitude 1 is a huge coincidence.
I'm not assuming they have to be small integers—I'm saying that if the universe is built on discrete rather than continuous foundations, then small integers and coincidences at the bottom-turtle theory-of-everything become much less surprising. You're treating the space of possible charge values as if it's the reals, or at least some enormous range, but I consider that unlikely.
Consider: in every known case where we have found a deeper layer of explanation for a "coincidence" in physics, the explanation involved some symmetry or conservation law that constrained the values to a small discrete set. The quark model took seemingly arbitrary coincidences and revealed them as consequences of a restrictive structure. auntienomen's point about anomaly cancellation is also exactly this kind of thing. The smallness of the set in question isn't forced, but it is plausible.
But I actually think we're agreeing more than you realize. You're saying "this can't be a coincidence, there must be a deeper reason." I'm saying the deeper reason might bottom out at "the consistent discrete structures are sparse and this is one of them," which is a real explanation, but it might not have the form of yet another dynamical layer underneath.
It's simple to say "Ah well, it's sparse" that doesn't mean anything and doesn't explain anything.
Symmetries are equivalent to a conserved quantity. They exist because something else is invariant with respect to some transformation and vice versa. We didn't discover arbitrary constraints we found a conserved quantity & the implied symmetry.
"There are integers", "the numbers should be small" all of these are nothing like what works normally. They aren't symmetries. At most they're from some anthropic argument about collections of universes being more or less likely, which is its own rabbit hole that most people stay away from.
Perhaps only visible matter is made up of particles with these exactly matching charges? If they did not match, they would not stay in equilibrium, and would not be so easily found.
You seem to be contradicting yourself, having already said:
>I'm aware of the charge coming from quark
So it's not +huge_number because the number of quarks involved is small. Sure we still don't understand the exact reason, but it's hardly as surprising that, uh, charge is quantized...
Well yes, but the coincidence that Quarks have charges of multiples of another particle, that is not made up of quarks, should rise your brow, shouldn't it?
Like we could accept coincidences if at the bottom is all turtles, but here we see a stack of turtles and a stack of crocodiles and we are asking why they have similar characteristics even if they are so different.
Whence your confidence? As they say in math, "There aren't enough small numbers to meet the many demands made of them." If we assume the turtle stack ends, and it ends simply (i.e. with small numbers), some of those numbers may wind up looking alike. Even more so if you find anthropic arguments convincing, or if you consider sampling bias (which may be what you mean by, "in stable particles that like to hang out together").
Which makes every constant fair game. Currently, we don’t have a good process for explaining multiple universes beyond divine preference. Hence the notion that a random number settled on mirror whole sums.
This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
For example, pair production is:
photon + photon = electron + (-)electron
You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon:
electron + photon = electron - photon
There are similar formulas for beta decay, which is:
proton = neutron + electron + (-)neutrino
You can also "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.
One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation in 3D space by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges, and a simple additive algebra over these charges match Feyman diagrams. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.
I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.
> This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
But again, this is just observation, and it is consistent with the charges we measure (again, just observation). It doesn't explain why these rules must behave as they do.
> This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
This is exactly what I am suggesting in my original comment: this "coincidence" is not a coincidence but falls out from some deeper, shared mechanism.
Sure, but that's fundamental to observing the universe from the inside. We can't ever be sure of anything other than our observations because we can't step outside our universe to look at its source code.
> It doesn't explain why these rules must behave as they do.
Not yet! Once we have a a theory of everything (TOE), or just a better model of fundamental particles, we may have a satisfactory explanation.
For example, if the theory ends up being something vaguely like Wolfram's "Ruliad", then we may be able to point at some aspect of very trivial mathematical rules and say: that "the electron and proton charges pop out of that naturally, it's the only way it can be, nothing else makes sense".
We can of course never be totally certain, but that type of answer may be both good enough and the best we can do.
As soon as charge is quantized, this will happen. In any quantization scheme you will have some smallest charge. There are particles with charge +2 (the Delta++, for example), but ... anything that can decay while preserving quantum numbers will decay, so you end up with protons in the end. (ok, the quarks have fractional charge but that's not really relevant at scales we care about QED)
If the question is, why is quantum mechanics the correct theory? Well, I guess that's how our universe works...
One argument (while unsatisfying) is there are trillions of possible configurations, but ours is the one that happened to work which is why we're here to observe it. Changing any of them even a little bit would result in an empty universe.
There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.
And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.
I agree with OP. The unexplained symmetry points to a deeper level.
I was born to this world at a certain point in time. I look around, and I see environment compatible with me: air, water, food, gravity, time, space. How deep does this go? Why I am not an ant or bacteria?
That's some interesting/wacky stuff, but there has been more research to improve those calculations - like deriving the electron charge and magnetic moment.
Personally I like the idea that a proton is somehow literally an electron and 3 up quarks (a neutron gets 2 electrons and 3 up quarks). I am not a physicist though, so I'm sure there are reasons they "know" this is not the case.
I find it fascinating that some physicists say wave functions are somehow "real" and then we've got Jacob Barandes saying you don't even need wave functions to do the computations of QM:
https://www.youtube.com/watch?v=7oWip00iXbo
IMHO there is a lot of exploration to be done without particle accelerators.
How do you explain that electrons have a rest mass, but photons don't (otherwise photons couldn't move with the speed of light according to special relativity)?
Because what we see as a photon is a the one bozon left without a pair of one of the four pre-Higgs bozons that exist prior to the electroweak symmetry breaking. That's how all of them get mass.
An interesting early theory of gravity was: "What if opposite charges attract slight more strongly than identical charges repel each other?"
If you tally up the forces, the difference is a residual attraction that can model gravity. It was rejected on various experimental and theoretical grounds, but it goes to show that if things don't cancel out exactly then the result can still leave a universe that would appear normal to us.
Agreed (well, assuming the delta is more than a small fraction of a percent or whatever). But this is begging the question. If they are really independent then the vast, overwhelming fraction of all possible universes simply wouldn't have matter. Ours does have matter, so it makes our universe exceedingly unlikely. I find it far more parsimonious to assume they are connected by an undiscovered (and perhaps never to be discovered) mechanism.
Some lean on the multiverse and the anthropic principle to explain it, but that is far less parsimonious.
Also note that the proton is not an elementary particle so it is really a question of "are the various quarks really 1/3, 2/3 of an electron charge".
Crackpots have found thousands of formula that try to explain the ratio of the proton to electron mass but there is no expectation that there is a simple relationship between those masses since the proton mass is the sum of all sorts of terms.
Crackpots are downstream of the "physics community" awarding cultural cachet to certain types of questions -- those with affordances they don't necessarily "deserve"-- but not others.
(I use quotes because those are emergent concepts)
Same as "hacker community" deciding that AI is worth FOMO'ing about
Well, I'm not sure I believe that "hierarchy problems" in HEP are real, but I do think the nature of the neutrino mass is interesting (we know it has a mass so it is a something and not a nothing) as is the nature of dark matter, the matter-antimatter asymmetry, and the non-observation of proton decay. That article has nothing to say about non-accelerator "big science" in HEP such as
As for the "hacker community" I think AI is really controversial. I think other people find the endless spam of slop articles about AI more offensive than I do. It's obvious that these are struggling to make it off the "new/" page. The ones that offend me are the wanna-be celebrity software managers [1] who think we care what they think about delivering software that almost works.
[1] sorry, I liked DHH's industry-changing vision behind Ruby-on-Rails, but his pronunciations about software management were always trash. You might make the case that Graham worked with a lot of startups so his essays might have had some transferable experience but they didn't. Atwood and Spolsky, likewise. Carmack is the one exception, he's a genius
For a given calculation on given hardware, the 100th digit of a floating point decimal can be replicated every time. But that digit is basically just noise, and has no influence on the 1st digit.
In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.
Or why the quarks that make up protons and neutrons have fractional charges, with +1 protons mixing two +2/3 up quarks and one -1/3 down quark, and the neutral neutron is one up quark and two down quarks. And where are all the other Quarks in all of this, busy tending bar?
They have fractional charges because that is how we happen to measure charge. If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.
> If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Actually, I doubt it. Because of their color charge, quarks can never be found in an unbound state but instead in various kinds of hadrons. The ways that quarks combine cause all hadrons to end up with an integer charge, with the ⅔ and -⅓ charges on various quarks merely being ways to make them come out to resulting integer charges.
Isn’t charge quantized? Observable isolated charges are quantized in units of e. You can call it -3 and +3 but that just changes the relative value for the quanta. The interesting question is still why the positive and neutral particles are nonelementary particles made up of quarks with a fraction of e, the math made possible only by including negatively charged ones (and yet electrons are elementary particles).
Well OK then! Let's tell all the physicists they can close up shop now. They might not have realized it, but they're done. All their little "theories" and "experiments" and what not have taken them as far as they can go.
> Let's tell all the physicists they can close up shop now.
Yes, that's part of the plan. I mean, not to all the physicists, just to those whose work doesn't bring in results anymore, and it hasn't for 30 to 40 years now. At some point they (said physicists) have to stop their work and ask themselves what it is that they're doing, because judging by their results it doesn't seem like they're doing much, while consuming a lot of resources (which could have been better spent elsewhere).
We're already in the realm of virtual particles, instantaneous collapse, fields with abstract geometric shape and no material reality, wave particle duality, quantized energy etc. The project of physics was to discover what the universe was made of. None of these things can answer that. If intelligibility was the goal, we lost that. So in an important sense, they might as well have closed up shop. If you're interested in the specific value of a certain property to the nth decimal place, there is work to do, but if you're interested in the workings of the universe in a fundamentally intelligible sense, that project is over with. What they're doing now is making doodles around mathematical abstractions that fit the data and presenting those as discoveries.
By observing the discrepancies between theories we are accessing those layers. Whether we can access them with instruments is a different matter but with our minds we apparently can.
Around 2001 I was working at Broadcom's networking division in San Jose. The switch chip we were working on (10Gbps x 8 ports) was understaffed and we hired a contractor at $120/hr to do verification of the design. He was pretty young, but he came across as confident and capable, and every weekly meeting he was reporting good progress.
Unfortunately we weren't reviewing his work, just trusting his reports, as we were overworked getting our own parts done. After a three months of this I said to the project lead: something smells wrong, because he hasn't filed a single bug against my design yet.
So we looked at his code, lots of files and lots of code written, all of it plumbing and test case generation, but he hadn't built the model of the chip's behavior. At the heart of it was a function which was something like:
bool verify_pins(...) {
return true;
}
We asked him what was going on, and he said he was in over his head and had been putting off the hard part. Every morning was lying to himself that that was the day we was going to finally start tackling building the model for the DUT. His shame seemed genuine. My boss said: we aren't paying you for the last pay period, just go away and we won't sue you.
My boss and I literally slept at work for a month, with my boss building the model and fixing other TB bugs, and I addressed the RTL bugs in the DUT as he found them.
Bible scholar Dan McClellan is on youtube and does short videos rebutting popular youtube/tiktok videos that make claims that aren't historical. Dan has said that the four names were not assigned to the texts until the second half of the 2nd century, probably around 180CE or so. That leaves 80-100 years where the books were in circulation before the naming convention was established.
The subject of authorship comes up frequently so he has addressed it a few times, but here is a short (under 7 minute) video. It isn't just an assertion, he gives reasons why he makes these claims:
There’s a big difference between the gospels not being cited by name directly, and not having a name. For example, the Gospels often cite Isaiah without using his name - just lifting direct quotes.
And there’re allusions to apostolic naming in things like Justin Martyr’s first apology, Ch67 (155CE, dating largely from it being co-addressed to Marcus Aurelius):
“ the memoirs of the apostles or the writings of the prophets are read, as long as time permit”
"Data over dogma." Dr. Dan McClellen is an engaging source for historically accurate interpretations and understandings of the bible. I encourage others to check his content out.
I wonder how he would feel about a competitor putting a flock-like camera outside his house so that anyone who wants to can learn whenever any car, perhaps his car, enters or leaves his home driveway.
Would he be happy with this, or would he become a "terrorist" by objecting?
This sounds like how curses did things, a 1980 technology.
On the other hand, if the guy in the video ran his app over a remote connection with limited bandwidth, diffing would probably perform better. I have a one Gbps google fiber connection to my job but at times my vpn bandwidth can choke down to a couple hundred kbps and sometimes worse.
reply