Once that 10x developer velocity from AI kicks in, I'm sure github stability improves. Did you know AI finally makes it economical to fix all the little bugs?
I suppose they're unsanitary in the same way all animals that aren't humans are: They don't was their hands? Cats don't strike me a particularly dirty creatures. They're not exactly clean and well groomed from nature, but no animal really is.
No. If I sent you $100 you'd probably send that $100 back, since you didn't expect then and have no reason to accept money from me. If I now go to a third party and don't tell them about how I sent you money, I have a legitimate transfer receipt for you sending me $100.
Its both a fraud on the third party, whom I have provided incomplete information. But also on you, who have become an unwitting accomlish in my scam, at least from the point of view of the third party.
That's what they meant by "round trip" transactions? Literally sending them a check and waiting for them to return it? And no other business relationship? And then lying about it using the received check?
That must be one of the least helpful new sentences I've read in a while. I thought it just meant they were seller financing all their customers by loaning them the money and they had gotten zero real revenue.
Well in this case it looks like it was just regular double sided fraud. They opened bank accounts in fake names and bought their own product to boost revenue. Much less interesting.
> an associate of Chidambaran, who previously worked as an iLearning vice president, incorporated and opened bank accounts in the names of several purported iLearning customers. Over the course of several years, the defendants transmitted millions of dollars from iLearning to an account controlled by this individual. This individual then sent those funds to other accounts he controlled in the names of other entities, before ultimately sending the money back to iLearning. The aggregate value of these round-trip transactions exceeded $144 million.
Eh, I remember the myriad of both "Torture Bin Laden" as well as "Torture George Bush" flash games in the early/mid 2000s. I think it's very on brand for indie developers.
When i attended university (almost a decade ago i guess, time flies) we didn't have a single exam on the computer. All exams were on paper or oral, most were without notes too. Computer science does not require computers.
> NASA spending has created a huge pile of technologies that we use day to day
We're a little too early to know if that's the case here too. I do foresee a chance at a reality where AI is a dead end, but after it we have a ton of cheap GPU compute lying about, which we all rush to somehow convert into useful compute (by emulating CPU's or translating traditional algorithms into GPU oriented ones or whatever).
If all AI progress somehow immediately halted, the models that have currently been built will still have more economic impact than the Internet.
Not least because the slower the frontier advances, the cheaper ASICs get on a relative basis, and therefore the cheaper tokens at the frontier get.
We have a massive scaffolding capability overhang, give it ten years to diffuse and most industries will be radically different.
Again, all of this is obvious if you spend 1k hours with the current crop, this isn’t making any capability gain forecasts.
Just for a dumb example, there is a great ChatGPT agent for Instacart, you can share a photo of your handwritten shopping list and it will add everything to your cart. Just following through the obvious product conclusions of this capability for every grocery vendor’s app, integrating with your fridge, learning your personal preferences for brands, recipe recommendation systems, logistics integrations with your forecasted/scheduled demand, etc is I contend going to be equivalent engineering effort and impact to the move from brick and mortar to online stores.
i feel a lot of people in tech have this incuriously deterministic attitude about llms right now… previous <expensive capital project> revolutionized the world, therefore llms will! despite there really nothing to show for it so far other than writing rote code is a bit easier and still requires active baby sitting by someone who knows what they are doing
They’re already far more useful than that, and I suspect harness engineering alone could add another OOM of productivity, without any underlying change in the models available today.
You have to agree that it's totally possible that none of those things you are envisioning getting built out actually end up working as products, right?
AI (LLM) progress would stop, and then everything people try to do with those last and most capable models would end up uninteresting or at least temporary. That's the world I'm calling a "dead end".
No matter how unlikely you think that is, you have to agree that it's at least possible, right?
> then everything people try to do with those last and most capable models would end up uninteresting
I believe that some of my made up examples won’t end up getting built, but my point is that there is _so much_ low hanging fruit like this.
Of course, anything is _possible_, but let’s talk likelihood.
In my forecast the possible worlds where progress stops and then the existing models don’t end up making anything interesting are almost exclusively scenarios like “Taiwan was invaded, TSMC fabs were destroyed, and somehow we deleted existing datacenters’ installed capacity too” or “neo-Luddites take over globally and ban GPUs”, all of this gives sub-1% likelihood.
You can imagine 5-10% likelihood worlds where the growth rate of new chips dramatically decreases for a decade due to a single black-swan event like Taiwan getting glassed, but that’s a temporary setback not a permanent blocker.
Again, I’m just looking at all the things that can obviously be built now, and just haven’t made it to the top of the list yet. I’m extremely confident that this todo list is already long enough that “this all fizzles to nothing” is basically excluded.
I think if model progress stops then everyone investing in ASI takes a big haircut, but the long-term stock market progression will look a lot like the internet after the dot com boom, ie the bloodbath ends up looking like a small blip in the rear view mirror.
I guess, a question for you - how do you think about coding agents? Don’t they already show AI is going to do more than “end up uninteresting”?
Coding agents are interesting, but in my opinion also many worlds away from what they're being sold as. They can be helpful and a moderate efficiency gain, if you know where to use them and you're careful to not fall into one of their many traps where they end up being a massive cost and efficiency loss down the line. They're helpful tools, but they're slow, expensive, and unreliable -- in order of decreasing likelihood that that's going to change in a big way.
I find it interesting that you chose the shopping list and fridge examples, because my view on the whole LLM hype is that 99% of it is a solution looking for a problem, and shopping and the fridge are historically such a commonly advertised area for technologies desparately looking for an actual use case. I don't think fridge content management and shopping plans are actual pain points in most people's lives. It's not something people would see a benefit in if they didn't have to do it manually. And it's an area with a very low tolerance for the systemic unreliability. The guy needed eggs to bake his cake, but the AI got him eggos instead -- et voilà, another person who thinks this whole "smart" technology is shit and won't deal with it anymore.
And so it goes with most AI use cases I've seen so far. In my view the only thing they're good at is fuzzy search. Coding agents are helpful, but in the end, their secret sauce it just that: fuzzy search.
Can fuzzy search be helpful? Yes, even very helpful! "Bigger than the Internet" helpful? I think not.
> Of course, anything is _possible_, but let’s talk likelihood.
The problem with talking likelihood is that it's an interpretation game. I understand you think it's wholly unlikely that it all fizzles out, I could read that from your first post. I hope it's also clear that I do think it's likely.
That's the point where we have to just agree to disagree. We have no rapport. I have no reason to trust your judgment, and neither do you mine.
However I do feel a lot of this comes down to facts about the world now, eg whether Claude Opus is doing anything interesting, which are in principle places where you could provide some evidence or ideas, along the lines of the detail that I gave you.
My read so far is you are just saying “maybe it fizzles out” which is not going to persuade anyone who disagrees. Sure, “maybe”, especially if you don’t put probabilities on anything; that statement is not falsifiable.
> The problem with talking likelihood is that it's an interpretation game
I am open to updating my model in response to a causal argument, if you care to give more detail. I view likelihoods as the only way to make these sorts of conversations concrete enough that anyone could hope to update each other’s model.
Even if chatbot LLM's stop at their current capability, There's a whole ecosystem of scientific language models(in drug discovery, chemistry, materials design, etc), and engineering language models(software, chip design, etc) that are very valuable in their fields.
And even if chatbot LLM's seem to be a dead end, them and other machine learning algo's will be happy to use the data centers to create/discover a lot of stuff.
AI progress may fizzle out, but everything it produced so far would still be there. Models are just big bags of floats - once trained, they're around forever (well, at least until someone deletes them), same is true about harnesses they run in (it's just programs).
But AI proliferation is not stopping soon, because we've not picked up even the low hanging fruits just yet. Again, even if no new SOTA models were to be trained after today, there's years if not decades of R&D work into how to best use the ones we have - how to harness the big ones, where to embed the small ones, and of course, more fundamental exploration of the latent spaces and how they formed, to inform information sciences, cognitive sciences, and perhaps even philosophy.
And if that runs out or there is an Anti AI Revolution, we can still run those weather models and route planners on the chips once occupied by LLMs - just don't tell the proles that those too are AI, or it's guillotine o'clock again.
> there's years if not decades of R&D work into how to best use the ones we have - how to harness the big ones, where to embed the small ones, and of course, more fundamental exploration of the latent spaces and how they formed, to inform information sciences, cognitive sciences, and perhaps even philosophy.
I think my sense of "dead end" would entail none of those directions panning out into anything interesting. You would "explore the latent spaces" only to find nothing of value. Embedding the LLM models wouldn't end up doing anything useful for whatever reason, and philosophy would continue on without any change.
What will happen is that new buzzwords will be invented, and a new fad will take its place. And we will be stuck with the short end of the stick again. You can hope, but shit doesn't really get cheaper for us common folk, ever. :/
I think there is little chance it is a "dead end", it's here to stay but at least LLMs seem to have hit the diminishing returns curve already, despise what investors might think, and so far none of the big providers actually makes money for all that investment
I think for many, if LLMs and AI only improves marginally in the next 5-10 years it is effectively a dead end. The capital expenditure necessitates AI does something exponentially more valuable than what it does now.
I think we are saying the same thing.i just think the pull back on AI will be dramatic unless something amazing happens very soon.
I just don’t see it. Both professionally and personally I’m producing so much more now. Back burner projects that weren’t worth months of my time are easily worth a few hours and $20 or whatever.
You’re probably already experienced at your job and using AI to enhance that, or at least using that experience to keep the AI results clean. That’s something you or a company would want to pay for but it has to be a lot more than today’s prices to make it profitable. Companies want to get more out of you, or get a better price/performance ratio (an AI that delivers cheaper than the equivalent human).
But current gen AIs are like eternal juniors, never quite ready to operate independently, never learning to become the expert that you are, they are practically frozen in time to the capabilities gained during training. Yet these LLMs replaced the first few rungs of the ladder so human juniors have a canyon to jump if they want the same progression you had. I’m seeing inexperienced people just using AI like a magic 8 ball. “The AI said whatever”. [0] LLMs are smart and cheap enough to undercut human juniors, especially in the hands of a senior. But they’re too dumb to ever become a senior. Where’s the big money in that? What company wants to pay for the “eternal juniors” workforce and whatever they save on payroll goes to procuring external seniors which they’re no longer producing internally?
So I’m not too sure a generation of people who have to compete against the LLMs from day 1 will really be producing “so much more” of value later on. Maybe a select few will. Without a big jump in model quality we might see “always junior” LLMs without seniors to enhance. This is not sustainable.
And you enhancing your carpentry skills for your free time isn’t what pays for the datacenters and some CEO’s fat paycheck.
[0] I hire trainees/interns every year, and pore through hundreds of CVs and interviews for this. The quality of a significant portion of them has gone way down in the past years, coinciding with LLMs gaining popularity.
This is thoroughly debunked at this point. The frontier labs are profitable on the tokens they serve. They are negative when you bake in the training costs for the next generation.
So what. Fluctuations over a year or two are meaningless. Do you really believe that the constant-dollar price of an LLM token will be higher in 20 years?
I can see a world where energy costs rise at a rate faster than overall inflation, or are a leading indicator. In that scenario then yes I could see LLM token costs going up.
Lol are people like you going to be enough to support the large revenues? Nope.
A firm that see's rising operating expenses but no not enough increase in revenue will start to cut back on spending on LLMs and become very frugal (e.g. rationing).
> Wasn't their previous attempt at running vending machines unprofitable?
If we are talking about the one at that newspaper, it wasnt just unprofitable. The "customers" made it give away products for free. It was ordering them playstations.
As entertainment it was fun, but as a business or proof of intelligence or Turing test, it was an abject failure.
That's a terrible argument on the face of it. "They can't make any energy, but also they make so much energy they can't use it all".
I actually live in Denmark, and we can produce solar energy just fine. My dad installed rooftop solar 10 years ago, and that thing has 90% of his electricity usage since then. It's still producing at around 85% capacity too.
Umm, so we still have to build enough traditional (and, ideally, dispatchable) generation capacity to make sure we can cover our electricity needs during those periods in winter where it's very cloudy and it's not windy?
"Cloudy and still weather has caused Great Britain’s renewable energy output to fall to near zero this week"
"Britain’s wind power output fell to just above zero on Wednesday, which, combined with the cold, dark weather, caused the market price for electricity to climb to almost £250 per megawatt-hour at auction, or almost seven times the average price before the pandemic"
"The sudden drop-off in renewable energy due to dull windless winter weather, known as dunkelflaute in German, has also forced the system operator to pay gas power stations more than £500/MWh to run on Wednesday evening when household demand is expected to reach its peak.
The weather conditions – the third dunkelflaute of the winter so far – left Britain’s electricity grid reliant on gas-fired power stations. They accounted for more than 70% of power generation at points on Wednesday."
In New England it works fine and we project 3 hours of production during the winter months. Not sure what Denmark’s latitude is, but 7 hours of production is not needed.
We have solar in Finland as well, like everyone else. Yes, it's useless in winter. Yes, the expansion has slowed down, because there is no storage and limited export options.
The Nordic power market is a mess, and it's not because solar doesn't work in winter but because the grid needs massive investments on all levels and nobody wants to be left holding the bill for it.
Electrification? Sure, I'll buy an EV when the _local_ grid operator makes sure my lights don't flicker when the neighbor uses an angle grinder. The last update was that they plan to replace the old transformer station from the 60's "when it breaks".
Local generation? Can't get rid of the excess generation if I wanted to.
Is Denmark's power grid expansion still geared at selling Swedish electricity to the Germans?
Sweden? No internal transfer capacity so their consumers have constant high prices while power is exported cheaply.
Indeed, and large parts of the reason has nothing to do with geography. The same applies to Denmark and the rest of the Nordics.
Obviously solar will be decreasingly useful as you get further to the pole, but the Nordics aren't worse off than Alaska or Canada in that regard, and both do solar to some extent AFAIK.
It has lot to do with geographic latitude and weather patterns. The amount of electric output per amount solar installed strongly affects the profitability of solar installation (if you don't count of government subsidies).
And summer isn't when you need the power anyway, so its very inefficient since northern winters has barely any sunlight at all, its close to 0 from solar power then. In warmer countries you want power in the summer for AC during the day, so there it matches usage, but in northern countries solar isn't very useful at all.
> but the Nordics aren't worse off than Alaska or Canada in that regard
Nordics are much further north than Canada, most Canadians live further south than Paris and Paris is a lot further south than even Denmark that is much further south than Finland.
Are you saying it's cloudy for four months straight?
And the panels are still making power during the winter.
A detailed chart would be nice but a good starting point to imagine is 60-70 days that average 50% solar power and the rest of the year is full solar power minus a couple particularly bad days.
Edit: In winter, in denmark, the amount of sunlight you get per square meter of flat ground is absolutely awful. But the amount of sunlight you can catch on a highly tilted solar panel is still pretty good, about half of the average output. So if you space them properly and overbuild based on the average, the 90% number isn't crazy.
> Are you saying it's cloudy for four months straight?
No. But it's cloudy most of the time for four month straight (in average there's only 200 hours of sun between November and February in Copenhagen. Yes, you read it right, that's not even 2 hours per day in average!).
> A detailed chart would be nice but a good starting point to imagine is 60-70 days that average 50% solar power and the rest of the year is full solar power minus a couple particularly bad days
That's an insane assumption! An average of 50% solar power during the day is the higher bound of what you can expect in the middle of the Nevada desert! (Because you know, the sun rises and falls during the day, it's never going to give the full power during daytime). And because there's night half of the time in average, even in Nevada you end up with load factors around 25%! (Go check the figures!)
In winter in Danemark, the situation is obviously far worse! A 2-5% load factor is to be expected depending on the weather. (Just check the live data: we're in April, it's 11am and solar panels are delivering just 10% of their power right now https://app.electricitymaps.com/map/zone/DK-DK1/live/fifteen... )
> the 90% number isn't crazy.
If this number doesn't sounds crazy to you, it's just because you're completely off in terms of orders of magnitude involved. 90% is likely achievable in southern US with great effort, lots of storage for night times and significant over-paneling, but it's pure science fiction in Denmark.
No no no no, that line has nothing to do with load factor. I'm talking about half the kilowatts for the house coming from solar, and half coming from the grid.
> Just check the live data
There's no way those panels are optimally angled and out of shade if they're making that little. Are those panels installed in rows on the ground? Rows that are pretty close to each other? Panels on a roof, the steeper the better, will see a much higher load factor in winter.
I'm other words, a home rooftop install will do much better in winter than a standard commercial install. That's a mixture of chance and optimizing for different things.
A thought experiment: You have one big solar panel mounted very high, with a multi-axis aiming system that points it directly at the sun. Do you think the amount of power you can make is going to be that far off a linear relationship with the number of hours of daylight?
> No no no no, that line has nothing to do with load factor. I'm talking about half the kilowatts for the house coming from solar, and half coming from the grid.
Assuming consumption isn't correlated with sun hours, these are equivalent unless you over-panel. With a load factor of 5%, you need to over-panel 10x to achieve 50% of your energy supply (in fact it's more complex than this and you'll need even more of that but that's an OK simplifying assumption).
> There's no way those panels are optimally angled and out of shade if they're making that little
Those are commercial solar farms, optimally angled under the constraint that the cost must be reasonable.
> A thought experiment: You have one big solar panel mounted very high, with a multi-axis aiming system that points it directly at the sun.
Do you have an idea of how much it would cost?! With Materials + installation + maintenance, such a mechanism would dwarf the price of the panels. There's a reason we don't deploy those at scale in practice …
> Do you think the amount of power you can make is going to be that far off a linear relationship with the number of hours of daylight?
In a country where 80% of the winter is cloudy, it's going to be very far, yes. The 10% power happening right now is because it's cloudy (light clouds, no rain, but still). It peaked at 40% in recent days with proper sun, but it happened only a handful of times in the entire winter.
I don't think so? If your Nevada desert load factor is 25%, then we're talking about it dropping to 12% or something. Unless I'm not understanding the way you're using those numbers.
> unless you over-panel
Some amount of over-paneling would be perfectly fine here. Not 10x, agreed.
> Those are commercial solar farms, optimally angled under the constraint that the cost must be reasonable.
They're optimized mostly for total power output, which affects things. And they don't have a free house to be mounted on.
They're also not trying very hard to avoid shade. The commercial plant has to buy land for every panel, while a house has much more land than panels. That's a massive difference. When the sun is near the horizon, you want your rows of panels to be very far apart or at different heights. Which means:
A commercial solar plant like one pictured in the article will have each panel shade most of the next row's panel when the sun is very low. To stop this effect, you need to put the rows super far apart, or put them at different heights (like on a roof). This means a home install could have 4x as much light hit each panel in the depths of winter.
> Do you have an idea of how much it would cost?!
It's a thought experiment. Don't worry about the cost of tracking. Because it turns out, a 60 degree angle that completely avoids shade is just as good. The key is avoiding shade. Commercial plants do not avoid shade. Rooftop installs do avoid shade (they won't be quite as tilted, but they'll still have a huge advantage). If you have a nice big yard you can also avoid shade.
> The 10% load factor happening right now is because it's cloudy (light clouds, no rain, but still). It peaked at 40% in recent days with proper sun, but it happened only a handful of times in the entire winter.
I think you didn't go through the full implications of this.
It's mid-april. If it's cloudy this far from the depths of winter, that means needing more panels is much more of a year-round thing. Which means a household array needs to be bigger as a baseline. Which means it can tolerate more losses in the winter.
The thing that would make 90% unreasonable is the difference between winter and non-winter power output. If spring and/or fall also require lots of panels, then 90% gets more realistic because expanding the system saves money for more months of the year.
> And they don't have a free house to be mounted on.
Rooftop solar is more expensive than solar farms. There's nothing free in putting a solar panel on a roof. (Which is a pity because it means that if your country doesn't have a desert, the economically optimal way of installing solar panels is deforestation, but that's the world we live in…).
> Because it turns out, a 60 degree angle that completely avoids shade is just as good
Not at all…
The sun isn't just going up and down you know, it also circles from east to West…
> They're also not trying very hard to avoid shade. […] When the sun is near the horizon, you want your rows of panels to be very far apart or at different heights.
> A commercial solar plant like one pictured in the article will have each panel shade most of the next row's panel when the sun is very low.
I'm sorry but this is utter bullshit. The commercial plants do avoid shade as much as possible because shade destroy efficiency (one cell being shaded criples the output of the entire row…).
They don't care about shade when the sun is low because when the sun is low the incidence angle is terrible in the first place. You want your average panel directed south (or north in the southern hemisphere), when the sun is low, it's going to be completely in the East or completely in the West, and you care about the cosine of your incidence angle, which means the output is going to be near zero even without any shade whatsoever.
> It's mid-april. If it's cloudy this far from the depths of winter, that means needing more panels is much more of a year-round thing.
Of course clouds are a year-round thing, what do you think… But sunny days are still much more frequent in summer.
> Which means a household array needs to be bigger as a baseline
Yes, but that's over-paneling…
> The thing that would make 90% unreasonable is the difference between winter and non-winter power output. If spring and/or fall also require lots of panels, then 90% gets more realistic because expanding the system saves money for more months of the year.
Sigh… Over-paneling 10x isn't going to be more worth it just because in spring and winter you need 5x. That's a nonsensical argument…
I'm sorry but you obviously have no idea about any of these things, I can only invite you to document yourself better at this point, because you're just pilling up crazy takes on top of crazy takes here.
> The sun isn't just going up and down you know it also circles from east to West…
Over a narrow range in winter. You get good coverage from pointing very south and avoiding shade.
> I'm sorry but this is utter bullshit. The commercial plants do avoid shade as much as possible because shade destroy efficiency
They do not avoid it "as much as possible". The panels are shading each other in that very photo, and that photo wasn't taken at the crack of dawn.
It's basic trigonometry. Narrow spacing needs the sun to get pretty high before shading stops. A roof install never shades itself. The difference matters.
> They don't care about shade when the sun is low because when the sun is low the incidence angle is terrible in the first place.
Wrong answer. Those panels are plenty tilted for low incidence sunlight. The ones in front will make plenty of power in the winter. But the ones behind them won't.
The limiter is the price of land. If land was free I guarantee they would spread them out more.
And a home install doesn't have this specific issue.
> Yes, but that's over-paneling…
No it's not! If you need it for most of the year it's not "over"!
> Sigh… Over-paneling 10x isn't going to be more worth it just because in spring and winter you need 5x. That's a nonsensical argument…
If you need 5x or more for half the year, you calculated "x" wrong. Your math is what's nonsense here.
> They do not avoid it "as much as possible". The panels are shading each other in that very photo
You haven't linked the photo…
> It's basic trigonometry. Narrow spacing needs the sun to get pretty high before shading stops.
Of course it's “basic trigonometry”… It doesn't matter if the panels are shaded when the incidence angle is high anyway!
> The limiter is the price of land. If land was free I guarantee they would spread them out more.
They wouldn't, they'd just put more panels on a bigger surface. And again, industrial actors are maximizing the economic output they can make. Whatever decision you take at your level, it's going to be more expensive than what they are doing, and more efficient.
> No it's not! If you need it for most of the year it's not "over"
Yes it is… By definition you are over-paneling if your peak production is higher than what you use. This threshold is important because cost calculations only works when you haven't reached that yet!
> If you need 5x or more for half the year, you calculated "x" wrong. Your math is what's nonsense here.
X is the value for which the cost/MWh makes sense. The further you got from there, the bigger fraction of the power is unexploited and the higher the cost per unit of useful electricity rises.
I didn't invent these concepts or these calculations, those are standards when talking about solar.
They don't build them in deserts that far north, do they?
I got this "fixation" by doing the math to figure out why panels do so badly when there's still seven and a half hours of daylight.
The insolation per square meter of ground is very low when the sun is near the horizon. But the insolation of a flat surface at 60 degrees of tilt is still pretty good. If you avoid shade.
Please tell me you have no disagreements with that. It's basic math.
So as you said with basic panels "one cell being shaded criples the output of the entire row". Normal commercial installs don't try to capture the morning sun. But in the middle of winter in Denmark the "morning" sun is basically all you have access to.
You said "They don't care about shade when the sun is low because when the sun is low the incidence angle is terrible in the first place."
If you tilt really far and avoid shade, you counteract the bad incidence angle. A single square meter of panel can absorb the light that would have hit 6 square meters of ground.
> They don't build them in deserts that far north, do they?
I'm not aware of any desert in Denmark…
But you say the design is driven by lack of space, then why do they use the same design in deserts?! That's my question. Denmark doesn't have much free space, but Sweden do, land is cheap in many places there, yet the Swede don't design their plant differently.
> Normal commercial installs don't try to capture the morning sun. But in the middle of winter in Denmark the "morning" sun is basically all you have access to.
And yet you insist commercial plants don't do that? Why? Are they stupid?
> If you tilt really far and avoid shade, you counteract the bad incidence angle.
Only the vertical angle, not the horizontal one… And again it makes no sense to optimize for winter morning sun when there's only 2 hours of sunlight per day in average during winter…
You could set up a football field of our perfectly optimized morning sun solar panels, plus the same for evening sun, and you'd still have failed to power a house for the full month between January and February where the sun often don't show up once, and in that time span you've already exceeded the 10% non-solar budget in that mental exercise…
Right! A lack of places with such super free land that also have horrible sun angles.
> But you say the design is driven by lack of space, then why do they use the same design in deserts?!
The desert builds don't have to deal with the same horrible sun angles.
But whatever, I might be wrong on what they would do with free land. That was a guess, I admit it. That guess was to illustrate my point about angles. It's not a critical part of my argument.
It's a fact that solar panels on a roof avoid the shading problem, while a normal commercial layout does not. Pure mathematics.
> Only the vertical angle, not the horizontal one…
The horizontal angle doesn't change very much. If you point flat 50-60 degrees south (the year-round optimal angle for Denmark) you will get a significant amount of sun no matter the season if you avoid shade. Winter sun is less than average but it's close to 50%, not 5%.
> And again it makes no sense to optimize for winter morning sun when there's only 2 hours of sunlight per day in average during winter…
That's so close to understanding my argument!
Commercial plants don't bother. They're not optimal for winter. But if you build on a slanted roof you get that optimization for free. So a home install actually becomes better than a commercial install for this specific use case.
But it's not 2 hours of significant light, it's more than that. Clouds don't make the sun useless.
Yes the range reduces in winter especially when you go north, but you still get at least 45° of incidence angle in the best case scenario.
> Winter sun is less than average but it's close to 50%, not 5%.
How can it be 50% when the sun is beyond the horizon for 17 hours straight?! For some reason you obsess with shade, but disregard the most important one: the one caused by earth moving in front of the sun (also called “night”)…
> That's so close to understanding my argument!
> Commercial plants don't bother. They're not optimal for winter
I see what you mean, but plants optimize for electricity value, not rough output, and electricity is more expensive in winter, if they could get good yields at that period, they would actually make more money than the one they get by selling excess electricity in summer…
> But it's not 2 hours of significant light, it's more than that. Clouds don't make the sun useless.
For regular solar panels, they pretty much do, especially in the north (because the cloud layer is effectively much thicker due to the high sunlight incidence angle). Amorphous panels have better performance in these scenarios but it's still far from good, especially if you tilt them heavily to face the sun as these panels need to be facing the sky to get as much diffuse daylight as possible.
As a result, the sunny hours, even though rare, are going to dwarf the others in electricity production, even if there's few of them.
But if you believe you can sustain 90% of your electricity consumption from solar in Denmark, go ahead, I'm not going to convince you otherwise and I'll have no guilt if you lose your shirt in the process.
> How can it be 50% when the sun is beyond the horizon for 17 hours straight?!
50% of the average. The average being a day with something like 12 hours of sunlight. Sorry to be unclear.
> I see what you mean, but plants optimize for electricity value, not rough output, and electricity is more expensive in winter, if they could get good yields at that period, they would actually make more money than the one they get by selling excess electricity in summer…
One important factor is that they're not optimizing for power per panel. Panels are pretty cheap, and filling the land with panels makes sense as an overall decision.
Let me reframe things. For a commercial plant it's not that they could get significantly more power in deep winter, it's that they could get the same power with 20% as many panels. But spreading panels out that far would be worse the rest of the year.
Many home installs can get that "spreading" for free.
So to redo my claim from earlier, if there was a magic button to put 50 feet between each row of panels with no downside, I strongly bet commercial installs would pay to press it. And it should take the winter output up from "useless" to "bad".
It's possible I'm still severely underestimating the clouds. But when there is light, there's this interesting advantage small/widely-spaced installs get in winter. Or rather, they have a much smaller disadvantage.
It does not ignore the word. It subverts it, and that's the point. It's the system equivalent of "death of the author", which states that omes a work is written, the authors intent loses relevance and the work must be examined on its own. The aurhors opinion or relationship to the work carries no more weight than any other persons.
That's not "true" in any demonstrable sense, but it can be a useful form of analysis. As it is with "purpose of a system"
This is not how people outside of cybernetics use POSWID. From context it does not appear to be how SlinkyOnStairs was using it either.
I think it's also trying to be too cute. The first two definitions of purpose on Wiktionary[A]:
1. The end for which something is done, is made or exists.
2. Function, role.
People (uselessly) talking about the purpose of a system are often referring to #1, while POSWID is using it to mean #2. The real point of POSWID is that only definition #2 matters. POSWID is a terrible phrase not because it is wrong, but because is is an equivocation -- I suspect that Beer intended it as a pun, but the difference between the two is if one gets the joke. POSWID gets used incorrectly because people don't get the joke.
> From context it does not appear to be how SlinkyOnStairs was using it either.
The exact definition of "purpose" doesn't matter much here.
The particular version of the heuristic used here is that the stated purpose and the actual purpose often differ. POSIWID being the observation that the actual purpose is reflected by the outcomes of the system, because if that isn't the case the system gets changed.
Thus, the observation about AI benchmarks. AI companies have had years now to stop using unreliable benchmarks as advertising material. There's been years of piece after piece about the problems with these benchmarks. And yet the AI marketing continues as is.
> POSIWID being the observation that the actual purpose is reflected by the outcomes of the system, because if that isn't the case the system gets changed.
I fundamentally disagree with this, and it seems to differ from how other proponents of POSIWID in this thread view POSIWID.
It also seems trivially false; systems are dynamic what was the purpose of the system just before it was changed because people didn't like the outcomes?
I'd go further and say this is also the cybernetics equivalent of the religious teachings about humans, specifically the whole "judge by one's deeds, not by one's words" thing. So it's not like it's a novel idea.
Also worth remembering that most systems POSIWID is said about, and in fact ~all important systems affecting people, are not designed in the first place. Market forces, social, political, even organizational dynamics, are not designed top-down, they're emergent, and bottom-up wishes and intentions do not necessarily carry over to the system at large.
> But in general, identifiers have little currency outside the system that generated them
That's clearly wrong, because if it were true we wouldn't be able to identify anything. Identifiers are only useful in so far as some external party assigns a meaning to the identifier. Two systems MUST pick a common idwntifier to discuss a person. They MUST pick an identifier to discuss a technical field. They MUST even pick an identifier to discuss a technical protocol.
Identifiers are everywhere. They'll usually be translated into something internal at the edge of a system, but I bet the PNR is too.
Most identifiers are for use within a system, and are not intended to be GUIDs, semantically. My name is Paul Davis, which functions well enough as an identifier within my lived community, but is pretty useless as "real" identifier for me, which is why other entities I interact with want my birthday, or social security number, or passport or ...
One can cheat on this for airline systems by taking the position that "the system" is the aggregation of all the different systems, not any one of them.
reply