4/2/2025 at 1:53:34 AM
Just as we have weather forecasting, climate models .. we do need and should have good fine-grain computational models of complex systems such as the cell .. and the global economy.We should be able to have whole economy simulations give reasonable predictions in response to natural events and lever-pulling such as :
- higher progressive tax rates - central bank interest rate moves - local tariffs and sanctions - shipping blackades / blockages - regional war - extreme weather events - earthquake - regional epidemic - giving poor people cash grants - free higher education - science research grants - skilled immigration / emigration
But .. of course this would require something like a rich country providing grants to applied cross disciplinary research over many years.
It might even lead to insights that prevent semi-regular economic boom and bust cycles we experienced the past 100 years.
by jgord
4/2/2025 at 4:43:12 AM
>we do need and should have good fine-grain computational models of complex systems such [...] the global economy.Many years ago when 'social graphs' were still a hot area to do research in I started building a simulation of the equivalent of a small medieval village.
What became quickly apparent is that you didn't just need interactions between any two individuals like classical social graphs talked about, but between any number of arbitrary groups of individuals. Otherwise something as simple as an extended family couldn't be modeled.
That meant that instead of being able to use a matrix as the fundamental data structure you'd need a tensor of rank N, where N is the number of people in the economy. Just to see how intractable this is if the village had 20 people in it with the traditional matrix approach you'd need 400 weights to model interactions. With the tensor approach you need ~1e+26.
In short: it's impossible to have fined grained simulations of complex societies. The best we can do is drastic over simplifications that give us _some_ predictive power.
by noosphr
4/2/2025 at 6:02:01 AM
A set of size N=20, has total number of subsets equal to 2^N = 1.05 million. You must have had other degrees of freedom to bump that to 1E26.by abdullahkhalids
4/2/2025 at 6:03:31 AM
That's only for two way relationships, you need N way relationships, which is N^N.by noosphr
4/2/2025 at 6:22:26 AM
Wouldn't you be able to quickly prune away invalid/unlikely interactions? Maybe, have some cutoff based on proximity of members or something?by wordpad
4/2/2025 at 6:31:28 AM
Yes, the resulting tensor is incredibly sparse, but still too large to ever be practical for use as anything but the theoretical upper limit on the complexity of a model.The issue is that while you can remove pretty much all possible interactions for a specific case you have no idea where an interaction could pop up unexpectedly with a huge impact ahead of time.
For the medieval theme the leader of the village may be a cousin of the king which is a very distant but very strong interaction.
by noosphr
4/2/2025 at 7:07:37 AM
how many people do you talk to in a day?by exe34
4/2/2025 at 8:17:43 AM
This is a perfect example of the limitations of old school social graphs. The number of people you talk to is just the first order effect. What about all the people that people in your company talked to? That has a non zero economic impact on you. Similarly for any other company that your company talks to, and so on and so on.by noosphr
4/2/2025 at 9:16:46 AM
that still scales linearly with number of people, not quadratically.by exe34
4/2/2025 at 12:39:09 PM
It scales super exponentially since each person is a member of an arbitrary number of groups and their actions have some non zero impact on each other member of the groups they are a member of, and each individual in each group with a member of which they have interacted with.by noosphr
4/2/2025 at 9:02:44 PM
if that were the case, numerical weather prediction wouldn't work. you can add up the impact for each neighbour.by exe34
4/2/2025 at 10:48:29 PM
Humans are not mindless molecules whose only interactions are with the people right next to them.by noosphr
4/3/2025 at 10:44:56 AM
You're allowed to move the numbers that represent the people in the matrix multiplication such that the interactions are close by.by exe34
4/2/2025 at 2:07:51 AM
We already have tons of those models.None of them are perfect.
And they never will be.
Could the be better? Yes.
The problem is, you won't really know they're better until post-ex. And even then, you'll never be sure how much better. They're always bound to fail catastrophically at some point. Etc.
by onlyrealcuzzo
4/2/2025 at 7:30:23 AM
In fairness we don't really have such models, at least not anymore.We used to, that's what “macro” economics was about. The models where crude but they did the job for a while, we used them for more than 30 years between the 40s and the late 70s, with great success (no economic crisis for the longest time since the industrial revolution).
But they conflicted with the idea that economist made of their job, as these models didn't include any if the classical economics credo that economists had been worshipping for almost two centuries (markets, competition, scarcity, supply and demand, etc.). So people started building “micro-founded” macro models, that tried to reconcile the empirical models that worked with the ideological principles of classical economics. But you can't have good models if you design them to match an ideological paradigm.
And then as you said, it failed catastrophically: the oil shock came, and suddenly the forecasts of the models became useless for a while. At that very moment, everyone who opposed to using models to justify state intervention where thrilled, and the era of short-terms economic engineering based on models was dead.
by littlestymaar
4/2/2025 at 7:40:43 AM
You've stumbled upon Chaos Theory (https://en.m.wikipedia.org/wiki/Chaos_theory), which aims to study chaotic systems (charactesised by very high relation to initial variables - see weather prediction, double pendulum, etc).Some problems are too sensible to initial variables and solutions are not prescriptive like regular physics - meaning that variability at the 20th decimal in your initial variables will induce massive output differences. Lorentz discovery of this is interesting as he was working on weather modelling, it's a clear example of the issues with chaotic systems. He was running simulations of weather systems with multiple fixed initial variables (temperature, wind speed, etc) and seeing how the system progressed over a few hours. He realised that after a typo on a very far away decimal on a single parameter, the system was modelling the complete opposite of what we had seen in the previous test (think it was forecasting a typhoon when it used to say sunny day), even while using values that would be "equal" with relation to the precision of the measuring equipement. And that's nothing to talk about getting clean, precise enough data for such models, which is practically impossible (see the observer effect, between other causes). Garbage in, garbage out.
All this to say that problems in this sphere are characterized by quickly becoming untractable and impossible to model precisely how they evolve over time.
I can recommend James Gleick's Chaos: Making a new science for a overview for the layperson.
by dmbche
4/2/2025 at 2:44:02 AM
There’s a whole discipline which does nearly that, though they do not use this style of agent based model.Generally agent based models have numerous parameters which can take many values (endowments, preferences) and the models don’t themselves give any guidance about how to set the parameters. Theory can give limited guidance (eg., that function is concave, this parameter is negative). Sometimes we have experimental data though its generalizability beyond the lab is uncertain.
What you want to do to create a scientific macroeconomics is to work backwards from the data you see in the economy (aggregate consumption, investment, etc.) and how you know the aggregates were generated (via the behavior of a lot of individual agents), and an equilibrium assumption to recover the parameters.
If you know the parameters of the model you assume, you can then simulate interesting counterfactuals. (And yes you assume the model - a “full” model including “all” of the individual endowments and parameters you can think of is completely intractable. You have to simplify.)
You’ll never get that out of the author’s computer game.
If you want references to the macro literature it’s enormous and I can provide them.
by huitzitziltzin
4/2/2025 at 7:14:42 PM
Would you happen to know what some key search phrases might be to get started in the contemporary modeling literature?by nxobject
4/3/2025 at 1:27:30 AM
Sure it’s just macro.Ljungqvist and Sargent is a standard grad level text.
Acemoglu has one with a growth focus.
Miao is another one.
Stachurski and Sargent is one focused on computational issues.
Stokey Lucas and Prescott is the math but I would skip that.
Dejong and Dave covers macroeconometrics.
by huitzitziltzin
4/3/2025 at 6:15:06 AM
Thank you!by nxobject
4/2/2025 at 7:41:00 AM
> and an equilibrium assumption to recover the parametersThat parts makes no sense though, there isn't an equilibrium and can never be, economies are a chaotic system. One of the key problems of economic modeling is that they used mathematical tool that aren't suited for that.
You can't consider an economy as a steam engine. Walras was a trained engineer in the 19th century so I can excuse him for making this approximation, but I can't excuse anyone still following his course more than a century later.
by littlestymaar
4/3/2025 at 1:33:52 AM
I don’t agree that it doesn’t make sense. It’s a good approximation a lot of the time.Also: from your comment I’m pretty sure you don’t have the background (correct me if that’s wrong) and so don’t know what it really means in practice to make an equilibrium assumption in a macro model: Markets clear, on average people have reasonable beliefs about the evolution of aggregate variables, firms maximize expected profits. That’s all pretty harmless.
I definitely don’t agree that we are “using mathematical tools which aren’t suited for that.” We aren’t treating the economy “like a steam engine.” The entire revolution in macro from the 1970’s on involves optimizing agents. There is no useful analogy to a steam engine.
I suspect you have been reading criticism by people who are misinformed about what macro actually is and how it is practiced.
by huitzitziltzin
4/3/2025 at 7:12:56 AM
> I don’t agree that it doesn’t make sense. It’s a good approximation a lot of the time.So is a weather forecast that says “the weather will be sunny tomorrow” in Miami. True most of the time yet utterly meaningless.
> Also: from your comment I’m pretty sure you don’t have the background (correct me if that’s wrong)
I have to admit my DSGE class was more than 10 years ago at this point, but I still vividly remember the hypothesis being hilarious (spherical cow in vacuum−level) even though now I would need to spend a bit of time digging into it to be able to write down why exactly.
by littlestymaar
4/2/2025 at 11:05:57 PM
Issue of the Commons.Weather models are good because if we know about it the weather doesn't care and doesn't change what it is going to do.
Anyone who has an accurate financial model is keeping it to themselves.
Anyone who has an accurate financial model and make it public... invalidates their model as everyone takes that information and plans to take advantage of it accordingly.
by _carbyau_
4/2/2025 at 4:04:52 AM
While your last point is certainly an ideal to aspire to, something tells me that the powers that be would not actually want to get rid of booms and busts, because ultimately that is where a lot of the “wealth” for those high up is created. You don’t really need complex models to solve the problem of some humans being really, really greedy, driving markets to overheat, ending in catastrophic failure.by mym1990
4/2/2025 at 5:21:59 PM
Booms and busts emerge naturally from market dynamics, even when everyone is acting fairly and reasonably. You don't need to add sneaky, greedy people pulling the strings from behind a curtain.by sdwr
4/2/2025 at 2:21:45 AM
> we do need and should have good fine-grain computational models of complex systems such as the cell .. and the global economy.Thanks to the pioneering work done by physicists, we realized we could simulate dimension reduced versions of reality instead. We call them statistics and differential equations :)
Stack enough of them together, you get something called "deep learning". Large scale national lab supercomputer type numerical simulations are for your grandparents (these days you can probably take shortcuts and simulate that sort of born secret computations in a neural net that is much more compute efficient than the typical supercomputer).
by Onavo
4/2/2025 at 2:37:15 AM
If you had such a model you could arbitrage between Polymarket betting on wars and stock prices. There's not much of an incentive to release such a model publicly.by jjmarr