A series of excel errors, including failure to copy down a formula to the whole column, led many governments to adopt a policy of economic austerity. The spreadsheet had demonstrated that, historically, countries that adopted austerity came out of recession faster. Once the errors in the spread where fixed, it actually proved the opposite. But by then the damage was done.
Edit: It was UMASS grad students that spotted the spreadsheet errors by these Harvard/IMF heavy weights:)
Reinhart and Rogoff kindly provided us with the working spreadsheet from the RR analysis. With the working spreadsheet, we were able to approximate closely the published RR results. While using RR’s working spreadsheet, we identified coding errors, selective exclusion of available data, and unconventional weighting of summary statistics.
In addition to their excel errors, their analysis required excluding the Australia, Canada, and New Zealand. Once these countries were included in their analysis, their argument falls apart. To me, it's clear that this was a case of very selective researcher degrees of freedom to support austerity. Why anyone would take Reinhart or Rogoff seriously after this farce is beyond my comprehension.
> their analysis required excluding the Australia, Canada, and New Zealand. Once these countries were included in their analysis, their argument falls apart
Unsure on Oz and Canada. But New Zealand was excluded due to gaps in R&R's data at time of their first paper's publication. They "fully integrated the New Zealand data back to the early 1800s" as well as for "every major high debt episode for advanced countries since 1800" for their 2012 paper [1].
The effect size that Herndon et al found is "growth at high debt levels" being "a little more than half of the growth rate at the lowest levels." R&R's 2012 paper finds an even more-muted result: "2.4% for high debt versus 3.5% for below 90%."
In case it's useful to anyone: google sheets' lambda[1] and map[2] functions have prevented a ton of fill-down issues for me. Plus the ability to use a whole column without specifying number of rows (e.g. "A1:B" instead of "A1:B1000")
Those functions + a little bit of custom app script have helped me (not very technical) get pretty far in building maintainable data pipelines, reporting, etc. in gsheets.
I think Excel Tables make a ton of sense, especially if you're using PowerQuery or other built-in data connections to populate the sheet.
But for Google sheets, I've yet to find something as flexible and maintainable as map/lambda -- especially when I'm doing things that are pretty... egregiously hacky, but better suited to gsheets than excel (I still prefer Google's realtime collaboration to Microsoft's).
Two of those 3 sound intentional: selective exclusion of data and unconventional weighting. The "coding errors" may also have been intentional. I would suggest more scrutiny of the authors and their motives before dismissing as "Excel errors, whoops"
I think it's more likely that if that paper had never been published the government would have advertised a different pretext for the decision they wished to make anyway.
True, but the paper reduced opposition. I was surprised by the paper at the time, but given the authors considered it a useful analysis. I was quite relieved when the error was discovered
> The authors show our accidental omission has a fairly marginal effect on the 0-90% buckets in figure 2. However, it leads to a notable change in the average growth rate for the over 90% debt group.
That was for parties to save face after investing in Greece. They just wanted their money back without having to recognize the investments were not good quality, as stated. Luckily, all the stereotypes about the lazy south were handy.
It's the same with corruption.
The Greece people know about their corruption, Germans mostly ignore it therefore the position in the corruption index. For instance they completely ignored who bribed the greece companies and politicians.
> Germany was and is still fooled by it. That is why Greece was treated the way it was after the Lehman crash in 2008.
With or without that study, Greece would have been treated this way. Saving money is burned so much into German culture that none of the recent German ministers of finance can forget their microeconomic views and adopt macroeconomic views.
This goes way back to the end of the 1920s when the German government decided to save money during the great depression.
Also, in Berlin there was a museum about saving money [1].
I have done excellent work in O365 Excel that is later muddied by concurrent/other editors.
Half the time I'm correcting cells where a user navigated to the wrong one and replaced the formula. What's more frustrating is when I develop a BETTER formula to use for the entire column, but there's no easy way to replace it all and I actually want to preserve previously calculated values. (I preserve values for historical accuracy, even if the formula then was not-great.) Makes me think Excel needs git-like version control for rows/records, and the ability to query things like last-modified of a cell, etc.
I have copied everything in the column out to a temporary spreadsheet then paste-by-value to put it back, once the correct formula is set throughout. That's tedious and error-prone and sometimes I lose the dumb historic coloring that previous editors wanted (I preserve what I can) and log why this changed as the maintainer changed. Am I doing this wrong?
There's no easy way to spot holes/changes in formulas as you scan down the column: You can use the Review tab option to show Formulas, but if the formulas are largely the same (start the same but are very long), you're unlikely to spot the difference. This could be thousands of rows to scroll through, or the last hundred you're concerned with. Seems like there should be a better way.
I want an easy table protection option to require that 'this formula will be the only formula in this column'. Table protection is so lacking. You can't protect a column to say "only computed values exist here". You protect the column, and it prevents users from entering a new row/record, making a mistake, and deleting the row to try again. We train folks: If you mess up the next row, just delete the entire row and attempt to add it back. The computed columns/values will be there for you. Protecting a column makes this impossible.
Online Excel is advancing.. but I want too much. I feel like there's been low-hanging fruit for years and it's no wonder all these alternatives are good enough to replace Office.
IMHO instead of ramming a square peg into a round hole one may think of switching from Excel to i.e. Jupyter notebook(s).
Keep the raw data separate from the calculations/visualizations and get a more trackable environment.
You may also pinch out bunch of functions as separate scripts in a git repo and share these between various notebooks/people.
Calculate md5 checksums of raw data files. If values in rows can be modified, get the hash sums (xxhash?) per row. That way even with the lowest common denominator (text CSVs) you know if the inputs changed and where.
For larger data consider storing it in parquet format and using duckdb.
Assuming that you must have Excel output calculate what's needed using raw data, notebooks/scripts then output Excel using openpyxl
> A series of excel errors, including failure to copy down a formula to the whole column, led many governments to adopt a policy of economic austerity.
I can't help thinking that if a single published paper can lead to a national policy of austerity then the problem is not with the paper ...
Sounds more like an excuse to never reform the country to me. By that logic in bad times you shouldn't do the necessary reforms. And since they aren't done in good times either.
The wikipedia link you cite disagrees that it “proved the opposite”
> Further papers … which were not found to contain similar errors, reached conclusions similar to the initial paper, though with much lower impact on GDP growth.
De facto, austerity tends to be used to argue for cuts to social programs and reduced investment in common goods or to argue for privatization of common goods.
This tends to benefit the top 0.1%: The living situation of the the majority is worsened and their bargaining position for employment is weakened. Privatization leads to private profits that overwhelmingly go to the top 0.1%, and of course there's a direct power aspect to it as well. If you literally own the regional electricity provider, for example, that gives you direct political power that goes beyond the mere profit you can make from this ownership.
Technically, it doesn't have to be that way. For example, austerity could theoretically be used to argue for much higher top-end marginal income taxes and capital gains taxes as well as wealth taxes. But political constellations being what they are, that's usually not how it plays out.
When productivity is low and not growing much, disproportionally high public spending can possibly lead into a death spiral. Case in point Britain in the 70s. Or Venezuela nowadays. There is no need to swing into extremes, though austerity policies during a global crisis is probably one the worst things you can do if there are other options available.
Death spirals are fine if you can make sure you're not personally impacted by them. In fact it makes it much cheaper to buy out large swathes of the dying/dead economy, privatize failing public services and drive down labor costs. You don't even have to both reviving the economy afterwards if you can get a quick exit by selling to a greater fool.
> Death spirals are fine if you can make sure you're not personally impacted by them
Can also be prevented with hard to necessary reforms accompanied by significant spending cuts. Public spending alone can prop-up the economy and outside of raw resources and infrastructure government owned companies tend to be very in-competitive.
I'm sorry, are you genuinely trying to argue for austerity ("hard but necessary reforms accompanied by significant spending cuts") in a thread that literally started with someone posting information debunking the scientific basis for claims that austerity works? https://news.ycombinator.com/item?id=36199042
Also I'm not sure what your link seeks to demonstrate. The article says British Leyland was nationalised in 1975 but its history section starts out stating that it was formed in 1968 by the merger of BMH and LMC and that BMH was "periously close to collapse". The article shifts a lot of the blame for its failure on strikes and the company being too poorly organized to withstand them but the continuous failure of BMH following the merger is a running theme. The reason it was nationalised is also suggested to be that it was facing bankruptcy while representing almost the entirety of the UK automobile market.
Honestly, if anything BL's history reads like an example of aggressive mergers/buy-outs ruining an industry through monopolization and the tax payer ending up having to foot the bill to bail out the company so half the economy doesn't collapse with it. If the LMC-BMH merger hadn't gone through, BMH would have failed on its own instead of growing into an integral part of BL and taking the entire company with it to the point that the government had to step in and keep it on life support long enough to start selling parts off for scrap.
> I'm sorry, are you genuinely trying to argue for austerity ("hard but necessary reforms accompanied by significant spending cuts") in a thread that literally started with someone posting information
I'm sorry, are you genuinely implying that there one-size-fits-all approaches in economics that universally work regardless of the circumstances? And I'm pretty sure your misinterpreting the article or fundamentally misunderstand how statistical models work.
> scientific basis for claims that austerity works
It does work to a degree in certain cases. Or at least in certain situations is the least bad option.
> The article shifts a lot of the blame for its failure on strikes and the company being too poorly organized to withstand them but the continuous failure of BMH following the merger is a running theme
Yes British business weren't competitive because of continuous government interference and dysfunctional yet to influential trade unions. The merger in 1968 was an outcome of government and trade union meddling Leyland was doing fine on it's own and there were no good reason for it to merge with BMH (which was on the brink of bankruptcy) as a result of that both went under eventually..
> Honestly, if anything BL's history reads like an example of aggressive mergers/buy-outs ruining an industry
Yes but these mergers/buy-outs were imposed by the government...
Your points are entirely orthogonal to mine. There may well be cases where net government spending must be reduced -- but that is identical to increasing net government income.
Edit: It was UMASS grad students that spotted the spreadsheet errors by these Harvard/IMF heavy weights:)
https://peri.umass.edu/fileadmin/pdf/working_papers/working_...https://en.wikipedia.org/wiki/Growth_in_a_Time_of_Debt