What is the best way to deal with complexity?

For the serious discussion of weighty matters and worldly issues. No off-topic posts allowed.

Moderators: Azrael, Moderators General, Prelates

quantropy
Posts: 192
Joined: Tue Apr 01, 2008 6:55 pm UTC
Contact:

What is the best way to deal with complexity?

Postby quantropy » Tue Aug 20, 2013 11:12 am UTC

I came across this paper Cancer: A Computational Disease That AI Can Cure, where the authors argue that dealing with cancer is essentially dealing with complexity, and that artificial intelligence is the best tool for this. I have no argument with the first point, but is artificial intelligence the best way to deal with complex systems? If not, then what is? I can think of three options for dealing with complexity. I'll take writing an operating system as an example of complexity.

1: Artificial Intelligence.
Pros: Computing power is cheap, and artificial intelligence is progressing all of the time.
Cons:(i) Artificial intelligence has had so many false starts - there have been plenty of examples of programs which write other programs, but none have been powerful enough to replace humans.
(ii) Why do we need to replace humans anyway - there are plenty of people on the planet.

2:Big organisation. (e.g. Microsoft)
Pros: Allocates sufficient resources to deal with the problem in hand.
Cons:(i) Big organisations tend to take on a life of their own, dealing with internal matters rather than the problem in hand
(ii) This requires splitting up the problem into modules, and it may well not split up in this way (that is part of being complex)

3: Human Expert
It would seem to make sense for one person to understand a system to understand a system in sufficient detail to be able to tackle problems related to that system. Now complex systems may be difficult for one person to get to grips with, but you can think of such an expert having assistants for particular tasks (In programming this is called a Chief programmer team ). Also the expert can have access to powerful software, but used as a tool, rather than something which is trying to replace a human.

The thing is though, that the 'Human Expert' model tends to drift into the 'Big organisation' model. Bill Gates moved away from programming into management and then away from that into philanthropy, Linux is named after one person, but not managed by him. Is this because the 'Big organisation' model is actually better - a sign of maturation of the subject of interest - or is this drift something that should be fought against? The 1970's 'War on Cancer' (=big organisation) didn't defeat cancer, but it can be argued that it did lead to progress in many different areas. Could it have achieved better results if tackled differently?

User avatar
Zamfir
I built a novelty castle, the irony was lost on some.
Posts: 7507
Joined: Wed Aug 27, 2008 2:43 pm UTC
Location: Nederland

Re: What is the best way to deal with complexity?

Postby Zamfir » Tue Aug 20, 2013 11:43 am UTC

It's pretty much intrinsic to complexity that there is no straightforward approach to it. It's the word that we use for situation that defy easy, algorithm-style simplification. Such situation ask for an ad-hoc approach, based on the specifics of the case.

In other words, there is no one-solution answer to operating system development, ancer research, life, the universe, and everything. Not even 42.

User avatar
Azrael
CATS. CATS ARE NICE.
Posts: 6491
Joined: Thu Apr 26, 2007 1:16 am UTC
Location: Boston

Re: What is the best way to deal with complexity?

Postby Azrael » Tue Aug 20, 2013 1:12 pm UTC

An AI based approach and large organizations are fairly similar as both incorporate lots of computational power to break down complexities to smaller matters and solve them -- the human-powered one is just farther distributed. I'm typically skeptical of claims that a sufficiently advanced AI can solve such problems as there aren't sufficiently complex AIs yet.

The human expert/visionary model really doesn't get there with any repeatability. Take Apple, a prime contemporary example: They were successful by minimizing complexity, and once the leader was gone (and they always go one way or another) they are (comparatively) struggling. Nor is there any guarantee that for every complex problem there is a sufficient expert/visionary available to step up.

Tyndmyr
Posts: 11443
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: What is the best way to deal with complexity?

Postby Tyndmyr » Tue Aug 20, 2013 3:25 pm UTC

Break it down into smaller problems.

The agents involved in it are mostly irrelevant. In any case, complexity is solved by subdividing the big issue into smaller subsets, each of which is less complex. Repeat as necessary until the problem is solvable.

leady
Posts: 1592
Joined: Mon Jun 18, 2012 12:28 pm UTC

Re: What is the best way to deal with complexity?

Postby leady » Tue Aug 20, 2013 4:08 pm UTC

also don't forget approximation too

travelling salesmen are horribly complex, but very good approximations to perfect are pretty quick

User avatar
idobox
Posts: 1591
Joined: Wed Apr 02, 2008 8:54 pm UTC
Location: Marseille, France

Re: What is the best way to deal with complexity?

Postby idobox » Tue Aug 20, 2013 4:28 pm UTC

We have used big organizations and human experts to try to cure cancer, with mitigated results. It makes sense to try a new method.

Asking oncologist to enter symptoms and other patient data (gender, weight, other conditions) in a dedicated database, as well as the treatment and results would provide us with a massive amount of data. Human experts could use it, but so could AI or expert programs that would them make suggestions
The biggest issue with AI is that it often learns by trial and error, and errors can be fatal in this case, so you need an expert in the middle. But having a computer system give out the medical records of the most similar patients could be a huge help.

For example imagine a patient with big toe cancer (BTC). He is male, 136kg, and has diabetes. The typical protocol for BTC is poisonor at 3g/kg of body mass and killallex at 1g/kg, but the practitioner enters all the data, and finds that over the past 10 years, there was 42 cases of BTC+diabetes, that patients who didn't get killallex had better survival rates, especially obese ones, because the others were more likely to suffer from kidney failure. The system also finds out that gender is not significant. Now, the practitioner can decide to still give killallex, give a lower dose, replace it with tumor-b-gone.
If he decides to use ray therapy, he could also find out that combining it with tumor-b-gone in other types of cancer reduces the symptoms of diabetes.
With all this information, much more than is usually available to any doctor, he can make better decisions, and share the results of rays + poisonor and tumor-b-gone therapy on diabetic male patients of BTC, and information that today would be lost because there is a single patient.

Because of the massive amount of data, you would need an AI to parse and decide what is relevant.
If there is no answer, there is no question. If there is no solution, there is no problem.

Waffles to space = 100% pure WIN.

elasto
Posts: 3568
Joined: Mon May 10, 2010 1:53 am UTC

Re: What is the best way to deal with complexity?

Postby elasto » Wed Aug 21, 2013 12:40 am UTC

I haven't read the original article so Watson may have already been mentioned there. If not this is highly relevant:

IBM's Watson -- the language-fluent computer that beat the best human champions at a game of the US TV show Jeopardy! -- is being turned into a tool for medical diagnosis. Its ability to absorb and analyse vast quantities of data is, IBM claims, better than that of human doctors, and its deployment through the cloud could also reduce healthcare costs.

The first stages of a planned wider deployment, IBM's business agreement with the Memorial Sloan-Kettering Cancer Center in New York and American private healthcare company Wellpoint will see Watson available for rent to any hospital or clinic that wants to get its opinion on matters relating to oncology. Not only that, but it'll suggest the most affordable way of paying for it in America's excessively-complex healthcare market. The hope is it will improve diagnoses while reducing their costs at the same time.

Two years ago, IBM announced that Watson had "learned" the same amount of knowledge as the average second-year medical student. For the last year, IBM, Sloan-Kettering and Wellpoint have been working to teach Watson how to understand and accumulate complicated peer-reviewed medical knowledge relating to oncology. That's just lung, prostate and breast cancers to begin with, but with others to come in the next few years). Watson's ingestion of more than 600,000 pieces of medical evidence, more than two million pages from medical journals and the further ability to search through up to 1.5 million patient records for further information gives it a breadth of knowledge no human doctor can match.

According to Sloan-Kettering, only around 20 percent of the knowledge that human doctors use when diagnosing patients and deciding on treatments relies on trial-based evidence. It would take at least 160 hours of reading a week just to keep up with new medical knowledge as it's published, let alone consider its relevance or apply it practically. Watson's ability to absorb this information faster than any human should, in theory, fix a flaw in the current healthcare model. Wellpoint's Samuel Nessbaum has claimed that, in tests, Watson's successful diagnosis rate for lung cancer is 90 percent, compared to 50 percent for human doctors.

Sloan-Kettering's Dr Larry Norton said: "What Watson is going to enable us to do is take that wisdom and put it in a way that people who don't have that much experience in any individual disease can have a wise counsellor at their side at all times and use the intelligence and wisdom of the most experienced people to help guide decisions."

The attraction for Wellpoint in all this is that Watson should also reduce budgetary waste -- it claims that 30 percent of the $2.3 trillion (£1.46 trillion) spent on healthcare in the United States each year is wasted. Watson here becomes a tool for what's known as "utilisation management" -- management-speak for "working out how to do something the cheapest way possible".

Wellpoint's statement said: "Natural language processing leverages unstructured data, such as text-based treatment requests. Eighty percent of the world's total data is unstructured, and using traditional computing to handle it would consume a great deal of time and resources in the utilisation management process. The project also takes an early step into cognitive systems by enabling Watson to co-evolve with treatment guidelines, policies and medical best practices. The system has the ability to improve iteratively as payers and providers use it." In other words, Watson will get better the more it's used, both in working out how to cure people and how to cure them more cheaply.

When Watson was first devised, it (or is it "he"?) ran across several large machines at IBM's headquarters, but recently its physical size has been reduced hugely while its processing speed has been increase 240 percent. The idea now is that hospital, clinics and individual doctors can rent time with Watson over the cloud -- sending it information on a patient will, after seconds (or at most minutes), return a series of suggested treatment options. Crucially, a doctor can submit a query in standard English -- Watson can parse natural language, and doesn't rely on standardised inputs, giving it a more practical flexibility.

Watson's previous claim to fame came from it winning a special game of US gameshow Jeopardy! in 2011. For those unfamiliar, Jeopardy!'s format works like this: the answers are revealed on the gameboard and the contestants must phrase their responses as questions. Thus, for the clue "the ancient Lion of Nimrod went missing from this city's national museum in 2003" the correct reply is "what is Baghdad?". Clues are often based on puns or other word tricks, and while it's not quite on the level of a cryptic crossword, it's certainly the kind of linguistic challenge that would fox most language-literate computers.

Watson's ability to parse texts and grasp the underlying rules has had its drawbacks, though, as revealed last month when IBM research scientist Eric Brown admitted that he had tried giving Watson the Urban Dictionary as a dataset. While Watson was able to understand some of the, er, colourful slang that fills the site's pages, it also failed to understand the different between polite and offensive speech. Watson's memory of the Urban Dictionary had to (regrettably) be wiped.

BattleMoose
Posts: 1993
Joined: Tue Nov 13, 2007 8:42 am UTC

Re: What is the best way to deal with complexity?

Postby BattleMoose » Wed Aug 21, 2013 6:18 am UTC

Global Circulation Models are stupidly complex and on average 500 000 lines of code long, in fortran.

And the problem is very complex too and demands some fairly severe simplifications and assumptions in order to be practical. And requires a whole range of experts with very different expertise in order to understand the problems, make the assumptions and write the code. It is so far beyond the skill of any one person. Nor is there any AI that could possibly produce a GCM, nor is there reason to expect that there will be any time soon.

The only way to handle a problem like this is to break it down into very, very small parts that fall into the expertise of individual experts or group of experts. Such as Oceans for example, where the Ocean people will write and manage the code that govern the behaviour of Oceans, taking for granted the inputs that are given to them from the Atmosphere that a different group of experts would manage and both groups need to accept the inputs given from each other and give each other the outputs from their parts. (And there will be groups that manage specific processes within both the Oceans and the Atmosphere and the Vegetation and and and...)

Essentially, confine your problem, and accept the inputs that are given to you. Model the aspect that you are responsible for and give your outputs to the other relevant processes that depend on them. Its a very reductionist approach and has its problems but is an effective means of managing a system that is too complex for one expert.

Same approach and be applied to building a motherboard, assuming you are responsible for a specific aspect of the design and are dependent on the work of others. Or a nuclear power plant, which requires turbine experts, nuclear physicists and electrical engineers.... Each person or small groups responsible for one small part of the bigger problem.

Not sure if cancer is the best example, don't know much about it but under the impression that its actually a lot of very many different complex problems rather than one big problem.

qetzal
Posts: 855
Joined: Thu May 01, 2008 12:54 pm UTC

Re: What is the best way to deal with complexity?

Postby qetzal » Wed Aug 21, 2013 2:56 pm UTC

idobox wrote:We have used big organizations and human experts to try to cure cancer, with mitigated results. It makes sense to try a new method.

Asking oncologist to enter symptoms and other patient data (gender, weight, other conditions) in a dedicated database, as well as the treatment and results would provide us with a massive amount of data. Human experts could use it, but so could AI or expert programs that would them make suggestions
The biggest issue with AI is that it often learns by trial and error, and errors can be fatal in this case, so you need an expert in the middle. But having a computer system give out the medical records of the most similar patients could be a huge help.

For example imagine a patient with big toe cancer (BTC). He is male, 136kg, and has diabetes. The typical protocol for BTC is poisonor at 3g/kg of body mass and killallex at 1g/kg, but the practitioner enters all the data, and finds that over the past 10 years, there was 42 cases of BTC+diabetes, that patients who didn't get killallex had better survival rates, especially obese ones, because the others were more likely to suffer from kidney failure. The system also finds out that gender is not significant. Now, the practitioner can decide to still give killallex, give a lower dose, replace it with tumor-b-gone.
If he decides to use ray therapy, he could also find out that combining it with tumor-b-gone in other types of cancer reduces the symptoms of diabetes.
With all this information, much more than is usually available to any doctor, he can make better decisions, and share the results of rays + poisonor and tumor-b-gone therapy on diabetic male patients of BTC, and information that today would be lost because there is a single patient.

Because of the massive amount of data, you would need an AI to parse and decide what is relevant.


It would be interesting to see what this approach turns up, but I can see some obvious implementation problems. Would oncologists enter the needed data with sufficient diligence? Would they even know all the right things to enter? Would incomplete records be a problem? And perhaps the biggest, how would we sift through the enormous number of spurious correlations this would surely generate? Throw in enough parameters, and any model can predict past results perfectly: males with BTC that weigh 130-147 kg, have diabetes, live in Iowa, have 3 boys, 1 dog, and never owned a cat always die within 2 years if given killallex (n=2), but always survive at least 7 years if given only poisonor (n=1).

I imagine such a system would be most useful if it could identify correlations that hadn't previously been noticed, but that actually fit with known cancer biology upon further reflection. Such correlations would almost certainly require additional directed study to confirm, but perhaps they would point us in useful directions that we'd otherwise have missed?

User avatar
Cleverbeans
Posts: 1378
Joined: Wed Mar 26, 2008 1:16 pm UTC

Re: What is the best way to deal with complexity?

Postby Cleverbeans » Wed Aug 21, 2013 4:26 pm UTC

qetzal wrote:Would oncologists enter the needed data with sufficient diligence? Would they even know all the right things to enter? Would incomplete records be a problem?

Incomplete data is a barrier regardless of the methods used to study them so, these concerns would be identical even if a human was trying to solve the problem. Naturally sufficient, complete data is best.

And perhaps the biggest, how would we sift through the enormous number of spurious correlations this would surely generate?

I believe you're looking for ANOVA analysis here, which should allow you to differentiate which variables are the most impactful on the model. If you have some mathematics background you can learn more about these models by checking out Standford Engineering Everywhere's Machine Learning and Convex Optimization courses for a more detailed explanation of how the AI actually works on these problems. They even use cancer data as part of the coursework.
"Labor is prior to, and independent of, capital. Capital is only the fruit of labor, and could never have existed if labor had not first existed. Labor is the superior of capital, and deserves much the higher consideration." - Abraham Lincoln

qetzal
Posts: 855
Joined: Thu May 01, 2008 12:54 pm UTC

Re: What is the best way to deal with complexity?

Postby qetzal » Wed Aug 21, 2013 5:27 pm UTC

Sure, incomplete data is always a problem. But I think it's a different kind of problem when you're looking for correlations in a database such as idobox is proposing, versus when you're pursuing conventional research approaches to finding cancer cures.

As for ways to identify important variables, I know there are relevant approaches, but it's not an area I'm familiar with. That said, I suspect it would be a pretty significant problem. For example, a relatively hot area these days is to look for correlations between DNA markers and disease - so-called genome-wide association studies. If you find that a certain marker correlates with a given disease, it may mean that a gene associated with the marker is involved in the disease. My understanding is that such studies turn up lots of correlations, and that it's not at all simple to determine which correlations are meaningful and which are spurious. And that's in a case where the researchers can ensure relatively complete data for every subject involved.

I suspect it would be a much bigger problem for idobox's approach. But I admit I'm basically guessing here, so I won't push the point further.

Tyndmyr
Posts: 11443
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: What is the best way to deal with complexity?

Postby Tyndmyr » Wed Aug 21, 2013 6:35 pm UTC

idobox wrote:Because of the massive amount of data, you would need an AI to parse and decide what is relevant.


You don't really need an AI to handle data parsing. We do that all the time.

quantropy
Posts: 192
Joined: Tue Apr 01, 2008 6:55 pm UTC
Contact:

Re: What is the best way to deal with complexity?

Postby quantropy » Thu Aug 22, 2013 10:56 am UTC

On further thought, people deal with complex systems all the time - it's called engineering. But engineering depends on two concepts - redundancy and modularity. Sometimes this is fine, and over time you can squeeze out some of the redundancy. Sometimes these cause a problem. To look at some of the examples so far.

Cancer
The problem here is that you need to kill cancerous cells but not normal cells, so you can't have that much redundancy. We have drugs, but they are not 100% effective - there is no silver bullet (and thinking in terms of AI looks too much like wishing for a silver bullet). Over time we have learned to increase the level of treatment without killing the patient.
I'm thinking, what if we could get away from modular thinking, and think about multiple treatments - 90% poisonor and 10% killalex, combined with a strict exercise regime and a fresh cup of really hot tea every 2 hours. We need a lot of computing power to be able to predict that this might work, but an AI is unlikely to find it.

Business
People do run successful businesses, and to some degree this is due to the ability to manage complex systems, but businesses also need a large degree of redundancy (i.e. profit margin) - if you start of with just enough money to cover your costs, you're likely to go bust at the first hurdle. Over time competition squeezes the profit margin and the product becomes a commodity. As businesses get larger, they also have to become more modular.
What we seem to do badly is running economies - like running a business, but with more feedback loops. Of course, when one person has been given the power to run an economy it's generally turned out badly.

Software and Computers
Software design is all about modularity (and some degree of redundancy), but it does have it's disadvantages - it means that a program can affect areas you don't expect it to - malware.
Regarding Apple computer, the success of Steve Jobs is largely as a businessman, but I get the impression that the design of the original computers was largely done by one person - Steve Wozniak.

Global Weather and Climate
We've been reluctant to go in for purposeful geoengineering so far. As for understanding the weather and climate, I'm not convinced that the reductionist approach is best. There are too many interactions between Oceans and Atmosphere and Vegetation, and it needs someone to be able to put them all together - to say that this interaction is important, but that interaction can be ignored.
Of course if you want to be able to use this, you need some way of making sure that such a person is using best judgement rather than justifying prior biases - I'm not sure how to achieve that.

BattleMoose
Posts: 1993
Joined: Tue Nov 13, 2007 8:42 am UTC

Re: What is the best way to deal with complexity?

Postby BattleMoose » Fri Aug 23, 2013 6:42 am UTC

quantropy wrote:
Global Weather and Climate
We've been reluctant to go in for purposeful geoengineering so far.


Main reasons we have avoided geoengineering up till now include:
1. Not emitting will always be more effective than geoengineering, in safety, ecnomics and efficacy
2. Researching geoeningeering will give the public a perception that we can fix the problem after we have created
3. We have no way of evaluating the consequences of geoengineering


As for understanding the weather and climate, I'm not convinced that the reductionist approach is best. There are too many interactions between Oceans and Atmosphere and Vegetation, and it needs someone to be able to put them all together - to say that this interaction is important, but that interaction can be ignored.
Of course if you want to be able to use this, you need some way of making sure that such a person is using best judgement rather than justifying prior biases - I'm not sure how to achieve that.


No such person exists. Nor is it likely that such a person will ever exist or could exist. Nor is there any means for producing such a person. No one has the capacity to acquire 4 different phds and 8 post docs across 4 disciplines nor is there any institution that would ever fund such an endeavour.

The interactions cannot be ignored, they are critical to how our climate responds. And this is what makes it a complex problem.

Tyndmyr
Posts: 11443
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: What is the best way to deal with complexity?

Postby Tyndmyr » Fri Aug 23, 2013 11:38 am UTC

The *always* aspect of 1 seems questionable. I mean, sure, geoengineering isn't practical now, but one day, it might well be. Probably in a *very* distant future.

elasto
Posts: 3568
Joined: Mon May 10, 2010 1:53 am UTC

Re: What is the best way to deal with complexity?

Postby elasto » Fri Aug 23, 2013 1:04 pm UTC

BattleMoose wrote:Main reasons we have avoided geoengineering up till now include:
1. Not emitting will always be more effective than geoengineering, in safety, ecnomics and efficacy
2. Researching geoeningeering will give the public a perception that we can fix the problem after we have created
3. We have no way of evaluating the consequences of geoengineering

However, geoengineering has some key advantages over reducing carbon emissions that mean I think it's a dead cert it will be tried a couple of decades hence:

Firstly, reducing carbon emissions has to be done collectively or else it doesn't work - and I predict there will never be any meaningful agreement on who should reduce and by how much: The West is responsible for the vast majority of emissions currently up there, but India, China, Brazil and so on will be responsible for a big chunk of future emissions. Geoengineering, however, can be carried out by a single nation - with or without the consent of the rest of the world.

Secondly, reducing carbon emissions is an extremely hard sell domestically. No voter wants to hear that his country needs to switch from cheap, dirty sources of energy to expensive, clean ones and his quality of life needs to go down to pay for it. They're even less likely to agree to it unless everyone else is signed up too - which won't occur. Regardless of the scientific truth of the matter, geoengineering can be sold as a silver bullet: a way of having your coke and eating it. (See what I did there!!1 ^^)

However, geoengineering's strength in that first regard is also its weakness. Let's paint a hypothetical - perhaps somewhat unlikely - but let's run with it for now.

Let's say that in thirty years or so, the effects of three degrees of climate change on the US are painfully clear: An increase in hurricanes, tornadoes and the occasional massive flooding is costing well over $100Bn a year. Moreover, increasing shortages of freshwater and loss of farmland has been pronounced, resulting in significant lobbying from farming factions and others that 'something must be done, and quickly!'

The effect on Russia has been quite the opposite though: Areas previously quite hostile have now become relatively conducive to farming, which is booming. The NorthEast passage is now permanently ice-free, a huge boon to Russian and Chinese shipping, and massive oil finds now accessible in the arctic region promise to bring in more than a trillion dollars to the Russian coffers.

The US may decide to unilaterally engage in geoengineering to attempt to restore the old climactic balance and Russia might vociferously oppose. The world could be brought to the brink of a new world war...*

(*Actually, I don't think the major powers will ever again engage in warfare. The world is too interdependent economically. But it could cause serious tensions the like of which we haven't seen in a century.)

morriswalters
Posts: 7073
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: What is the best way to deal with complexity?

Postby morriswalters » Fri Aug 23, 2013 1:12 pm UTC

You seemed to ignore three.

qetzal
Posts: 855
Joined: Thu May 01, 2008 12:54 pm UTC

Re: What is the best way to deal with complexity?

Postby qetzal » Fri Aug 23, 2013 1:46 pm UTC

If we truly have no way to evaluate the consequences of geoengineering, then we also have no way to evaluate the consequences of not geoengineering. In which case, #3 has no bearing on the question, no?

BattleMoose
Posts: 1993
Joined: Tue Nov 13, 2007 8:42 am UTC

Re: What is the best way to deal with complexity?

Postby BattleMoose » Fri Aug 23, 2013 2:12 pm UTC

qetzal wrote:If we truly have no way to evaluate the consequences of geoengineering, then we also have no way to evaluate the consequences of not geoengineering. In which case, #3 has no bearing on the question, no?


If you don't or cannot understand the consequences of an action, you really shouldn't be doing it, because you might end up harming people. This applies equally to geoengineering and emitting greenhouse gasses. Its generally called not being a jerk.

morriswalters
Posts: 7073
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: What is the best way to deal with complexity?

Postby morriswalters » Fri Aug 23, 2013 3:37 pm UTC

qetzal wrote:If we truly have no way to evaluate the consequences of geoengineering, then we also have no way to evaluate the consequences of not geoengineering. In which case, #3 has no bearing on the question, no?
You do understand that the addition of CO2 to the atmosphere in the quantities that we do, is in effect a test of geoengineering. So are you now suggesting we double down and pile ignorance on top of ignorance?

elasto
Posts: 3568
Joined: Mon May 10, 2010 1:53 am UTC

Re: What is the best way to deal with complexity?

Postby elasto » Fri Aug 23, 2013 5:31 pm UTC

morriswalters wrote:You seemed to ignore three.

Who's that? Me?

First, as I say, it's more a political issue than a scientific one. Politicians frequently follow the logic: "We must do something. This is something. We must do this!" So it can happen with or without a sound scientific basis. All it will take is things getting bad enough. The same is really not true for reducing CO2 emissions, for the reasons I already outlined.

Second, it is possible to do limited testing of geoengineering solutions. For example, aerosols sprayed high into the atmosphere is where I'd put my money if I had to. Every time a volcano erupts we have a geoengineering 'test' of this kind. They can be released on a small scale to measure the impact, they can slowly be ramped up over the course of, say, a decade, and if they have unacceptable side-effects they can be left to fall out of circulation after only a couple of years.

Third, it doesn't have to be a perfect solution - it only has to be better than what we have then to be worth doing. It's like how giving someone with cancer a carcinogenic compound that makes all their hair fall out is still a good thing to do if the cancer will otherwise kill them.

Right now we mostly just have predictions, so it's absolutely not worth taking a risk with any of this. But if the bad stuff has happened (millions displaced due to rising tides; millions affected by increased hurricanes and tornadoes; millions in famine due to increased desertification etc.) then doing nothing at all will not be politically feasible - and by that time simply reducing CO2 emissions will not help since, according to some at least, it has a half-life of more than a century. So trialing some geoengineering solution that we know will go away on its own if unexpected side-effects turn up seems the smart thing to do.

Tyndmyr
Posts: 11443
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: What is the best way to deal with complexity?

Postby Tyndmyr » Fri Aug 23, 2013 6:30 pm UTC

BattleMoose wrote:
qetzal wrote:If we truly have no way to evaluate the consequences of geoengineering, then we also have no way to evaluate the consequences of not geoengineering. In which case, #3 has no bearing on the question, no?


If you don't or cannot understand the consequences of an action, you really shouldn't be doing it, because you might end up harming people. This applies equally to geoengineering and emitting greenhouse gasses. Its generally called not being a jerk.


Complete knowledge is probably impossible. We make decisions based on incomplete information all the time, and in fact, we have to do so. As mentioned, our current state of affairs is geoengineering as well.

Once a problem is identified(and we can safely say we've done that), it's important to get all the options on the table and examine them as best as we can. Sure, at some point we've got to go with a plan of action, even without complete info, but jumping to a specific conclusion as to our best actions is needlessly limiting the information we have to use.

Limiting CO2 emissions may not be a solution. In fact, I think it's quite likely that it wouldn't be optimal as a sole solution, but will be part of a hybrid approach. I also am not a great fan of limiting research solely because the public may perceive it incorrectly. Sure, that will happen...but that happens with a TON of research, some of which is very useful.

BattleMoose
Posts: 1993
Joined: Tue Nov 13, 2007 8:42 am UTC

Re: What is the best way to deal with complexity?

Postby BattleMoose » Fri Aug 23, 2013 6:38 pm UTC

Tyndmyr wrote:
BattleMoose wrote:
qetzal wrote:If we truly have no way to evaluate the consequences of geoengineering, then we also have no way to evaluate the consequences of not geoengineering. In which case, #3 has no bearing on the question, no?


If you don't or cannot understand the consequences of an action, you really shouldn't be doing it, because you might end up harming people. This applies equally to geoengineering and emitting greenhouse gasses. Its generally called not being a jerk.


Complete knowledge is probably impossible. We make decisions based on incomplete information all the time, and in fact, we have to do so. As mentioned, our current state of affairs is geoengineering as well.


You've got the precautionary principle all mucked up. I cannot say much more than I already did. If you have reason to believe as act could be harmful, you shouldn't be doing it.

http://en.wikipedia.org/wiki/Precautionary_principle

morriswalters
Posts: 7073
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: What is the best way to deal with complexity?

Postby morriswalters » Fri Aug 23, 2013 6:53 pm UTC

If complete knowledge is impossible than a greater understanding is. And the specific questions we can answer about climate change are relatively rare. The tools to do this are brand new. We are just now acquiring enough computer power to do the things that we need to do. We don't understand climate well enough at this point. Given that then "first do no harm".

Tyndmyr wrote:I also am not a great fan of limiting research solely because the public may perceive it incorrectly. Sure, that will happen...but that happens with a TON of research, some of which is very useful.
Research is one thing, geoengineering is something else. Historically we have not been real bright about the areas we research and how we go about it. We tend to do everything and hope for the best. If we foul this up there is no line of retreat. No safe haven. The world is fundamentally different than how it was a hundred years ago.


Return to “Serious Business”

Who is online

Users browsing this forum: No registered users and 6 guests