Introduction In this essay, it will argued that Hansen’s predictions were not wrong but simply didn’t have the amount of data that we have on our climate today. Possible sources of uncertainty which the NASA model did not include or understand fully yet are examined. Cloud processes which have an albedo effect as well as trap radiation closer to the surface of earth are still not well understood. NASA’s climate models could have easily over or underestimated the effect of these clouds. The interaction of the Carbon Cycle, which is known as a slow feedback, and the climate. The earth acts as a carbon sink, absorbing Co2. However recent research suggests that as temperatures rise, this process reverses, resulting in emissions of Co2 from soils and oceans. Furthermore, permafrost has not been included in many GCM’s. Permafrost is a type of slow feedback, which contains high levels of methane, which is twenty three times more potent a greenhouse gas than carbon dioxide. and while it may not have a large effect on near projections, those for 2100 and beyond will be skewed by this uncertainty. Background On June 23, 1988, James Hansen testified before the american house of representatives that there would be a 0.4 degree increase in global temperature by 1997. Hansen, unable to predict exact future greenhouse gas emissions, chose 3 scenarios to model, A, B and C. Scenario A assumed exponential growth in greenhouse gas emissions. Many critics of Hansen point to scenario A to try to discredit his model and predictions. However, this was only one of his predictions, and by focusing solely on one aspect is both unscientific and highly damaging to the climate change awareness cause. What Hansen actually predicted was a range within the future temperature change could fall within, based on 3 scenarios. Hansen’s Scenario C was most closely associated with actual temperature increases (see appendix A) As with any model of our climate, especially those of the past, there is a large level of uncertainty involved. Some of this uncertainty comes from cloud feedbacks and land surface changes or from greenhouse gas emissions. The latter of which Hansen could not have accurately included in his model. Hansen’s model assumed that climate sensitivity to a doubling of atmospheric Co2 was 4.2 degrees celsius, the upper bound of the range. The IPCC predict that climate sensitivity is likely to be between 1.5 – 4.5 C (Climate Change – Physical Science Bias, 2013). With this in mind, Hansen would have been able to accurately predict the rate of global surface temperature change. This does not mean that NASA’s climate models were wrong. But simply that modelling our climate for the future has to take into account so many variables that we cannot reliably predict the future. (267)Discussion According to the IPCC (IPCC, 2013), there is very high confidence that uncertainties in cloud processes can explain much of the spread in modeled climate sensitivity. Clouds reflect incoming solar radiation back into space as well as trap outgoing radiation. Due to this, clouds have a significant effect on the total radiation balance. Cloud feedbacks are still not fully understood, despite being one of the main sources of variability in GCM’s. (Bony, 2005) suggests that variability in models can be partially explained by an under representation of Marine Boundary Layer clouds in tropical regions. They explain that these types of clouds sensitivity to changes in temperature is being underestimated, leading to variability in models and climate projections. On the other hand however, this effect was only seen in tropical coastal regions. Models in temperate climates showed no significant variation from the mean. Going back to climate modeling, NASA’s early models didn’t have the level of data and understanding of cloud processes which we have today. This would have led to a larger range of uncertainty within Hansen’s projection of global mean temperatures. Since 1988 we have come extremely far in terms of computing power. With this, we can run countless simulations of our climates, in much less time. In the near future, we will see exascale computing which is around 1000 times faster than current technologies. These will allow us to reduce the range of uncertainty in our projections of future climate change. The earth acts as a carbon ‘sink’. This works in a few ways, trees take carbon dioxide from the air, using it for energy through photosynthesis, and the ocean uses phytoplankton in a similar process. This process of storing carbon in the ground is called carbon sequestering. It is only until this carbon is mined and burned for human activities, such as fossil fuels, which releases this carbon back into the atmosphere. As a result, temperatures rise which then reverses the ‘sink’ effect (Jones, 2003).  (Cox, 2000) estimate that the ‘sink’ effect will last up to about 2050, but that by 2100 global soils will be a source of Co2. As temperatures rise, the rate of decomposition of organic material organic material in the soil increases causing it to emit Co2 (Melillo, 2002) . On the other hand, scientists argue that increased nitrogen in the soil will promote plant growth, resulting in an offset of carbon entering the atmosphere (Sokolov et al., 2008). However, the extent to which this will offset emissions is still not clear. One of the reasons NASA’s general climate model might have overestimated temperature increases is because it didn’t take into account this negative forcing effect. By including climate related changes to ocean and soil Co2 sequestration we can reduce the level of uncertainty. In the future, satellites will advance to the stage at which they are able to measure all global greenhouse gas emissions, whether anthropogenic or naturally occuring. Through this, we can trust in climate projection models as they will have full data about the processes that are occuring.Permafrost is possibly one of the largest sources of positive carbon feedbacks and will contribute to models under-estimating the extent of carbon emissions in the future (Burke, Jones and Koven, 2013).Heat rises, and as the earth heats up alot of radiative forcing will occur at the north pole because of this (see appendix B). Carbon Dioxide (Co2) and Methane (CH4) are stored in the permafrost covering the polar regions, trapped under layers of ice and snow. Methane is  twenty three times more potent greenhouse gas than carbon dioxide. It currently accounts for 7% of all greenhouse gas emissions in the UK (Strickland, 2009). Permafrost is not currently included in many Earth System Models, which has led to great concern among scientists as to how to raise awareness of this issue to policymakers (Schaefer et al., 2014). As technology advances, so does our understanding of these processes. With greater resolution we can see the finer details of what is affecting the climate. For example improved satellite imagery has been helping scientists map the rate of polar ice change. Recently, scientists located 91 hydrothermal vents below the west antarctic ice sheet, using advanced satellite imagery (van Wyk de Vries, Bingham and Hein, 2017). Understanding the rate of energy flows from these vents, we can improve our models of the ice sheet degradation and reduce uncertainty in our climate projections. We have touched on some sources of uncertainty within climate projections, now we will look at whether we can trust climate projections in the future. The next few years will be critical in terms of improving our understanding of the climate and developing improved climate models. The global scientific community needs to come together and pool their resources to create a world climate research facility (Shukla et al., 2009). This facility can help improve climate predictions by the means of large investment in computing power to supplement RCM’s and increase resolution. Currently, petaflop computing is under development, with the chinese hoping to have a working prototype by the end of 2018. With this increase in computing power, GCM’s can run more simulations faster. This allows us to reduce uncertainty. On the other hand, with more data and simulations, we may not be able to reduce uncertainty, and in fact increase it (Knutti and Sedlá?ek, 2012). More accurate data will enable us in the future to better understand the global climate processes and thereby improve our GCM’s. Satellites such as NASA’s GOES-R which allow near real time weather monitoring allow us to watch storms, lightning and wildfires progress with incredible accuracy. The same goes for GOSAT satellites which allow us to measure greenhouse gas emissions from space. By better knowing the energy balance we can better project radiative forcing and its effects on earth.  On balance, technology in the future will allow us to place more trust in climate projections. However, this depends on the international community commiting funding and knowhow to solve the problems currently faced.  Conclusion NASA’s 1988 projections were right in that they estimated a future warming of the planet that was not due to internal variability. And this in itself was strong scientific evidence in support of climate change theory.Hansen’s climate predictions were not necessarily ‘wrong’ there was just too much uncertainty in his projections for the public to accept them. This is a problem that has plagued climate modelling from its inception. As there are a variety of climate models each with a different method of estimating variables there is bound to be uncertainty. To reduce this, and in the future gain trust in climate modelling, it is suggested that one global climate model be developed. This will benefit society in that there will be a common goal to work towards to reduce fragmentation in the scientific process. In the years to come, big data and our ability to process that data will be an invaluable tool in predicting climate change. Advanced satellites and a greater understanding of the processes that underwrite climate change will enable scientists to close the gap of uncertainty in projections. On the other hand, scientists have argued that as we learn more about our climate and the number of variables that affect it, uncertainty will increase as we will have to take into account all of them.