10 February 2017, Carbon Brief, Guest post: Why NOAA updates its sea surface temperature record. The National Oceanic and Atmospheric Administration (NOAA) is one of a number of climate agencies that pieces together global temperature from thousands of measurements taken each year across the world’s land and oceans. Last weekend, an article in the Mail on Sunday sparked interest in the way NOAA constructs its temperature record. The claims in the article, widely rebutted shortly after its publication, focused on the most recent version of NOAA’s sea surface temperature (SST) record. I have been involved in the development of this dataset since 2011 and it is due to be updated again shortly. However, an early draft of the journal paper about this update appears to have been circulated amongst media without the permission of the authors (including myself). I have, therefore, decided to make some observations here in a personal capacity that may help make better sense of how the dataset is produced and what it shows. Peer review The status of the new version of our dataset – nominally labelled “ERSSTv5” – is that we have submitted a paper to a journal, and it is undergoing peer review. As is the academic norm, the authors wish to respect the review process and did not give their permission for the draft to be shared with the media. To be clear, the authors have expressly not given permission for the draft to be used or quoted in the media. To be equally clear, at this juncture, copyright of the paper remains with the authors. The peer review process on the paper has only just begun, therefore it is premature and incorrect to analyse the dataset in detail before the paper is published. Peer review is an essential step towards eventual acceptance of any new research. It is highly unusual – and, in my view, undesirable – to discuss the specifics of submitted manuscripts in a public manner before this process has concluded. Peer review will, inevitably, point out ideas which shall serve to strengthen any given analysis. I shall, therefore, not be discussing specific scientific aspects of the draft paper and shall refer to it below only to the barest extent required. If and when the paper is published, this will be the appropriate time to discuss its findings in depth and I would be delighted to do so. Read More here
Yearly Archives: 2017
1 February 2017, Renew Economy, Eight reasons why Dr Finkel is great news for Australia’s energy future. Our electricity grid looks likely to progress more systematically to a cleaner more secure future thanks to Australia’s Chief Scientist Dr Alan Finkel being brought in – to lead the analysis and policy recommendations. For those who could not make Tuesday night’s 2.5 hour session in Adelaide with him, here are some of the key comments made by him and his team:
1. Dr Finkel and SA’s Chief Scientist Leanna Read both see the grid becoming 100% renewable powered as the end point.
2. Dr Finkel is walking the talk: all electricity at his home is sourced from green power electricity and he is an electric car user.
3. He and his team will travel shortly to other renewable energy leading regions with few grid interconnections to share best practices for SA (Texas and Ireland), to high penetration locations committing to further quick transitions to distributed renewables (California, New York, Denmark, France, UK and Germany) and meeting GE and Siemens who are leading in creating distributed grid systems and controllers and grid storage. Read More here
31 January 2017, Climate News Network, Video demand drives up global CO2 emissions. Sitting back and watching your favourite streamed TV series may seem harmless enough – but video demand is leaving a hefty carbon footprint. LONDON, 31 January, 2017 – The internet is fast becoming a major source of global carbon emissions – and the main cause is video demand, the increasing popularity of “real time” streamed video content. Video streaming to internet-enabled TVs, game consoles and mobile devices already accounts for more than 60% of all data traffic – and the latest forecasts suggest this will rise to more than 80% by 2020. Increasingly, viewers across the world are watching films and TV series in real time through subscriptions to Netflix or Amazon, while social media platforms such as Facebook and Twitter are offering more and more streamed video content for free. This is driving a dizzying increase in the amount of information that needs to be stored and transmitted by power-hungry data centres. Up until 2003 the world had accumulated a total of five exabytes – five billion gigabytes – of stored digital content. By 2015 that amount was being consumed every two days, as annual consumption reached 870 exabytes. As more video is streamed and more of the world’s population goes online, annual data traffic is forecast to reach 2,300 exabytes by 2019. Read More here
11 January 2017, The Conversation, Getting a scientific message across means taking human nature into account. We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not. Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better? A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.” But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Read More here