Friday, 13 December 2013

Replacing CFL bulbs with LEDs

I replaced the traditional incandescent bulbs in my flat some years ago and it was a no-brainer.  The compact florescent lights (CFLs) were coming rapidly down in price, were available in all the sizes I needed and of course consumed far less energy than the other heated filament bulbs they replaced.  I was happy to be saving energy and cutting my electricity bills.  All was good with the world, even though some of the bulbs took a while to "warm up" before they gave full output. I could live with that in most places.

Scroll forward to 2013 and I've now replaced a few of the first CFL bulbs with another new technology: mains power LED bulbs.  I need to replace some of the older CFLs that had failed over the 8 or 9 years I'd had them and Tesco (of all people) had new 4W LED bulbs on special offer at around £8 per bulb.  Now that's still a lot more than CFLs (which were being given away at one point and which you can quite happily get in pound shops these days).  However I wanted to try them out as I have four down lighters in the kitchen that are used quite a bit and where I could really do with the crisper, bluer, more immediate light from the LEDs.

I thus spent £32 on replacing 4 x 14W CFLs with 4 x 4W LEDs (compared to 4x100W reflectors that were there when I moved in over a decade ago!).  I liked the light and there's no "warming up" period (or at least none that I can detect).  Whilst the LED bulbs were more expensive the energy savings of having them on for a few hours per day on a regular basis means that I should be financially in front after 5 years or so, and the LED bulbs have a much longer predicted lifespan.

With that good experience under my belt, I started to wonder if there were other bulbs I should consider replacing in my flat.  The top contenders are the lounge standard lamp that is on for a few hours every evening, my bedside light (ditto) and the outside light above my front door (which could really do with a brighter light with no warm up whilst I'm trying to go up or down my iron staircase). As a result of discussions at the Footpaths Group at Loughborough University I started to wonder what the embodied energy of the various bulbs is and that's where things got alot more complicated.

Embodied energy is the energy required to actually make and distribute the product. The more complex the processing required to make a product and/or the more raw materials requiring high energy processing, the higher the embodied energy is.  The US Government funded an analysis of the embodied energy in different types of bulbs:

  • Incandescent bulbs require an average of 42MJ per 20 million lumen hours,
  • CFLs require an average of 170MJ per 20 million lumen hours,
  • Current LEDs require an average of 343MJ per 20 million lumen hours.
The "MJ per 20 million lumen hours" might seem to be a rather odd set of units, but it is basically used to standardise the amount of energy (MJ - mega-joules) over a fixed light output (the "per 20 million lumen hours") irrespective of the actual light output of the resulting bulbs.  This lets you compare the embodied energy required to replace, for example, a single very bright incandescent bulb with several CFLs or LEDs with lower light outputs (in lumens).

Now this looks bad for LEDs but the same report also shows the energy used actually lighting the bulbs. The traditional bulb consumes 15,100 MJ per 20 million lumen hours, the CFLs use up 3780 MJ per 20 million lumen hours and current LEDs sip just 3540 MJ per 20 million lumen hours.  This confirms that replacing traditional incandescent bulbs with either CFLs or LEDs is a good thing to do.  Even though both CFLs and LEDs have higher embodied energy, they consume about a fifth of the energy required to provide the standardised light output, and the energy used in lighting the incandescent bulbs dwarfs the embodied energy.

The tricky call is replacing CFLs with current LEDs though.  Whilst the energy used in powering the LEDs is 240MJ less per 20 million lumen hours, they require 173MJ more to make.  So they do win out in the end energy wise but the break even point comes rather late in their life.

I'm probably still going to replace my high use CFLs with LEDs.  However waiting a while may help: mains powered LEDs are still a relatively young technology and the energy use, embodied energy and prices are all likely to fall over the next few years.  My existing CFLs are still pretty good, so its probably best to wait until they break, and then make the switch to LEDs.

Saturday, 30 November 2013

LibCampUK 13 - the highlights for me

I've just spent the day in the brilliant new Library of Birmingham building at the LibCampUK13 "unconference". This is a meeting of librarians, library staff, library consultants, library suppliers and even a few of us lowly techies allowed out into the light from our basements. It being an "unconference" the agenda is created on the fly on the day from things that the attendees want to talk about. Its not about presenting papers or holding workshops - its about sitting around in huddles sharing problems, telling stories and swapping ideas. Its manic, chaotic and absolutely fantastic. It being full of librarians there was also plenty of cake on hand (including a competition won by a lady who had made a superb Moomin cake).

This blog post serves by way of a memory jogger for some of the useful things I came across during the day. There was so, so, so much more than I can record here as there were five or so parallel threads going on. You were encourage to "vote with your feet" and move between threads during each session if you got bored or wanted more variety, but I found all the groups I attended engaging and so stayed put!

My first session was on social media use in libraries. This was a really popular topic - so much so that there were two separate groups discussing it simultaneously. In our group I learnt some really useful tips that the social media aware folk are using (including a lady who works for a pub chain rather than a library service!). For example tweeting events that are in your local area that you don't run but which might be of interest to your followers both gets your tweets retweeted more widely and thus attract new followers, as well as having the event organisers follow you and retweet some of your messages by way of return. Location based searches can be useful as well - see who is talking about topics you are interested in your geographic area and then target their conversations with your own replies. Social media analytics are of interest to many: can they be sure that the effort that they put into social media interactions is actually reaching the desired target markets? Keeping up with the various social media services is also an issue: although Twitter and Facebook are biggies, Pinterest, LibraryThing and Tumblr are also used (hardly anyone seems to care about Google Plus though!). Different social media are used by different groups and even age isn't a sure fire targeting mechanism (some schools say students are into Tumblr as the latest thing but another school librarian said her students viewed it as a bit last year and were all over Instagram now). Some sites have issues with some (or all!) social media being blocked - useful to bear in mind if you're trying to reach certain groups (schools and NHS especially).

After grabbing a quick coffee and trying to grab a charge on my laptop and phone, I headed upstairs to the Open Archives session. It was already in full swing when I crept in and I was surprised to hear that some educational institutions still seem to under value open archives or data repositories as a way of spreading the message about their research (despite them being happy to send data to folk who do manage to seek them out). RCUK mandates of archiving of data supporting publications and Gold/Green journal article publishing with institutional repositories is helping, as are some JISC initiatives. There's lots of active work in this area though so its a "hot topic" at the moment. One chap said he was from FE and there was a large and mostly untapped market for repositories in FE colleges. Some discussion of open data and the benefit it provides for "mash ups" (especially if library folk are hacking on open source code).

Next was lunch, followed by three more sessions. The first afternoon session I attended was on digitisation. Some interesting work being done on private digitisation initiatives, especially for things like maps (which the chap who pitched and initiated the session was really into). Some discussion on the +/- aspects of things like Google Books: I was on the +ve side as we've used it in LORLS reading lists and its been really useful and popular, but some complaints over lack of transparency on quality of OCR behind the page scans (which I can understand as that's an expensive thing to do and probably isn't Google's primary aim at the moment). Heard about a group of heritage conservation volunteers called NADFAS who had help digitise and preserve works in some special libraries (though it needs librarian input to ensure metadata about digital objects are captured).

Back in the main theatre the middle session of the afternoon I dropped into was about gadgets, a topic close to my techie heart. Most folk in the group held up smart phones or tablets and said lots of their users had them. Some talk about managers buying "iPads" without any real idea how they would be loaned out or to who. Some sites have issues with setting up shared tablets as the software eco-systems on them don't really encourage it, whereas others (ones using "bump-in-the-wire" wireless portals for network authentication) have fewer issues. I asked if other sites had any great solutions to students trailing power leads everywhere (as policies and telling them off don't really work): one chap said that even their new sofas had power sockets in the arms and all tables had power sockets. I managed to also slip in a mention that NFC capable phones/tablets can also pick up RFID tags which seemed to interest several folk!

 I then stayed put for the last session of the day which in my group was on open source. The immediately useful take home for me was Library Box, which is a content hosting wifi hotspot that I'd not come across before. Great for providing educational/library resources in "pop up" environments, especially where there isn't decent Wifi or 3G coverage (eg book events in public parks). Some discussion about appropriate open source software for managing small library catalogues: I suggested Koha but one of the facilitators suggested it was too complex for really small libraries and she'd made good use of Drupal with cataloguing extensions. Another chap was looking for suggestions for school data repositories - Dspace and Eprints were mentioned but again may be too complex to set up and maintain, whereas a CMS like Wordpress might work fine and be more familiar to school teachers.

So some great stuff there, and so much more in the other sessions I didn't get to (and probably in the bar after the meeting which I didn't go to either). LibCamp is definitely on my list to attend again next year... assuming they let Shambrarians like me slip in again!

Wednesday, 27 November 2013

Goodreads, Perl and Net::OAuth::Simple

Part of my day job is developing and gluing together library systems.  This week I've been making a start on doing some of this "gluing" by prototyping some code that will hopefully link our LORLS reading list management system with the Goodreads social book reading site.  Now most of our LORLS code is written in either Perl or JavaScript; I tend to write the back end Perl stuff that talks to our databases and my partner in crime Jason Cooper writes the delightful, user friendly front ends in JavaScript.  This means that I needed to get a way for a Perl CGI script to take some ISBNs and then use them to populate a shelf in Goodreads. The first prototype doesn't have to look pretty - indeed my code may well end up being a LORLS API call that does the heavy lifting for some nice pretty JavaScript that Jason is far better at producing than I am!

Luckily, Goodreads has a really well thought out API, so I lunged straight in. They use OAuth 1.0 to authenticate requests to some of the API calls (mostly the ones concerned with updating data, which is exactly what I was up to) so I started looking for a Perl OAuth 1.0 module on CPAN. There's some choice out there! OAuth 1.0 has been round the block for a while so it appears that multiple authors have had a go at making supporting libraries with varying amounts of success and complexity.

So in the spirit of being super helpful, I thought I'd share with you the prototype code that I knocked up today.  Its far, far, far from production ready and there's probably loads of security holes that you'll need to plug.  However it does demonstrate how to do OAuth 1.0 using the Net::OAuth::Simple Perl module and how to do both GET and POST style (view and update) Goodreads API calls.  Its also a great way for me to remember what the heck I did when I next need to use OAuth calls!

First off we have a new Perl module I called Its a super class of the Net::OAuth::Simple module that sets things up to talk to Goodreads and provides a few convenience functions. Its obviously massively stolen from the example in the Net::OAuth::Simple perldoc that comes with the module.


package Goodreads;

use strict;
use base qw(Net::OAuth::Simple);

sub new {
    my $class  = shift;
    my %tokens = @_;

    return $class->SUPER::new( tokens => \%tokens,
                               protocol_version => '1.0',
                               return_undef_on_error => 1,
                               urls   => {
                                   authorization_url => '',
                                   request_token_url => '',
                                   access_token_url  => '',

sub view_restricted_resource {
    my $self = shift;
    my $url  = shift;
    return $self->make_restricted_request($url, 'GET');

sub update_restricted_resource {
    my $self = shift;
    my $url          = shift;
    my %extra_params = @_;
    return $self->make_restricted_request($url, 'POST', %extra_params);

sub make_restricted_request {
    my $self = shift;
    croak $Net::OAuth::Simple::UNAUTHORIZED unless $self->authorized;

    my( $url, $method, %extras ) = @_;

    my $uri = URI->new( $url );
    my %query = $uri->query_form;
    $uri->query_form( {} );

    $method = lc $method;

    my $content_body = delete $extras{ContentBody};
    my $content_type = delete $extras{ContentType};

    my $request = Net::OAuth::ProtectedResourceRequest->new(
        consumer_key     => $self->consumer_key,
        consumer_secret  => $self->consumer_secret,
        request_url      => $uri,
        request_method   => uc( $method ),
        signature_method => $self->signature_method,
        protocol_version => $self->oauth_1_0a ?
                                   Net::OAuth::PROTOCOL_VERSION_1_0A :
        timestamp        => time,
        nonce            => $self->_nonce,
        token            => $self->access_token,
        token_secret     => $self->access_token_secret,
        extra_params     => { %query, %extras },
    die "COULDN'T VERIFY! Check OAuth parameters.\n"
        unless $request->verify;

    my $request_url = URI->new( $url );

    my $req = HTTP::Request->new(uc($method) => $request_url);
    $req->header('Authorization' => $request->to_authorization_header);
    if ($content_body) {
        $req->content_length(length $content_body);

    my $response = $self->{browser}->request($req);
    return $response;

Next we have the actual CGI script that makes use of this module. This shows how to call the (and thus Net::OAuth::Simple) and then do the Goodreads API calls:


use strict;
use CGI;
use CGI::Cookie;
use Goodreads;
use XML::Mini::Document;
use Data::Dumper;

my %tokens;
$tokens{'consumer_key'} =  'YOUR_CONSUMER_KEY_GOES_IN_HERE';
$tokens{'consumer_secret'} = 'YOUR_CONSUMER_SECRET_GOES_IN_HERE';

my $q = new CGI;
my %cookies = fetch CGI::Cookie;

if($cookies{'at'}) {
    $tokens{'access_token'} = $cookies{'at'}->value;
if($cookies{'ats'}) {
    $tokens{'access_token_secret'} = $cookies{'ats'}->value;

if($q->param('isbns')) {
    $cookies{'isbns'} = $q->param('isbns');

my $oauth_token = undef;
if($q->param('authorize') == 1 && $q->param('oauth_token')) {
    $oauth_token = $q->param('oauth_token');
} elsif(defined $q->param('authorize') && !$q->param('authorize')) {
    print $q->header, 
    $q->h1('Not authorized to use Goodreads'),
    $q->p('This user does not allow us to use Goodreads');

my $app = Goodreads->new(%tokens);

unless ($app->consumer_key && $app->consumer_secret) {
    die "You must go get a consumer key and secret from App\n";

if ($oauth_token) {
    if(!$app->authorized) {
} else {
    my $url = $app->get_authorization_url(callback => '');
    my @cookies;
    foreach my $name (qw(request_token request_token_secret)) {
        my $cookie = $q->cookie(-name => $name, -value => $app->$name);
        push @cookies, $cookie;
    push @cookies, $q->cookie(-name => 'isbns',
                              -value => $cookies{'isbns'} || '');
#    print $q->redirect($url);
    print $q->header(-cookie => \@cookies,
                     -status=>'302 Moved',


sub GetOAuthAccessTokens {
    foreach my $name (qw(request_token request_token_secret)) {
        my $value = $q->cookie($name);
     $tokens{'access_token_secret'}) = 
                                    callback => '',

sub StartInjection {
    my $at_cookie = new CGI::Cookie(-name=>'at',
                                    -value => $tokens{'access_token'});
    my $ats_cookie = new CGI::Cookie(-name => 'ats',
                                     -value => $tokens{'access_token_secret'}
    my $isbns_cookie = new CGI::Cookie(-name => 'isbns',
                                       -value => '');
    print $q->header(-cookie=>[$at_cookie,$ats_cookie,$isbns_cookie]);
    print $q->start_html;

    my $user_id = GetUserId();
    if($user_id) {
        my $shelf_id = LoughboroughShelf(user_id => $user_id);
        if($shelf_id) {
            my $isbns = $cookies{'isbns'}->value;
            print $q->p("Got ISBNs list of $isbns");
            AddBooksToShelf(shelf_id => $shelf_id,
                            isbns => $isbns,
    print $q->end_html;

sub GetUserId {
    my $user_id = 0;
    my $response = $app->view_restricted_resource(
    if($response->content) {
        my $xml = XML::Mini::Document->new();
        my $user_xml = $xml->toHash();
        $user_id = $user_xml->{'GoodreadsResponse'}->{'user'}->{'id'};
    return $user_id;

sub LoughboroughShelf {
    my $params;
    %{$params} = @_;

    my $shelf_id = 0;
    my $user_id = $params->{'user_id'} || return $shelf_id;
    my $response = $app->view_restricted_resource('' . $tokens{'consumer_key'} . '&user_id=' . $user_id);
    if($response->content) {
        my $xml = XML::Mini::Document->new();
        my $shelf_xml = $xml->toHash();
        foreach my $this_shelf (@{$shelf_xml->{'GoodreadsResponse'}->{'shelves'}->{'user_shelf'}}) {
            if($this_shelf->{'name'} eq 'loughborough-wishlist') {
                $shelf_id = $this_shelf->{'id'}->{'-content'};
        if(!$shelf_id) {
            $shelf_id = MakeLoughboroughShelf(user_id => $user_id);
    print $q->p("Returning shelf id of $shelf_id");
    return $shelf_id;

sub MakeLoughboroughShelf {
    my $params;
    %{$params} = @_;

    my $shelf_id = 0;
    my $user_id = $params->{'user_id'} || return $shelf_id;

    my $response = $app->update_restricted_resource('[name]=loughborough-wishlist',
    if($response->content) {
        my $xml = XML::Mini::Document->new();
        my $shelf_xml = $xml->toHash();
        $shelf_id = $shelf_xml->{'user_shelf'}->{'id'}->{'-content'};
        print $q->p("Shelf hash: ".Dumper($shelf_xml));
    return $shelf_id;

sub AddBooksToShelf {
    my $params;
    %{$params} = @_;

    my $shelf_id = $params->{'shelf_id'} || return;
    my $isbns = $params->{'isbns'} || return;
    foreach my $isbn (split(',',$isbns)) {
        my $response = $app->view_restricted_resource('' . $tokens{'consumer_key'} . '&isbn=' . $isbn);
        if($response->content) {
            my $book_id = $response->content;
            print $q->p("Adding book ID for ISBN $isbn is $book_id");
            $response = $app->update_restricted_resource(''.$book_id);

You'll obviously need to get a developer consumer key and secret from the Goodreads site and pop them into the variables at the start of the script (no, I'm not sharing mine with you!). The real work is done by the StartInjection() subroutine and the subordinate subroutines that it then calls once the OAuth process has been completed. By this point we've got an access token and its associated secret so we can act as whichever user has allowed us to connect to Goodreads as them. The code will find this user's Goodreads ID, see if they have a bookshelf called "loughborough-wishlist" (and create it if they don't) and then add any books that Goodreads knows about with the given ISBN(s). You'd call this CGI script with a URL something like:

Anyway, there's a "works for me" simple example of talking to Goodreads from Perl using OAuth 1.0. There's plenty of development work left in turning this into production level code (it needs to be made more secure for a start off, and the access tokens and secret could be cached in a file or database for reuse in subsequent sessions) but I hope some folk find this useful.

Friday, 22 November 2013

Post nuclear differences

In the wake of the Fukushima Daiichi nuclear power station disaster in early 2011 both Japan and Germany changed their stance on nuclear power generation. Some nuclear plants were shut down immediately and the rest will most likely be decommissioned with the next decade.  Both countries appeared to have widespread public support for this radical change in their national energy policy.
Solar city, science park, Gelsenkirchen.
Source: Green Baroque Ins. Flickr under Creative
Commons CC BY-NC 2.0 Licence 

The effect these decisions have had on these two nations response to climate change is interesting.  Germany was already well into building out its solar & wind based renewable generation capacity before the earthquake and tidal wave wrecked the reactors in Japan. With widespread community involvement in the investment in renewables, and helpful financial and regulatory environment provided by the German authorities,  they've been able carry this forward.  They've still got fossil fuels in their mix but they do still seem to be on target for their carbon emissions targets.  Germany already had a strong anti-nuclear movement and was planning on phasing out nuclear by 2036 anyway so this event really accelerated that timetable.

Japan on the other hand have just announced during the UN COP19 climate change talks that they were going to have to substantially reduce their existing emissions reduction target.  They are building renewable generating capacity of course - practically every developed nation is.  However it will not be enough to cope with losing all the nuclear generating capacity that they are removing,  which prior to March 2011 contributed around 30% of their total generating capacity.  They are having to turn to increasing use of imported fossil fuels as oil, coal and gas to supply their electricity.

I was wondering what we could learn from the different outcomes arising from what, at first, appear to be very similar decisions.  Germany had something of a head start as they were already aggressively building solar PV & wind generators. But they also have a geographic  advantage over Japan: Germany is bigger with more land available.  Solar & wind both have low "energy density" so to get a decent amount generated you need alot of them covering lots of roofs & land. Japan is a relatively crowded country so space is at a much higher premium.  As a comparison Japan is 39th in the population density league table whereas Germany is 58th.  Japan does have potential for many gigawatts of renewable power generation though, as it has amply space in the seas around it for large scale off-shore wind farms.  There does appear to be the need to encourage more community involvement and investment in on-shore renewables though.
Fukushima Unit 4 with cranes working on 
stabilizing the site. Source: IAEA Imagebank under 
Creative Commons CC BY-NC-ND 2.0 Licence

Both nations have to bear the costs of decommissioning their nuclear infrastructure, which will mostly likely be a long and expensive task.  Again Germany has an advantage - it only had 17 nuclear power stations operating prior to March 2011 whereas Japan had over 50.  Germany was already well into decommissioning quite a few reactors, especially from the former East Germany.

Japan also has the expense and difficulty of cleaning up Fukushima itself to deal with. That's going to be a big drain on the resources of both its owner Tepco and the Japanese government. The clean up may well be competing for funds, people and time required to ramp up construction of  renewables, even though those renewables are part of the solution to the overall problem. Indeed one wonders if the exclusion zone around Fukushima might well end up being a good place to site renewables with their relatively low maintenance requirements (so fewer people have to spend less time in the potentially more radioactive areas).  At least they might provide some economic payback to the people whose land is otherwise now worthless.

Japan's economy has taken some serious blows over the last few years, which also puts them at a disadvantage against Germany. Germany is the economic power house of the EU, and so it can afford to invest in the capital cost of renewables. Indeed its a positive cycle for the Germans: the more renewables they can invest in the more insulated they are from fossil fuel price rises, which improves their competitiveness and increases their income, part of which they can invest in more renewables.  Japan has the opposite problem - its emergency switch to large scale fossil fuels to replace the nuclear power stations is costing the Japanese power companies an extra 3.6 trillion yen in 2013 over the costs in 2010 before the disaster.  Things are likely to get worse for Japan before they start to get better.

There is one overarching "take home message" I pick up from the different reaction to the change in energy policy in the two countries.  The sooner a nation starts to make a large scale switch to distributed renewable power generation, the better placed it is likely to be to deal with sudden, external changes in traditional centralized power generation.  In this case it was the rapid removal of all nuclear capacity but in the future who knows what it will be?  Gas pipelines cut off as part of national sabre rattling? Wars leading rapid price rises in global oil prices? Coal shipments being disrupted by industrial unrest?  All of these could affect national grids that rely too heavily on one particular fuel source, especially if that fuel is controlled by others.  We all need to be investing in clean, distributed energy generation to make our nations, towns, cities and communities more resilient in the face of these unexpected changes.

Saturday, 9 November 2013

Replacing Green Levies with Brown Ones

There's been much talk in the UK media and within political circles recently about the costs associated with so called "green levies".  These are additional costs added to energy bills to help fund climate change mitigations such as increased power production from renewable sources, carbon reduction strategies and the all important energy efficiency measures for low income and vulnerable groups in society.  The principle is that the more energy you use, the more you end up paying to help turn that energy generation into a cleaner, greener form and help poorer folk save energy.

Now some people are campaigning that these levies need to be reduced or removed completely and/or moved to general taxation.  The rising energy prices from the Big Six energy companies are hitting the "hard working" people of the UK and some politicians are sensing a quick, popular vote winner in appearing to do something to cut these bills.  Moving some of the green funding measures to general taxation is probably the most progressive option as it moves the cost towards those who can afford to pay more, even if they themselves have already reduced their energy demands. Of course that might be rather unpopular with people in power who tend to pay more of such taxes, and there is a bit of recent history of socially responsible tax payer funded schemes facing the axe.

But what if these green levies are removed completely and we succeed in stalling UK plc's green economy?  Not only will that affect quite a lot of jobs (many in the private section - that bit of the market place that is supposed to be pulling us out of economic doldrums) but it will also mean there will be less investment being made in climate change mitigation technologies, and we'll also end up putting a lot more CO2 into the atmosphere as a result.  We may well find that it becomes impossible to meet our legally binding targets on carbon emissions, which may have some direct economic costs if we're fined or foreign competitors manage to lock the "dirty man of Europe" out of future deals.

One argument put forward by those wishing to remove the green levies is that they don't think that climate change, global warming, call it what you will, is a man-made or even man-influenced effect.  To them it doesn't really matter how much CO2 or other greenhouse gases we emit, the climate will just do its own thing, and actually it will just fluctuate a bit and really we can all just carry on with business as usual.  Ignore the "ecomentalists" and get on with a continuation of 20th Century life into a bright, energy guzzling future.

Unfortunately people with such views currently seem to hold some of the reins of power in the UK, so there's a distinct possibility that at least some of the climate change mitigation funding may be lost.  If that does come to pass, I'd like to propose something to replace them: "brown levies".  Such levies will not be used to fund climate change mitigation strategies but instead fund climate change adaptation strategies. For example such things as building better sea defences, increasing the use of permeable paving systems in urban areas to reduce flooding, covering more of the UK countryside with polytunnels or glasshouses to reduce weather event effects on agriculture, etc, etc.

Some of that is happening now, but at a relatively low level, so the brown levy wouldn't need to be large to start with, but we do need to fund it.  At the moment what funding there is coming from disparate sources such as water bills, council taxes and general taxation, but it is hidden away rather that splashed all over the front pages.  Lets bring it out into the light as a nice, visible set of costs in the same way that the green levies have been brought centre stage by having them bundled together in energy bills.  That way people can see what they have to pay to adapt to climate change.  What goes into the levies could be given to one of Parliament's climate change committees to look after, or be debated every year in the House.

If the climate change deniers are right, the brown levies will stay small, and possibly even reduce as the climate swings naturally back to a late 19th/early 20th century.  Nothing will need to be decided on by Parliament and everyone gets to laugh and point at members of the the Green Party, Friends of the Earth and Transition Towns.  The worst that will happen is that we'll have funded some useful short term environmental protections in coastal towns and flood plains which will have reduced their insurance costs and protected some local industries.  The sort of thing we've been doing for years.

Of course if those folk from the Green Party, Friends of the Earth and Transition Towns are right about man made climate change, and the climate deniers in power right now do manage to wreck the current green levies funding climate change mitigation strategies, then those brown levies will have to go up over time.  And up.  And up.  Adapting to global climate change, even with the moderate changes we're likely to see in the UK, is likely to be very expensive.  Possibly more expensive than the cost of mitigating climate change in the first place.  And of course you'll also be paying for the higher priced fossil fuels themselves still, as UK plc won't have been made more energy efficient or built out its low carbon power generation sufficiently.  We may well be less competitive with some of our neighbours who will be more self-reliant on locally sourced power, so those increased costs will come at the same time as reduced trading incomes.  Oh, and there's those targets we won't have met to deal with as well.

So there's the glove slapped down to the climate change deniers in power trying to reduce green levies: put your (and everyone else's) money where your mouths are and agree to introduce legally binding climate adaptation brown levies if you remove climate mitigation green levies.  What do you, in your world view, have to lose after all?

Sunday, 20 October 2013

Nuclear reactor design validation and space industries.

I was chewing the fat over nuclear reactor designs with a chum on Twitter when something struck me.  One of the big issues in producing new nuclear designs is finding somewhere to build a prototype to validate the design.  There's lots of interesting, and potentially much safer and cleaner fission reaction designs being mooted, but most of them stay as computer simulations or paper designs because its increasingly difficult to get regulatory approval for building the reactors.

This is understandable in a way: few countries want to make it easy for companies to build dodgy designs that could leak radioactivity into the environment.  This means that reactor designs need to be carefully approved usually, which is very, very time consuming and thus very, very costly.

It also plays into the hands of the incumbent reactor manufacturers. Their designs are often evolutions of older, well understood reactors and the companies know the ropes of regulatory approval.  The regulators also know the companies and people involved.  These same companies also have a vested interested in not allowing some of the GenIV designs to be rapidly developed as they make quite a bit of their income from selling fuel rods for existing fission reactor designs, which some of the newer designs do away with (most notably pebble bed reactors and molten salt reactors using liquid fuels).

So onto my wacky idea: why not do initial validation of new reactor designs in space?  There are some distinct advantages:

a) If the design fails, a leak isn't going to be an ecological disaster.  If you put a reactor on the moon or an artificial satellite outside of Low Earth Orbit (LEO) reactor leaks aren't going to get back into Earth's biosphere terribly easily.

b) Space access charges are falling.  Companies like Elon Musk's SpaceX are reducing the price of getting mass into orbit.  At the moment most of this is targeted at getting equipment and people into LEO (for example supplying the ISS) but the basic launcher technology developments are ultimately aimed at getting mass to Mars.  Having a commercial funding opportunity get part of the way there would help the commercial space launch industry as well as the nuclear industry.

c) Whilst getting the equipment into space is expensive, it might be offset against the cost of the massively constructed containment buildings that are often required on Earth but wouldn't be required for the validation reactors in space.  Who cares if you've got minimal containment on the reactor if there's no biosphere to pollute?

d) Advanced robotics mean that you might well not need to send people up with these reactors, so you don't even have to worry about worker contamination/decontamination.

e) No extra nuclear waste will be generated on Earth, which is good seeing as Governments are still flapping about what to do with the stuff we've already generated.  If anyone is worried, provide ability to fire into the Sun!

There are downsides of course (probably lots of them considering this is just recording one of my brain farts!):

a) Getting radioactive material into space is nearly as tricky from a regulatory point of view as building the reactors.  Folk don't want radioisotopes being blown up all over the sky for some reason. Radiothermal generators have been sent out on space craft though so its not insurmountable, especially if space access charges mean that you can spread your fuel load into a number of small consignments spread out over many launches (so one launch failure doesn't mean a full reactor load of fuel being exposed to the biosphere).  Fuel containment and escape options may also help, especially as we have experience of equipment that has survived rocket explosions in the past.

b) Lack of gravity may affect some reactor designs.  This would require artificial gravity to be provided (simulated by rotating the reactor to give a 1g acceleration).  Reactor designs that don't suffer from this may of course be of interest for space exploration applications themselves.

c) Validation of a reactor design in space will still then require national regulators to accept the results.  This is unproven and may be just as costly/long winded as getting the prototypes approved for Earth bound deployment.

d) Hostile environment: whilst space doesn't have a biosphere to pollute, it does have micro-meterorites, difficult thermal gradients (very cold in shadow, boiling in the Sun), solar winds, etc, etc to deal with.  Lots of engineering fun to be had!

Well, just an idea.  Worth kicking out there for comment and thoughts though!

Saturday, 14 September 2013

Allotment waiting lists & community gardens

In many parts of the country there are long waiting lists for allotment plots. In some large towns and cities these waiting lists might stretch for years. People often complain that councils should "do something" about them and often mention getting groups together to demand more space for allotments. Unfortunately few councils are in the position to make new land and what property holdings they do have are now under pressure to return a decent income, especially with all the deep cuts in their budgets recently.

At the same time we have community gardens struggling to keep going due to the need for volunteers.  These shared growing spaces are often set up to provide garden beds for those with no access to their own backyards, but they can also give schools that lack their own gardens beds for the children to grow in. Community gardens should really be a place to help build community spirit & cohesion but it seems many of them struggle to recruit enough volunteers interested and/or knowledgeable in gardening to sustain their development.

So here's an idea for councils: take your allotment waiting list and point people on it to their local community garden. Folk who take up the challenge and help at the community garden get to move more rapidly up the waiting list. This has a number of benefits :

* councils are seen to be "doing something" about allotment waiting lists,

* community gardens are likely to get a steady flow of volunteer gardeners, some of whom might well stick around even after getting their own allotment plot,

* new comers to allotments & gardening get to sample what the work is like, the ups and downs of growing & pick up hints and tips before they take up their own plot. Some folk like the idea of gardening more than they like the actual work, and so this might "weed them out" before a precious allotment plot is allocated to them,

* links will be forged between the community gardens & allotment sites, which may help with things like Tool Banks, harvest festivals, In Bloom schemes, etc.

Now some people might not want to do shared gardening and just want their own private plot. That's fine - they can just stay on the council's allotment waiting list. They might be leap frogged by some more sociable, community minded folk, but they might also find some folk drop out of the list when they discover the time & effort commitment gardening imposes.

Sunday, 7 April 2013

Indestructible chickweed (or what's going to be drought tolerant this summer)

Today the Spring sunshine has finally shone and we've had a bit of warmth for gardeners in much of the UK.  And about time too - its been a cold, dark Spring, following hot on the heels of a cold, dark Winter and a wet, dark Summer and Autumn in 2012.

Back in Autumn 2012 I sorted out the pots on my entrance balcony (yes, I'm a gardener with a first floor flat with no garden!), mostly to get rid of last years annuals and pop in some Spring bulbs (which are now flowering by the way, no thanks to the local cats who've used all of my pots, tubs and troughs as toileting facilities for the last five months).  I bought in one home made self-watering pot (made out of an old squash bottle) and put in on the window sill.

And then promptly forgot about it.  Until just before Xmas when I noticed the compost had a little green shoot showing.  I'd no idea what it was so I gave it a good drowning which meant it would have enough water to last until I got back home after Xmas visits and hopefully find out what it was.

When 2013 found me back in my flat I discovered that what I was growing wasn't some lovely self-sown annual flower worthy of the Higgledy Garden flower beds, but blinking chickweed.  Oh well, never mind.  I promptly ignored the pot again, intending to empty it out and reuse it in the Spring when I started to sow veggies and herbs in the flat again (yeah I know - I'm lazy and slovenly).

Well, Spring as I said is finally dragging itself into being and so I was tidying up the pots in the north facing windowsill.  And look at my little chickweed now:

Lush and green, eh?  Little flowers, trailing stems, the very picture of healthy plant life!

But wait: I've not watered that pot since Xmas.  I know its a self-watering pot, but that just means that it can last a week or two when filled up, not four months.  The compost in it is old and exhausted and bone dry! And yet little Miss Chickweed seems to be doing fine thank you very much.  Amazing hardiness.  I can only assume its nipping over the hallway to the downstairs loo in the middle of the night and having a quick drink in there.

Now I know its been wet and cold and dark since it forever (or at least feels that way) but no doubt later in the year we might well be complaining about the heat and drought (well, we can wish at least).  We'll be looking at our wilting veg plots and empty rainwater butts, wondering what edible plant can cope with such drought.  With heat.  With cold.  With full sun.  With overcast darkness. With, er, more or less anything that you can throw at it.

I give you chickweed: the future of vegetable gardening in the Climate Change world.

Wednesday, 20 February 2013

Vegetarians and buffets

I've just been to an event at our local town hall that included a buffet lunch.  There wasn't any option on the booking form to indicate "vegetarian" or "vegan", so I sort of assumed that the buffet would include a good smattering of well labelled veggie food at the very least.  I was wrong: there were piles of pork pies and unlabelled pastries and sandwiches.  I had to rely on the old veggie tactic of asking omnivores whether they thought that this or that might be veggie.  This resulted in a few solid little cheese tartlets and a cheese and pickle sandwich.  Good job I'm not a strict vegan or I'd have been completely stuffed.

Now I can sort of understand people self-catering for family parties who aren't used to veggies doing something like this.  However this was being laid on by professional caterers.  Surely its part of their professional skills to ensure that veggies, who after all make up a fair percentage of the UK population these days, are suitably catered for?

In fact why do folk serve non-veggie food at buffets at all?  I've never found a meat eater who can't eat anything that doesn't contain flesh.  Indeed most are quite happy to indulge in veggie foods and on occasion I've been to buffets where the veggie options have run out far more rapidly than the number of veggies at the event would indicate.  When you think about it just providing a range of veggie, or better yet vegan, finger food would cover all bases and leave everyone happy.  I've seen some absolutely fabulous vegan food served as buffets, pop-up kitchens or as street food as events.  I can't believe everyone tucking into those was a committed vegan, yet they didn't seem to be raising any complaints.

So come on event organisers and professional caterers: get your acts together and make sure there's well labelled food that everyone can eat.  If nothing else you'll be less likely to get a "rubbish buffet" response on the feedback form from the veggies you've invited.

Monday, 11 February 2013

Decentralised energy generation - a lesson for nuclear?

In the UK, as in many other countries, there's been rapid growth over the last five or so years in the amount of electrical power generated by small "micro-generation" installations.  Some of these, such as domestic solar PV set ups, are small and cheap enough to be paid for by individuals.  Other usually somewhat larger schemes (up to a couple of megawatts) are owned by community groups of one form or another.  Most of these micro-generation projects have been based on renewable "alternative" power sources such as solar, wind or hydro.  Millions of pounds are being invested in micro-generation systems - people see small, localised energy generation as a way of gaining some measure of control and involvement over the energy they use.

Whilst I think decentralised power is a great idea, and I'm fully behind getting as many renewables on the grid as possible, we're still going to need something with more energy density that can provide energy when the sun doth not shine, nor the wind bloweth.  Grid level storage is part of the solution for this issue, as is grid interconnections between countries.   Of course baseload generation has traditionally been the domain of the large scale coal, gas and to some extent nuclear power stations.

Now coal is a no-no for future power stations - "clean coal" seems to be a pipedream with carbon capture and storage still very much in the experimental stage and the economics looking decidedly shaky.  That's without considering the "carbon cost" of getting the coal out of the ground and thousands of tons of it shipped around the planet between the mines and the power stations.  Of course the UK doesn't produce much coal any more, so those "coal miles" are now seriously long journeys across the world.

George Osbourne and chums seem intent on making a second "dash for gas", either by getting the UK hooked on volatile foreign imports or kickstarting an on shore gas boom based on hydraulic fracking.  Gas has a far lower carbon footprint than coal which is good, but, as the Americans seem to be rapidly discovering, fracking can be very environmentally damaging and short well productivity lifespans can result in the investment regime look more like a Ponzi scheme.   We're probably going to have gas in the UK Grid mix for many decades to come, but its probably worth seeing if we can minimise it to handling rapid on/off load following applications.

Nuclear was seen by many, including surprisingly quite a few influential environmental champions, as the great white hope for large scale, low carbon energy.  UK governments of several hues have pinned their hopes on it as well, but as we've seen over the last few years the plans have been somewhat derailed with many reactor vendors pulling out of schemes.  Only EDF still seem to be in the race, and they'll probably only stay the course if they get underwriting guarantees from the Government and/or some buy in from the Chinese.

Part of the reason why nuclear has stumbled is the vast costs attached to the current Generation III+ designs.  Part of this is a result of the regulatory and insurance landscapes surrounding nuclear power, but part of it is that these nuclear power stations will be huge, centralised power generators, each delivering somewhere between a few hundred megawatts and a gigawatt into the National Grid.  Getting this sort of investment is tricky, especially in a recessionary period.

Its also a liability for the Grid.  The anti-wind crowd constantly moan about what might happen to the Grid if the wind doesn't blow and the turbines suddenly stop, but the turbines are geographically spread around the UK and weather forecasting can predicted wind speeds in different areas fairly accurately out to a day or so.  If a turbine fails unexpectedly the Grid might lose a couple of megawatts - a relatively easy loss to deal with.  On the other hand if a gigawatt centralised nuclear station suddenly goes offline, the Grid management have their work cut out.  Decentralised generation can add resilience to the Grid.

Now I reckon there's another way forward for nuclear power: instead of looking at huge, monolithic GenIII+ designs that still need billions of pounds spent for "development" and construction, we should be turning to some of the innovative, smaller GenIV designs.  There's plenty of these designs floating around, and not all of them are just in the heads of the Internet's nuke geeks.  Indeed a number seem to have the backing of large companies (such as Toshiba's 4S) or Government's (for example China's development of thorium reactors).

Lets imagine for a moment that one of these GenIV designs gets produced in a package that generates a few tens of megawatts and can be mass produced (maybe not in huge numbers, but with a production line at least capable of making a few hundred reactors).  This sort of power output puts it in between the community owned renewables and the large scale utility power stations.  If the vendors can get the price point down to a £20-50million per reactor then that puts them in the range of a large community energy project (Westmill solar PV farm for example managed to raise several million pounds in a community share issue in 2012 - I know because I've got some of those shares!).  It certainly makes the attractive for commercial generators as well - one fracking company recently raised over £20million in a share listing just on the hint that there may be some potential future profits in fracked gas.

Sealed reactors limit proliferation and radioactive contamination vectors, and many of the GenIV designs are designed from the ground up as inherently walk away safe.  Designs that burn up an appreciable fraction of their fuel load also limit nuclear waste issues (and the waste we've already got could be viewed as fuel in some design options, but that's another topic).

Some folk are opposed to nuclear reactors of any design, so this idea isn't likely to be popular with them.  However having the option of a community sized reactor with a 5-10 year long lifespan between refuelling would give community energy schemes another avenue.  If nothing else you could then offer the anti-wind community the option of the neighbourhood underground nuke station...

Friday, 25 January 2013

Snow, ice Big Society and Transition

Last Friday it started snowing in the English Midlands.  Its snowed most of Friday and into Saturday but by mid-afternoon Saturday the white stuff had stopped falling, so I nipped out with the shovel and some grit and cleared our drive, the public pavement along the front of the house, a path to the front door of the little old lady next door and our neighbour's path and drive (as he works in the theatre and was just leaving when I started snow clearing, so I assumed it would an icy death trap by the time he got home as temperatures fell).

Now I've been away from here since Sunday (when it was snowing again), so I sort of assumed that when I came back, the rest of the folk in the road would have cleared their drives and paths.  A couple had but for the most part the snow had just been trampled down and turned to a sheet of ice.  As I went out I saw one of the other elderly ladies who lives just down the road struggling along the compacted, icy snow with her walking stick.

When I got back, I got the shovel back out and cleared all the public path from our house up the road to her house.. four or so houses worth of path clearing.  Took me about 30 minutes and she came out to say thanks.  No thanks were needed - this is what being in a community should be about: everyone mucking in, doing their bit and helping each other out.

If the other able bodied folk around us had just spent 20 minutes doing the bit of path outside their houses, this would all have been cleared at the start of the week.  As it was, they've driven their cars over the path in and out of their drives, seemingly oblivious to the state of the pavements.

Now some people would say, "but its the council's job to clear the paths".  Well I've got news for them: the council aren't clearing the paths out here.  They've got their work cut out keeping the main roads and town centres clear - they don't even regularly grit the road surface on side streets like ours.  I also suspect that many of these same people will be the one's who complain about the cost of council services and the amount they pay in council tax.  If they want a temporary army employed at short notice to clear side streets and pavements then they'd have to be prepared to pay for it.  I'd guess that a 20 minute work out shovelling the just the bit of pavement in front of their house themselves would work out a bit cheaper.

Other people claim that if they clear the public path in front of their house then they'd be liable if someone fell or that the law doesn't let them clear public spaces.  The latter is nonsense: there is no law preventing you from clearing snow and ice from public spaces.  It is also very unlikely that you would face any legal liability for clearing public paths if someone fell, as long as you were careful doing it and use common sense to ensure that you do not make the pavement more dangerous than before (so for example don't pour hot water on snow to melt it - eventually it will freeze and turn into black ice).  Its also worth remembering that the pedestrians using areas affected by snow and ice also have responsibility to be careful themselves - you're helping them but they also have to watch out themselves.

So what's this rant got to do with Big Society and Transition Towns?  Well, Big Society was supposed to be about people in local communities stepping up and taking responsibility for parts of their locality.  Doing the some of the things that the armchair experts rant about, that Government and councils can no longer afford in the recession.

Clearing ice and snow would seem to fit nicely into the Big Society ideal, and if our road (and many like it) are anything to go by, the Big Society needs a kick up the pants.  Its not like icy pavements are some esoteric concept that people find difficult to grasp, or can't see how volunteering a bit of their time could help the issue.  Its a simple thing that is easy to deal with and makes lives easier and safer in your community.  It can even save society as a whole real money - cleared pavements make it easier for people to get to work or school, and will reduce NHS costs by reducing injuries from slipping over.

Transition Towns are part of the Big Society (whether they like it or not!) and are working to build community spirit and improve local resilience.  A simple task like clearing pavement ice isn't as "sexy" as planning out a community garden or getting some solar panels installed, yet it seems to be a good metric of how community minded a locality is.   If a community can't do something that obvious  voluntarily without anyone standing over them, then they aren't likely to step up for the more onerous tasks that we'll need do work on together in a future of ernergy constraints and climate change.  Think of it as a social barometer: community cleared paths demonstrate a social responsibility that we'll increasingly need in the 21st century.

So out of the comfy armchairs folks and get those paths cleared!