Tuesday 8 December 2015

Using Microsoft Graph API from a daemon process (in Perl on Linux!)

At work, I've been asked to look at how easy (or difficult) it will be to give folk using Microsoft Office365 access to the student timetables.  We already provide a feed into our Google Apps for Education accounts for the students, and this has been very popular.  However The Powers That Be have had a funny five minutes and decided to move future students to Office365 (no, none of us can work out why either!).

Now in the past we've bodged up a means of staff seeing time table information in Office365 (as they've been stuck in there for some years now) using iCal files.  This works but:

  • requires the user to make an active decision to register and then past the link into their Office365 Web Access (OWA) calendar setup
  • means that Microsoft's servers whack our servers ever 4 hours to update this iCal link (which is OK for a small subset of the staff, but would be less fun for our servers when 15000+ students start to hit them).
So I've been looking at the exciting new Microsoft Graph API that they released a week or two back. Actually its been lurking in beta for a while, but v1.0 appeared more or less as soon as we started to look at how to do this, which is supposed to be the first general release for production use.  According to some Microsoft folk we talked to, this API is going to be the way of the future, so its what we started to look at using (as opposed to the older SOAP based Exchange Web Services API).

The Graph API uses RESTful calls, JSON, and OAuth2.0, so it looks pretty sane.  That was a surprise for me: I'm used to Microsoft stuff looking awful from the start from the point of view of a Linux hacker.  Much of the documentation for the API's OAuth2.0 flows assumes that you're going to be writing a web delivered app.  In this case, the app interacts with the user to get them to log into Azure Active Directory and then delegate rights to do specified things as them to your code.  That isn't much use for a daemon process which is what we want, but luckily Microsoft also implement the "client credentials" flow in OAuth2.0.  This means that you can get a client token (aka client_id) for your daemon application and you can then use the API to swap the client token for a bearer access token that can access any user's data in your Office365 tenancy, limited to the scopes set by the admins. One initial stumbling block for me was that I'm not normally an AD admin in our tenancy, and it looks like you need to be in order to assign application authorization scopes to the app in the Azure management console (luckily our AD guys were OK with giving me admin access to a test tenancy where I could break things to my hearts content whilst working out how this all works).  Still, this is very similar to Google Apps, where you give access scopes to a service account that can then act as other users.

Just to make things a bit sicker, my end of the Graph API calls is coming from Perl scripts sitting on a Linux box.  I'm a Perl hacker, and I've already written Perl modules to wrap up some of the Google APIs in the past, so this isn't overly concerning to me.  Indeed one of the selling points of the Microsoft Graph API's RESTful, OData standards basis is that its pretty much language and platform agnostic.  Its just as happy talking to a Perl script on a Linux box as it is a C# program on a Windows server.

Being version 1.0, the current Microsoft Graph API has a few oddities. I'm not sure why these didn't get fixed in the beta period - maybe the fact that it was a beta put off normal Microsoft developers from using it (use Open Source folk are used to using alpha and beta releases in production, as we've got the code to fix things if they go wrong!).

For example, lets say you want to use Graph to add a member to a Unified Group.  Unified Groups which are an exciting new type of group that can have calendars, files and conversations associated with them (they have nothing to do with local AD groups, existing security or mail enabled groups, calendar groups or distribution groups.  Microsoft really need to stop using the word "group" for new collections of things!). That's easy: there's a documented RESTful call for adding members.  Simiilarly, you can list members of the group. Great - those all work a treat. Now how do you remove members from the group?  Ah.  There doesn't seem to be an API call for that.  Or at least if there is, its not currently documented or its not in the same place as the creating/list members calls. I've flagged it up on Stackoverflow, so hopefully someone will either point me in the right direction or fix the API/documentation. That would be a bit of a show stopper for us though - students are flighty, jittery types who do tend to jump around the modules they are studying so we need to be able to add and remove them from the groups easily (and preferably without them getting an email every time this happens).

On the flip side, Microsoft have said that Unified Groups have some interesting new features, such as being able to have calendars attached directly to them. That sounds like just want we need: we can have a Unified Group for each module's timetable, add the students (and staff teaching it) to this group (possibly by adding the existing local AD module groups into the Azure AD and then adding those groups as members of the Unified Groups) and then fill the group with calendar events for all the lectures, labs, seminars and tutorials. Unfortunately this doesn't seem to work at the moment... the API documentation seems to indicate it should, but I and others are getting errors that the Unified Groups don't have a mailbox.  The odd thing is that I can use the Graph API to add events to an individual user's calendar and if I set the attendees to be the group, it does appear in the group calendar in OWA.  That behaviour is... odd.  But then it could be because I'm not getting how Microsoft intend Office365 calendars to work  I'm used to Google's calendars - ACLs in Google calendars land actually seem to work fine for sharing calendars, although you do have to make quite a few API calls for large numbers of students when setting them up (which Microsoft's Unified Groups would do away with if it worked).

Anyway, I'll keep plugging away at it.  I just hope I don't accidentally turn into the department's Microsoft Graph API "expert".  That would be embarrassing for a Linux hacker!

UPDATE: According to the Marek Rycharski from Microsoft on Stackoverflow, it turns out that the Graph API can't (yet) handle calendars on Unified Groups.  Its "on the roadmap" but no immediate plans for implementing it in the near future.  Drat!  Still at least I got told how to remove users from the Unified Groups ready for when I do need it at some point in the distant future.


Sunday 11 October 2015

Could graphene be used for super light space craft heat shields?

Last night I was watching an interesting video about a self-made billionaire who is spending 99% of his acquired fortune on funding innovation development to tackle energy, water and health issues. I'd not come across Manoj Bhargava or Billions in Change before, and some of the ideas seem a bit "out there", although if they work out then he's producing some badly needed solutions to world problems.

One of the ideas he's funding is a clean energy concept of bringing up heat from deep in the earth to drive electricity generation without using fossil fuels.  Geothermal energy has of course been done for years - Iceland gets most of its power that way.  However the new twist that Manoj and his engineers and inventors have is to use graphene to shift the heat from deep down below to the surface.

Graphene is the atom thick version of carbon that is the new wonder material for all sorts of applications. But Manoj claimed in the video that its heat transfer capability is tremendous - so much so that heat applied to one end of a graphene string will travel along to the other end, leaving the middle cool.  I'd not heard of this property before, but a quick Google search threw up this paper in which physicists have shown evidence of this marvellous heat transfer capability.

Of course there's a lot more research, not to mention engineering development, required before graphene heat pipes become a widely available thing.  But it got me wondering: assuming graphene does have this great heat transfer capability, could it be used in spacecraft heat shields? In effect I'm wondering if it would be possible to wrap a reentry capsule in a graphene matrix made of graphene strings, with one end of each string at the bottom and the other at the top.  As the heat shield warmed up, the heat from the bottom would be conducted over the sides of the capsule and then radiated from the top at the other end of the graphene strings?

I've not heard of graphene being considered for this application before, but then I'm not intimately involved in spacecraft design.  I bet its Elon Musk's radar if it is a possibility (everything is on Elon's radar!)

Friday 28 August 2015

Wacky idea time: Nuclear powered ocean going freight islands?

Nuclear powered ocean going vessels have been around for decades.  As well as the well known nuclear power submarines with their deadly payloads of nuclear weapons that can stay submerged for months at a time, there are also nuclear powered aircraft carriers and icebreakers out there.  Nuclear power plants for shipping are expensive but have the advantage of large power outputs, less time spent refuelling and low carbon footprints.

The latter point on carbon footprints made me wonder: onshore nuclear power stations can offer low carbon electricity outputs, but are now massively expensive to build, get mired in politicial objections left, right and centre, and are often unpopular with the local residents around proposed sites.  We need to find a way to deal with long lived nuclear waste from the legacy nuclear power stations. At the same time we need to find low carbon ways to ship bulk goods around.  And it would be great if we could get cheap, renewable replacements for existing liquid fossil fuels so that we could keep more fossil fuels in the ground.  What if we could find a way round all of those issues?

So, my quick brain fart for today: build very large, ocean going freight vessels that are nuclear powered.

By "very large" I mean bigger than the largest oil tankers available today by an order of magnitude - effectively floating metal islands that can plough across the oceans from continent to continent.  Obviously they'd be too big for most ports to handle and many people may object to a nuclear powered vessel turning up in their local harbour (unless they're used to the military ones already).  However what about if these giant vessels went just to the edge of territorial waters and unloaded onto smaller vessels?  Those smaller vessel would be normal sized, conventionally power container ships and tankers.

If you build something big enough, you could effectively include a dock inside the huge ship for normal vessels to go into, protected from rough seas. The loading/unloading of the smaller ships could even be done enroute, which would mean that transshipment and handling time wouldn't be increased. Bringing boats inside a larger ship is already done: the US Navy have vessels that can take smaller boats inside for long distance transport, equipping and deployment.  Or if you're into sci-fi its like the James Bond baddie with the oil tanker that could swallow submarines. Paging Elon Musk on that one!

Manufacture of this mega-freighter would have to be modular so that existing ship yards could build them sections at a time.  Each completed section would be floated out of the dry dock and then joined up to other sections already held at sea.  That's the bit I'm really not sure about: how easy would it be to join up sections at sea that are floating? That would obvious require calm weather to do, but could the modules be designed to interlock easily like some sort of giant floating Lego bricks?  I don't know - I'm not a shipwright or naval architect.  However large floating structures have been joined together in the past, so I don't think its insurmountable.

These floating freight islands would have to be powered by nuclear reactors that provide propulsion power, "hotel" power to keep the crew (and maybe passengers) supplied with heat, light & electricity and potentially enough "extra" power to use the various Fischer-Tropsch processes to combine sea water with air to produce liquid hydrocarbon fuels.  We already know that works - the US military have tried it to produce jet fuel onboard their nuclear powered aircraft carriers.  The synthetic hydrocarbon fuels could then be used to fuel the smaller servicing freighters and/or provide av-gas for helicopters or VTOL aircraft for the short hops to and from shore.

By using the nuclear reactors for the long haul ocean part of the freight journey we'd be reducing the carbon footprint of the goods.  The reactors will effectively live out their lives at sea, many miles from the nearest land. If the reactors are designed using the proposed Gen-IV designs they'll be "walk away safe", and something so massive as this would also mean it would be unlikely to leak radioactive material into the sea (I assume the reactors would be in the heart of the floating metal island, so there could be a lot of steel and concrete between them and the water).  Indeed this might be a great application for the various designs of modular reactors - don't build a ship with one 1GW pressurised water reactor but instead 10 modular 100MW molten salt reactors that can be swapped in and out for refuelling and replacement.  That would help with the economies of scale that modular reactor designs really need if they are going to be constructed on a production line to bring costs down and safety up.

Tsunamis and earthquakes wouldn't be an issue for these reactors, and if they're making enough synthetic fuels as a by product of the reactor running they could even help provide low carbon fuels for import to the countries they visit.  Indeed if they moor up at a fixed off shore point, they could be hooked up to the Grid in that country by a relatively short undersea HVDC power cable.  It might transpire that some could even be nearly permanently moored like that as a safer place to put nuclear capacity for the Grid's low carbon base load supply.

I wonder what the limitations of build these would be?  Cost is an obvious one: a nuclear submarine costs a couple of billion US dollars to make, and this would be something far larger.  Yet Governments and companies are already handling projects that cost many tens of billions - things like new build on-shore nuclear power stations, failing Carbon Capture and Sequestration (CCS) projects, high speed rail lines, etc.

The anti-nuclear lobby would probably object to this as its another application of nuclear power, but if the Gen-IV design could make use of legacy high level nuclear wastes, it might be palatable as a way of cleaning up wastes from previous generations of nuclear reactors (which is after all one of the anti-nuclear groups' major concerns). Also there are already many nuclear reactors swimming around in the ocean and have been for decades.

I'm not sure what the legal position would be for nuclear reactors running on vessels in International waters that never actually enter territorial waters once launched. Would it be covered by the flag that the ship sails under? Could you pick a small country that doesn't have a huge amount of nuclear regulation red tape in order to make this viable?

Saturday 2 May 2015

Energy storage and nuclear power

A couple of days ago, Elon Musk, the billionaire serial business creator, fronted a product launch at one of his companies - Tesla Motors.  Telsa is renowned for building high quality electric cars but this launch wasn't for a car: it was for batteries.  Elon was explaining how the battery technology originally developed for the Tesla cars will now be available to home owners and companies to provide electricity storage.  The news has naturally focused on the $3500 10kWh domestic battery pack, but I think the real killer is in the industrial, grid scale scalable storage that they are going to offer.  This is interesting enough for one un-named utility to already put their name down for 250MWh of capacity, and indeed its the industrial/utility side that analysts see as the major market.

Which got me thinking: how much solar/wind/etc generation and Tesla storage could you buy for, oooh, say the cost of EDF's proposed Hinkley Point C nuclear power station?  The estimated cost of Hinkley Point C keeps going up, but lets use the original EDF £16bn figure here.  Hinkley Point C is a 3.2GW station, so we need to try to match that using renewables for £16bn ($24.22bn at the current exchange rates).

Now first off I have to say that this is just me getting some ball park figures: its not an engineering analysis.  I just want to see if Tesla's batteries plus renewable generation can give us a stable base load power source to the National Grid that would look to the outside world as though a 3.2GW nuclear station were sitting there. To do this we'll need more than 3.2GW of renewable generation capacity: we not only have to match the nuclear station's peak output, but also fill the batteries so that we can also supply power at night and on calm, overcast days.  Lets assume that we want 3 days worth of energy in the batteries to cover these low generation periods to start with.

Now we don't know what cost Tesla's commercial utility scale power packs are going to be, but we do know that the residential 10kWh ones will cost $3500, or in other words $350 per kWh.  I would assume that an economy of scale kicks in when you're buying a huge amount of batteries for utilities that would reduce this $350 per kWh figure for the utility scale one.  Lets say it knocks $50 off the figure - yes, that's a wild, stab in the dark guess, but its seems vaguely sensible and conservative.  So we need three days worth of 3.2GW generation stored:

3 days x 24 hrs x 3.2GW = 230.4GWh = 230400000kWh

At my estimated $300 per kWh that will cost:

$300 per kWh x 230400000kWh = $69120000000 = $69.12bn.

Ah, that's blown the $24.22bn budget already, and we haven't even paid for any of the renewable generation yet - this is just the cost of 3 days of Hinkley Point C sized output storage.

Lets plough on though, and see what the final number is.  For large scale renewable power generation, the costs are falling (ie going the other way to nuclear!).  Large scale wind turbines cost $1.5m-$2m per MW of output.  Large scale solar farms cost ~ £1.6M ($2.42M) per MW if I've read the slightly confusing Solar Trade Association report right.  Lets pick $2M per MW as reasonable wet finger guesstimate of cost for both wind and solar then.

We need to generate more than 3.2GW though: we want to match Hinkley Point C's output when we're generating at our peak and have lots of excess generation capacity available then to fill up the Tesla batteries for the periods when its calm and dark.  Lets guess again and say that we need twice the generation capacity to do this.  We might need more, we might need less, but 6.4GW again seems like a reasonable first guess.

At our $2m per MW estimate of renewable generation costs, this 6.4GW will cost $12.8bn.  Well at least that bit is under the $22.42bn Hinkley Point C budget!

How much energy storage can we get for the $9.62bn difference?  At $300 per kWh estimate we get:

$9.62bn / $300 per kWh =~ 32GWh

So that's about 10 hours worth of storage if we're going to be sucking 3.2GW from the battery system. That's still not bad, but will it be enough to allow large scale solar and wind to challenge nuclear for base load power generation in a decarbonised Grid?  Some people think so, and the numbers will fall on the side of renewables+batteries if their cost trajectory keeps going down in the same direction whilst nuclear's costs keep rising.  It will be interesting to see how this plays out.



Friday 27 March 2015

Tooling up for repairs

This weekend see's the first Transition Stratford Repair Cafe take place.  For my sins I'm one of the volunteer "repairers".  The idea is pretty simple: there's going to be four or five repairers sitting in a room for a few hours, people come along with things that are broken that they would normally just throw into the landfill and the repairers try to show them how to repair them.  If they manage, everyone is happy: the person doesn't need to spend money on a new replacement, they've potentially learn a valuable new repair skill, the repairer has the good feeling of helping and the Transition group as a whole are helping cut down on waste and the use of resources.  If the repairer can't repair it we can point people at local professional repairers (thus helping the local economy), or the item gets taken away by the person who brought it and it goes in the bin anyway (which is where it was heading in the first place).  All done in a friendly social space, and for free, though people are encouraged to buy tea & cake whilst waiting or make a donation to help cover the room hire costs.

I've seen a Repair Cafe running at first hand Malvern when a couple of us went last year for a spot of industrial espionage (and tea and cake - it is a cafe after all).  The one thing that strikes you is the variety of things people bring in: knives that need sharpening, children's sit on battery cars, flat panel TVs, duff patio umbrellas, clothes that need a bit of stitching, etc, etc.  And that has lead me to this afternoon's quandary: what tools and equipment should I take?

I've just spent three hours in my workshop trying to guess what might be useful.  Its tricky, because I don't really know how many people we'll get, nor what they'll bring.  Its sort of exciting: will they bring stuff we can actually repair, or will it all be things that have designed obsolescence built into them in a way that makes it very hard, or impossible, to repair?  Is it going to be digital electronics that needs tiny specialist screw drivers and a fine soldering iron?  A lawn mower that needs it blades sharpening?  A chair that needs a leg gluing? Just to add to the sense of adventure, we'll only get 20-30 minutes to do each repair, as we might well have a queue of people waiting!

Anyway, here's a rough list of what I've finally decided to take.  I could immediately discount any decorating/building tools and heavy woodworking gear (I doubt I'd need the lathe or pillar drill for example!), so its mostly hand wood & metal working tools and simple electronics stuff. Thankfully one of the other repairers is doing the needle work, so I can also forget the sewing machines, threads, buttons, etc. Which means I'm currently taking:

  • Workmate (vital!)
  • Toolbox containing
    • variety of fixed bit screwdrivers,
    • junior hacksaw,
    • large hacksaw,
    • interchangable bladed tenon/rip saw,
    • some G-clamps,
    • multimeter,
    • hand drill,
    • wood & metal drill bits,
    • PTFE tape,
    • pipe cutter,
    • a couple of  hammers,
    • a variety of wood & metal files,
    • some pliers, 
    • some pipe wrenches,
    • masking tape,
    • pencils,
    • square,
  • A shopping trolley containing:
    • Box of various spanners,
    • Socket set,
    • iFixIt 54 bit electronics screwdriver set,
    • Set of general small screwdriver bits & driver,
    • Como mini drill,
    • Hand staple/nail gun,
    • Soldering iron, solder, etc,
    • Hand operated grinding wheel,
    • Small metal vice mounted on a workmate compatible base,
    • Wood glue,
    • 3-in-1 oil,
    • Superglue,
    • WD-40,
    • Component tray (for keeping screws, etc together when dismantling something),
    • Oil stone,
    • Needle files,
    • Sand and emery paper,
    • Some tins of nuts, screws, bolts, etc.

I'm umming and ahhing about taking my rechargeable drill/driver with me... it might be handy if something big with lots of screws turns up.  No oscilloscope or other electronics diagnostics tools, so it will limit any repairs to complex electronics (but that would probably take more than 20-30 minutes anyway... it can take that long to get in some cases!).  I should possibly have more glues, tapes and clamps: at Malvern they had loads of these available.  Not sure what other "consumables" might be needed... fuses perhaps?

So now we just have to wait and see what comes and thus what gets used that I've taken, and what I should have taken but didn't.  Hopefully with four or five repairers there we'll have a broad coverage of tools and consumables, so if I have forgotten something one of the others may have it (and vice versa). Fingers crossed, eh?

Saturday 14 February 2015

Insulate existing properties as part of building new ones?

In the UK we're looking to constantly build new housing stock to house both a growing population and because they way we live now is different to 50 years ago (with more single occupancy and single family dwellings).  However new build is still only a fraction of the UK houses available, with many existing ones having been built in the last 150 years to somewhat lower standards of insulation and energy efficiency.  Over the last 20 years Government grants and promotion have lead to about 70% of properties with cavity walls having insulation added to them.

Unfortunately changes in the grant regime have lead to a collapse in the insulation industry and left a relatively large number of homes still requiring more insulation and similar energy efficiency measures. Some of these are "difficult to treat homes" and ones requiring expensive solid wall insulation that have long pay back times. The Green Deal was supposed to help with these but its been a bit of a damp squib, although the periodic Green Deal Home Improvement grants have been over subscribed rapidly demonstrating a demand for such help.

So here's an idea that just struck me: why not make it a requirement that developers of new build homes have to insulate a given number of existing homes in the locality (say within a 5-10 mile radius) for each new build home they erect? And I don't just mean pay into an anonymous pot: I mean actually have workers go into homes in the area and do the insulation whilst the homes are being built (with mandatory fines if they don't - no weaselling out of it later by doing deals with friendly councillors!).  This would have a number of advantages:
  • Developers already need to engage with the local community during the planning cycle so they could find suitable properties and home owners in the locality at that stage, including them in the final planning application,
  • Local people would be getting direct, visible benefits from developments within their communities, rather than funds that might drift away elsewhere,
  • Making local homes, especially those belonging to the elderly, infirm or poor, more energy efficient would be great PR for the developers (and they often need great PR!),
  • Insulation will be pushed up the agenda for new builds as well: if a developer is going to have to talk about insulation during the planning stage and pay for insulation companies to do a load of existing homes, they'll be more likely to make use of the "economies of scale" and get the same people to put decent levels of insulation in the new homes rather than skimp with the minimum.
A similar idea could apply to developers of commercial or industrial space: they could be required to fund insulation and energy efficiency measures in existing local community and local government buildings. This would mean that the developments immediately start to make a positive difference to the amount of energy used by the local community, and potentially reduce costs and/or secure a longer term future for such services.
Linking new builds to the redevelopment of existing housing stock would help ensure that older properties don't become "second class citizens" alongside modern buildings, and would also mean that Governments wouldn't have to be directly involved in the funding of retrofit, which appears to result in "start-stop" bursty situations that wreck industries and confuse the public.