Saturday, 15 April 2017

My first play-through of Crusader Kings II

I picked up Crusader Kings II as a bundle in a steam sale last year, having had a nice impression of the publisher Paradox Interactive - renowned for meticulous grand strategies. I tried my best to play it twice before, but couldn't seem to bear out the tutorial. Choosing an ideal spouse to build your medieval dynasty seemed a shuddering jump away from strategically redeploying Siberian divisions to stop an armoured breakthrough, as in Hearts of Iron from the same publisher. This was my first thought as I went into the tutorial - wanting for a slow game which could be paused at will with a crying baby at night - for the third time.

But this time I found it surprisingly enjoyable.  The game is an endless Machiavellian climb to the top, usually on a pile of bodies. Every duke wants to become king, every king desires the emperor's throne and the emperor seeks to keep his vassals in place by dividing and ruling. Actually, these feudal courts were eerily reminiscent of Stalin's 20th century government: he eliminated all his ambitious "vassals" and split the power, so that the remaining loyalists dared not oppose him.

Although the tutorial stated that it would be over when Alfonso of Leon died, after playing to his heir and restarting the game, I still received a prompt to start a tutorial again. So, without the tutorial registering as completed and irked that I couldn't otherwise unlock achievements, I jumped in at the deep end and started an "Ironman" campaign. In this roguelike mode, there is a single computer-controlled savefile: any mistakes are permanent.

I started in 867 (no Charlemagne DLC) as the leader of Wessex, because my wife and I have recently been watching "The Last Kingdom", a period drama set against the backdrop of King Alfred the Great of Wessex stopping the Vikings. Without a full understanding of how wars or combat works, or how the initial Viking "horde" spawns work, I initially launched myself into a series of disastrous counterattacks which quickly depleted our war score and losing Northumbria.

In modern military textbooks, one of the principles of defence is an Offensive Spirit and I was fully committed to use this in the defence of Wessex - and eventually England - from the Vikings. While they consolidated their spoils, and with Mercia as a buffer zone, I immediately set about fabricating claims to take Cornwall, a county in Wales and then counties in Mercia itself. This emboldened me to start taking on some of the weaker Vikings and soon to press forward to form the kingdom of England.

While I was consolidating England, the Vikings had settled and formed a large Scotian block under the Kingdom of Ireland, with a surprisingly strong and independent Wales in the mix. However, when the lords of Scotland briefly rebelled and split the country in two, I seized my chance to fight both individually and take enough counties to form the Empire of Britannia. Now with a dominant position in the British Isles, I was unopposed in taking the rest of the counties de jure, one at a time, but greatly slowed down by the need to have a 10 year truce after every county. This was greatly sped up when I was able to usurp the kingdom of Ireland, fracturing the lands there and allowing me to press multiple wars simultaneously.

By 1066 I had most of the British Isles and set about googling a few things. I had previously attempted to press a courtier's claim to be king, but was exasperated to find that this merely installed him as an independent ruler. But then I read something which said that giving the claimant a landed title would make them your vassal when a claim was pressed. I set about testing this theory by giving out a county and installing a new king of Sweden under my banner. Now I could take whole countries at a time, but the downside was a Threat increase taking 20 years to dissipate.

But there was another alarming find for me online (since I had the Sunset Invasion DLC). In this alternate history, a great North American empire had captured a wayward Viking ship. This had caused a paradigm shift for them, leading eventually to the construction of a vast armada intent on taking Europe. This was in large part a move to balance out the Mongol invasions of the East with a huge (albeit horseless) horde in the West. Now my Stalinist expansion (sometimes called "map painting" online) had a true purpose, as I needed to force enough subjects under my banner to show a united front against these Transatlantic hordes, whose human sacrifice was surely worse than vassalage to me. Finally, my expansion had another unexpected effect: in order to create land holdings to give to claimants, I built cities (which give the highest monetary income) in sparsely populated places such as the Shetland Islands (where construction is cheapest). Cities in Crusader Kings II are miniature republics and the lands taken when their mayor has a claim pressed become republics too. Thus France and parts of Germany under me entered the Renaissance as fledgling democracies.

The "Aztecs" (really a collection of Mesoamericans) arrived very early, first in Muslim Spain and then in Scotland in 1254 (when the expected arrival is usually 1300). In my realm they had a force of over 100 thousand troops, but I now had marginally more and importantly concentrated my forces. This led to a complete rout and subsequent destruction of all their armies before they could capture a single holding. Without bothering to make peace, I immediately sailed to Spain and fractured their armies there too, until I hunted their retreating force deep into France. For the sake of the relevant achievement, I didn't make peace until their army size dropped precisely to zero.

The Mongols, unfortunately, failed to make any progress in this play-through, though I understand the Horse Lords DLC might give them a bit more edge. This was a shame, because I was expecting them to arrive at the gates of Byzantium or somewhere else I would need to face them. My strategy for hiring a retinue, a standing army, was focused on beating the Mongols and their wily mobile horse archers.

I now needed a goal for the final two centuries of gameplay. I chose an overall strategy of restoring the old Roman Empire, for which I would need control of Byzantium and several other counties around the Mediterranean, such as Alexandria and Venice. In between taking France and succeeding in several Crusades, I made a good start on these prerequisites. Taking the title of Emperor of Byzantium, however, would require more slick maneuvering. Since I couldn't vassalize another emperor, I would have to create an heir with a claim to the throne.

I had already passed Agnatic-Cognatic succession in my empire, allowing female rulers. Now I had managed, through several strategic marriages to Greeks, to have a female heir and a grandson with a claim to the throne of Byzantium. All was set for me to recreate the Roman Empire in two generations. But I snatched defeat from the jaws of victory in what would be my Great Mistake for this play-through. When my character's aged wife died, I thoughtlessly remarried the first princess I could find with high stats, so I wouldn't need to reduce my demesne size. But she quickly gave birth to a boy, who would sidestep my carefully planned lines of inheritance. In retrospect, I could have forced him to take the vows of priesthood, thereby disinheriting him. And what's more, I had also married off one of my daughters without thinking, so that the male heirs she had would later challenge me for succession.

From the Great Mistake came the Great Survivor of my dynasty. Pretty soon I found myself as a child on the throne, with the next in line to succeed me outside my dynasty. If I died now, before I had any direct heirs, it would be game over. It was a tense time when little Mærleswegn fell ill at age 8. But he recovered and as soon as he came of age married a buxom Polish princess. He soon had several sons to succeed him, enlarged his empire earning the title "The Hammer" and even knocked off two of the pretenders, though this was now unnecessary. But becoming the Byzantine - and therefore Roman - emperor was now out of the question before the game ended in 1453 and the Renaissance began.

The longest reigning Wessex monarch

It is clear that there is a lot to learn in this game even after a full play-through. Indeed I never once made any tactical military choices, like manually assigning commanders, or giving battle orders. Some things I did purely for the achievements, such as forming Wales for "United the Kingdoms", taking an opportune moment during a Threat cooldown to go on the shortest possible pilgrimage, but also choosing to expand to Arabia for "Protector of the Holy Places" when I had 30 spare years before I could make my next attempt at Byzantium.



Overall a pleasantly challenging and very relevant game.

The Empire of Britannia at greatest extent

Typical empire summary at the end of the game

Achievements obtained during run:
  • The Marriage Game // Full House
  • Until Death Do Us Part
  • Crusader
  • United the Kingdoms
  • Protector of the Holy Places
  • Pilgrim
  • Paragon of Virtue // Exalted Among Men // Merchant Prince // Saint // Celebrity
  • And Stay Out!
  • Persistent Survivor
  • Divine Blood // Dragon Blood // Royal Blood
  • Turbulent Priest 
  • Dwarf Fortress
DLC (as the 5-year anniversary bundle): SoI, LoR, SI, TR, TOG, SoA, RoI

Tuesday, 28 February 2017

I'm having a kid. Here's why


Albert Einstein's father co-founded an electrical company. Had he decided to focus on his career over having children, his small-time company would likely have gone bankrupt or been bought out anyway; for the sake of that irrelevance, we would have lost one of the greatest minds of the 20th century.

We recently scanned my wife for lifeforms. Specifically this one, due to be born in about a month:

I have seen a fair number of articles in the press or online opinions about people deciding not to have children, perhaps to pursue a career instead. Our Prime Minister got a free ticket to 10 Downing Street because of such a scandal. Others argue that the world is becoming overpopulated and resources ever more strained. Particularly, the link between lower levels of economic development, lower level of education and high birth rate and relatively high infant mortality is well understood. Perhaps it is a good idea to have fewer children in the developed world to offset this effect.

To me, this argument is as if the heating fails in a block of flats, so that each one is freezing cold, setting fire to one of the flats so that the average temperature is acceptable. Having huge variations in the birth rate between countries - or between different social groups in a given country (such as based on level of education) is bound to brew trouble. It would mean, for example, a huge rate of migration (and brain drain etc) or that child-bearers are less represented in government (which has a high rate of university graduates).

What's more, we in economically developed countries should set a precedent for the rest of the world who are industrializing. As increasing GDP leads to higher living standards, countries adopt many "Western" paradigms (such as the internet where you are reading this blog). This applies for demographic trends as well. For example, as Thailand and Korea have become more industrialized, their birth rate has dropped below the rate of replenishment. 

If the entire world were to reach the current Western level of socioeconomic development (which is broadly happening), and this correlates with a below-replenishment birth rate (which is also the case), then the human race would eventually die an anguished death. Anguished, because the exponential drop in numbers would leave a proportionately high ageing population needing care. Saying that "children aren't for me" is like saying "a job in farming isn't for me" - if everyone had that opinion, we would go extinct.

In his farewell speech, Barack Obama talked about his accomplishments: ending recession, giving 20 million Americans access to healthcare, achieving marriage equality - regardless of your personal political viewpoint, things important to him - yet he cited raising his children as his proudest accomplishment.

It is worthwhile to have a large population. Whether there are 1 or 10 billion people on the planet, there will always be (for example) 2% working as farmers just to feed the rest, but the difference in the latter case is that there is a 10-times larger pool of talent to work on the space program or write music. Spending some extracurricular time to do childcare for 20 years and the associated career setback (if there even is any) is well worth the years of accomplishments from the children that result.

For a different viewpoint, here is Doug Stanhope:

Wednesday, 18 January 2017

Why I prefer "Star Wars" episodes I - III

(Though I must admit - I haven't seen The Force Awakens)

Naturally, contains spoilers

Edit: The US president's battle with judges in early 2017 makes the following line - and stark warning about limiting executive power - from Revenge of the Sith the most meaningful in the entire franchise:
"He controls the senate and all the courts" - Mace Windu's justification for the (morally correct) extra-judicial killing of Chancellor Palpatine

It's a sign that humans either enjoy repetition or lack imagination, that most cultural narratives mostly share common "tropes". I feel that the major shortcoming of the Star Wars series is their egregious use of the "death of millions is a statistic" trope (or nine million, in this case). With a Dunbar number of 150 individuals, humans cannot empathize with millions of others. Most fiction therefore focuses on just a few individual characters.

Unlike a "Trek", the word "Wars" which in this case follows "Star", does not normally conjure images of discovery and self fulfilment, but carries the heavy burden of death, devastation and is not to be taken lightly. Moreover, wars are won by logistics and massive mobilization (corresponding to billions of soldiers in the setting of Star Wars) - not by the action of a few individuals.

That is why I despair at the incestuous Star Wars films and their extended universe. A few members of one family decide the fate of a galaxy of billions; the death of a single character really is given more screen time than the destruction of an entire planet.

Perhaps most laughable of all is the fact that one of the most pervasive and iconic characters is the robot R2-D2, whose entire personality and psyche would have been endlessly copy-pasted on a production line. Think about it - with a naming scheme of two letters and two digits, fewer unique astrometrics droids (shown to be used in single-pilot fighters) could exist in an entire galaxy than the number of ground attack aircraft produced on one planet (Il-2 and related variants). And yet a single such robot participates in all the greatest historical events of its time.

Seen through this prism, I feel that the narrative of Episodes I-III is more robust than the others. The characters feel truly embedded into grandiose events unfolding around them, with the clash of vast armies Rescuing a princess is a fine plot for a fairy tale, but wars are fought and won because of economic needs. (Trade agreements, anyone?) They are best seen by a mature audience, allegoric to the rise of fascism, perhaps also a critique of America in the early G. W. Bush years.

I have a similar problem with the film Pearl Harbor among others, which was a chance to use modern CGI and Hollywood actors to bring history to life. Instead, over half the film isn't even set in the titular military base, but focuses on a mere handful of characters spending most of their screen time in largely unrelated locations. How could a movie about such earth-shattering events have been reduced to the outcome of one love triangle? And, of course, the film cannot conclude with a defeat (at least The Empire Strikes Back may be commended in this regard), so it ends with the militarily irrelevant Doolittle Raid.

I would truly like to know if it's possible to have an alternative to this character-driven form of fiction. Certainly, it dates back at least to Homer, who in the Odyssey gives greater detail of the protagonist's big toe (well, pretty much) than his crew of thirty individuals with their own hopes and dreams.

In the meantime, I would suggest that writers follow the example of Tolstoy in War and Peace - by all means have a character who mans an artillery piece on the flank of a world-changing battle, but do not have them pole vault the entire Grande Armeé to kill Napoleon and single-handedly win a war.


Wednesday, 14 December 2016

Spectrograms: a ferocious example

A spectrogram is a great way to visualize the frequencies present in a given waveform, such as an audio recording, radio transmission or electronic signal. The spectral intensity of the signal is plotted (usually on a logarithmic scale) against time (x-axis) and frequency (y-axis).

Processing the data

The plot below shows the left and right (green overlaid on blue) channels from an audio recording. We want to decompose it at a series of time steps into its constituent frequencies.



The following python code is a fairly basic implementation of function to make spectrograms. We parcel the input data into several sections of a given width (N_fft), each of which will be Fourier transformed. It's useful to make sure that there is some overlap between these sections, so the output is fairly smooth. Since most inputs are real (as in the examples above - music, speech etc), we can do a purely real transform. The last thing we need is some envelope function, so that the edges of each section are not sharp, but rather the signal in each section smoothly decays to zero; in this case, we multiply by the Hamming function, which is commonly used for this purpose.

 from numpy import *
 def make_spectrogram(data,N_fft,N_overlap):  
      N_net=N_fft-N_overlap  
      total_iterations=(len(data)-N_fft)//N_net            
      output=zeros((N_fft//2+1,total_iterations),complex)  
      for idx in range(0,total_iterations):  
           output[:,idx]=fft.rfft(data[idx*N_net:idx*N_net+N_fft]*hamming(N_fft))  
      return output  


The spectrogram

After we apply the function to our nice data set (with an appropriate choice of the size and overlap of the sections to be Fourier transformed), we get the following plot:





















There is a usual humdrum of noise close to the bottom, around 100 Hz, which corresponds to human speech and other noise. But we can also see an intense spectrum of harmonics emerge and then rise and fall in frequency (usually called chirping) in two events at 20 and 21 seconds. The harmonics are either real constituents of the audio clip, or due to sampling, or most likely both.

The input

In this case I took the raw signal from the following clip of delightful Twitch streamer Oddish (OddishPlaysGames @StephOddish) as she ambushes a Brightwing player in Heroes of the Storm. (By the way, I challenge any young reader to explain that sentence to someone over 50)



(High-res original here)

The full code

If you'd like to try this yourself, the following python script should work with only fairly basic libraries. You will need to have an appropriate .wav file, which I obtained by downloading the clip from Twitch and extracting the audio with ffmpeg, but any other sinusoidal signal will also do.

 from numpy import *  
 import matplotlib.pyplot as Plot  
 Plot.rc('text', usetex=True)     #To make the plot text look nicer  
 Plot.rc('font', family='serif')  
 import wave as wave  
 import struct  
   
 def make_spectrogram(data,N_fft,N_overlap):  
      N_net=N_fft-N_overlap  
      total_iterations=(len(data)-N_fft)//N_net            
      output=zeros((N_fft//2+1,total_iterations),complex)  
      for idx in range(0,total_iterations):  
           output[:,idx]=fft.rfft(data[idx*N_net:idx*N_net+N_fft]*hamming(N_fft))  
      return output  
   
 #Read in .wav file and convert to array of floats for each channel; .wav generated by the following command:  
 #ffmpeg -i Oddish.mp4 -ab 160k -ac 2 -ar 44100 -vn Oddish.wav  
 wave_file=wave.open("Oddish.wav",'r')  
 params=wave_file.getparams()  
 channels=params[0]  
 frame_rate=int(params[2]/channels)  
 frame_number=params[3]  
 frames_raw=wave_file.readframes(frame_number)  
 time_window=[18.0,23.0]  
 frequency_window=[0,10000]  
 even_idx=list(range(0,frame_number,2)); odd_idx=list(range(1,frame_number,2));  
 frames=[array(struct.unpack("%ih" % frame_number*channels,frames_raw))[even_idx],array(struct.unpack("%ih" % frame_number*channels,frames_raw))[odd_idx]]  
   
 #Plot sinusoidal signal  
 Plot.plot(linspace(time_window[0],time_window[1],num=int(time_window[1]-time_window[0])*frame_rate),frames[0][int(time_window[0]*frame_rate):int(time_window[1]*frame_rate)])  
 Plot.plot(linspace(time_window[0],time_window[1],num=int(time_window[1]-time_window[0])*frame_rate),frames[1][int(time_window[0]*frame_rate):int(time_window[1]*frame_rate)])  
 Plot.xlabel(r'$t$ [s]',fontsize=28)  
 Plot.ylabel(r'Amplitude [a.u.]',fontsize=28)  
 Plot.savefig('OddishSignal.png', format='png', dpi=100,bbox_inches='tight',pad_inches=0.1)  
   
 #Create spectrogram  
 frequency_points=500  
 time_points=500  
 required_timepoints=(time_window[1]-time_window[0])*frame_rate  
 N_FFT=int(frame_rate/(frequency_window[1]-frequency_window[0])*frequency_points)  
 N_overlap=int((N_FFT*time_points-required_timepoints)/(time_points-1))  
 spectrogram=abs(make_spectrogram(frames[0][int(time_window[0]*frame_rate):int(time_window[1]*frame_rate)],N_FFT,N_overlap))  
   
 #Plot spectrogram  
 plot_time=linspace(time_window[0],time_window[1],len(spectrogram[0,:]))  
 plot_freq=linspace(frequency_window[0],frequency_window[1],len(spectrogram[:,0]))  
 fig=Plot.figure("Spectrogram")  
 ax=fig.gca()  
 colour_plot=ax.pcolor(plot_time,plot_freq/1000.0,log(spectrogram/amax(spectrogram)),vmin=-2.5*log(10), vmax=0,cmap='hot')  
 Plot.xlabel(r'$t$ [s]',fontsize=28)  
 Plot.ylabel(r'$f$ [kHz]',fontsize=28)  
 Plot.ylim(0,2)  
 Plot.xlim(time_window[0],time_window[1])  
 Plot.savefig('OddishSpectrogram.png', format='png', dpi=100,bbox_inches='tight',pad_inches=0.1)  
   
 Plot.show()  

Monday, 5 December 2016

Brent and Levenberg-Marquardt: the bread and butter algorithms for postgrads

There are two extremely useful numerical algorithms I end up using almost all the time when processing data. This was particularly true during my final year undergraduate project and my PhD, when like most people I had raw data which had to be processed and conclusions made. Here I will outline how to use them in python (which I think is the most useful and fastest coding all-purpose language) with a specific example, which is explained at the end.

Levenberg-Marquardt: Curve fitting


It is often useful to fit an analytical function to some raw, messy data. In this case, a numerical algorithm will iterate through different values of input parameters, until a mathematical function is as close as possible to the data's independent variable. The aim here might be to extract some hypothesized physical parameters; for example, Planck discovered the constant named after him, when fitting a function to black-body spectra.

In python, suppose we have two lists (with the same length) x and y. First, we define the function we think best describes the data, or one which might have the most scientific relevance. Here I have chosen a linear function, combined with a shifted hyperbolic tangent:

We want to use scipy's well-optimized curve_fit routine (which we import to our script) to fit this to our data. First, we define the left hand side of the above equation as fit_function, where the first argument is always our independent variable (here it's x) and the others are fit parameters. We then provide this function and the data as arguments to curve_fit:

 from numpy import *; from scipy.optimize import curve_fit  
 def fit_function(x,m,c,B,x0,w):  
      return m*x+c+B*tanh((x-x0)/w)  
 p,errors=curve_fit(fit_function,x,y,p0=[10.0,10.0,10.0,20.0,10.0])  
 y_fit=fit_function(copy(x),p[0],p[1],p[2],p[3],p[4])  

We have made the algorithm's job a lot easier by specifying an initial guess p0 for the parameters. This is generally useful, otherwise it may converge slowly, or get stuck in a basin of attraction far from the optimal results. Our parameters are returned in an array p, along with a covariance matrix. In order to plot the result, we now use the parameters to define y_fit and allow us to plot the following:


However, the linear coefficient m = -0.11 has come out negative. This may well be the best fit to our data, but suppose that we demand that it be positive (for physical reasons). We can then trick the algorithm by making our function return an absurd value if m is negative:

 def fit_function2(x,m,c,B,x0,w):  
      if m<0.0:  
           return ones((len(x)))*1E100  
      else:  
           return m*x+B*tanh((x-x0)/w)+c  

This does indeed give a positive (albeit small) value of m = 1.53E-8 and the following figure:


Brent: root finding

Now that we have an analytical model for our data, we can make some conclusions from it. For example, suppose we want to predict the value of x for which y = 17.75. There is no analytic inverse function for such a mix of linear and hyperbolic tangent. To solve the equation numerically, we rearrange it so that we are finding the roots of some expression, namely:
The most efficient generic algorithm to do this was invented by Richard Brent. It combines the method of bisection with interpolation, meaning that it requires two bounds (a,b) on the root; for it to work, the expression should go to zero once and only once in this interval. Strictly, the algorithm looks for a change in sign of a function, which could also happen if it does so when going through infinity, as for example 1/x does at x=0.

Brent's algorithm is implemented in scipy in a function called brentq, which takes as its inputs a function and the two bounds. The function argument must itself take only a single input; this is a problem in our case, because we all the parameters which we calculated in the previous section to supply to our fit function as well. We get around this by using the lambda notation in python, which can force function taking multiple variables to take just one. Our implementation is as follows:

 from scipy.optimize import brentq  
 threshold_1775=brentq(lambda x: 17.75-fit_function2(x,p[0],p[1],p[2],p[3],p[4]), 10.0,30.0)  

We have checked the graphs from previously to get the bounds (10,30) between which our result lies and have also checked that y = 17.75 is has only a single solution in this region (otherwise there would be multiple roots and the algorithm would fail). The result is x = 24.7.

The data

In fact, the data shown corresponds to gender diversity in physics; here x+1950 is the year and y is the percentage of Physics A-levels awarded to women. Many thanks to @KarenLMasters (with the raw data available here [1] [2]) for providing the following graph:
 
I'm using it here as an example, without getting into a discussion into its societal and cultural meaning. This is a curve fit we obtained with curve_fit:

 

Depressingly, as we see above, the best fit is a hyperbolic tangent (meaning that the percentage has "plateaued" at around 23%) with a negative or at best an insignificantly small linear trend on top. It is up to teachers, politicians and the rest of society to get rid of any linear transient and get B = 50%.

Extracting numerical data from screenshots

As the raw data was difficult to obtain, I ended up using  python script to extract the raw data from the JPEG figure itself, which is also an extremely useful trick for scientists.

The script takes a minimalist version of the graph in question, meaning that the image file must end at the axes of the graph - all white space and text has to be removed. I provide the script to extract the data from the following minimalist image RawData.png here:


The broad idea is to find the image co-ordinates of (in this case) any blue pixels and then transform to graph co-ordinates. In this case, given that there is one star per year, this allows the center of each star to be identified easily (rather than, say, its points).

 #Needed for PNG  
 import Image  
 from numpy import *  
 #Other stuff  
 import matplotlib.pyplot as Plot  
 from scipy.optimize import curve_fit  
 #~~# Graph data acquisition  
 imageFile="RawData.png"          #The image here has to be the axes only (no padding, no labels)  
 x_limits=array([1950.0,2020.0])     #Fill in these limits from the graph labels  
 y_limits=array([10.0,26.0])  
 im1=Image.open(imageFile)  
 rgb_im = im1.convert('RGB')  
 dimensions=rgb_im.size # 0 - x, 1 - y  
 X=linspace(x_limits[0],x_limits[1],dimensions[0])  
 Y=-ones((dimensions[0]))  
 for idx_x in range(0,dimensions[0]):  
      convergence=0  
      while convergence==0 and Y[idx_x]<dimensions[1]-1:  
           Y[idx_x]+=1  
           if rgb_im.getpixel((idx_x,int(Y[idx_x])))[2]>220 and rgb_im.getpixel((idx_x,int(Y[idx_x])))[0]<40:  
                convergence=1  
 for idx_y in range(0,dimensions[0]):  
      Y[idx_y]=(y_limits[1]-y_limits[0])*(dimensions[1]-Y[idx_y])/dimensions[1]+y_limits[0]  
 #**# Analysis  
 years=linspace(1951,2016,num=66)  
 percentage=zeros((len(years)))  
 def find_nearest(x,x_desired):  
      nearest_idx=0  
      nearest_difference=abs(x[0]-x_desired)  
      for idx in range(1,len(x)):  
           difference=abs(x[idx]-x_desired)  
           if difference<nearest_difference:  
                nearest_idx=idx  
                nearest_difference=difference  
      return nearest_idx  
 for idx in range(0,len(years)):  
      percentage[idx]=Y[find_nearest(X,years[idx])]  
 #Output  
 output_file=open('ExtractedData.txt', 'w')  
 for output_idx in range(0,len(years)):  
      print >> output_file, years[output_idx], percentage[output_idx]  
 output_file.close()  
 Plot.figure(1)  
 Plot.plot(X,Y)  
 Plot.plot(years,percentage,'o')  
 Plot.show()  

References

Thanks again to Karen Masters for making me aware of the following data, accessed at the time of writing (6th December 2016):

[1] http://www.iop.org/policy/statistics/gender/page_67095.html

[2] http://www.gatsby.org.uk/uploads/education/reports/pdf/18-physics-in-schools-patterns-and-policies-august-2006.pdf

Sunday, 18 September 2016

Does a janitor really need to know Pythagoras's theorem?

There is a debate which comes up very often, of whether it is worth teaching people any more than is strictly necessary, or whether there is any benefit to learning a wide range of subjects at school and university.

"Do you really need to learn about glaciers or the quadratic equation at school?" I will answer this question with another: "Does a soldier really need to do push-ups?" The motion of a push-up itself is near useless just about anywhere in life, military or otherwise, but it is a valuable physical exercise. In the same way, rigorous academic study is central to exercising the brain. Mathematics in particular is as important to develop logic and critical thinking as push-ups are to strong muscles.

In the modern world, no job is "brainless" (except, perhaps, for reality TV). There is so much automation and connectivity that it's impossible to be illiterate or innumerate in the workforce. Car mechanics increasingly need to be proficient with an oscilloscope; farmers increasingly internet connected. Even besides economics, our democracy and society requires informed individuals.

I think that this point reaches even further. There is often a question of whether university degrees are "worth it" when there are already so many graduates in the job market. Should our society invest in pumping out so many graduates? When every one of its members is responsible for taking part in its politics, looking after the environment, representing it internationally, I don't think it can afford not to.


Wednesday, 7 September 2016

Must a Civilization harness fusion power?


Chances are, if you're reading this, that you are aware of the (Sid Meier's) Civilization series of video games; the most recent of them has sold something like 20 million copies. It could be argued that they are a cultural icon. Many young people like me grew up on them.

In a game whose premise involves commanding a titular civilization from the age of triremes to jet fighters, there is a comforting commonality in the depiction of how technology progresses. Or, more specifically, its apex. Because, while you will inevitably encounter writing and gunpowder as time passes, nearly every game in the series finishes the story off with Nuclear Fusion.


What is most remarkable about it is that it is one of the very few speculative technologies to feature in the games. Sure, you can launch a space mission to colonize another star system, but almost everything else in the games mirrors the world up to the present day.

I think that the creators continue to include fusion in the story of civilization, because like me (and I hope, most of the players too) they are optimistic that humanity will inevitably generate vast amounts of energy from controlled fusion. Could a civilization afford not to?