Lately I've been spending some time reading up on research into developments to the nature of employment given the increased computerization and automation in today's, and in particular, in tomorrow's world. These developments brings immense increases in productivity and opens up a new world of opportunities, but are employees keeping track and updating their skill sets to utilize it? My personal opinion is no, which was what initiated looking into the research on the matter.
Frey and Osborne's paper "The future of employment: how suspectible are jobs to computerisation" (2013) bring some interesting aspects, including a decent historical context to this issue; starting with referencing how John Maynard Keynes is frequently cited for his prediction of a widespread technological unemployment "due to our discovery of means of economising the use of labor outrunning the pace of which we can find new uses for labor" (Keynes, 1933). This was of course during a different technological advancement than we're experiencing now, but it shows that the discussion is not new, in fact it is nicely illustrated by an example of William Lee, inventing the stocking frame knitting machine in 1589, hoping that it would relieve workers of hand-knitting, something which met opposition by Queen Elizabeth I that was more concerned with the employment impact and refused to grant him a patent, claiming that "Thou aimest high, Master Lee. Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars" (cited in Acemoglu and Robinson, 2012).
Has anything changed since the 16th century, or are we facing the same kind of social opposition to changing the status quo? How many, today, are willing to learn a programming language in order to interface with and utilize the tools of today? As pointed out by Makyr (1998): "Unless all individuals accept the "verdict" of the market outcome, the decision whether to adopt an innovation is likely to be resisted by losers through non-market mechanisms and political activism". This was followed up by the luddite riots between 1811 and 1816 as a manifestation of a fear of technological change among workers as Parliament revoked a 1551 law prohibiting the use of gig mills in the wool-finishing trade.
Today's challenges to labor markets are different in form, yet resemble the historical aspects to a great extent. These days the ability to communicate with a computer is, in my humble opinion, as vital as learning human languages, yet there are barely a few pushes towards learning programming languages alongside human spoken languages. My hypothesis is that a reason for this is a lack of knowledge in the adult population for the same, and quite frankly mathematics and logic in general, which naturally makes people uncomfortable requiring children to learn these subjects. Initiatives such as the UK's attempt to get kids coding, with changes to the national curriculum. ICT – Information and Communications Technology introducing a new “computing” curriculum including coding lessons for children as young as five (September 2013) is therefore very welcome, but as referenced in an article in The Guardian: "it seems many parents will be surprised when their children come home from school talking about algorithms, debugging and Boolean logic" and "It's about giving the next generation a chance to shape their world, not just be consumers in it".
The ability to shape my own day is one of the reasons why I'm personally interested in the world of open source. If I'm experiencing an issue while running an application, or if I want to extend it with new functionality, it is possible to do something about it when the source is available. Even more so, in a world that is increasingly complex and interconnected, basing this communication on open standards enables participation from multiple participants across different operating systems and user interfaces.
At the same time, increasingly so in the aftermath of Edward Snowden, I want to have the ability to see what happens with my data. Reading through the End User License Agreements (EULA) of services being offered to consumers I sometimes get truly scared. The last explicit example was the music playing service Spotify that introduced new terms stating that in order to continue using the service I would have to accept to having gained permission from all contacts to share their personal information. Safe to say I terminated that subscription.
There is an increasing gap in the knowledge required to understand the ramifications of the services being developed, the value of private information, and people's ability to recognize what is happening in an ever connected world. As pointed out in two earlier posts, "Your Weakest Security Link? Your Children, or is it?" and "Some worries about mobile appliances and the Internet of Things" this can actually be quite difficult, with the end result of individuals just drifting along.
So what do you think? Why not pick up an online tutorial on learning SQL, the structured query language used to talk with most database systems the next time you're feeling bored and is inclined to put on a TV program or just lie back on the couch, or maybe pick up a little bit of python, C, or for that matter C# if you're in a windows-centric world. Or as a general plea; make read a book once in a while.