This year’s study repeated previous OSS Watch surveys and was aimed at IT directors in FE and HE institutions. The survey was conducted online using SurveyMonkey.

The 2013 OSS Watch National Software Survey closely followed the design of the previous two surveys, conducted in 2008 and 2010, and therefore provides a good insight into the changes in the status of open and closed software from 2008-2013.

In 2010 the study included a “background survey” of self-selected respondents via the OSS Watch mailing lists. However, analysis of the results of that survey indicated that the main survey sample of IT directors was more appropriate for the study, and so the background survey was not undertaken for 2013.

Response rates

The survey was distributed using Jisc’s mailing system, reaching heads of IT in both higher education and post-16 education in the UK. The total number of institutions in the UK is 619, and there were 50 respondents, representing a response rate of around 8% of the sectors. This is quite low, and consistent with a general picture of “survey fatigue” in education.

For the purposes of analysis we excluded one response as the respondent was not from the UK HE or FE sectors. 11 respondents only partially completed the survey; we have included however the answers they did provide.

Of the remaining responses, 19 identified themselves as representing FE institutions, 17 were from HE, and 2 were from HE providers in FE. The remainder were a mix of Adult & Community Learning, 6th Form Colleges and specialist adult learning institutions, which for the purpose of our analsysis we have categorised under FE. This gave us a total of 32 respondents from organisations classified as “FE”, and 17 respondents from organisations classified as “HE”.

While the overall response rate is lower than for 2010, an analysis of the results of the questions on organisational responsibilities (see section 1) showed that the populations they are drawn from are comparable.

Comparisons with previous surveys

The 2010 survey report already normalised the results for comparison with the 2008 survey, eliminating questions with very low response rates for example. As the 2013 survey is identical to the 2010 survey the results are therefore readily comparable. The main differences are in the composition of respondents, with 65% of respondents in the 2013 survey being classified as FE, whereas in 2010 the proportion was 50%.

This means that comparisons between surveys sliced by sector are appropriate, but comparisons of all responses may not be as there will be a skew towards FE in the 2013 results.

Previous section: Executive summary

Next section: General information about institutions