Secure Browser for testing on Moodle using Chromebooks or Win/Mac

We recently set up a secure browser for our student chromebooks so that when they take quizzes/tests they can’t open up any other tabs/windows or take screenshots.
They steps involved:

A. Turn on the Safe Browser Option in Moodle

Go to Site Administration->Development->Experimental and check the box


(You can also just search the administration settings for “Safe”
I don’t know what version of Moodle this feature was added, but it was part of the Core Moodle code, so you don’t need to worry about 3rd party plugins.

B. Create a Chromebook Kiosk app

Here is a link to a copy of my complete kiosk mode app
To change it for your site you will need to download the folder and remove the copy of from all the names. Then edit a few of the files:

in the application.html file

  1. change the title to be the name of your site
  2. in the webview line change the src=“” address to be the address of your site

in background.js you could change the ‘id:’ field to be the name of your site (I don’t think this step will have an impact, but might as well put your school name there)

Then you will need to publish this app by zipping this folder and uploading it to the chrome developer site
I published my app as private to our school domain.

C. Push the kiosk app out to student Chromebooks and blacklist user-agent switching apps in the Chrome web store and/or install the Safe Exam Browser on Windows/Mac

  1. Go to Chrome Management > Device Settings > Kiosk Settings > Single App Kiosk, select Allow Single App Kiosk for devices in the organizational unit you select.
  2. Click Manage Kiosk Applications. In the dialog that appears select the exam kiosk app you want to use. You can search for it on the Chrome Web Store, or manually install it if you have the app ID and URL by selecting Specify a Custom App.
  3. Make sure the devices you want to administer the exam with are under the organizational unit you select for the kiosk app.
  4. Then student chromebooks will get an Apps menu on the login screen with your secure browser which opens to your moodle site.
  5. Now blacklist user-agent switching so tricky students can pretend to be the safe browser even when they are not
  6. For windows/mac go to You will then need to configure the settings to point to your site.

D. Done

Now teachers can set up a quiz with the Safe Browser option turned in the quiz settings page under Extra Restrictions on Attempts. If students try to take the quiz while logged in normally, they will be told they need to use the Safe Exam Browser.
To use the safe exam browser kiosk app, students must log out of their chromebooks and look for an Apps menu in the bottom left menu bar. They then launch the secure browser and log into Moodle normally to take the quiz. Once they are done, they close the secure browser mode and then can log into the Chromebook normally to do other work.

A tale of 400 netbooks

Our district spent $140,000 on student devices this year. We could have bought 230 9.7″ screen iPads (estimate $600 for keyboard case and a few apps) but instead we bought 400 11.6″ screen netbooks ($350 for Acer Aspire One 722).

  • We bought netbooks because we saw that the web was powerful, we noticed that a lot of the web is words, and we discovered that it is easier to write words on a full, physical keyboard.
  • We bought netbooks because we saw that the web was powerful and we noticed that some of the most exciting tools for the web are very limited on the tablet (e.g. google docs real time collab, comments, and presentations and’s peer editing and originality report).
  • We bought netbooks because with Ubermix they boot quickly, are easy to manage, user friendly, and full of free educational apps.
  • We bought netbooks because while text is great, multimedia is important and we wanted multitrack audio editing (audacity), layered photoshop quality image editing (GIMP), and video editing (OpenShot) for free.
  • We bought netbooks because we wanted kids to be able to take a picture using the webcam and upload that picture to any site using a file manager (like our cool Moodle glossary of math terms).
  • We bought netbooks because we wanted kids to be able to print (using our existing printers and from home when they start taking them home).
  • We bought netbooks because Scratch is one of the best apps for students to create and share multimedia, animations, and games while learning the fundamentals of algorithmic thinking (programing, problem solving).
  • We bought netbooks because now that kids actually get to have a device that they keep, they can start to customize it, hack it, learn to program, and then restore the device when they mess up.
  • We bought netbooks because layout and design with a full office suite allows for some great professional work (like a 3 column brochure, a graph from a spreadsheet with a trend line, and a full featured presentation).
  • We bought netbooks because we didn’t want to burden our student’s education with a particular corporation, so that they would know that they had options and choices when it comes to the technology they used like LibreOffice and Firefox.

I have nothing against tablets and I have nothing against Apple. However, I looked at all these great things our students could do with netbooks and then came the kicker. We bought netbooks because we could give 170 more students hands-on access to all of the tech tools that will help transform their education. Maybe an iPad can do some things that a netbook can’t (and maybe a netbook running Ubermix can do some things an iPad can’t), but being able to give 70% more students access to technology is something that seems hard to argue against.

Can You Be Data Driven Without Statistics?

Have you ever seen a margin of error reported on a state test result or an error bar on a state test graph? Has anyone ever reported a p value, an R squared, a standard deviation, median, or any other statistical measure along with a test result? Frankly I can’t recall even seeing an average (mean) when state tests are discussed. If we are truly trying to be data driven in our decisions as educators and institutions, I believe we need to do some basic data analysis to understand this data or else we end up in a state of DRIP (Data Rich, Information Poor).

Scientists/statisticians will tell you that the result of a test is not a single, true number, but a number with an error margin (or confidence interval) around that number. So in political polls you will see 55%  +/- 3%, because we understand if we polled multiple times on the same day using the same polling method, we would get a variation in the end result. This is true for students taking tests as well, however the information about the amount of variance is unpublished. Why care? Because important decisions are based around whether or not test scores rise or fall. So a department in a school might be put under increased pressure if their scores fell by 5%. However, if the variance of the test is +/- 8% then a 5% change is insignificant. That is, it is not possible to say that the decrease in test scores is due to students actually knowing less or whether it is due to random chance and natural variation.

Error bars will increase with a smaller population size and with a wider range of results. So it is more difficult to make solid claims of change on a single class/department than an entire district and it is more difficult to make solid claims of change with a diverse group of abilities than on a group of students that have similar abilities.

Let’s look at an example of actual data, first without statistics, then with statistics. Here is a table and graph of test results:

Year 1 Year 2 Change
Far Below Basic 12 11 -1
Below Basic 21 27 6
Basic 62 48 -14
Proficient 36 52 16
Advanced 11 14 3

This data suggests some improvement in test scores based on the “squint” analysis technique, i.e. getting a general impression based on the amount of green. In fact if you average the test scores on a 0=Far Below Basic to 4=Advanced scale you do see an improvement from 2.09 to 2.20 which is a 5% improvement.

However, to truly state the facts we have to include some information about the variance. Based on the standard deviation and population size, the 90% confidence interval for these averages is 14%, which means that the numbers actually have to be 14% greater in year 2 to show a significant increase. Another way of thinking about confidence intervals is that we are 90% certain the test results are +/- 14% from the reported value. When running a t-test to see if the two averages are significantly different,  the p value is .35, with .1 or less being considered significant in psychology. Here is a graph showing the average test scores with 90% confidence intervals for error bars.

The large amount of overlap shows that there is no measurable difference between the two years, yet this sort of analysis is not regularly done. Based on the few tests I have looked at for my school’s variance and sample size, it seems that around a 15% difference is a significant difference. While there are certainly nuances of statistics I don’t understand, I could easily accomplish this analysis based on a single college stats course and a spreadsheet program (LibreOffice calc or Microsoft Excel). Instead of spending ten of thousands of dollars on data warehouse systems, schools should run their data through basic analysis and then only do further investigation on significant changes. Now that I know a 15% difference is the threshold for a real difference I can easily dismiss smaller variations as likely due to random noise. I would hope that administrators and board members would ask for this confidence interval for their schools and then use it as a filter before jumping to conclusions. I think it would surprise people at my school that even a 10% change from year to year is statistically insignificant. But that is the power of math, it reveals truths which are often counter intuitive. If we want to be data driven, we need to stop analyzing numbers with our gut and do the math.

PS- This example data does not show significant improvement in student learning, even given the assumption that a single 60 question multiple choice test is an accurate measure of student learning of a complex subject. If we start questioning the correlation between the results of the test and the actual student learning outcomes (success in college, career readiness, application of material learned in the real world), then we are even farther from being able to prove anything with these numbers.

Quickvisit/Walkthrough Form with Email Using Google Docs

Google docs is a great way to great forms for doing classroom observations. However, the information in the form just goes to a spreadsheet which isn’t very useful for providing teacher feedback. By copying a template in Google Docs you can have a quickvisit form which emails the teacher and observer an email with the information from the form in a nice human readable format.

Step 1: Copy the template

  1. In google docs/drive choose Create->From Template
  2. Choose Public templates
  3. Search for quickvisit
  4. You should see CUSD Quickvisit Walkthrough with Email by Colin Matheson click Use this template

Step 2: Add your staff emails

  1. Hover over the teacher dropdown and click the pencil icon that appears in the far right
  2. Enter the email addresses of your teachers (you have to do this one at a time)
  3. If that seems like too much work for now you could change the Question type to Text and then type in the email address at the time of the walkthrough, however by doing a little up front work enter emails in the form you will save time and prevent typing errors later
  4. You should also enter the emails of the main people doing the observations in the Observer question

Step 3: Turn on the script trigger

  1. Close the editing form window/tab and go back to your document list
  2. Click on the spreadsheet which is attached to the form (by default it is called “Copy of CUSD Quickvisit Walkthrough with Email”
  3. Go to Tools->Script editor
  4. A new window/tab will open up, in that window choose Resources->Current script triggers
  5. A popup dialogue will appear which says “No triggers set up. Click here to add one now.” Click on that link
  6. Change the third menu from “On spreadsheet open” to “On form submit” and choose Savescript trigger
  7. A big box saying Authorization Required will appear, click Authorize
  8. You will then have to click Save again to actually have the trigger save
  9. Close the Script editor window

Step 4: Try it out

  1. Go to your form web address by choosing Form->Go to live form
  2. Enter in some practice data and see if the emails are sent

Advanced Customization

So let’s say you like the concept of this form, but would like to customize the questions, the email being sent, or use the email feature on an entirely different form. Some elements you can alter by just editing the form, others will require you to also edit the script that sends the emails. In order to get emails to work with another form that you have already written you will need to copy and edit the scripts. You can get to the scripts from the Tools->Script Editor menu. The script is written in javascript so if you have someone with a bit of web/programming experience they can help edit it for you, but you don’t need programming experience to customize the script. I have italicized all pieces of code in the text below to help them stand out.

Simple Edits: The dropdown menu answer options can be changed and the new answer options will show up in the email.

Medium Edits: If you want to change the text of a question,first you need to edit the form to change the question and answers. Then you need to go to Tools->Script editor to change the text in the email to reflect the changes you made. The email text code starts on line 21, but first we should understand line 7:

var dataRange = dataSheet.getRange(last,2,1,25);

This gets columns 2 through 25 from the last line of the spreadsheet. So if you you have more or fewer questions you can alter those numbers. We don’t bother with column 1 because it is a time stamp. Later I load that info into a variable called array.

So starting on line 21 is the email
emailTemplate =
“Hello, ” + array[0][0]

Anything in quotes is actual text of the email. The array[0][0] means it takes the first bit of data from the spreadsheet (which is actually the second column). array[0][1] takes the info from the 0second question, array[0][13] takes info from the 13th question, etc. There is always a [0] before the actual number we are using, and the number is always in square brackets. If you add new questions to the google form, they will be added to the last column of the spreadsheet, so first you need to change the number 25 on line 7 var dataRange = dataSheet.getRange(last,2,1,25); and increase it to reflect the number of questions you added (if you added two questions then you should change 25 to 27. Then you need to add some text to the email and pull the data from the spreadsheet with array[0][25] to pull the first question you added and array[0][26] to pull the answer from the second question you added. If you delete questions from the form, google docs keeps them in the spreadsheet so you don’t have to alter any numbers in the array. You will want to remove the reference to the deleted question from the email text.

Also of note is that on line 60 the script checks to see if a box “Email to teacher” was checked. This happens to be the 22nd column on my spreadsheet. If it is Yes, then an email is sent to the teacher. If not, then just an admin email is sent.

Advanced Edits: To use these scripts on an entirely new form, you can select the entire text of the script and copy the text. Then in your form, go to Tools->Script editor, create a new blank script, and copy the text in. You will then have to adjust all of the text in the email and the numbers in the array[0][change this number to refer to the question you want]

Let me know if you have other questions about the script or need help. I have responded to several emails and calls with people looking for specific help already, so feel free to contact me directly.

Why use Moodle, when you can use something shiny?

A fellow Moodler (Laurie Korte) asked for some talking points to convince her district to use Moodle as its primary collaboration tool.

In my district (and around the web) I often get folks saying “We should use X instead of Moodle” (Ning, Blogger, Edmodo, Facebook, Quia, etc.) and my general feeling is that while there are lots of strengths to each of those tools, they lack the ability to be the one-stop shop.

With Moodle we get one web address and one login that gives students and teachers ALL of the tools that would require dozens of other systems. Is Moodle the best blog tool, the best forum tool, the best wiki tool, the best quiz tool, the best survey tool, the best social networking tool? No. But it is the only web app I have found that has all of these in one spot.

Why is having all these tools in one place so important and why is Moodle so good at meeting lots of need?

  • Prevents login/password overload. Every new site is another web address, username, and password that must be created, managed and remembered. If the school has to create and manage those accounts, it becomes a lot of work. If the teachers and students have to manage/create the accounts, then most won’t bother. Moodle is good at getting login info from other databases and at sending login info to other systems. We have Moodle, Google Docs, and Mahara all tied to a single sign on with Moodle. Soon we will have the Moodle logins tied to our Active Directory accounts. (Edmodo and Ning won’t allow that kind of account management)
  • Prevents fragmentation of progress. One teacher uses Ning because it is really slick for social networking, another uses Quia for online quizzes. Can they share their success with each other? Only by learning an entirely new system and interface. In Moodle if one teacher uses Forums and another uses Quizzes they can share their strategies (or even copy the activity between their courses) and add new tools to their teaching with less “activation energy”. There will always be a few cutting edge teachers who can pick up new tools and experiment from year to year, but you shouldn’t try to make those tools “district standards”. As a district you want one system that you can be assured will meet all your teachers needs so you can move everybody forward.
  • Lack of consistency for students/parents. With one site, kids learn how to interact online and build on those skills year to year. This means the tech is more transparent. If every class/year kids have to learn a new system, there is more time wasted. (Yes I admit that learning new tech can be a waste of time if you do too much of it and don’t use it enough.)
  • Easy for teachers to grow into it. Moodle can start as just a static webpage. This gets a lot of teachers on board (even ones who aren’t on the Web 2.0 bandwagon). Then after a year or two they can add a forum here, a wiki there. After awhile they have slowly integrate Web 2.0 into the classroom. Throw them into something like Ning and most teachers just won’t engage, because it is too much change.
  • Moodle is robust and expandable. Want to upload an Examview test bank? Want to allow 2 quiz attempts with a half hour delay? Want to monitor all student messages? Want to restrict a few students from messaging, but still allow them to take quizzes? Want to give one student the ability to moderate a forum? Want to give students an entire course to use for their project? No other tool I have found has so much power to customize. Also the plugin database full of 3rd party mods is awesome: I have added a nicer file upload system, a cool photo gallery, Google Apps integration, integration, site-wide message reporting. All for free and easy to set up.
  • Moodle can be used for professional development. Set up an anti bullying course and have teachers go through it. Do Blood Borne Pathogen certification quickly and easily on Moodle. Create a staff discussion forum. No need to have a separate log in for teachers to work with students and teachers to work together.
  • Finally, Moodle can be customized to look very nice

So is Moodle right for your district/school? It may not be, but if you think about long term, system-wide growth and Moodle looks very attractive.