Social Media and Digital Citizenship

We have an official school twitter account on our website’s homepage which re-tweeted a student’s post about the football game. If you clicked on the student’s twitter account, it was full of school inappropriate language. And since I teach our students about digital citizenship, I got to thinking more deeply about the topic.

Current Thinking:
Social media is a powerful tool for connecting people together, but there is no way to prevent exposure to content that a person might deem offensive other than to completely block the use of social media. Even with good intentions and reasonable diligence to try to prevent offensive content being connected to the school accounts, by clicking through pages linked to our social media account there is a chance that some content that we are not in control of could be found offensive. Every twitter account of a business, school, public person, or professional has to accept the fact that they share the space with hateful speech, offensive jokes, and naked pictures.

For example even without re-tweeting, a person could click on a twitter account’s follower list and see who follows that account. Since a person doesn’t have control over who follows their account, there might be an account (real or one of the millions of spambots) following the account which is posting offensive content and they would have to do quite a bit of work to monitor and cleanup their follower list.

Our students follow the school twitter accounts and they will very likely have material which many people deem offensive on their twitter pages just like if we recorded their conversations in the halls we would hear some offensive speech.

However, students (and adults as well) have a hard time thinking about the fact that online interactions can be taken out of the context they were intended: late night jokes between friends at home can be scrutinized in the principal’s office the next day. We have these discussions with students regularly, but it is a justifiably difficult line to walk to be social and engaged online and to also never post something that someone in another context might find offensive.

Here are some factors that make it hard for people to keep their online behavior squeaky clean:

  1. There is such a huge volume of social media so it is unlikely any particular thing a person posts will come back to hurt them. So people learn they can be edgy/offensive without any negative feedback.
  2. There are plenty of examples of people doing and saying whatever they want on social media and getting praise, so people learn that one way to get attention is to push the boundaries of what is appropriate.
  3. Social media is largely insular, in that most people are only posting for and being actively read by a handful of friends. So a behavior or style of speech can be reinforced in that small group setting without any negative feedback (like using racial or gender slurs).
  4. Social media is fast, a particular topic quickly becomes irrelevant, and so people learn to post quickly so that they are involved in the topic of the moment.
  5. People’s internal thoughts are not as appropriate as their social behavior and interacting with a phone or computer screen doesn’t have the social feedback cues that come from speaking to another person. So paradoxically people’s public posts can contain content closer to the thoughts inside their mind, not the thoughts they normally share with others.

All of these factors work against the thoughtfulness that we encourage during our yearly digital citizenship presentations. I would love to hear your thoughts.

Secure Browser for testing on Moodle using Chromebooks or Win/Mac

We recently set up a secure browser for our student chromebooks so that when they take quizzes/tests they can’t open up any other tabs/windows or take screenshots.
They steps involved:

A. Turn on the Safe Browser Option in Moodle

Go to Site Administration->Development->Experimental and check the box


(You can also just search the administration settings for “Safe”
I don’t know what version of Moodle this feature was added, but it was part of the Core Moodle code, so you don’t need to worry about 3rd party plugins.

B. Create a Chromebook Kiosk app

Here is a link to a copy of my complete kiosk mode app
To change it for your site you will need to download the folder and remove the copy of from all the names. Then edit a few of the files:

in the application.html file

  1. change the title to be the name of your site
  2. in the webview line change the src=“” address to be the address of your site

in background.js you could change the ‘id:’ field to be the name of your site (I don’t think this step will have an impact, but might as well put your school name there)

Then you will need to publish this app by zipping this folder and uploading it to the chrome developer site
I published my app as private to our school domain.

C. Push the kiosk app out to student Chromebooks and blacklist user-agent switching apps in the Chrome web store and/or install the Safe Exam Browser on Windows/Mac

  1. Go to Chrome Management > Device Settings > Kiosk Settings > Single App Kiosk, select Allow Single App Kiosk for devices in the organizational unit you select.
  2. Click Manage Kiosk Applications. In the dialog that appears select the exam kiosk app you want to use. You can search for it on the Chrome Web Store, or manually install it if you have the app ID and URL by selecting Specify a Custom App.
  3. Make sure the devices you want to administer the exam with are under the organizational unit you select for the kiosk app.
  4. Then student chromebooks will get an Apps menu on the login screen with your secure browser which opens to your moodle site.
  5. Now blacklist user-agent switching so tricky students can pretend to be the safe browser even when they are not
  6. For windows/mac go to You will then need to configure the settings to point to your site.

D. Done

Now teachers can set up a quiz with the Safe Browser option turned in the quiz settings page under Extra Restrictions on Attempts. If students try to take the quiz while logged in normally, they will be told they need to use the Safe Exam Browser.
To use the safe exam browser kiosk app, students must log out of their chromebooks and look for an Apps menu in the bottom left menu bar. They then launch the secure browser and log into Moodle normally to take the quiz. Once they are done, they close the secure browser mode and then can log into the Chromebook normally to do other work.

A tale of 400 netbooks

Our district spent $140,000 on student devices this year. We could have bought 230 9.7″ screen iPads (estimate $600 for keyboard case and a few apps) but instead we bought 400 11.6″ screen netbooks ($350 for Acer Aspire One 722).

  • We bought netbooks because we saw that the web was powerful, we noticed that a lot of the web is words, and we discovered that it is easier to write words on a full, physical keyboard.
  • We bought netbooks because we saw that the web was powerful and we noticed that some of the most exciting tools for the web are very limited on the tablet (e.g. google docs real time collab, comments, and presentations and’s peer editing and originality report).
  • We bought netbooks because with Ubermix they boot quickly, are easy to manage, user friendly, and full of free educational apps.
  • We bought netbooks because while text is great, multimedia is important and we wanted multitrack audio editing (audacity), layered photoshop quality image editing (GIMP), and video editing (OpenShot) for free.
  • We bought netbooks because we wanted kids to be able to take a picture using the webcam and upload that picture to any site using a file manager (like our cool Moodle glossary of math terms).
  • We bought netbooks because we wanted kids to be able to print (using our existing printers and from home when they start taking them home).
  • We bought netbooks because Scratch is one of the best apps for students to create and share multimedia, animations, and games while learning the fundamentals of algorithmic thinking (programing, problem solving).
  • We bought netbooks because now that kids actually get to have a device that they keep, they can start to customize it, hack it, learn to program, and then restore the device when they mess up.
  • We bought netbooks because layout and design with a full office suite allows for some great professional work (like a 3 column brochure, a graph from a spreadsheet with a trend line, and a full featured presentation).
  • We bought netbooks because we didn’t want to burden our student’s education with a particular corporation, so that they would know that they had options and choices when it comes to the technology they used like LibreOffice and Firefox.

I have nothing against tablets and I have nothing against Apple. However, I looked at all these great things our students could do with netbooks and then came the kicker. We bought netbooks because we could give 170 more students hands-on access to all of the tech tools that will help transform their education. Maybe an iPad can do some things that a netbook can’t (and maybe a netbook running Ubermix can do some things an iPad can’t), but being able to give 70% more students access to technology is something that seems hard to argue against.

Can You Be Data Driven Without Statistics?

Have you ever seen a margin of error reported on a state test result or an error bar on a state test graph? Has anyone ever reported a p value, an R squared, a standard deviation, median, or any other statistical measure along with a test result? Frankly I can’t recall even seeing an average (mean) when state tests are discussed. If we are truly trying to be data driven in our decisions as educators and institutions, I believe we need to do some basic data analysis to understand this data or else we end up in a state of DRIP (Data Rich, Information Poor).

Scientists/statisticians will tell you that the result of a test is not a single, true number, but a number with an error margin (or confidence interval) around that number. So in political polls you will see 55%  +/- 3%, because we understand if we polled multiple times on the same day using the same polling method, we would get a variation in the end result. This is true for students taking tests as well, however the information about the amount of variance is unpublished. Why care? Because important decisions are based around whether or not test scores rise or fall. So a department in a school might be put under increased pressure if their scores fell by 5%. However, if the variance of the test is +/- 8% then a 5% change is insignificant. That is, it is not possible to say that the decrease in test scores is due to students actually knowing less or whether it is due to random chance and natural variation.

Error bars will increase with a smaller population size and with a wider range of results. So it is more difficult to make solid claims of change on a single class/department than an entire district and it is more difficult to make solid claims of change with a diverse group of abilities than on a group of students that have similar abilities.

Let’s look at an example of actual data, first without statistics, then with statistics. Here is a table and graph of test results:

Year 1 Year 2 Change
Far Below Basic 12 11 -1
Below Basic 21 27 6
Basic 62 48 -14
Proficient 36 52 16
Advanced 11 14 3

This data suggests some improvement in test scores based on the “squint” analysis technique, i.e. getting a general impression based on the amount of green. In fact if you average the test scores on a 0=Far Below Basic to 4=Advanced scale you do see an improvement from 2.09 to 2.20 which is a 5% improvement.

However, to truly state the facts we have to include some information about the variance. Based on the standard deviation and population size, the 90% confidence interval for these averages is 14%, which means that the numbers actually have to be 14% greater in year 2 to show a significant increase. Another way of thinking about confidence intervals is that we are 90% certain the test results are +/- 14% from the reported value. When running a t-test to see if the two averages are significantly different,  the p value is .35, with .1 or less being considered significant in psychology. Here is a graph showing the average test scores with 90% confidence intervals for error bars.

The large amount of overlap shows that there is no measurable difference between the two years, yet this sort of analysis is not regularly done. Based on the few tests I have looked at for my school’s variance and sample size, it seems that around a 15% difference is a significant difference. While there are certainly nuances of statistics I don’t understand, I could easily accomplish this analysis based on a single college stats course and a spreadsheet program (LibreOffice calc or Microsoft Excel). Instead of spending ten of thousands of dollars on data warehouse systems, schools should run their data through basic analysis and then only do further investigation on significant changes. Now that I know a 15% difference is the threshold for a real difference I can easily dismiss smaller variations as likely due to random noise. I would hope that administrators and board members would ask for this confidence interval for their schools and then use it as a filter before jumping to conclusions. I think it would surprise people at my school that even a 10% change from year to year is statistically insignificant. But that is the power of math, it reveals truths which are often counter intuitive. If we want to be data driven, we need to stop analyzing numbers with our gut and do the math.

PS- This example data does not show significant improvement in student learning, even given the assumption that a single 60 question multiple choice test is an accurate measure of student learning of a complex subject. If we start questioning the correlation between the results of the test and the actual student learning outcomes (success in college, career readiness, application of material learned in the real world), then we are even farther from being able to prove anything with these numbers.

Quickvisit/Walkthrough Form with Email Using Google Docs

Google docs is a great way to great forms for doing classroom observations. However, the information in the form just goes to a spreadsheet which isn’t very useful for providing teacher feedback. By copying a template in Google Docs you can have a quickvisit form which emails the teacher and observer an email with the information from the form in a nice human readable format.

Step 1: Copy the template

  1. In google docs/drive choose Create->From Template
  2. Choose Public templates
  3. Search for quickvisit
  4. You should see CUSD Quickvisit Walkthrough with Email by Colin Matheson click Use this template

Step 2: Add your staff emails

  1. Hover over the teacher dropdown and click the pencil icon that appears in the far right
  2. Enter the email addresses of your teachers (you have to do this one at a time)
  3. If that seems like too much work for now you could change the Question type to Text and then type in the email address at the time of the walkthrough, however by doing a little up front work enter emails in the form you will save time and prevent typing errors later
  4. You should also enter the emails of the main people doing the observations in the Observer question

Step 3: Turn on the script trigger

  1. Close the editing form window/tab and go back to your document list
  2. Click on the spreadsheet which is attached to the form (by default it is called “Copy of CUSD Quickvisit Walkthrough with Email”
  3. Go to Tools->Script editor
  4. A new window/tab will open up, in that window choose Resources->Current script triggers
  5. A popup dialogue will appear which says “No triggers set up. Click here to add one now.” Click on that link
  6. Change the third menu from “On spreadsheet open” to “On form submit” and choose Savescript trigger
  7. A big box saying Authorization Required will appear, click Authorize
  8. You will then have to click Save again to actually have the trigger save
  9. Close the Script editor window

Step 4: Try it out

  1. Go to your form web address by choosing Form->Go to live form
  2. Enter in some practice data and see if the emails are sent

Advanced Customization

So let’s say you like the concept of this form, but would like to customize the questions, the email being sent, or use the email feature on an entirely different form. Some elements you can alter by just editing the form, others will require you to also edit the script that sends the emails. In order to get emails to work with another form that you have already written you will need to copy and edit the scripts. You can get to the scripts from the Tools->Script Editor menu. The script is written in javascript so if you have someone with a bit of web/programming experience they can help edit it for you, but you don’t need programming experience to customize the script. I have italicized all pieces of code in the text below to help them stand out.

Simple Edits: The dropdown menu answer options can be changed and the new answer options will show up in the email.

Medium Edits: If you want to change the text of a question,first you need to edit the form to change the question and answers. Then you need to go to Tools->Script editor to change the text in the email to reflect the changes you made. The email text code starts on line 21, but first we should understand line 7:

var dataRange = dataSheet.getRange(last,2,1,25);

This gets columns 2 through 25 from the last line of the spreadsheet. So if you you have more or fewer questions you can alter those numbers. We don’t bother with column 1 because it is a time stamp. Later I load that info into a variable called array.

So starting on line 21 is the email
emailTemplate =
“Hello, ” + array[0][0]

Anything in quotes is actual text of the email. The array[0][0] means it takes the first bit of data from the spreadsheet (which is actually the second column). array[0][1] takes the info from the 0second question, array[0][13] takes info from the 13th question, etc. There is always a [0] before the actual number we are using, and the number is always in square brackets. If you add new questions to the google form, they will be added to the last column of the spreadsheet, so first you need to change the number 25 on line 7 var dataRange = dataSheet.getRange(last,2,1,25); and increase it to reflect the number of questions you added (if you added two questions then you should change 25 to 27. Then you need to add some text to the email and pull the data from the spreadsheet with array[0][25] to pull the first question you added and array[0][26] to pull the answer from the second question you added. If you delete questions from the form, google docs keeps them in the spreadsheet so you don’t have to alter any numbers in the array. You will want to remove the reference to the deleted question from the email text.

Also of note is that on line 60 the script checks to see if a box “Email to teacher” was checked. This happens to be the 22nd column on my spreadsheet. If it is Yes, then an email is sent to the teacher. If not, then just an admin email is sent.

Advanced Edits: To use these scripts on an entirely new form, you can select the entire text of the script and copy the text. Then in your form, go to Tools->Script editor, create a new blank script, and copy the text in. You will then have to adjust all of the text in the email and the numbers in the array[0][change this number to refer to the question you want]

Let me know if you have other questions about the script or need help. I have responded to several emails and calls with people looking for specific help already, so feel free to contact me directly.