Hello, and apologies for the long absence. You know how things can get busy…
Actually, I’ve got the best excuse of all for being away—I’ve been extra busy this year teaching the new AP Computer Science class this year. Organizing, developing materials for, and teaching that class has taken up just about every spare work moment I’ve had. I’m not unhappy about that at all—having the opportunity to work on any new course, and especially that one—is an exciting experience, and I’ve learned a lot of lessons this year, lessons that I’ll tell you about soon.
In the meantime, let’s talk briefly about Legacy vs. Transition.
Here, I’m referring to legacy in it’s modern digital sense: legacy software is software that is not the most current, but which is still supported to some extent, perhaps by virtue of the fact that it was very popular at one point, and its use is still widespread. (The adjective legacy may be extended to other uses as well, but the software context is a common one.)
Last year Microsoft announced the new version of its venerable Office suite, now called Office 365. Along with whatever features that new software includes, it also comes with a new licensing strategy. Under this new system, a one-time license to use the software is not purchased outright; rather, the user pays a monthly fee for the right to use the software. It’s a classic example of the software as a service model for software distribution, and it’s certainly within Microsoft’s rights to transition to such a model. Google has been doing it for years, and if Google doesn’t charge cash money for the service, I’ve certainly paid for their services in other ways (including my privacy every time I send or receive an email from someone with a GMail account).
After losing the ability to read ClarisWorks (and then AppleWorks) documents a number of years ago, I made the decision to transition to using Microsoft Office products, with the intention of avoiding the kind of data loss that comes from using products that have a shorter lifespan. I have gigabytes of Microsoft Office documents on my hard drive, from my own handouts, worksheets, tests, and letters of recommendation to documents that have been shared with me by practically every person with whom I have a professional relationship.
Microsoft’s new licensing plan, however, was just the impetus I needed to start thinking about transitioning to a new system. I still have the Microsoft Office suite on my computer and occasionally still edit legacy documents using those applications. New documents, however, are being created using LibreOffice, which is a serious attempt at providing free software to support the creation of OpenDocument (.odt) files.
You can see a screenshot of the LibreOffice word processing interface above, and the similarities between it and Word are such that you shouldn’t find yourself too disoriented.
LibreOffice includes translation features to bring Word documents (.doc and .docx) over, and how well your files will be translated depends in part on how hard you’ve pushed Word’s feature-packed capabilities—I haven’t explored those capabilities much yet.
For the moment, I’m just enjoying adapting myself to the new system, and creating a new series of documents that, going forward, won’t require an ongoing investment with the powers that be at Micro$oft.
In the grand scheme of things, working about the long-term viability of your electronic documents might not be something that you want to think about… but it merits some consideration. I have songs made with music mixing software that I no longer have access to. (I have final mixes of the music, but the software itself no longer works; I am unable to create new mixes of the music.)
JPG graphics images, carefully edited and compressed fifteen years ago when dial-up connections were still a thing, look terrible on the high-resolution Retina Display of an iPad. (At least the colors of those images haven’t faded with time, which is more than I can say for the paper-based photos in an old photo album of mine.)
Hardware legacy is something to consider too. I have FireWire hard drives but my laptop doesn’t have a FireWire port. I have Zip disks from the 90s, too, and no Zip drive to put them into. Fortunately, I copied everything from those drives onto a USB external hard drive a few years ago when I did have access to those machines. I was either smart or lucky to have anticipated the transitions that would have to be made happen down the road.
I don’t think there’s a one-size-fits-all to the challenges posed by aging software and hardware. Some people spend enormous time and energy making sure that they always have copies of everything digital—I tend to lean towards that end of the spectrum, as you might imagine—and others don’t want or need to keep anything of their digital life.
I don’t know any of those people, though, so I can’t really speak to that.
The holidays are no time to get any rest. Oh, no, there’s too much going on–parties, holiday shopping, out-of-town visitors–to actually get any down time. No, to actually get a chance to relax, you have to resort to more drastic measures… like getting sick.
That’s my genius plan, and it’s working just great.
While I’m sitting around waiting for my body’s defense mechanisms to do their thing, I’ll just include a quick year-end pointer here to one of Audrey Watters’s year-end Trend posts, this one on Computer Science in schools:
Despite the proliferation of these learn-to-code efforts, computer science is still not taught in the vast majority of K–12 schools, making home, college, after-school programs, and/or libraries places where students are more likely to be first exposed to the field.
There are many barriers to expanding CS education, least of which is that the curriculum is already pretty damn full. If we add more computer science, do we cut something else out? Or is CS simply another elective? To address this particular issue, the state of Washington did pass a bill this year that makes CS classes count as a math or science requirement towards high school graduation. Should computer science – specifically computer science – be required to graduate? In a Google Hangout in February, President Obama said that that “made sense.” In the UK, computing became part of the national curriculum.
She has a bit more to say on the subject, but her thoughts echo many of my own. Does everyone really need to “Learn to Code”? How important is Computer Science in the midst of an already bulging academic curriculum? How can educators and the tech industry best reach out inclusively to students on behalf of an industry that is not only famously non-inclusive, but downright hostile to some demographics?
It’s a problem that merits discussion at all levels, and there are certainly institutional responses that might be pursued. As I expand my role as a computer science educator I may even become involved in some of those—that’s certainly my intention.
In the meantime, I consider myself on the ground doing the front-line work without which nothing else matters. “For this assignment, students, we’re going to…”
If you’re not doing something cool with your computer science, well… what’s the point, really? ;)
Merry Christmas and Happy Holidays, everybody. See you in the New Year!
You may have heard about the Hour of Code this past week, a 5-day educational technology event sponsored by Code.org that is meant to inspire future generations of computer scientists and computational thinkers: by spending just an hour working on a computer science related project—playing with a coding simulation, building a game, solving an algorithmic puzzle—students of any age level will have a better understanding of the topic of computer science, and perhaps be inspired to study it further, either in school or on their own. As a computer science teacher it had popped up onto my radar a few months ago, and it sounded like an intriguing idea so I proposed the idea to our school directors, who were immediately excited about the possibilities.
Fast forward two months, lots of meetings, some curriculum development, and a website, and I’m happy to report that Hour of Code was a rousing success at Poly. We decided early on to target fifth and seventh grades at the school, and I decided early on to create a curriculum—part coding, part computational thinking discussion—that would work with our students. It certainly helps that we had an entire Apple iMac computer lab that I was free to install a user-friendly text editor on.
As I write this, we’ve finished working with the two classes of fifth graders, who thoroughly enjoyed the experience. We talked, we coded, and they walked away with an official and personalized Code.org Certificate of Completion as well as a printout of their code and corresponding Python turtle-graphics art. (Little Marco enjoyed the experience so much that he was quite put out when the lab had to be vacated before he’d put the finishing touches on his masterpiece. I learned later that the first thing he did when he got home from school that day was to plop down in front of the computer and finish his program.)
Crucial to the success of the day was the support of a large number of people, including our division Ed Tech coordinators, our Director of IT, the teachers who gave us class time to work with their students, and three of my own Upper School students who came down to assist the younger students. We had teacher visitors from other schools in attendance as well, including a professor from Caltech’s Center for Advanced Computing Research. (I don’t think he was scouting our fifth graders for prospective students, but you never know…)
The participation of all these people was vital: advancing technology use in schools is not just about getting new hardware. As a gentle reminder of this fact, our seventh grade sessions—tentatively scheduled for this week—had to be postponed due to some scheduling conflicts. All is well, though, and we’ll be running a more sophisticated Hour of Code session—one that delves into recursion—with our seventh graders at the end of January.
For futher information about Poly’s Hour of Code, including code examples, the presentation slides, or a zipped file containing both, see Polytechnic Hour of Code.
I blame it on the fact that I’m teaching a new course.
As I’m teaching AP Computer Science, and developing curriculum, assignments, and lessons for that class, and trying to figure out what works—and what doesn’t—there are lots of mid-course adjustments that I make. Not every assignment needs to be perfect, perhaps, but if I don’t get done addressing all the concerns in that one lesson, it’s hard to have very high expectations for the work that students will do that evening.
And in an AP course, time needs to be used wisely. I can’t afford to be expanding units when there’s a certain amount of material that must be covered by the end of the year.
Fortunately I’ve been able to leverage YouTube and GoToMeeting videoconferencing software to take up some of the slack while I get my act together. A 3-minute follow-up to a lesson, emailed to students, can help to proactively clear up a lot of confusion. Likewise, being available for online office hours, during which students can share their screens with me and we can debug their programs… that’s invaluable.
And although I’ve usually worked on the computer in the past, with a voiceover that describes what I’m doing, it’s often useful to “do a Khan” (as in Sal Khan, of Khan Academy), and just write some stuff out. I don’t have any evidence to back me up here, but my gut says that there’s an enormous cognitive benefit to developing things progressively, and by hand.
Here’s an example of a combination of drawing and computer analysis, done not for the AP Comp Sci class but in preparation for an Hour of Code unit that I’ll be using with some students. See what you think:
Do you see any advantage to demonstrating things in long form, as opposed to doing voiceovers with slides or computer displays?
We’ve just completed the first quarter of the school year, and I’m loving (and for the moment surviving) the opportunity to teach a new course: AP Computer Science.
I actually began my teaching career in 1986 as the instructor of a computer programming class, first using BASIC, and then Pascal, on IBM XTs–the original beige PC. This was well before you crazy kids had access to the InterWebs, but we loved our computing machines just the same.
So it’s funny, and fun, to be teaching Computer Science again, and it’s exciting to be participating in that daily experiment we call “teaching,” in which the instructor hypothesizes about what might be an effective tool or strategy for working with a class, tries it out, and then goes home to clean up the mess of those experiments that–wonderfully or tragically–failed.
I’m finding out that my students this year have a wider range of abilities than I’m used to seeing in the AP Physics class I teach. The possible reasons for that wide range don’t really matter; I’m there to teach the students who are in the class, meet them all wherever they are, and see what I can do to help guide them in learning the subject.
How do you actually do that, though? How, practically, do I proved instruction and lessons for a classroom full of students, some of whom are going home and programming their own Blackjack programs just for fun, while others are having profound difficulties applying concepts that they appeared to have understood well just the day before?
The act of providing these varying levels of support in a single class has earned the buzzphrase differentiated instruction, and here’s what I’ve developed for a typical lesson:
a whiteboard-based overview
whiteboard based pseudocode
freestyle coding for advanced students
template-based support for intermediate students
solution-based support for students who need more support
Wanna see it in action? Here’s a 7-minute documentary-style rundown, complete with footage of the kids hard at work.
At 34:20, Kevin starts talking about the challenges of dealing with Amazon.com / Amazon.uk licensing differences for electronic textbooks, with corresponding separate Google accounts to manage those accounts. Even once he gets this solved, he’s still concerned that notes taken in the textbook for one country are stuck in one Amazon cloud, and inaccessible from another.
A math/computer science teacher outlines in gruesome detail his efforts to get a new “recording his class lessons” workflow going after the untimely death of his laptop. Sample entry:
I would love to drop the USB Mic too if I could figure out how to use the Droid’s Mic with this configuration. I would then be truly wireless! This new incarnation of the Kindle has an 8.9″ HD screen, dual WiFi, dual speakers, dual cores as well as a webcam and mic. I think there’s a version of Teamviewer, called Teamviewer for Meetings, that uses VOIP so I wouldn’t need a separate Mic. IDK if it’s free or cheap. I suppose I could go back to using a wireless lapel mic? Maybe I could use a BlueTooth Headset Mic? You see, my lapel mic disappeared after Hurricane Sandy destroyed the Math Building at my High School….
I am also experimenting with other Remote Desktop Protocol (RDP) apps such as SplashTop. I’m using Splashtop2 for Droid and SplashtopStreamer for Windows.
I’m focusing on Teamviewer and Splashtop as these Desktop streamers are available for both Windows and Linux and the client app is available for Droid. I usually have to use Windows whenever I’m on the road, say at a conference. However, I usually use Linux all day every day at the High School. Further, all my tablets are now Droids!
This article, referred to me by my friend Cindee, relates how one teacher, reflecting on frustrations encountered while teaching Python, eventually developed a technology-based workflow that allows him to give student better access to the materials covered in class. (More relevant to computer science classes than traditional subjects.)
It’s a kerfuffle all the way ’round, and everybody’s got something critical to say about the situation, from the large scale of the roll-out to the money involved, from the choice of device to the sloppy execution. Everybody except perhaps Audrey Watter’s, who says this is what we should be teaching kids to do anyway.
And for me: Google Saves the Day?
My own frustrations are perhaps minor compared with some of these, and I’d like to think they won’t cost 1 billion dollars to solve (the projected cost of LAUSD’s iPad program). One of my recent discoveries: Google Docs and Presentations, used by many teachers and students, don’t have a notifications option that will inform a document’s shared users when that file is edited. Google Spreadsheets offers this option, but Docs and Presentations don’t.
So my genius plan for conducting an ongoing conversation with colleagues via one of those documents hit a bit of a snag, and while there is a workaround–we wouldn’t be education technologists without our workarounds, would we?!–it shows again that trying to find a solution to some of these things is sometimes / often / usually harder than we’d like it to be.
The reality is that I’m grateful for Google’s shared documents, which are increasingly a cornerstone of many teachers’ workflows. It’s good enough that I almost don’t mind them mining my data so that they can more efficiently sell me ads.
Hang in there, people. We’ll get this figured out one of these days soon… :)
You’ve almost certainly heard of Maslow’s Hierarchy of Needs, which describes five levels of needs, in ascending order, that lead toward fully realizing one’s human potential.
Those needs are summarized in the triangle below, with an important addition at the very base of the pyramid, courtesy of the Internets.
It’s funny in part because it’s true, at least as far as educational technology is concerned: if you don’t have a wireless signal at your school that students can use to access the Internet, well… it’s going to be pretty hard for you to do anything technology-related.
Okay, maybe you need hardware—I’ll give you that. But hardware by itself doesn’t really cut it anymore. (Yes, I know you’re leaning back and thinking fondly of the days when we could give a kid a multimedia CD-ROM, point them towards a computer, and pretend that we were teaching them. Those days are over!)
And depending on your classroom setting, the hardware issue may already be solved: your students are in a 1-to-1 program, or a Bring Your Own Device program… or maybe you’ve got a critical mass of smartphones that some of your students already own. There are lots of ways this could work out.
And from there, it’s up to you, you and the students, what you want to do with this technology, and how you want to leverage it. Web-based research assignments? Shared Google Docs (either via Google Apps for Education or students’ private Google accounts) for students submitting cooperative work? Web pages? Mobile apps?
With apologies to Maslow, then, here is an update Hierarchy of Needs for Educational Technologists. There are thousands of technology-facilitated things you can do in the classroom, but it all begins with a device and a connection to the Internet.
There are perhaps a few elements missing here: administrative support for new ideas, new hardware, or new software? And certainly professional development funding/time for inexperienced teachers is always needed.
What else have I missed? Or are these really the essentials that are needed for successful deployment of Educational Technology at a school?
At Laura Holmgren’s request, last spring I wrote what became the inaugural post at poly360.org, a blog for the independent school community in which I work.
I’m fortunate to work in a community where the topics covered in that post are actually part of ongoing, day-to-day discussions I get to have with other teachers and technologists.
I’m cross-posting the piece here.
The Intersection of Teaching, Learning, and Technology
Richard White – 360 Reflection
When I was nine years old I read Danny Dunn and the Homework Machine, a story in which Danny and his friends Joe and Irene program a computer to do their homework for them. At that time the personal computer was still a fantasy, but the possibility of being able to have a machine handle my academic chores–my learning–was absolutely intoxicating.
Fast-forward a few years: I’d gone from programming a mainframe in high school to majoring in Computer Science in college, and then from teaching computer programming in high school on IBM PCs (pre-Internet!) to teaching AP Physics in Berkeley. I’d re-discovered the book from my childhood–there’s my name on the inside, written in my mother’s neat cursive–and read again about Danny’s hard-earned lesson: that programming a computer is not a shortcut to learning. The last page of the book, though, opens up a new possibility:
“Danny had a strange, wild look in his eyes, and a faraway smile on his lips. ‘Listen–what about a teaching machine…?’”
I began investigating the possibilities of technology-enhanced programmed instruction. The learning process for an inspired student can be a pretty straightforward process: get exposed to something new, learn a little bit about it, and then use what you’ve learned to do something interesting. For some subjects, the process of presenting information and checking for understanding is ideally suited for a computer, and I wasn’t the only one who thought so. Programmed instruction in book form had existed for years, and computer-based math instructional methods were already being launched.
I was a month or so into developing my own programmed instruction when I began to realize that this system, whatever its benefits might be, also had the effect of isolating me from the very best part of my vocation: working with students to help them understand the world around them. Teaching content, exploring with students the process of interpreting content, and perhaps most importantly, learning to develop strategies for dealing with new and unexpected situations, all demand a dynamic, creative, process that is the very heart and soul of my work. There was no way for me to write this stuff down, to program it, to “classroom flip” this aspect of my work.
That hasn’t kept me from leveraging technology where appropriate. The vast majority of my current curricular materials are online–lessons, labs, homework help, and practice tests–and students across the U.S. and abroad use these materials as a guide in their own learning. I am part of a global learning and teaching community, using technology that is faster, cheaper, and better than ever. We are actively exploring new ways that we can use that technology to improve education.
But at the heart of it all–sometimes just barely visible behind the iPads and the laptops, the email and the tweets, the websites and the Massive Open Online Courses–are students and teachers, working together, just as we always have.
And there is nothing that will be able to replace that.
The last couple of weeks I’ve been spending some time putting together informational videos—screencasts—to be used as part of my school’s Bring Your Own Device program which begins this Fall for ninth graders.
As teachers we all spend a certain amount of time preparing content for the courses we teach, and this is a little like that… only more so. I’m conservatively estimating that I put in ~2 hours of work per minute of video, based on writing the content (script and PowerPoint), creating and assembling resources (logos, other screen captures, etc.), recording the basic presentation, post-production editing (layering in the additional resources, removing out-takes), and uploading of video to YouTube.
I don’t envision that this is going to become a permanent part of my job, but I’ve enjoyed trying to become more proficient at the process.
Here’s the current line-up.
I’ve toyed with the idea of making a How to Make a Screencast video, but… how do you record yourself record something? How do screencapture yourself doing a screencapture? This is all very meta-….