Archiving Work
At the end of a semester, or the end of a school year, it's typically time to move on to other things: new classes, a new school year, or even a new school.
Before you go, consider making an archive of your work.
It's easy to make digital copies of digital work, and in Computer Science more than most other subjects, you may find at some point in the future that it would have been nice to have a copy of work that you've done in the past.
Here are some quick ways to make archive copies of your stuff before you go.
- Creating a copy of your work from the course server onto your own computer
- You probably already have a folder on your computer with all of your work for our class in it. If not, make one of those first. I'll use compsci-2022 as an example:
$ mkdir compsci-2022
- cd into that folder...
$ cd compsci-2022
- ... and make folders to organize the information in there, in preparation for another collection of work that's coming in.
- While still in that ap_compsci-2021 folder on your computer, use the Terminal to download your home directory from the course server.
$ scp -rP 1030 studentID@crashwhite.polytechnic.org:~ ~/Desktop/compsci-2022
- You probably already have a folder on your computer with all of your work for our class in it. If not, make one of those first. I'll use compsci-2022 as an example:
- Grabbing a copy of the course website for future study
You'd be amazed how many requests I get from former students for copies of the information that we've covered in here. Here's how you can make your own copy of the website.- Get wget installed on your machine. This command-line tool is what we're going to use to copy the crashwhite.com/apcompsci from the internet onto our own machine so that we'll always have a copy of it!
wget doesn't come installed by default on most operating systems. This article has instructions on how to get wget installed for your machine. Then... - In the Terminal, run a wget command to download the website's contents. This is a good command to use:
$ wget -r -k -p -np --html-extension www.crashwhite.com/apcompsci/
In this command, -r tells wget to recurse down through the entire website, -k converts links in the document so that files will be suitable for local viewing,
-p
gets page requisites, -np means no-parent links, and--html-extension
converts extensison to .html (deprecated?).
When I ran this command on the apcompsci website it took less than a minute to copy the whole website into a folder called www.crashwhite.com on my Desktop, and the site took up a little over 100 MB of space. Put that folder into your compsci-2022 directory and you've got pretty much a complete record of the work you did in here.
- Get wget installed on your machine. This command-line tool is what we're going to use to copy the crashwhite.com/apcompsci from the internet onto our own machine so that we'll always have a copy of it!
The only thing left to do is make sure you have a backup of everything! :)