Supermoon Surf Session

After an awesome September for fall surf and a mostly satisfying October, November brought a sort of doldrums of surf to southern Maine. With fewer sessions available, and decent swell seemingly more rare than a snow leopard, I have been hunting down 2′ waves in 35mph offshore winds just to get on something – or so it has seemed.

Just before dinner on Sunday, December 3, my friend and colleague Matthew called to ask if I’m planning a session under the supermoon. I could see the moon through the trees across my street because I was out grilling at the time. But I quickly dismissed him: it was already quite dark and I was about to head in for the family Sunday dinner.

But over the course of dinner, the thought sat in the not-quite-back of my head. I hadn’t caught a really good wave in more than a week, and the moon was, as our president might say, “YUGE!”

At the end of dinner, I pulled up the Wells Beach cam to see what – if anything – I might see. Not much. I pulled up a swell, wind, and tide report, and things looked semi-promising. After checking in with the family, I threw gear in the car and a board on the roof before driving to Wells on a semi-hope for a 3′ wave. Was I ever rewarded.

It was hours after sunset when I arrived, of course, but the sea, sand, and sky were nicely lit by Supermoon. It was tough to gauge the size of the waves from the lot, but it was clear that I’d at least get on clean, 2′ waves while bathing in moonlight and ocean temps in the mid-40s.

Supermoon over Wells Beach, December 3, 2017
Supermoon over Wells Beach, December 3, 2017

But it wasn’t 2′; it was a solid, consistent 3-4′! And clean. It was a bit tough to really judge position and time the waves. While the moon carve a long, runway-like swath of bright light across the narrow band of the ocean, outside of that band it was really tough to see the incoming swell. When I dialed it in, though, the near-dark drops were exhilarating. I caught perhaps half a dozen rights and at least as many lefts. Partly owing to the vision penalty, I often struggled to get the most from the waves, and there was a bit of a rip where I was set up, which pulled me too far outside over and over. But also got my share of 10-second rides complete with cutbacks.

The hardest part of it all, I think, involved finding and maintaining trim position on the face. The supermoon lit the lip right up, leaving a really dark (practically black) face and trough, which made it really difficult to judge position on the face. Going left, especially, I found myself riding up the face (towards the lit lip) and even off the wave – even with some conscious efforts to avoid that problem. It was interesting – and weird.

I’ve been thinking about it, and I liken it to driving a car into a pronounced curve. If you look to the inside of the curve, you’ll drive right in and round nicely. If you let yourself look to the outside of the curve, you find yourself battling an outward drift that can actually be dangerous. I think my eyes were drawn to the light of the lip, and the board just followed.

December 3, 2017: 1 hour of night surfing under the supermoon at Wells Beach, Maine.

Fall Surf Sessions 2016

Crouching to get under the lip at Gooch's in mid-October.
Crouching to get under the lip at Gooch’s in mid-October.

Autumn has been good for surf, which is a really good thing because the summer was just awful. Hermione produced some really clean head-high to overhead surf at the local break, and Matthew served up some decent sessions as well. I’ve been out in some of my biggest waves, and I’m mostly able to manage the drop, the turn, and the trim – even in lineups packed with surfers.

In between the big storms, we’ve had plenty of long period, midsize swell that’s just great for logging. In late September, I made a commitment to cross step. While it has really compromised the quality and length of many of my rides, I’ve used the relatively consistent 3- and 4-foot surf as an opportunity to get comfortable moving fore and aft. I still head to the nose too early (or too late), but I’m pretty comfortable cross stepping my way around the board. It’s still not pretty; I have a heavy-footed approach that I need to lighten up quite a bit. But I can sense the progress. I see it, feel it, and experience it in the ways I recognize what I need to do on the wave and quickly execute it.

Lil’ House on the Mousam

Lil' House Pose.
After three days on Lil’ House. From left: Roger, Michael and Nate.

Lil’ House comes to life! It’s taken a couple years of stop-and-go tinkering, but our lil’ house on the Mousam is now habitable. It’s certainly still a work-in-progress, but she’s turning into a cool little hangout in the woods.

When we bought our home, it came with this tiny, turn-of-the-century cottage on the back of the lot, in the woods. Mostly, the cottage was slowly rotting away and turning to mulch. Rotting trim and broken windows were letting rain and snow slowly decay the floorboards in the back left corner, and the cool brick chimney (boarded up) leaked, further compounding the problems. But it had charm.

Originally towed from some property in Wells sometime in the mid- or late-80s, the cottage sat precariously balanced on the two pressure-treated 4x4s used as a sled for transport. Under those beams were three old-style railroad ties. Back in 2013, Nate and I spent a day moving and leveling the cottage, then I used plastic sheeting to help slow the onslaught of water damage. But every time I went inside, I was turned away by the sheer magnitude of the interior project, the desire not to spend money on a tiny shack, and confusion about handling the built-in cabinetry. Do I try to salvage and reinstall it? Do I just toss it?

In early summer 2015, at a time when I needed to swing a hammer at something, I took a sledge hammer to the interior walls of Lil’ House and attempted to pull the cabinetry without damaging it. I half succeeded, put all the woodwork under a big tarp, loaded up all the debris for the dump, then let the house sit for another year.

Lil' House picture. Interior
Interior of Lil’ House. Gutted. Skylight openings cut.

Then in August, my son Nate, my father-in-law Roger, and I got a bit busy on the cottage. We stripped the moss-encrusted, leaking asphalt roof, found two cheap skylights at the Habitat for Humanity Re-store store and some shingles. We cut two 2’x2′ holes in the roof, installed the skylights, and re-roofed Lil’ House. Nate took a small sledge to the chimney, which we removed.

Lil' House under construction.
Old windows are out, rotting wood is being cut away, old roof is on the tarp, skylight holes are in, and chimney hole (on left) is covered.

I had been collecting old, vertical sash windows from the side of the road for a couple years, knowing that I did NOT want to spend much money on this project. I mostly hoarded them in Lil’ House, waiting for the day when I would actually put some windows in. But salvaged windows aren’t uniform, and they typically don’t match.

We made the best of it by greatly expanding the glass area of the house. Including the skylights, there is easily more than 24 square feet of glass in this tiny, 10×9 cottage. We kept just one window – itself a replacement, vertical-sash frame from an old double-hung – and put in four additional, fairly large windows. Two open in, suspended from the rafters. Two open out. And one is fixed.

Roofing Lil' House.
Skylights are in, shingles are on. All that’s left is the ridge cap.

I salvaged screen material and built fixed screens for the windows that open. Where Lil’ House was once a dark, dank shack with low ceilings, boarded up, missing, and broken windows, she is now a small room with tons of light, a breeze, and a view of the woods.

It made little sense to try to reinstall the built-in cabinet: it took up about 25% of the square footage and a good portion of the wood was damaged in the removal. Instead, I committed to re-using it where I could. It became exterior trim board, replacement sill plating and structural beams, and more.

And I decided to use the biggest “counter” piece to make a built-in, drop-down desk/table. While the desktop is way overbuilt, coming in around 50 pounds, it gives me a good feeling to keep it. It folds down and out of the way, helping to maximize the 90 square feet of floor space, but then it pops right back up to become a five-foot-long desktop.

Lil' House Interior.
Looking in on Lil’ House from the front door, floor view. Notice all the light.

Next came some wiring. There’s nothing fancy here because I’m delivering power via extension cord. I picked up a couple interior wall sconces and wired them together and to a plug. I even found a free front porch light that I mounted right outside.

I had picked up a stereo receiver/tuner and a CD changer from the transfer station Treasure Chest a couple months ago. And last year I picked up a couple bookshelf Realistic speakers at Treasure Chest. I built a small shelf in the corner and above the desk, where I put the stereo system. I mounted the speakers up in the peak. This little cottage is coming along!

Lil' House.
View from the back side, with outward swinging windows.

Much of the site is still a construction zone, with tools in boxes in the cottage and piles of debris scattered about. And she still looks just awful from the exterior because I need to get busy on some sanding and painting. She’s all trimmed up on the outside now, though I still need to replace a couple rotting trim boards around the doorway. But this Lil’ House is now a legitimate cottage. I could bring a cot out and easily use it as a three-season shelter.

 

 

Humanism’s Surf Wax

In a moment of wild enthusiasm, I decided to develop a surf wax brand: Humanism’s Surf Wax. It’s a great wax for New England, and it has a history dating back to the gods and titans.

Humanism's Surf Wax image.

The Legend of Humanism’s Surf Wax:

Invented by Hephaestus, god of fire and woodstoves, this wax helped Poseidon surf the ancient seas. Stolen from Olympus by Prometheus, the formula contributed to human flourishing for millenia. Its loss ushered in the Dark Ages. But fortune favors the curious, and Renaissance scholars found the lost formula while translating Homer’s third epic, Kymatistá. This wax – now forged in New Hampshire for those strong in spirit, free in thought, and sound in body – may have failed Icarus. But it is perfect for New England. Enjoy!

The back story involves a trip to a local surf shop and discovery of the option for surf wax custom branding options out of New Hampshire.

I reached out to Jim at Jimbo’s Surf Wax to get a case of “private label” Cold Water wax and got busy with a design. Along the way, I got a bit overzealous and ended up designing three distinct private labels (Lil’ Crippsy Surf Wax and Ma Em’s Noogis Surf Wax are the others) and getting Jimbo to put his special blend in my wrappers. Fun stuff.

To put it mildly, I won’t be wanting for wax anytime soon!

The inspiration for my “legend” is the legend that appears on every carton of Newman’s Roadside Virgin Lemonade, a great lemonade that pairs well with brewed tea in the Maine summer.

AAEEBL ePortfolio Conference – Boston 2013

Two undergraduate English majors (Lauren Levesque and James Muller), Cathrine Frank, and I joined Michael Smith at York College/CUNY on a panel at the 2013 AAEEBL ePortfolio conference in Boston. Our panel, “Who Owns the ePortfolio,” explored some of the tensions in ePortfolios when an institution is interested in assessment but wants students to embrace the value of ePortfolio for their development and digital identity.

Our English majors offered brief tours of the ways they are putting their ePortfolios to use. And with Michael’s support, we streamed the presentation to the Web at CUNY.is/LIVE, a phenomenal free streaming service available at the CUNY Academic Commons. And we recorded the broadcast do document our students’ presentations.

Bass and Eynon

Following our panel, I attended the keynote jointly delivered by Randy Bass (Georgetown University) and Bret Eynon (LaGuardia Community College/CUNY). Their central questions as they look to the future of higher education: How do we create an integrated learning experience for students across an increasingly disintegrated set of structures and contexts? How do we assess learning holistically? How do we demonstrate educational distinctiveness?

Bass and Eynon are interested in the contributions ePortfolio might make to the future. After introducing their FIPSE-supported project entitled “Catalyst for Learning,” they articulated a set of practices that seem to yield effective ePortfolio initiatives.

  • They function at campus level, with departments, and with institutional stakeholders. Successful projects work with all groups.
  • Pedagogy, professional development, assessment, technology, and scale must come together if ePortfolio is to make a difference.
  • Inquiry learning, reflection, integration work iteratively in successful initiatives.

Their research (based on 24 campuses) shows that ePortfolio initiatives:

  • Advance learning success
  • Make learning visible
  • Catalyze institutional change

If the data support these conclusions – and they presented some of this data – they’re working in a space that meets a real need for the ePortfolio community.  Moving beyond testimony, individual spotlights, and even department-level assessment, the “Catalyst for Learning” project seems to speak to some of today’s hot-button institutional outcomes. Some of their data point to associations between ePortfolio implementation and retention, GPA, and even graduation rates.

But what might ePortfolio have to do with the challenges of higher education? What does this future look like?

  • MOOCs today look like a return to an instructor mode, an earlier teaching mode, but this will likely change
  • Endless pursuit of productivity, scale, efficiency, with quality often dropping out of the conversation
  • High failure rates in online learning environments may change,  but this problem also points to opportunities

The discourse that emerges from these higher ed discussions is focused on data, scale, and personalizing learning through knowledge of individual learners’ behaviors.

As Bass and Eynon see it, three core principles seem to guide those involved in much of the higher education discussion.

  1. Technology is only way to break the access, cost, quality conundrum
  2. Learning processes can be understood via data analysis
  3. Improving human learning depends on improvement in machine learning.

The landscape in this future:

  • Taking Instructivism to Scale
  • Learning Paradigm on Analytics
  • Productivity Agenda

As Bass describes, MOOCs focus on the fact that a large part of education is generic/interchangable.  On the other side of a continuum is the local and identity-specific component of education.  In between the generic and the local/experiential, Bass argues we find the high-impact integrative curriculum. The challenge is that higher education seems not to recognize these three zones, making it difficult for institutions to make this change.

For Bass and Eynon, ePortfolio may be an “agent of an integrated learning culture through evidence of impact.” If this it to happen, Bass thinks we need Integrative Learning Analytics.  We need “integrative learning” analytics, ways of evaluating integrative learning. But we also need integrative “learning analytics,” a bringing together of a range of learning analytics.

Their talk was a fascinating argument for the potential centrality of ePortfolio to an institution’s effort to meet the challenges of higher education: a bridge between instruction and learning, between productivity and quality, between granular learning/metrics and integrative learning/outcomes.

This is a heavy burden for ePortfolios. While I am an advocate for ePortfolios, I’m not yet convinced they can meet this challenge. Certainly, the larger forces of education commodification, standardized assessment, and the cost containment pressures on colleges and universities are not particularly conducive to some of the more exciting elements of ePortfolios.

Big “Fail” for the Hynes Convention Center Exhibit Hall

The conference venue gets a big “fail” on its family-unfriendly accommodations in the exhibit hall.  A conference attendee, presenter, and friend of mine with a toddler in a stroller was denied access to the lunch because it was held in the Exhibit Hall.  The hall, it turns out, would not permit individuals under 18. Insurance liability, according to security.

And they placed lunch at the back of this “childfree zone” in the Hynes Center, so my friend couldn’t actually get to the food or network with other attendees in a structured luncheon program that involved sitting at the tables focused on specific aspects of ePortfolio and technology.

Apparently, shameless promotion of products by vendors hawking tech wares is allowed, encouraged, and monetized by the Center. Every accommodation was made to ensure vendor access to adequate power, bandwidth, and presentation space. But a conference presenter with a paid registration (and a child in a stroller) cannot get in.  And a look at the swag given away by the vendors would suggest that the Exhibit Hall is actually a confectionery and toy shop: lots of hard candy, and even Peeps; buttons with fun pictures on them; wind-up dancing robots; funny little squeezable figurines with big hair; and more.

WTF.  It’s 2013. It’s the United States. And it’s Boston.  We’re not in the 1950s or in some backwater where people think women should be bound to the home. And it’s not the nineteenth century in which children should be seen and not heard. Is this an emerging trend? In the twenty-first century children should be neither seen nor heard?

One has to imagine the money saved by not buying the liability for children for three days ($50? $250?).  Juxtapose that economic reality with the cost of the electrical drops to support something like 100 technology vendors, including the placement of a leather-appointed, limo-like surveillance van right in the hall.

The overall conference venue was very welcoming, but the Exhibit Hall earns a big fail.

Omeka or WordPress. Omeka or…

I’m teaching an advanced humanities seminar called “Doing Humanities Digitally” this coming term. New course, new materials, and an open field in which to run. I’ve been working on some connections to pull in some TEI coding, some archival interactions, and even some collaborations with the natural sciences (think HASTAC). I have mostly settled on some work with the Digital Thoreau project, am toying with a foray into the Transcribe Bentham project, critical reviews of extant DH projects, a student-developed remediation of a prior humanities project (course paper), AND a digital exhibit project.

Omeka logo

I’ve been familiar with Omeka (developed by the CHNM at George Mason University) for a couple years now, and I’ve browsed through some of what the tool offers. I have always wondered whether it is really better/more powerful/more suitable/etc. than something like WordPress, a CMS/blog tool I use regularly and that is pretty easy to manipulate for a range of purposes.  This DH course, particularly the idea for a project that brings environmental science research together with some archival work, forced my hand and left me no choice but to get serious about evaluating Omeka for our purposes.

  1. First idea: Grab a free Omeka.net account and try out the tool. This couldn’t work because the free account doesn’t include the Geolocation plugin that enables users to locate items on a Google Map.  (There are some workarounds, I now know, but that’s a different story.)
  2. Second idea: Grab the open source Omeka CMS and install a test version within my domain. Complete control, full front and back end control. Not wanting to spend money on a sandbox arrangement, I went with this second idea.

A Preliminary Review/Evaluation

Installation was not as easy as I had hoped. This isn’t Omeka’s fault, really. It had more to do with some php settings that I didn’t know how to tweak/adjust. But I would estimate it took about 12 hours over several days to actually get the CMS to behave in a way that enabled me to test the tool. WordPress, in contrast, is included in many webhost packages, making it a truly one-click installation.  (The WordPress Network install is another matter.)

FWIW, here are the settings I needed to tweak on a Lunarpages-hosted installation:

  • php.ini – needed to comment out some “disable_functions” code in the file located in my public_html folder to get ImageMagick and Omeka communicating.
  • Path to ImageMagick isn’t so easy to locate on Lunarpages, but it’s here: /usr/local/bin/
  • I’m still sorting through a fileinfo module “warning” that Omeka spits out at the first install screen, but the CMS itself seems to work even with that problem/warning.
The theme options are still pretty limited (Omeka is in 2.0.x right now), but that’s a CSS issue that shouldn’t stop one from using the tool. Any self-respecting library or museum using Omeka for real exhibit purposes would almost certainly get some custom styling on the site.  I haven’t looked at the CSS yet, but I imagine it’s pretty easy to adjust.  It’s probably comparable to WordPress in that regard.
Dublin Core baked into Omeka makes it scream “legit” DH tool. This feature alone puts it ahead of WordPress when considering a course whose goals include some engagement with DH standards for describing archival materials. In fact, the entire process of adding materials to Omeka foregrounds good description practice. It isn’t sexy, but the GUI effectively tells users to get their Dublin Core metadata in first.  This isn’t required, but the order of presentation signals its importance.
I’m still very much toying with Omeka, but I think the ability to present multiple exhibits lends itself to a range of points of emphasis. WordPress could easily accomplish this using subpages in a nav structure, of course.  But the exhibits model in Omeka also offers users ready-made templates for presenting pages within an exhibit.  One could build a photo gallery, a mix of images and text, a text-only page.
Geolocation is available in Omeka. This is less exciting than it seems at first, really, since Omeka doesn’t import the geolocation metadata from images on upload, and it can’t handle KML/KMZ imports of GIS data to display in a map.  But it is still pretty nice. I even managed to embed a custom Google Map into an exhibit page, effectively enabling one to pull GIS data loaded into a Google Map into an Omeka exhibit. Of course, it’s all running on embedded Google Maps, and WordPress handles that just fine.
I’m going to run with Omeka for the project, not because it’s better than WordPress (it isn’t – yet). I’m going to use Omeka so my students have the EXPERIENCE of using a hot new DH tool that is legitimate and still under active growth and development. I’m also going to use it so that I can help my colleagues consider including digital archival and exhibit projects in their own humanities courses. WordPress would be much, much easier for me.  But why do the easy thing?

Acknowledgements

I have to thank my colleague Pam Morgan for a series of conversations that yielded the idea that would draw together some of the data she (along with her colleagues and students) has collected on the Saco River estuary, research on the uses of the Saco River, and historical photographs and documents at the McArthur Library in Biddeford, Maine. I must also thank Renee DesRoberts, the archivist at the library, for her willingness to really open up their collection of glass plate negatives for our project. I’m really looking forward to this project!

I also want to thank the tech support at Lunarpages and patrickmj from the Omeka Dev Team for offering me some help on the features and limits of the Geolocation plugin.

When Tech Woes Rain Down

The last week has been a tech nightmare.

My university switched its email/calendar provider in the middle of the semester and in the middle of the week. On Tuesday night, IT moved students, administrators, and faculty from Google Apps for Education to Microsoft Office 365. But the move to MS has been limited to email/calendar – for now. The rest of the band-aid comes off at some unspecified future date. After spending nearly a year communicating to the community that we’re moving to Google Apps for Education (from a woefully outdated and undersupported local solution that many still used), the leadership changed course almost instantly.

Impact on me: At least 10 hours of lost productivity as I worked to re-cobble together a unified solution to bring my mail and calendaring together into my mail and calendar clients in a way that enables me to share my availability with my wife. Did I mention this happened in the middle of the week and in the middle of the term!

More important than the email/calendar insanity is the impact of the switch on my department’s student learning and assessment plans. After embracing Google Apps (because we’re a Google Apps university) and planning to launch a major ePortfolio initiative in Google Sites, the whiplash-quick pivot from Google left me holding a bag of, well, nothing. With a colleague requiring ePortfolio in our pilot course, I could NOT in good conscience move forward with a platform I knew would be deprecated once the MS migration was complete. (Goodbye Google Sites template for student ePortfolios!) Result: Two crazy days trying to decide on another solution.

Enter WordPress! After pricing a non-university hosting solution, I was able to work out an arrangement with the university to host student ePortfolios in a WordPress Network installation on site.  This is, frankly, the best solution for us because it keeps the project in a .edu domain, it is a platform with which I’m familiar, and it has real portability for students after they graduate. And WordPress is much more than a blog tool these days. We’re implementing a full-on CMS.  The whole thing would be ideal, but our rollout timeline is, unfortunately, quite compressed since we have ePortfolio running in a class right now!

As if this weren’t enough for a week, today I received a “vulnerable script” warning from my own web host. After some digging, it turns out that my archived WordPress-based course websites now need to be upgraded to close some security loopholes in 2.x versions of WordPress.  This looming upgrade headache and CSS update has me pondering the value of maintaining live, visible course archives in the first place. Nice.

Hopefully, this set of three significant headaches will mean the pox has moved on to someone else.

Google Docs: (Still) Not Ready for Primetime

Last year I embraced Google Docs for student peer review and document submission in composition courses. I’ve been pleased with the way the sharing of documents helps me better track and support the peer review process in my writing classes. Peer review reading/marking becomes homework, class time is spent on discussing revision options, and everyone involved can see the comments as they emerge. And the addition of offline commenting and access through Google Drive really enables me to do my own commenting even when I lack internet access. All good!

But I remain flummoxed by the print limitations embedded within the app. I cannot understand why Google has not implemented a “print with comments” feature for its tool.  I worked around that problem in Spring 2012, but I really expected Google to add the feature by now. Amazing that Google Docs drives users through MS to make full use of a central feature of its tool.

How to print a Google Doc with comments?

It’s easy, if you have MS Word: Simply download the document as a .doc file, open in Word, and print. That’s great for a one-off print job.  Now, multiply that by 50-75 papers, the number of papers one might print if one teaches 2-3 sections of writing.

Here’s the workflow:

  1. Receive shared document from each student in the class.
  2. Move each document into a Google Drive folder to organize the class papers.
  3. Open each paper in Google Apps, make comments.
  4. Download paper to hard drive as .doc file.
  5. Open in MS. Word.
  6. Print.

Insane. I recently printed marked papers for two sections of composition and spent about an hour just getting the shared docs (wtih comments) to print. And if one uses Google Drive’s offline access, this workflow effectively means doubling the copies of the student paper on one’s harddrive.

Why Not Just Use MS Word?

Great question! I have several key reasons for avoiding MS Word:

  • Not every student has MS Word. (Yes, there are workarounds – Open Office, for example.)
  • I don’t like the idea of moving progressively more “marked up” papers around a peer group via email attachments. The Cloud is key because of the simultaneous editing features and all comments appear on one document.
  • Ease of “accepting changes” in Word bothers me.
  • I really like the versioning history in Google Docs.

The real question here is whether the cloud-based Office 365 tool would radically simplify my workflow here.  I don’t know, really, but reviews of that tool suggest that autosave may be problematic in Office 365.

Why Print Docs in the Cloud?

This is the $64,000 question.  Can’t we simply avoid the whole doc print thing here? After all, papers are submitted electronically, students comment electronically, and I comment electronically. Writers have ready access to the original, marked document. Why bother to print?

Read just about any study of screen vs. print reading and you’ll quickly find that we read more carefully when a document is printed. We skim web texts.  Make no mistake about it: A Google Doc may be akin to a printed paper in terms of content, but keep it on the screen and most of us fail to read as carefully.  In a writing class, where evaluation of claims, evidence, organization, and proofreading are central elements of the revision process, there’s real value in retaining the “print” output.

I have already ceded so much to the cloud by putting the drafting, peer review, and revision processes online. For some strange reason, I want to retain the easy ability to hand my students printed, marked comments on their final drafts.  We spend class time reading the comments, considering areas of strength and weakness, and focusing on the writing without the distractions of social media, the email inbox, etc.

Plea to Google: Activate the “print with comments” feature in Google Docs.  If I’m not mistaken, you used to have that feature in an earlier version of the tool. It’s essential in the Google Apps for Education suite!

Cutting the Cable

After living for more than 20 years without cable, I was “forced” to sign on with Time Warner when I relocated to Maine. I’m exaggerating about the lack of cable: I spent years with Dish Network, and later adopted AT&T’s uVerse service when it hit my neighborhood and bundling meant a savings fo $50/month.

It’s funny, but I loved the Dish UI, and the uVerse UI was also reasonably decent. (Where Dish simply hid unsubscribed channels, uVerse gave me visual cues that showed channels I didn’t get.) TWC’s UI is just awful! They tease you by showing you the entire package of programs, and many of them are not really available without significant subscription charges. And, oddly, the entire system has a time lag in the UI – try scrolling through a channels menu and the system can’t keep up. This from a company that advertises super high speeds.  (My guess is that Dish and AT&T DL channel menus to a HDD, but TWC prefers to serve it all up over and over.

Anyway, I’ve just completed my one year “teaser” rate on the bundled TV, Phone, Internet service.  (It’s really about the only game in town, or in the woods where I live.) When I called to secure a continued discount I learned that I’d be getting the “step” rate. I get a discount over full retail pricing, but it’s less than the teaser.  This is like getting the second bag of dope at half price. Why the discount? To lock you even longer so you really feel the pain of loss (withdrawal?) should you cancel.

So we’re cutting the cable! Think cold turkey.  OK, not quite.  I’m implementing a hodgepodge setup for media.  Netflix streaming and a DVD plan, something we’ve had for nearly a decade.  I’ve spent a whopping $79 on the refurbished high end Roku box, with the bonus of a HD Netflix stream that far exceeds the quality we’ve been getting through our Wii.  And now I’m working up a set of channels on the Roku that will get us some of what we’ll lose by cutting cable. Plex is going to be an awesome way to stream our own media to the TV without putting a computer in the room. And I’m likely to subscribe to Hulu Plus to get some network programming.  For the networks, I’m toying with the OTA HD reception we can pull in from the Portland stations.  That’s a project, but I can already see it working reasonably well.We’ll end up saving about $40/month, so it will take a couple months to recoup the Roku investment. And a decent HD antenna will run me close to $100.  Of course, that’s all equipment I get to keep, unlike that cable box I rent for about $100/year.

We’ll lose Disney, and the kids do watch 2-3 of their shows pretty regularly. And Cartoon Network is a favorite for the Clone Wars animated series.  I’m working on solutions for those challenges. I’m hoping that I might use Plex, iTunes, and possibly Hulu Plus to assist here. And I don’t have a DVR solution in the mix – yet.

And we’re also talking about dropping the landline for another $30 savings.  I’m still feeling too old fashioned to be without a phone, but we’re already getting a VOIP phone setup through TWC. We don’t get the old phone system that works in power outages anyway, making the security of a landline a kind a mental fiction. There’s no reason I can’t implement a third party VOIP solution at a fraction of the TWC price, and I’ve looked into it. My real hesitation: TWC has my number and won’t allow another VOIP provider to port it out.  We’d need to get a new phone number! Local calling only is my fall back option here. Save money, but keep the phone number. Our cell plan could easily handle our national calling, particularly if mostly restricted to cell-to-cell and evening calling.

 

 

Google Apps for Education – Migrating Email & Calendar

We’re a “Groupwise” institution that is also a Google Apps for Education school. Over the last two years, I have been forwarding Groupwise email and calendar data over to a personal/professional Google account not associated with my school. On top of that, I actually like to use my mail and calendar clients to handle most of my reading and scheduling. Perhaps I’m old fashioned, but I really like the off-line access.

This has been quite clunky, mostly because it has meant that I need to manage multiple data streams and not all of them clearly line up. And sometimes my outgoing mail gets sent “from” the wrong account and I get what amounts to peanut butter in my chocolate. But I recently learned that we’re going to move off Groupwise and simply embrace GMail and Google Calendar. Time to act!

Email

Adding the requisite “GMail” account information to Mail has been fairly simple. I just added a new account. I’ll need to do a whole lot of pruning as I complete my migration since I have now effectively doubled my inbound email traffic with the migration. (Not fun, but straightforward.)

Calendars

I already use iCal to manage multiple calendars, and the family uses sharing to help keep the various work, child activities, and social events in what amounts to a single place. It seemed like it would be easy to set up the new Google calendar. Not so fast.

It turns out that one has to use the correct server information in order to get the Apps for Education calendar to play nice with iCal. IT couldn’t help me because “iCal isn’t supported” by IT at this time. Google to the rescue. If you’re struggling to get your Google Calendar to talk to iCal, check out the instructions: http://edutraining.googleapps.com/Training-Home/module-3-calendar/chapter-9/3-2.

I’ve done this set up with a couple other Google calendars, so I’m not sure why I couldn’t remember the “special” instructions.

Now it’s onto tweaking forwarding options within Groupwise.