Joanne Dehoney, Director of E-LLearning and Victoria Gertis, Special Projects Manager at the Ohio State University presented.
We began with everyone in the room listing three technologies (I picked geotagging, clickers, and wikis).Â As people around the room called out technologies, we all crossed these items off our lists.
- Web conferencing
- Personal learning environments
- Collaborative spaces
- Cellphones and lms
- Maships IM
- Social Networking
- Web based exams
- Google Jockeying
- and so on
“Emerging is as emerging does.”Â Lots of people are dealing with the specifics of any of these technologies.Â The presenters aren’t worrying about whether something is emerging, they are simply looking for the specific projects that come to them over time.
- Tweak and iterate
- Build consultation model (correlates with reject)
Adopting means acquiring the funding through a service improvement request (at Richmond these are called program improvement requests) to fund the technology in an ongoing way.
The evaluating technologies model (I need a chart!)
Project overview includes:
- Success Criteria
- Assumptions Risks Obstakles
Ohio State used this overview document to evaluate the request from three faculty in different departments who wanted to see how they could use Second Life in teaching.Â English Composition, Design, and Women’s Studies were the three departments.
- Reject the project if these are not in place (Supports undergraduate or graduate instruction,
- Conceivable technical / personnely/ physical infrastructure,
- Exploration feasbel within available budget, and definite limit to committment (in.e not an unfunded service)
Faculty are required to provide input (writing) in the final report.
Next you want to explore: how does the technology work? what are other schools doing with it?
You then begin to plan, creating a project charter, which contains some elements from your overview document.Â You need to add scopeÂ – what won’t you do?Â Are you fitting the mission of your unit?Â
(I have to say at this point that the presenters were going through their material so quickly that if you, like me, didn’t get a copy of the handouts beforehand, there’s very little opportunity to keep up.)
They give faculty a document that lists benchmarks they would like faculty to consider as they are evaluating the technology on teaching and learning.Â Impact assessment, Support estimates, and expenses.policy implications, technical assessment are all covered, though only the first three are for faculty to consider.
Have faculty write a faculty pilot report, including what research question they were pursuing, the project description and project methodology.Â New questions may also come in the report.
Once the faculty have filed their pilot report, it’s up to the instructional technology team to review the information and decide what recommendations to make to the rest of IT.
At OSU, the pieces are working for them, but lots of intervening factors come into play during the length of the project.Â Their Second Life project is their first full walk-through of the approach.Â They like the fact that they’re trying to hold onto the outcomes without worrying about the formats that people are choosing to submit.
Most projects are going to be in one of four categories (reject, tweak, consult, adopt).
Someone from UT Austin said that they get institutional buy-in before getting to the pilot stage (they have an island in Second Life).
Perhaps explorations of policy implications are higher than OSU made them – these should be in the exploration phase.Â Third parties especially trigger FERPA review and similar policies.
OSU has just this morning rejected Pachyderm – the software failed a faculty member three times.