We put the University of Sydney’s new Superlabs under the microscope.
Superlabs are working, in spite of their inherent limitations.
After all, no one is suggesting the best ‘mode’ of pedagogy is to address a vast, acoustically-hostile space filled with 100-plus bemused and labcoat-bedecked new students and expect the experience to somehow be 10 shades of awesome. It’s not. Superlabs make it tough to engage students, and tough for academics to feel connected.
But the alternative – a bunch of smaller, disconnected labs across multiple campuses with considerably worse technology (and used far less frequently) – is an even less appealing prospect.
So university tech managers are turning something potentially terrible (superlabs) into a positive learning experience for staff and students. Hats off to our university tech managers!
The University of Sydney was an early Superlab adopter (its Charles Perkins Centre X-Lab, which picked up an AVIA award in 2014). The technology template has since been set, which the University has been steadily improving on without reinventing a wheel created using the expertise and experience of SMEs over many years, in particular Jason Wheatley and Paul Menon.
Nathan Ashmore is the University’s AV Design Manager and design lead for the new LEES (Life, Earth & Environmental Science) spaces, as he is with all of the University’s greenfield builds (with solid support from his colleagues in the ICT Client Design & Standards team). The LEES Building brings together a number of disciplines together from the School of Life & Environmental Sciences. As mentioned earlier, the LEES Superlabs allow multiple classes to run simultaneously in dynamic sizes and provide a significant ROI on the tech investment.
“Lab staff, space and equipment are all expensive, so consolidating these requirements into an efficient use of space is key. Overlay a dynamic and flexible audiovisual system into the built environment and you’ve got a suite of pedagogical tools for academics to communicate practical lab concepts to the scientists of tomorrow, simultaneously,” observes Ashmore.
The LEES Building packs three biological safety-certified super wet labs: two 144-seat PC1 spaces have three teaching stations each, and one 96-seat PC2 space which has two teaching stations. The reduction in teaching stations this time around was largely due to three factors: typically larger class cohorts in the PC1 spaces; the very real concern of cross contimination in the PC2 space; and observations from the practical number of classes run in the original X-Lab. Across all spaces, class sizes can be defined from eight people all the way up to the class capacity.
PROGRAMMED TO EXCEL
As system integrator, Programmed Electrical Technologies (PET), supplied and installed the AV and network infrastructure into the LEES Building 1. Project Manager, Mark Connerton, describes what makes the project special from a PET perspective.
“We’ve been very impressed with the Panphonics SoundShower directional speakers. Audio reinforcement through these speakers allows students to listen to an instructor with minimal spill into an adjoining class.
“The fact we can send multiple SVSI streams to students concurrently, all viewed on single PC monitor or larger display… well, 10 years ago this would have been unheard of. Now, it’s a game changer.
“The simple-to-operate user interface allows the directional audio and the video streaming to elegantly come together, making dynamically sized classes within the same facility a reality.
“We at PET believe this project exemplifies how audiovisual technology can be used to support a shift in tertiary laboratory teaching and leverage technology as a basis for innovative teaching practices.”
Programmed Electrical Technologies: programmed.com.au
VIDEO: MANY TO MANY
Ashmore used the X-Lab as the basis of the LEES Superlab design, which at its core is about allowing multiple video sources to be shown at once from multiple presenters.
He explains the video setup: “Each teaching station has an Extron DXP88 8×8 matrix that feeds 3x AMX SVSI N3132 encoders. They stream out to the computers at student desks which are a 1 to 1 ratio – 144 in the PC1 labs and 96 in the PC2 lab.”
The video sources include: a Wolfvision EYE14 visualiser/demo camera on a yellowtec arm, and a resident PC, as well as an HDMI laptop input at each teaching station. There is a Panasonic AW-HE2 ‘face’ camera and a Mersive Solstice Pod2 for wireless and wired network sharing. A microscope input is also routed into the matrix, with an auxiliary feed directed into the Resident PC via a Magewell USB Capture 2, allowing the use of a mouse for highlighting particular aspects of a specimen, as well as control of the microscope’s camera settings. A further development on the streaming-only approach was the deployment of five 70-inch displays, dynamically selectable throughout the lab using SVSi N1000 codecs.
Ashmore: “With the combination of low-latency (sub 50ms) video transport and large screens we can display lip sync’ed content from a live camera, generally via the Panasonic face camera. Many presenters are a bit shy about the face camera, but it can be useful if you’re lecturing to a full room in the first few weeks of semester — students can become familiar with your face. Academics can broadcast their face on the big screens, and then send other pieces of content to the student desktops via the H.264 streams.
“Having the large screens gave us a relatively real-time visual syc reference, addressing a limitation inherent in strictly H.264 streaming designs. If you want to show content to the students – and because we could only stream content in the X-Lab – there was a 250ms delay (variable, depending on the processor load on each individual PC) which meant you either had to delay the program audio to try and be in line with the network streams or you had to stream the audio over the network to each person’s computer and have them use headphones.”
Ashmore describes this hybrid approach of combining minimally compressed network video along with baseband video as one often favoured by the University of Sydney: “We could have achieved the same result entirely using baseband video with a giant matrix but that’s very costly and whenever we’re dealing with a dynamic switching requirement we will augment baseband video switching with network-based video.
“Equally, we don’t generally do fully network-based video as it often isn’t the right solution – we’re happy to take the hybrid approach where it makes sense. Using SVSI N1000 encoders means we don’t have discernible latency. And then with anything that’s less critical – a demonstration camera or showing slides, for example – we find that will be fine to put through the H.264 streams. That said, the system gives the users flexibility to use it as they see fit.”
Rounding out the video sources is the Mersive Solstice collaboration platform, which performs a number of pivotal content sharing roles, as Ashmore explains: “Perhaps our most conventional use of Solstice is as a network presentation gateway. We’ve imaged all the student PCs to include a link on the desktop to connect to the Solstice device for screen sharing. The system allows the academic to invite the student to share their work, as opposed to surreptitiously reviewing what they’re doing and then sharing it – a very significant cost saving and arguably the right thing to do by the students!
“Each lab includes two 80-inch Sharp interactive touchscreen displays. They’re provide a focus of huddle zones for up to 18 people and they’re generally used with the Sharp Pen Tools app for whiteboarding. As far as Solstice is concerned, it’s just another student computer, but it provides the ability to share whiteboarding to the entire lab session – it can be routed like any other source at the teaching desk, and the user experience of connecting to the client device to the system is relatively consistent across use-cases.
“The third potential use of Solstice is a little more advanced. We had a request to put a camera in the fume hood — after all, it’s impractical for 144 students to crowd around a fume hood. Our solution is to use Solstice to mirror the academic’s phone camera. It’s quite low latency, especially when you broadcast it to the large displays, and it allows you the sort of flexibility a fixed camera simply couldn’t provide. It’s also a very cost effective solution, in that we were already spec’ing Solstice in the room. The lower resolution of screen mirroring is a welcome compromise for the flexibility offered, and it is rock solid once connected, thanks to the excellent wi-fi coverage within the space.”
Central to the audio setup, both in the LEES Superlabs and the original X-Lab, is the Panphonics N20 (flat panel, narrow dispersion ‘Audio Elements’) loudspeaker. “We settled on the technology after a lot of testing in the pilot phase of the Charles Perkins Centre X-Lab,” reports Ashmore.
“The original brief was for highly directional audio to specific seats without using headphones – the feeling was that headphones are an OH&S risk, and staff wanted the ability to get students’ attention regardless of whether they’re wearing headphones or not.
“We needed an electro-acoustic solution, which was certainly challenging. We did some testing of the Panphonics solution and found what we could achieve in the voice frequencies was far superior to all the other proposed solutions. Due to our initial success we’ve not seen any reason to change our approach for these Superlabs. The only difference was to increase the density of speakers, something recommended to us by Programmed Audio Design Manager, Tim Dogao.
“We also increased the number of Panphonics speakers to include one per teaching bench, which we didn’t have previously – for a bit of program fold back (the downside of having highly directional speakers is the presenter doesn’t get a clear idea of the program audio going through to the student zones).
“At the front end we did inform the University that this isn’t a rock ’n’ roll PA, it’s primarily for voice lift, and the settings are optimised for regular speaking voices to ensure minimum interference to other cohorts sharing the room.”
HONING THE ZONING
“Setting up the audio zones is simple thanks to the intuitive floorplan interface developed by AT Controls,” explains Ashmore. “At system startup, a setup page appears which allows the presenter to select the audio zones they wish to include in their session. We’ve further enhanced our previous designs by including selection of the large displays, further expanding upon the dynamic nature of our laboratory learning envirinonments. The zones are split into Banks, which are a row of eight student seats. So that’s a lot of granularity.”
“Once the session is set up, the touch panel presents as identical to every other in the University. It has the exact same UI, providing familiarity and consistency, which is crucial to ensuring the technology is a seamless as possible, allowing academics to focus on their subject matter expertise, rather than learning multiple systems. We felt this was important even in the most complex designs.
“There are two Shure ULX-D1 packs with Countryman EC2 headsets per teaching bench – not the cheapest mic but they provide excellent gain before feedback and very high speech transmission, which is the name of the game for the audio system.”
Rounding out the audio system there’s a four-channel infrared distributed Williams AV TX9 system – a student can simply tune into the channel their presenter is broadcasting on.
In a central AV/Comms room a QSC Core 110f provides routing and DSP, with one Panphonics amp channel per Bank of Panphonics speakers.
WINNING OVER THE SCEPTICS
A big part of Ashmore’s job in the delivering of the new spaces was to ensure the move was as smooth as possible for the staff. An extensive schedule of training helped considerably, and staff were strongly encouraged to attend the deep dives into the tech — which paid off.
“Initially, there was a lot of apprehension about the technology,” reflects Ashmore. “Thankfully, some of the most vocal opponents are now some of our best power users.”
SHARP MAKES THE CUT
Ashmore: “The Sharp 80-inch interactive touchscreen displays are probably one of the most heavily used pieces of tech. It’s enormously satisfying to see staff, who may have initially been hesitant regarding the appearance of AV technology their labs, now embracing it and becoming power users.
“I didn’t really have to train any of the staff in how to use the Sharp pen tools software – it’s fairly straightforward and they’ve just taken to it. Staff will generally write the notes up during the class, or they might take surveys, gather data through the class, write it all up on the whiteboard and then use the Solstice link on the desktop to share that back to the presentation which they then broadcast to either the large displays or the streams to the student computers. We weren’t entirely sure about how widespread the uptake would be but sharing the ‘whiteboard’ happens every day. Sometimes it’s hard to take off your engineer hat and assess whether something is user friendly enough. Walking past the spaces and catching a glimpse of the hand-written formulae and collected data, distributed visually through the space, is probably the aspect that gives me the most satsifaction and positive reinforcement that the project team delivered true educational value to our students. Being able to look directly into the labs reinforces the projects’ ethos of putting ‘science on display’.
Nathan Ashmore (USYD)
Tim Dogao (Programmed Electrical Technologies)
Mark Connerton (PET)
Lucy Child (USYD)
Rizwan Muhammad (USYD)
Sam Gibson (USYD)
AMX SVSI video encoding
Extron matrix switches
Panphonics SoundShower speakers
Panasonic fixed installation cameras
Sharp interactive touchscreen displays
Mersive Solstice collaboration systems
Williams AV IR hearing augmentation
Shure ULX-D wireless
Countryman mic headsets
Wolfvision document cameras