8 Wheatley students win national praise

Richard Tedesco

What started as a classroom assignment for eight Wheatley School science research students, ended in two farsighted projects recognized for honorable mention among 500 submissions in this year’s Toshiba ExploraVision competition, marking the first time Wheatley students have won recognition in the competition.

“I think it’s very good,” said Paul Paino, director of science research at Wheatley. “We’ve been entering this contest for along time

We never made that list until now. This is a first for us.”

Five Wheatley student teams started working on the projects around Thanksgiving, according to Paino, who said making the submissions with the requisite abstracts, research papers and graphics for ToshibaExploravision was part of the plan. Each team worked on their projects for two or three months to meet an early February submission date for the competition.

The two teams were notified about several weeks ago about receiving the honorable mentions, which Paino said were well deserved.

“They’re great ideas,” he said.

ToshibaExloravision is open to students in grades K through 12 worked in groups of two to four to simulate real research and development teams, imagining what an existing technology could evolve and what it might be used for 20 years from now. Teams explore how their visions of technology could work and what breakthroughs are necessary to make their ideas a reality.

Past winners have projected future technologies ranging from self-cleaning toilets to a new method for treating diabetes.

The Wheatley students’ projects, which both used high-tech ocular technologies, convey the complexity of the problems that students in the competition contemplate – and what it takes to gain recognition for one brilliant futuristic notion over another.

One of the team’s proposed that electro-oculography cameras that detect eye movement could eventually be used in automobiles as the central element of a system to detect whether a person attempting to operate a motor vehicle is under the influence of alcohol or drugs and incapable of driving safely.

Cicatic eye movement decreases when you’re under the influence,” said Daniela Czerminski, one of two students who spoke on behalf of her four-person team.

The team was looking for a system to prevent drunk driving, and they reasoned that the sophisticated digital cameras could be rigged in autos to “read” people’s involuntary eye movements to detect someone who is inebriated. They considered the problem because, as sophomores, they’d all started driving recently.

“The first thing we thought of was drunk driving. They have breakthroughs. But there’s an easier way around it,” Czerminski said.

They had considered another system that would extract blood samples from drivers, but it wasn’t fool-proof for all substances.

“We wanted to do finger-pricking, but that doesn’t detect everything,” said Gabrielle Pollack, one of Czerminski’s partners.

Their other partners in the project were Taylor Kaminsky and Allison Giller.

Czerminski said that the one drawback of the technology is that people’s eyes naturally register slower movements after they’ve awakened in the morning – and could be taken to be under the influence by the sophisticated camera.

But that’s just one of the issues that will await being sorted out 20 years or so hence.

The other project, devised by sophomore Sukhveen Soni and juniors Zoraiz Arif, Charles Yu and Zohaib Shaikh, proposed the use of a bionic contact lens that would enable virtual direct communication between people and interfacing with companion technologies, including the Internet.

The bionic lens would be activated by gestural technology, enabling the user to view a virtual wrist watch, for example, by focusing on his or her hand so the computerized lens can “read” the gesture and perform the needed function. A micro-LED on the lens would enable it to produce holographic images of objects projected by a tiny camera mounted in it, according to Yu, who said the camera would transmit images of objects in its environment to the computer to process and back to the lens to project.

The activating gestures “would be programmed to the computer,” Arif said. “Again this is projected into the future and there be enough computing power.”

It would be necessary to develop the micro LEDs to decipher the gestures, Yu said.

“We’d have to cram it in with this technology, but it wouldn’t be adequate to our purposes,” said Shaikh, who noted a lag in computer and plasma screen development to enable the respective technologies to be combined for such a mechanism now.

But the students figure that human nature will provide sufficient motivation to enable the technology – and direct bionic lens communication between similarly equipped human beings.

“People want to do things as conveniently as they can,” Arif said.

Explaining the multi-layered nature of their prospective invention, Shaikh said the groups was “looking really big” for a highly technical innovation since it was a competition being run by Toshiba.

The inspiration for their concept came out of a pop culture icon, the robotic “Terminator” character played by Arnold Schwarzenegger, complete with bionic eye that performed some of the same type of functions the Wheatley group has in mind.

“I thought it was cool,” said Yu.

And after the concept was translated into an abstract and an 11-page research paper with accompanying illustrations and submitted for the competition, the experts judging the Toshiba contest agreed.

Share this Article