PERCEPT Featured in Easter Seals Crossroads Assistive Technology Update with Radio Talk Show Host Wade Wingler
5G Mobile Evolution Lab Director and PERCEPT Principal Investigator, Professor Aura Ganz, was a recent guest on the Easter Seals Crossroads Assistive Technology Update Episode along with 5G Lab Systems Engineer James Schafer. This show, hosted by Wade Wingler, has an audience that reaches listeners in over 140 countries and is one of the most popular shows on assistive technology.
The show was featured on April 4th and can be heard through this link:http://bit.ly/1q6dztZ .
More information on the show can be found in this link: www.AssistiveTechnologyUpdate.com.
Transcript from the PERCEPT portion of radio talk follows:
JAMES SCHAFER: Hi, this is James Schafer, and I'm the systems engineer in the 5G Mobile Evolution Labs at the University of Massachusetts, Amherst.
AURA GANZ: And hi, this is Aura Ganz. I'm a professor at the electrical and computer engineering department at University of Massachusetts in Amherst, and the principal investigator of Percept Project, and this is your Assistive Technology Update.
WADE WINGLER: Today on Assistive Technology Update, I'm excited to be joined by Dr. Aura Ganz and James Schafer. They are in Massachusetts at the University of Massachusetts. We're going to talk today about a very interesting, and I think very innovative, project called the Perceptive Project. I guess I want to first introduce my guess. Dr. Ganz, James, are you there?
AURA GANZ: Yes.
JAMES SCHAFER: Hi there.
WADE WINGLER: Good, hey thank you so much you guys for taking time out of your afternoon to talk with me a little bit. I know you're excited about the Percept Project, and I am too. Are you guys doing okay today?
AURA GANZ: Yes, very well.
JAMES SCHAFER: Doing excellent.
WADE WINGLER: Good, so as I was preparing to talk with you guys today, I got to thinking, we're going to talk about navigation system for people who are blind or visually impaired, and I remember it must have been probably 10 or 15 years ago, I was at one of the national conferences on assistive technology. I think it was Closing the Gap perhaps. A friend of mine who is blind was wearing this big backpack that had a computer in it and had a big wire coming out of the backpack, and he was holding a joystick kind of thing under his arm and in his hand. It was one of the very early navigation systems. It was a GPS system designed specifically for people who are blind or visually impaired. I remember we went outside and we walked around, and it was pretty remarkable that it can figure out where he was and not exactly where she wanted to go, but where he was in relation to streets. It became very apparent very quickly that the thing only worked outside where he had access to GPS signals. The moment we went back into the hotel, it stopped working. That was one of the early limitations in positioning systems for folks who are blind or visually impaired. But I think we're going to uncover today that you guys are working on a system that kind of deals with some of those problems. Is that right?
AURA GANZ: Yes, of course.
WADE WINGLER: So tell us a little bit, Dr. Ganz, about Percept. Where did the idea come from and why is it important?
AURA GANZ: So Percept is an indoor navigation system for the blind and visually impaired. Blind people only use a phone which carries a Percept application, and the environment or in the building, we are deploying near field communication tags at specific landmarks such as doors, elevators or stairs. When the visually impaired person enters the building, he or she will use a vision-free application and enter a specific destination such as room 309. By touching these tags that are deployed in the environments, the application will provide directions to the specific destination. I'm sure James will provide you a little bit more explanation about how the system works, how the application works, but I want to talk to you about how this idea came about.
So I received a grant from the National Science Foundation some time in 2005, which funded me to develop something called animated spaces, or spaces that actually can talk to you or can talk to the user or can talk to people that navigate through the environment. It was very clear to me that this technology can be very useful for people who don't have vision or visually impaired people and can guide them through this environment to their desired destination. We started to develop Percept systems using funding from the National Institute of Health, National Eye Institute, which provided us funding to develop the first prototype of Percept systems.
WADE WINGLER: So, Dr. Ganz, I think a lot of my audience might know the answer to this question already, but tell me why indoor navigation is important for folks were blind or visually impaired.
AURA GANZ: Independent navigation through unfamiliar indoor spacious is a very significant barrier for the visually impaired. Currently the visually impaired people need to have assistance by a sighted person to get wherever they desire to go such as a mall, to work, or shopping. So providing a system that provides them with independence is very importance for their lives, like increasing their probability of getting employed. Currently, based on the statistics that I have of the unemployment rate in the visually impaired communities, it's about 75 percent. We think this technology can provide them the opportunity to travel to work or travel where they need to independently without depending on any people with sight.
WADE WINGLER: That makes a whole lot of sense. Our show tends to be about technology, but really it's all about technology that fosters independence for folks with disabilities. This clearly does that. So why don't, Dr. Ganz, or you James, tell me some of the nitty-gritty about how this works. I know involves a smart phone, I know it involves near field communications, but let's dive in little deeper into that.
JAMES SCHAFER: Okay, so I first want to break Percept into three distinct parts. First, let's talk about the user. So for the user, they are going to be experiencing Percept through a smartphone application. Now they are going to be using Percept in a building environment, and within the building, we are placing NFC tags, which is short for Near Field Communication. This is technology that is very similar to RFID if you've ever used Fast Lane or heard it EZ-Pass or in malls, they will put the security tags inside items and an alert will go through if you go through that security checkpoint. This is a similar technology.
NFC in particular is a near-touch communication where the phone will come in close contact to the tag. Through that, we are able to associate a person's location. So what we have is a smartphone application that interacts with the building environment, and we are able to know the location of the person.
The final piece of the puzzle is the Percept server. What this does is it digitally represents the building. Through that, we are also able to know and calculate a pathway from where the person is to where they would like to go and translate that into navigation instructions that are sensory landmark-based that are visually impaired or blind person can follow.
WADE WINGLER: So kind of like turn by turn but more like step-by-step with very tactile landmarks.
JAMES SCHAFER: Yes, so instead of like turn by turn, it's as you say from one sensory landmark to another sensory landmark.
WADE WINGLER: That's cool. So is this designed to work with or replace traditional orientation and mobility systems?
AURA GANZ: No, it's designed to work with existing orientation mobility systems. This is a very important guideline that we follow. So the visually impaired person will have their cane or dog, we definitely don't replace those orientation and mobility tools. This was told to us by orientation mobility specialist. Many projects tried to replace those tools, but none of them was accepted in the visually impaired world.
WADE WINGLER: That makes sense. I think we've learned that it takes a lot of different tools. Any navigation system can tell you where you're going, but you need to know where the top of the staircase is as you're heading down so that you don't fall. James, I think you mentioned that you're using an android phone. Is it tied to the android platform at this point, or is it working on multiple platforms?
JAMES SCHAFER: Currently, right now, it's working just on the android platform, but we do have the intentions to develop it also for the other mobile platforms including iPhone, which is the most popular platform it seems for the blind and visually impaired.
WADE WINGLER: So it requires the installation of NFC tags and a virtual map of the building. Where does it work and how hard is it to map to building and set up the tags?
JAMES SCHAFER: So currently, one of things that we are developing is another application that we call the ONM tool. What this tool allows us to do is indicates where the doorways, elevators and other sensory landmarks, and automatically this tool will stitch together the pathways throughout the building and provide navigation instructions. So really the goal and the emphasis is to create a system that is low-cost and that doesn't require a significant amount of engineering time to design all of the routes or to have an orientation and mobility instructor to write each and every navigation instruction from every point in the building. But to have a tool where you can just indicate his landmarks and have Percept generate the navigation instructions itself.
WADE WINGLER: That's cool, and that sounds like a very interesting way to go about it.
AURA GANZ: I also would like to add that one of the significant benefits of our system is that in case you're not able to follow the instructions, you can get rerouted to the destination from anywhere in the building. So sometimes it happened to our human subjects, they missed landmarks that are told to them throughout the instructions that we provide. However, it's not the end of the world. Using our technologies, they can get rerouted to the destination from anywhere. It's sort of a dynamic system.
WADE WINGLER: So what smart enough to know where you make the wrong turn and using something like breadcrumbs or other technology, help you get back to where you can go the right way again.
AURA GANZ: Yes, so whenever you reach a landmark which may not be the correct landmark, it doesn't matter, you can be rerouted from this mission from anywhere.
WADE WINGLER: That's great. So talk to me a little bit about the future of the Percept Project. If an organization were interested in having this technology in their building, what would they do?
AURA GANZ: So currently we are in the process actually of fundraising for our next generation of Percept which will enable us to generalize the solution to any building. This is where we currently are. Moreover, what I mean by generalizing the solution to any building is that we need to write the software that gets us a blueprint of the building and is able to generate the instructions automatically. Obviously this will lower significantly the cost of deployment in any building in the country. So I am now in the phase, writing grants for National Institute of health, National Eye Institute and other funding agencies. In addition to that, we need funding to write the application for the iPhone as James mentioned. This is the most prevalent platform for the visually impaired. So this is currently where we are. I would like also to mention to your listeners that we are currently testing the technology in the subway system of Boston. In Arlington subway station in Boston, which I think is a very important development for this technology it shows how useful it will be.
WADE WINGLER: I think a subway station is a great example of how this technology can be deployed. I can think of lots of public buildings and hospitals and museums and academic buildings and places where this would be very useful, not to mention retail establishments. As I think about this, I have lots of ideas about how this could be used, and I'm sure you do too. If somebody were interested in learning more about the technology, if somebody were listening and wanted to learn more or to invest in the work that you're doing, how would they reach out to you? How would they find out more about the Percept Project ?
AURA GANZ: Please visit our website which is percept.ecs.umass.edu. Contact me at firstname.lastname@example.org. At the website that I just mentioned, there is the description of the technology. There are many press releases that have been released on this technology. I think it's very informational, and we will be very happy to answer any questions that you may have.
WADE WINGLER: excellent, Dr. Aura Ganz is the director of the 5G Mobile Evolution Lab and professor at University of Massachusetts at Amherst and the principal investigator of the Percept Project. James Schafer is the systems engineer and the project manager for the Percept Project. Dr. Ganz, James, thank you so much for taking some time out of your afternoon to talk with me today.
AURA GANZ: Thank you for the opportunity.
JAMES SCHAFER: Thank you so much, Wade. It was great talking to you.
WADE WINGLER: Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? Call our listener line at 317-721-7124. Looking for show notes from today's show? Head on over to EasterSealstech.com. Shoot us a note on Twitter @INDATAProject, or check us out on Facebook. That was your Assistance Technology Update. I'm Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana.