KIRK BEHNKE: Fantastic, thank you. Hi, everybody. Welcome to the Pennsylvania Tech Accelerator webinar for today. My name is Kirk Behnke, and I'm a contractor for the Pennsylvania Tech Accelerator being sponsored by Temple University. Our presentation topic for this month is Alternate Access and Switches. So that's our presentation for today at 4 o'clock.
This is the last session of our current series, which is called Tools for Employment, Independence and Socialization. This session is being recorded, and the video will include picture-in-picture ASL translation as well as a full transcript. You can access the video on our website at disabilities.temple.edu. And Caitlin is going to put a link and a couple of other resource links in the chat as well for your viewing and linkage pleasure. That website is where you can also register for our future webinars.
And I would like to announce that our next and final series will be on recreation and regulation, and that starts next month in October, and it goes until the end of December. TechOWL has a wonderful series of these 18 topics, and we have run these webinars since July of last year. We do have the recordings on our website as well. So each session is held twice on the third Thursday of the month. We offer it at 12:00 and at 4:00. Both sessions have the same information, so you can attend whichever one is most convenient for your schedule.
So the series is part of this Tech Accelerator Program, which is an initiative funded by the Office of Developmental Programs and the Office of Long-Term Living here in the Commonwealth. And this is made possible through the funding from the American Rescue Plan Act. And then you can learn more about the Technology Accelerator Program and the fantastic projects that we're still working on with the link in the chat. That information will also be sent to you in a follow-up email with the PowerPoint or slides and other materials from today's session.
So for those of you who are joining us for the first time, TechOWL stands for Technology for Our Whole Lives, is a federally designated Assistive Technology Act program, and that serves the Commonwealth of Pennsylvania. Every state and territory does have a state Tech Act project. And actually, Mike will talk a little bit about his in New Jersey. But each state and territory has some commonalities, such as having device lending libraries or DME utilization programs.
Most of them also offer training and educational opportunities. So if you live in another state and you want to know more about the Tech Act projects, feel free to let us know and we can look up your local AT program so that you can take advantage of those services. OK, so without further ado, I'm going to hand it over to Mike Marotta, who is our guest speaker for today. And Mike, take it away.
MIKE MAROTTA: Thank you, Kirk. I appreciate it. Hello, everybody. Good afternoon. Give me one second to share my slides. There you go. I am excited to be here with you today. I'm looking forward to sharing this information about alternative access and switches. If you would like the slides, I know Caitlin's going to send things to you afterwards, so you'll get a follow-up with the link and also, I'm sure, the link to the recording.
But if you really want them right now-- and by the way, I would say get the slides, not just because they're mine and I would say get them. But get them either through the QR code on the screen. There is a Bitly on the screen, which is bit.ly/mmatp091825. Or if you're joining us here in person, I just put the link in the chat. You can click on it and open it-- a bunch of different ways to get to it.
Like I was saying a second ago, I would encourage you to get the slides because I'm going to talk about a lot of things-- a lot of technology tools-- and there'll be pictures of them. And I've tried really hard to link out to where you could get those tools. So if you're interested in learning more about that, I've tried to add a bunch of links in the slide. So anytime there's an image in the slides, hopefully, unless I missed one or two, it will take you out to some website that you can get more information about that device.
I'll do my best to explain them as we're here, but I do recognize that we're kind of overviewing it right now. We're going to talk about a lot of things in an hour. We did this earlier today, and we talked about a lot of things, and it was exactly an hour. So I think we're going to run right up on the end as we go. But I'll try to tighten it up and move it along a little bit. But there's good information there.
But I think, like anything else, when you come to a webinar, I feel like a lot of it is introduction of information. That is, you find something that's interesting to you. I would encourage you to go out and learn more about that-- find more information about that. So hopefully you got the link to the slides, and you have all that. Kirk mentioned one of my jobs. So I have several jobs in assistive technology. I am an AT specialist. Ultimately, I think it all boils down to that. I am an AT specialist. I've been an AT specialist for 35 years now. So that's a long time.
Kirk and I made a joke before. Kirk and I have worked together a very long time ago, too. We are seasoned, not old. So we're going to say seasoned. But I've been an assistive tech specialist for very long. As Kirk mentioned, I am the director of the New Jersey Assistive Technology Act Program. So similar to how Kirk described the activities that TechOWL does, we do the same activities in New Jersey. So if you're watching us from New Jersey, find us. We're at the ATAC Center at Disability Rights New Jersey. So you can check us out there.
I'm also an adjunct professor at a college here in New Jersey. And then I was lucky enough to author a couple of books about technology in education, and those are listed there. I'll give you my email address at the end so you can reach out if you want any more information or you want to chat about things. I always offer that up to people in webinars.
If you take a minute and process this information and you have a question or a thought, I know those thoughts never hit me when I'm in a webinar. They usually hit the second the webinar ends. If that happens to you, I feel your pain. Email me. I'm happy to chat with you about anything as we go forward. And definitely reach out to our friends there at TechOWL. They have some great information that they can share with you.
So what's our plan? Here's our afternoon plan. Talking about the idea of technology tools that allow us to access whatever it is we're trying to do. I mentioned earlier-- and I love this idea of this topic-- the concept here is the idea of tools for access. So assistive tech tools for access is one of these umbrella areas where it touches everything.
If we're talking about driving a wheelchair or using a computer, or using your phone, or using technology to read or write-- whatever it might be that you're using technology for-- access is part of that conversation. It always is. How are you going to do whatever you're trying to do on that technology? And that's where the access comes in. So we'll talk about that. We'll break it down into a couple of different categories, show some examples, talk about some ways you might use technology for that.
So we'll start with the idea of matching a person to the technology. If you've ever seen me talk, I will talk process first. I will always talk about the idea of how we find the right tool for someone. I don't want to start with just tools, because then we end up potentially looking at a bunch of tools that aren't even appropriate for us or the person that we're working with. So let's talk about how we figure that out first.
And then we'll branch out into the other areas of access. We'll talk about touch, voice. We'll look at keyboards and cursor control, so we'll look at mouse alternatives. We will talk about switches. And then my favorite topic, which is number 6, other-- just some other things that fit into this general idea of access but don't really fit into any of those other categories. So we'll hit those towards the end. As we go, I would encourage you, if you have ideas, thoughts, or questions, drop them in the chat. I'm watching the chat as we go.
If I see something that's in the chat that relates to what we're talking about in that moment, I'll probably just answer or comment then. If I see something that one of you posts and it's about something I know we're going to talk about, I'm going to hold off and I'll talk about your comment then. But I wanted to put that out there so no one feels like, well, wait a second, he answered that person. Why is he not answering me? I'll see if I can get it to fit somewhere in the context of our conversation. And then at the end, we'll catch up on anything we didn't answer as we went through. So that's our plan. Let's do it.
So let's start here because this is the most critical part to start in-- this idea of feature matching. How do we make sure that the technology the person has is the technology they need? We see instances where-- I've seen this both at the Tech Act and also as a consultant. I've seen this where people say, well, I got this assistive tech because someone else had it, and it worked really well for them. But I have it, and it stinks.
And my comment usually is, well, that tech doesn't stink. It just stinks for you. It's not the right tool for you. We can't lead with the technology, because a lot of times when we lead with the technology, we end up with the wrong thing. And so sometimes it's important to take a step back from this and say, what is it I'm trying to accomplish? And so I'm going to present to you here quickly this idea of the HAAT model, H-A-A-T.
The HAAT model breaks down the way we would approach things. And it has these three components. The human is the person. What are their skills and abilities, likes and dislikes? What are things they've tried in the past that have worked? What are things they've tried in the past that didn't work? Give me the context-- the idea of the person. What is the person about? And then the activity-- the activity is key here.
We are trying to find a piece of technology to meet a need the person has. What is the activity they're trying to accomplish? Then we'll be able to find the right technology. If we lead with technology and then work backwards to, I have this thing-- this piece of technology-- and I wonder if it'll work here in this activity, and I wonder if it'll work for this person. We're working backwards.
So instead, always thinking about that person first, who they are, sometimes I'll say, what's that person's story? Just what are they about? What's happening there? Give me the details of that person and then move through it that way. If you're someone who works in education and assistive technology, you might have heard of the SETT framework-- another way to approach consideration of, what does the person need?
And again, that puts the person first, the technology last. I'm a big fan of that. And you'll learn quickly. I love technology, by the way. I'll put that out right here in the first slide. I love technology. I love the right technology is what I love. I've seen far too many examples of the wrong technology for people. So thinking about that, coming from a space of understanding the person and what it is they're trying to accomplish, will help us get a good tool match at the end because, ultimately, that's what we're doing.
We're doing something called the feature-match process. This model is pretty straightforward. On one side is the person, their needs, their abilities, their expectations, likes, dislikes-- all of those things I just mentioned on the one side, the individual. The other side of this coin here is the technology. And the technology, really, broken into three parts-- how I input into the technology. That's what we're going to talk about today-- access. How do I get into the technology to complete my task? The processing is what that technology does for you. The output is what you get. What you're trying to accomplish is the output.
So we will focus our conversation today on that input part, which is the access piece. And we'll move through these choices as we go and look at some examples of that. But as long as we start from an idea that learning the person and their needs and wants first before we jump into tech is critical. I think I said enough of that. You get my point there. I could pull my soapbox out one more time, but I don't think I need to.
So let's start with touch. I start here because I think in the context of technology now, this is where a lot of our tools live. We have a lot of tools with touchscreens. I mean, look around you just in front of you right now or maybe even the device you're sitting at watching this. It's probably a touchscreen device. And touchscreens are no longer only in very high end devices. I mean, we're talking about every single device.
I'm eyeballing my whole desk right now, and every single item on my desk has a touchscreen. My phone, my tablet, my computer and my laptop all have touchscreens, so it didn't matter what I had. Touch is an option for access. Now, as we dig deeper into access, our conversation has to go with those considerations we think about. Is touch the right tool for us? Is that the right access method?
And I guess it depends, right? These considerations become really important when we start thinking about these different areas of tech. What is it the person is able to do? Are they able to isolate a finger and directly select something on a screen? Are they able to do that? Do they have the functionality and the fine motor control to perhaps touch that? Do they have some of that functionality? And maybe we can use additional tools or customizations to allow them to do all of the activities they want to do.
When we think about that, those become part of our questions. What is the person trying to do? People will say sometimes to me when I talk with them, well, this person needs to use an iPad. OK, great. Can you guess what my first question might be? The first question is, why? Why are they using the iPad? What are they trying to do? Help me understand what they're trying to do, then let's talk about how they might access it. If it's simply the direct select, great.
Do they need access to apps? Do they need access to the menus within an app? Do they need access to different settings to control them? And what does that look like as far as the touch access? If you think about your phone, using your finger to isolate and tap on an app icon on the screen, those are relatively a good size. Maybe on your phone, maybe they're, like, a half inch, 5/8 of an inch square, or something like that.
You might be able to do that. But think about when you get in an app. How many times do you get in an app, and to change the settings of that, it's a dot in the top corner or something? And can the person do that? So what do I need them to have access to? Does that change the way they might use that device? One of my other considerations I always like to remind people is touch devices predominantly are not about pressure. It's not about pushing hard. It's about touch-- simply contact on the screen.
And so if someone has a very light touch, that's OK because as long as they're making contact, they're going to be able to select the things they want. However, when we start thinking about touch and using touchscreens, there are some things we need to think about. We probably do these a lot and don't even think about it. Think about all the times we use a gesture on a touchscreen device to do something-- to perform some action.
Do we drag things? Do we tap? Do we double tap? Do we tap and hold-- I'm just kind of going through some of them here-- a pinch or a spread to adjust the size of things? Those are all functions that someone might need to do on a touch device. Can they do that? Perhaps they have very good access with one pointing finger but lack the ability to use two fingers simultaneously to pinch or spread, or something along that line, or to hold their finger on the screen and slide, or slide and tap, any number of things.
So if that becomes an issue, we have to look at some of the settings. Can we customize our device? And luckily, if you're using an iPhone or an iPad, part of your accessibility settings is there is a whole section called Touch. And there is a function in there. It was previously called Assistive Touch. Now it's just called Touch.
That allows you to mimic gestures by simply tapping an icon on the screen. So if I'm still able to tap, the device thinks I've done something else because of the built in accessibility features. So these are the questions we have to ask of what does the person need to access. What do we need to help them to customize their experience so they can do that?
This is my moment where I take a step out of our presentation for a second and I say, if you have never played with the accessibility settings in your phone or your tablet or your computer, I'm giving you a little homework. If you're here right now, you can't do the homework now. If you're watching this recording, I might say to you, pause it for a second and find your accessibility settings and just see what's there. There's a whole wealth of information there in our devices.
It doesn't matter what device you have. They all have something built in that is very powerful and can go a long way to providing an assistive technology support for a person to use in order to be successful. So look at those. I will push my soapbox back under my desk now, but that's really important to me. I think it's important to share. That's a great place to start is built-in accessibility features and looking at your touch devices in this case.
We talked about the idea of touch needs you, too. In theory-- this is my air quote, first one. Air quote, "in theory," I need to use my finger to point at the device and tap on it. The majority of people, that's how they access a touch device. That doesn't mean that's the only way to access a touch device. There are styluses. There are pointers, lots of options out there. I just got in the mail today a free random pen because they want me to buy a whole bunch of pens, and what's on the end of it but a stylus.
So we can get styluses anywhere. They come in all shapes and sizes. The conversation for this has to be with the individual. What does this person need? Are they going to be using a finger? Are they going to be using a head pointer? Are they going to be using something-- I work with someone who puts something on a cuff on their arm, and that's how they're tapping on a device. Great, that works. So we have to find the right stylus and pointer for that.
So looking at some of these, if you have a 3D printer, there are so many options out there on the web of the 3D-printing files to create styluses and various pointers. The one is on the screen here from Makers Making Change called the Finger-- the 6th Finger is what they call this, or the Toe Stylus. So it's giving you another pointer. It's a stylus, but it has a bit of a ring at the end of it. So I could slide it on a finger and end up with another pointer option off of that finger.
So thinking of something like that, keyguards to provide some boundaries for people when they're using touch devices because we think about a touch device, what is it? A piece of glass. And so how can I get some boundaries on there? A keyguard is a sheet of plastic with holes cut out that correspond to the items underneath. So it gives me some boundaries as I access apps and settings and things like that.
Here's some more 3D printing things. There's just three you've seen, but there's so many more out there. Any of these, you click on the picture, it will take you out to that page on their website to print that. So if you have a 3D printer, you could be well on your way to printing these kinds of things-- these kind of adaptive styluses for people.
We'll move on to voice. I think these two-- I put these first because I think we're in a point now where these two are the most predominant access methods we are using. Like I said, most of our devices have touch. Most of our devices have voice access at some point now-- some kind of voice access, whether that is voice access for typing.
So on our computers, tablets, and smartphones, is that something that's providing me speech to text? So I say something, and the text shows up on the screen. Or is that something that is allowing me to speak to it, like an Alexa or something like that? Or even my phone, am I talking to it? And it is either responding back to me or performing some action. So two kinds of things that can happen here-- writing or doing something else.
And so all of our devices have speech now. What's interesting is that all these devices with speech-- couple things that are unique now that that's happened to us-- most of our devices, even though they have speech, it's not speech that we train anymore. We don't train it to know our voice. We just turn it on. And if our voice fits into its bank of what it thinks words sounds like, it will produce the desired result.
The trick that we have sometimes is that if we're working with individuals where their speech is affected in some way, they are sometimes on the outside looking in for tools like this because it doesn't give us enough flexibility to recognize some voices. And so that's been-- as good as speech has gotten and as advanced as it's gotten, it's almost that's the step backwards it's taken for us in assistive technology because the first voice systems that we had, you would train it to understand your voice. And it would learn over time how you say things.
So it didn't matter how I said a word. As long as I said a word consistently every time, it didn't matter. And so it's unfortunate that some of our tools have gotten away from that. There is some tools on the market that can help us with that. If someone has a voice that's what I would probably call a non-traditional speaking voice-- what this tool is not expecting to hear-- there are some different apps that can help you where you train them. You train the app to recognize your voice.
There's one called Voiceitt-- Voice and then I-T-T I believe. there are two T's. That is a tool where you speak to the app which you have trained to recognize your voice. It recognizes your voice and then speaks out loud in a computer-generated voice. So the theory would be, if I had my Alexa in front of me, I would speak directly to my iPad. It would speak out loud and the Alexa would do something. So it's giving you an in-between, which is allowing for that customization.
Other warnings I would give you with speech-- not warnings. We'll call them considerations, not warnings. But microphones. Get a good microphone if this is what you're going to use. If you're buying a headset mic from the dollar store, you're not getting a good microphone. If you're using a laptop and you're expecting a good microphone, chances are high you probably don't have a good microphone. So think about a microphone.
Think about a quality microphone. Can you get something that's a headset or the person wears so that regardless of where they turn the microphone is picking up the speech consistently? Now, I've said that word twice. These systems like consistency. They want to expect to hear a word said the same way every time. So where you are in relation to the mic is really important.
The environment is really important. Quiet is obviously better. That doesn't mean you can't use it in a louder environment. It just means you need to be a little more aware. Your microphone needs to be a little better. It needs to be a little closer to you. Maybe you need to position yourself in a way, in this kind of louder space, to where you're limiting the noise as best as you can. So some of these things to think about with speech recognition now, really interesting as we go.
I mentioned the idea of these two ways we use speech. One is for writing. I share with you this picture from Kelly Fonner, which I love, which talks about the process of writing with your voice. I use this a lot because people will say things to me as an AT specialist like, that person talks all day long. They're going to do amazing with speech to text. And then I remind them that talking is not the same as writing with your voice.
Here's what you do when you write with your voice. Just look at the steps involved here. We'll go through them, but think about this idea. I have to think about what I'm going to say, and I have to compose it in my head. I do two steps before I've even said anything out loud-- think and compose, speak it out loud. The computer picks it up, or the device picks it up, and prints it on the screen for me. Then I have to read it in order to check it to make sure it's accurate.
If I also struggle with reading, there's another barrier that's happening now. So I might need to give that person another tool to help them perhaps read the text out loud so I can hear what the system wrote when I spoke. So now I might have two tools all of a sudden. So I think, and I compose. I speak it out loud. I read it. I check it. I edit the work. It doesn't always get it wrong. It also doesn't always get it right.
If I gave you a off-the-cuff guess, 95% accuracy is what people see, which sounds great. So out of every 100 words I say, it gets 95 of them right. Awesome. The interesting part here is, how long does it take me to fix the five words it got wrong? And what is my process to fix those five words? Am I able to do that independently? That becomes part of the solution here that we have to figure out for the person.
So then they revise their work and they fix it, and then they do this whole thing over again. One of the most cognitively demanding tools we have is speech to text because of these steps and how involved it is to get something-- to get quality out of it. I was going to say perfect, but as perfect as it can be, I suppose, to get what I wanted out of it.
I've given you some resources, just in case. I know we're talking about perhaps youth and adults today. While these resources are more geared-- two of them are more geared towards students in school, the concepts are the same. So moving from left to right, you have a podcast with OTs talking about using speech to text effectively.
You have in the center a self-paced web learning module from a group called OCALI that talks about speech to text. And then on the right, you have a training guide that would help you as a professional teach someone how to do speech to text effectively. Those are all linked under the pictures. So if you have the slides, those will be there. You can get to them. I'll show you the link to the slides at the end again. So we'll have it, so don't worry.
The other part of voice that we're talking about here-- we had the writing part, speech to text for writing-- now we have voice for the other things we do. And I'll take a little side trip into voice assistance right now, personal assistants using these, and Alexa or the Apple HomePod, which is what the colorful items on the bottom are. Two different kinds of devices directed by speech. So I use speech to control them, and then they in turn do other things for me.
What are some of those other things? Here's a great example of an infographic of what most people ask their Alexas to do. And I don't know that there's any big shocks here-- listen to music, ask it a question, check the weather. Those seem pretty straightforward. That seems pretty accurate there. Set a timer. Set an alarm. Those are very typical things.
What becomes closer to our interest area here when we think about these devices as AT supports is a little further down-- not as used as some of these other features, but just as valuable. Control smart-home devices. Looking at or finding a recipe, cooking instructions, that kind of thing. So I think those are pieces. I was trying to see if there was something else, but no, those are the ones I want to hit on. Those become part of how we can use these tools to be more independent.
And so we see here this is another graph, which shows what people are using smart-home applications for. We see control and connectivity. That would just be connecting to the internet, controlling the device itself, home entertainment. Then we start to see some of the other pieces that we would look at-- comfort and lighting, security, and then at the very bottom, smart appliances. And I'd like to think that there's got to be a new graphic of this. I just can't find it from this group.
But I'm curious where we're at now. If I had to guess, I think our smart appliances usage is much higher up this list now. I think we're seeing a lot more manufacturers providing us examples of speech-enabled appliances, refrigerators, coffee makers, microwave ovens-- those kinds of things where I can speak to it, and it performs tasks for me. Again, consumer electronics equipment, but incredibly powerful for us as assistive technology tools as we go forward.
And so we look at some of this home automation. We look at things like, as we go from left to right across this slide, thermostats that are voice controlled, wireless smart plugs that connect to my devices that allow me to control other items in my house strictly with my voice. So even if I have, for lack of a better word, a dumb appliance-- so not a smart appliance but just something that just plugs right in the wall-- a fan. No disrespect to fans, but maybe the fan's not very smart. It just plugs in the wall.
But I want to control when that fan comes on and off. I can plug it into one of these wireless-enabled plugs and then connect my Alexa to that plug to control that item. And again, these are the kind of things that at one point were very expensive. And now, smart plugs, you could probably get a three pack of those for maybe $20. Control a few things with your Amazon Alexa.
I showed you the picture of the Amazon Alexa before. The smallest one with the screen, so that's the Show, the 5-inch Show. It's probably not even $50. So we have an opportunity to get things that are relatively inexpensive but set people up to be very, very successful and independent in their environment. The last two pictures are more appliances that I mentioned, so microwaves that have Alexa built in, coffee makers that have Alexa built in. So I can simply announce to my items to do something for me, and they do it.
Controlling lights, we mentioned before, through plugs. There are lights themselves that are Alexa enabled, so there is a receiver built into the actual light bulb itself. So I can control lights with a voice assistant, turning them on and off, dimming them to whatever I need to do. The item on the right on this screen is a similar smart home tool. This works from an app. This is not necessarily through your Alexa, but I like throwing it in because I like the idea. This is a very physical tool, which I like. It's called the SwitchBot.
It's a small little square. It connects to an app that you put on a smartphone or a tablet. And when I hit the button on the app, a little finger extends out of this SwitchBot and turns on and off buttons and switches. So imagine-- the picture that they show is a great example. That person has a printer with an on/off button, but perhaps they can't turn the on/off button-- they can't use it. They can't physically access it.
So the SwitchBot gets taped to the printer, and when they hit the button on their device, the little pointer comes out and turns the button on. When they hit it again, the little pointer comes out and turns it off. So it's an extension of a finger. I used that yesterday, and somebody did not like that. They were like, ooh, we don't like this idea little finger coming out of it. But it's a little like a arm, like that made it better, sorry. Should I call it a thing? I mean, I'm an AT person. I'm supposed to say better words than "thing." But yeah, a little thing comes out of it.
This person wants to be independent and in control. I consider curtains and blinds security. I would think of that in that realm when I think about independence. So there's a couple items that are Alexa enabled. That's the one on the right, which is some blinds. The one on the left is the SwitchBot company again, so the same device we saw before. This one is app enabled, and it mounts on a curtain rod. And when I ask it to open my curtains, the little rollers just roll my curtain across to open it. And when I'm ready to close it, I hit the button and the little rollers on the curtain rod just come across and close it again.
In my mind, that's fairly low tech. That's all about just putting batteries in it and using the app on your phone. I like that because it's a little easier sometimes for people. They don't feel that's as complex. But it's a good option for some people if they have curtains they need to open and close. Security for doors opening and closing-- we can get as high tech as a fingerprint-enabled, voice-controlled lock, which is on the right.
So that has pretty much everything, fingerprint, Bluetooth. There is a key. There is a touchpad. It is voice enabled. So I can open in all those ways. That might be an option for someone. Another option could be the device on the left. If your door has a traditional doorknob on it, this device simply snaps over your doorknob. And the device that snaps over the knob is voice enabled with Alexa.
So I tell the Alexa I want to open the door. And that piece that I've snapped over the doorknob actually spins and opens the door. So that one works, too-- a little less complex, a little maybe easier for people to use. And it doesn't require replacing everything. I simply snap that over the knob that is already there.
And then, finally, I think this is the last one. Security, as we go further, video doorbells, Ring cameras, looking at little security cameras. All of these can be fed through the Alexa and make that-- the Alexas with a screen, so the Alexa Show. All of these can be fed through, so they become a bit of a hub for security. So someone can be independent and not open the door when they're not supposed to, and see who's there, and keep track of their surroundings around all their environment. So really nice options-- also not expensive anymore.
It's funny. I think back to the late '90s or early 2000s when we would work to help someone design an environmental control system for their house. And you had to get a company in that specifically did this. And they had to plan it. And you were talking at the end maybe $10,000 to $15,000. Now I'm showing you devices that come right from Amazon.
I think the most expensive thing I just showed you was the actual fingerprint doorknob lock, which I think was about $300. The rest of these things, $100, $50, $40. You could put together a really quality system for somebody relatively inexpensive and build it over time so someone could be independent in their space.
All right, we'll continue on our discussion here on keyboards and mice because, although we live in a world of smartphones and tablets, there is still use for the keyboard. I have a keyboard right in front of me right now. There are still people that need those supports as access. And so what can we do to help them be more successful? Maybe it's low-tech, or even in some cases in this slide, no-tech solutions.
And for me, a no-tech solution is a large print letter sticker to put on my keyboard. That's no-tech. It doesn't require anything. Sure, I have to buy something, but there's no technology there. You can get those in different colors, so different contrasts, and just simply stick them on the keyboard. Maybe that's the solution someone needs. Maybe they need a keyguard like we talked about with the tablets before. So they need something that goes over their keyboard to help them isolate into the key they want to press.
Maybe they need a little more modifications, like the lady at the bottom. She's using a mouth pointer to type and to use her trackball for her mouse. Maybe that's a solution. There's simpler, low-tech solutions. The nice thing about the pointer at the bottom and even the keyguard is they are also 3D-printed things. If we have somebody that has a 3D printer, another opportunity to put that into action creating assistive technology devices for us. So I'm going to keep plugging that into your brain to talk to somebody that has a 3D printer.
Maybe we need an alternate keyboard of any kind. It's another area where, at one point, adapted keyboards were very expensive. But now a lot of gaming keyboards are very similar to adapted keyboards we might use. The keyboard on the far left there is a one-handed gaming keyboard. It's not a one-handed keyboard for people with disabilities. It's a one-handed gaming keyboard right off Amazon. I think it was $30.
So these are not-- suddenly we're not talking about very expensive things anymore, which is great because they're more mainstream. And we have this ability to go to places like Amazon and potentially purchase these. So we have that one on the left, which is a one handed. We have the item at the very top is also a one-handed keyboard. It's a different key layout, but the idea is I could place my hand in the center. And then because it's kind of a fan in front of my fingers, I would be able to access all the keys with just one hand.
Of course, with that, because it's an alternate layout, I need to learn that layout. So there's that. That has to fit into our plan here with someone if they're going to use this. The bottom keyboard at the center and the bottom, is called the BAT keyboard, like the animal, a bat. And what that is is that is a chordic keyboard similar to a court stenographer. Letters are made by combinations of keystrokes.
So that picture there, they make a right-handed and a left-handed of that. That one is the right-handed keyboard. So I place my hand on the pad in the center. My thumb accesses the three keys at the bottom-- the blue, the black and the red-- and then my fingers are all placed on the four keys along the top. Every letter I want to create is a combination of keys.
So that is, again, another instance where I have to learn something new because it is-- and I've had people just call these strange, and they're weird, and I don't know if I want to use these. And I point out to people that if you're not familiar with this traditional keyboard that we use, that layout is just as strange as one of these. The difference being that if someone knows the traditional keyboard already, they may have to unlearn that keyboard in order to learn this new one. So that might be different.
But besides that, it's just one of those things that will get better and faster over time. As the person uses it, they will become more familiar with the keys. And they will slowly build up their speed with perhaps a more comfortable position for them to type or, in some cases, a more safe position for them to type in, especially somebody who has repetitive stress injuries. We're constantly looking for ways. How do we get that person access to everything they need without causing any harm? So we see that with a lot of keyboard alternatives.
Maybe those keyboard alternatives are app based. So the two on the left are grouped together. They are the same, where you see groupings of letters. And it makes it easier for someone to navigate around. Color is thrown in there to help someone with visual distinguishing the letters within the colored groups to make it easier for them to break that keyboard up. Or the keyboard on the right is a virtual keyboard, where the keyboard gets projected on any flat surface in front of you.
And so I could make anything my keyboard. I could make my desk my keyboard or my table. I could sit in my car and just put a book on my lap, and the keyboard would project there. And so I could just tap on the display that's been placed on the flat surface in front of me for that keyboard. That's a great example. I don't know the name of that one offhand, but that one is linked in the picture. That's a keyboard that at one point was hundreds of dollars. And now that's something I can buy on Amazon, and it was $49.
So I think there's really nice opportunities to find some of these options. And this feels like a good time to remind you what Kirk reminded you at the beginning. Think about the loan program. Don't forget about borrowing equipment, trying things out. These might look great when you see them. In the two minutes I show them on a slide right now, they might look like the answer to everything. And then you might get it home or get it to work and start using it and realize, oh no, I don't like this at all. But you might need to explore it in order to know that. So take advantage of a lending program. That's what they're there for-- to make sure that people make the right decisions for themselves with tech.
All right, we'll keep moving on. I'm watching the time at the same point. We're OK, I think. We'll find out in a second. We have mouse control. So thinking about, is that person not going to be able to use a traditional mouse or a trackpad that might show up on a laptop? Are they unable to do that? So can we look at different kinds? Can we look at trackballs like the one in the center there, which is a handheld trackball that I use my thumb to move the cursor on the screen?
Can I use ergonomic mice that are more vertical? If you look at the mouse on the top right corner, that is a vertical mouse is what that's called. The idea being that, traditionally, if you want to put your hand in a neutral position-- a neutral position is the handshake position, your band out in front of you-- that is the position your hand wants to be in. Now, how do we use a mouse or a keyboard? You take that same hand and you twist it.
Right away, without us even doing anything, you feel the twist in your wrist. So over time, that's what causes those potentials for injury. So if we start thinking about how can I position the equipment so someone's hand is left in that neutral position, that's what a vertical mouse does. It keeps my hand in what they call the handshake position so that I can continue my task without that strain on my wrist. So thinking of different options, so many options out there for mice, so many for mouse alternatives.
Maybe we go one step above and we look at head control. So with this, there is some kind of camera on the top of whatever screen you are looking at for your item, whether it's your laptop, your desktop, a tablet, whatever it might be, the screen you're looking at. There's a camera at the top. That camera is tracking your movement in space. So right now, I'm moving left to right, just in the camera that you're watching me from.
But if it was connected to a head control for the cursor, my mouse on my screen would be moving and following me. And as I moved around, it would move around the screen. And then perhaps I'd set it up to where if I stayed still for a certain period of time, it would automatically click for me. It's called the dwell feature. So as long as you're staying still, it clicks. Those might be an option. So we're looking at more of this.
We're getting into the alternate access tools now that require training. They are much harder to use than you think they're going to be. Similar to speech to text, using a head control is surprisingly tiring because there's a lot of movement to get that cursor to go where you want it to. And so keeping that in mind, these tend to be more expensive as technology solutions are. These I would definitely look to borrow, so I could try this out before I actually went and purchased one.
If you're using a Chromebook, Chromebook has something built into their accessibility features now called Face Control, which is head control. So it's using the camera built into your Chromebook to watch you as you move in space, and it moves your cursor. Where the face part comes in is that if you make gestures with your face. Perhaps I blink and it clicks on the screen somewhere. Maybe I open my mouth and it activates the dictation feature to allow me to talk. So I can use different facial gestures to do functions on the Chromebook. So that is in your accessibility settings called Face Control.
And then eye gaze. Eye gaze, as we move up the chart here, this is now requiring less movement from me as a user because I don't have to move my head anymore. I'm simply moving my eyes to locations on the screen in front of me, and I'm being tracked by the camera that's pointed directly at me. The two items on the right are third-party devices you purchase. They are kind of expensive-- a couple thousand dollars potentially.
What we're now seeing, which is exciting-- picture on the left-- is the latest version of iOS that works on your iPhone. And they have a beta version of eye tracking now. So you can use the camera on your iPhone to watch your eyes, and it will move a cursor on the screen of your phone. Then when you stay still, a box will pop up and give you some options that you might be able to select things.
It's pretty good. It's a good start. I think as we see the next versions of the iOS software in our phones and iPads, it will get even better. It makes for a really nice option for people. Now, I might not need to get something that costs $3,000. Perhaps I can use something that's built into my iPad-- perhaps. I'm putting a big perhaps there. Maybe I would try it. I think the good part about it being one of the built-in features is that you'll be able to try it first. See if it works, get familiar with the way that technology works, and then you might find this was good, but it didn't do everything I needed. So maybe I need to go purchase something.
Or maybe I go to the lending center and explore another tool. But just the idea that this is now built into our phones, this is a really nice development for us as assistive technology enthusiasts, whatever we might be, why we're here today. This is a good sign because this means that technology is continuing to move in the right direction to provide us options as someone goes forward. And those options are being built into the tools we carry around with us all day, which is great. I love that.
All right, let's talk switches. As we think about access, and we look at alternate access where the individual may have limited mobility or limited ability to consistently select choices, then we look for switch access. So we look at external switches that we can position near the person and then use an interface to connect into the technology that we want to control, whether that's an AAC device, a computer, a tablet.
In the case of the young child playing with the toys on the top, that's connected into a device called a PowerLink. And then the PowerLink has a hairdryer plugged into it. And when the student hits the switch, the hairdryer blows the ball into the bowling pins and we, let's assume, get a strike. So that switch is the replacement for physically touching the hairdryer and turning it on. So as we think about switch considerations, lots of choices.
Another great example of borrowing things from a lending center. There's so many. What do they do? What are some of the considerations we have to think about? We have to think about where we're going to be potentially putting them. So where's the access site? What is the access method? How is someone going to use it to activate it?
I'm thinking of a person I work with now who's an adult with cerebral palsy, and he is incredibly hard on his switches, and he breaks them. And so as we start thinking about switches, the last time we got switches for him, we actually went to a manufacturing catalog. And we found switches that they use in different manufacturing sites for machinery. And it's a very heavy, metal switch, but it stands up to the hitting that he gives it. He uses his foot, so he's stomping on a switch all day. And so we looked at that as something that's very durable for him.
So think about the type of switch you're going to use, where it's going to be mounted, how you're going to mount it. The person is going to need it positioned alongside of their head, perhaps. What does that look like? How do you get it there? How do you make sure that it stays in that position? How do you make sure that the switch you're using is able to interface into the other devices this person needs? If this person uses an iPad and they're a single-switch user, you don't plug a switch into an iPad.
There has to be some in-between, maybe some kind of interface that happens in between. And the interface plugs into the iPad, and the switch plugs into the interface. Maybe you use a wireless switch that wirelessly connects. These are part of the questions and the considerations we have to think about. What are you trying to connect? How are you going to connect it? Where are you going to put it? Switches have probably the most considerations for us to really think about.
I love this picture on the right because it reminds us of all the places we could potentially put a switch. So consider that it's not just about someone's hand. It could be any spot on them. What we're looking for is consistent movement. As we do that, we're thinking about how much force someone's going to use. The gentleman I just spoke about before with his foot, he uses bunches of force. So he breaks switches all the time.
The amount of travel is how far the switch has to be pushed in before it connects, before it clicks, before it makes the connection to do something. Size and color and texture-- these are just all the things we have to think about. Does it give feedback? If you use a keyboard that is more flat and doesn't have physical keys that push down, don't you sometimes miss, like, oh, did I really push the key or not? Sometimes you need that feedback. Maybe it's physical of pushing it down. Maybe it's a visual feedback of a light, or auditory, a click, or a beep, something like that. So what are we doing to make sure the person knows the switch has been activated?
This is a great resource. The link is behind the picture here. It's a website that gives you some strategies of how to help people develop their switch skills. Switches tend to be a slower access method at first. But I've seen people be very successful with switches and be very quick at their use. So don't always assume that a switch user is going to be a slow user of technology. They won't. They will continue to get better, and those skills will continue to develop as you go. So there's a great resource for that.
All right, a couple of minutes. I got to tell you, this is exactly where I was on the first webinar, too, which is kind of weird. But here we are. Access-- other. Maybe video game access-- is the person looking to control some video games? We live in this great time where there are commercially available adapted video game controllers. No longer do you need a guy like me who's willing to take apart a game and wire it up and solder things into it. You don't need that anymore, which is great. You can purchase these items, and they are controllers that allow you to customize them for whatever somebody needs.
The picture in the center is an Xbox controller and all the pieces that can plug into it. The item at the top is the new PlayStation adapted controller, which is customizable. And all those little white pieces that are around that disk can be removed and plugged back in, and so you can customize all of the way you use that. So I would definitely check those out if you want. If you have a gamer in your vicinity who wants to use an adapted game, definitely check that out.
Remember these access methods we've been talking about-- any of these. Someone might use them for wheelchairs-- to drive a wheelchair. Great. Just remember that whatever it is that we're talking about as far as access still needs to be able to plug into a mobility device. So there might be a need to get a mobility device interface that these things plug into. The other thing I would remind you is, if you use a joystick on a wheelchair to control the wheelchair and something else, many times people will control a wheelchair and an augmentative communication device. Great.
Have a backup plan for when the wheelchair breaks. I've seen, unfortunately, times where someone's wheelchair breaks, and now they don't have a wheelchair and they don't have a way to speak, because they only had a joystick that used their electronics and their wheelchair. So if you use your wheelchair joystick for your AAC device, have a separate joystick that plugs directly into it so that person is not out of both of those pieces of technology if one breaks. And I shouldn't even say if. I say when one breaks.
Finally, I show you one that is very new, very interesting. This is called the MouthPad by a group called Augmental. And what you are seeing is a molded retainer that fits into the top of your mouth. And you use your tongue to control whatever it is you're using it for-- a mouse on a screen, a wheelchair, whatever it might be. Your tongue is the access method through this little embedded retainer piece. Really cool. I just saw this at a conference last year.
There were items like this before Kirk and I were talking about it in the first session that were around for a little while and then went away. This is just back. And it's kind of interesting, and it's really very nice. The electronics are very nice on this. I won't show the video now, but if you get the slides, go in and watch the video. This is Keely. She's a student who uses this little joystick to drive her wheelchair but also uses the MouthPad to control her electronics, which is really cool to check out. So that gives you some homework to watch that video and check out her using the MouthPad.
And with that, I will put the slide up with the link. I will check the chat, but I didn't see anything come through. Questions or comments? I'll put the slides in there. There's the link to the slides one more time. If you want to reach out, there's the way to get me. Feel free. I'd be happy to chat with you about anything. You can tell I love talking, so sure, I'll talk with you. Why not? This will be great if you have questions and comments, for sure.
KIRK BEHNKE: Thanks so much, Mike, for that great presentation. We really do appreciate it. And if anybody has any other questions or comments, we're going to give you probably, like, 30 more seconds. Otherwise, we're going to end for the day. So thanks, every--