Transcript
CAITLIN MCKENNEY: Hey, welcome. Thank you all for joining us today for the March tech accelerator webinar. Our presentation topic this month is smart home safety and supporting digital choice. This session is being recorded, and the video will include picture-in-picture ASL interpretation as well as a full transcript. You'll be able to access that on our website at Disabilities.Temple.edu. I will put that link in the chat for everyone as well. And that website is also where you can register for future webinars.
TechOWL has a wonderful series of 18 topics that began in July of 2024 and will be running through this December of 2025. They are held on the third Thursday of the month, offered at two times 12:00 and 4:00. This series is part of the Tech Accelerator program, which is an initiative funded by the Office of Developmental Programs and the Office of Long-Term Living here in Pennsylvania. This project is made possible through funding from the American Rescue Plan Act.
The Pennsylvania Tech Accelerator project includes five major focus areas-- training and resources to build capacity for stakeholders, readiness evaluation to develop tools for successful adoption of remote supports and assistive tech, statewide assessment to plan and benchmark a technology first systems change, a provider survey to assess technology and awareness among providers, and then two technology summits, which we just held as live events in Philadelphia and Pittsburgh within the past two weeks. You can learn more about all of those projects on our website.
Without further ado, I would like to introduce our guest speaker for today. So Tiffany Wilson is the owner of Wilson Inclusive Solutions, and Tiffany has over 15 years of experience in disability services and has published work on the unintended harm of smart home devices as assistive technology. So thank you, Tiffany.
TIFFANY WILSON: Thank you. I just realized I left my slides from this morning, or this afternoon's presentation if you're in Pennsylvania. So now I have the slides up and ready to go. And I want to thank Caitlin. I want to thank the Institute on Disabilities at Temple University and also the AT Act program, TechOWL, that is also housed within the Institute on Disabilities, for having me here today to discuss the importance of digital autonomy. And the presentation is supporting digital choice-- smart home safety.
Through implementing large scale assistive technology initiatives, including Alaska's Federal AT program, I've witnessed how smart home technology is changing lives. However, I've also seen complex challenges that go along with internet-based and off-the-shelf technology used as assistive technology. These situations have often caused unintended harm, and we're going to talk about that today. These experiences and observations are outlined in various scenarios, and if we don't cover a specific scenario that you're concerned about, please wait for the question and answer session and hopefully, we can get your questions answered.
I believe that the successful implementation of smart home and internet-based assistive technology requires clear professional boundaries, ones that protect boundaries and privacy while empowering genuine independence. Today, we will explore how to apply person-centered principles to digital environments, establish professional boundaries in digital environments, and connect end users with resources and supports for their digital independence. Throughout this presentation, I will refer to people receiving your support as in users.
This is to emphasize their role as active technology users with agency and choice. I'll also be using the term direct service provider or direct service support professional in a general manner, and it just means any provider who provides disability support services who is maybe not an assistive technology assessor or not within the assistive technology equipment distribution programs. So this is for more our general audience to talk about smart home safety for your participants, your clients, whatever term that you use in your program. You can access the slide by going to change ChangeAccess.com/Slides or use a QR code.
Let's get into considerations in our communication framework. I gotta say the disclaimer. If I mention the name of a product or company, all of their trademarks, names, everything is property of the owners. And mentioning it is not a specific product endorsement. Attendees, implement any strategies at your own risk and discretion, and also be aware that privacy regulations and policies vary by state, agency, and setting.
Today, our communication access framework is that all communication styles are important and welcome. We respect diverse perspectives and contributions. What may work in one environment may not work in another environment. Questions will be answered at the end of the presentation. Please direct all your questions to the Q&A section in Zoom. We'll try to grab questions from chat, but no promises as things move quickly. We will also have alternative ways for asking questions if text-based communication is not accessible to you, or if you prefer to use American Sign Language.
So let's talk about person-centered digital autonomy. Person-centered digital autonomy is a person-centered approach to digital choice. Just like we apply these principles in physical environments, care planning, and promote personal choice in person, we should also extend these same principles into the digital environment as we modernize with new technology and provide access with digital technologies.
To create this extension into the digital space, we first need to understand what self-determination means in a digital world. Simply put, digital self-determination means people should be able to use and control their own technology according to their own choices. This concept of digital self-determination was first coined by Doctor Jorn Lengsfeld It's about ensuring individuals have the power to make informed decisions about the technology in their lives rather than have others control it for them.
For this reason, I created the digital choice framework. I built it on established and researched person-centered principles, and it focuses on one fundamental principle-- end users must drive their own technology decisions. True digital support means respecting end users as technology decision makers and never using technology for control or personal convenience at the expense of autonomy. This is important as we move into talking about boundaries and implementing smart home tools safely.
First, let's just talk about digital spaces and the types of boundaries we should have. They can be less visible than physical ones. They're equally important. Just as there are clear lines we don't cross in someone's home, we need similar frameworks for digital environments. Professional boundaries and digital spaces protect both end users and service providers. These essential boundaries include account ownership.
End users should control their own accounts and login information. There should be professional separation. We should never use our personal device, accounts, or phone numbers. Our phone numbers should not be used for multi-factor authentication for our clients. We should also have privacy boundaries. We should protect sensitive information and respect digital spaces. There should be financial separation with clear boundaries in regard to purchases and subscriptions and all the costs that come along with assistive technology acquisition.
We should understand there are limits to our roles as technical supports or providing technical advice. We should connect end users to qualified or appointed tech specialists rather than trying to figure it out as a care provider. Know this-- even well-intentioned tech support can create significant risks from dependency to-- let me say that again. From dependency and all the way to privacy violations to financial entanglement and regulatory problems. I've seen it. This is far from limiting support from end users. Clear professional boundaries actually enhance support by preserving genuine autonomy and protecting everyone involved.
When we talk about privacy, we should also talk about HIPAA, the US Health care law, the Health Insurance Portability and Accountability Act. And a lot of us are bound by HIPAA, not all of us, but we should really understand and think about privacy boundaries when we're implementing smart home technology or assisting others with theirs. And it becomes very critical when we cause unintended harm with end users' private information. This technology often captures, stores, and transmits information that could contain protected health information.
It creates challenges and considerations that we should make now that we weren't having to make in the past. Now, the technology is modernizing, and we should consider how these devices send and receive data. Did you know that HIPAA identifies 18 data elements as Protected Health Information, or PHI? This includes data-- a data point is a name, a location, et cetera. It covers more than just the diagnosis. Just because you don't mention a diagnosis doesn't mean you may not inadvertently-- you may inadvertently cross a privacy or HIPAA violation.
For privacy boundaries in a digital setting, I believe that PHI belongs to the end user, and they decide how it is used and transmitted in their smart home devices. Support staff should not have independent access to digital accounts for end users. And end users, or their legal guardians, not support staff, should control account setup and management.
One example was a well-meaning direct service provider who bought iPads for the participants in their program. They thought they were being helpful setting up all the Apple ID accounts for the iPads on the behalf of their participants. And what they did to set it up without consent and without talking to their participants or about account management or anything like that, they took the protected health information from their agency intake form and created the Apple ID account without consent.
So they took private information from a disability service intake registration form, protected by HIPAA, and created Apple ID account on their behalf. For them, it was a HIPAA violation and not good practice. As support environments become increasingly digital, we must think differently about privacy protection. Maintaining proper boundaries respects and protects everyone involved. When in doubt, we should always err on the side of protecting end user privacy and refer to our agency policies. It's better to be cautious than cause unintended harm.
Now that we've examined some specific boundaries and privacy considerations, let's summarize your role supporting digital choice. Our role as direct service provider is to champion digital autonomy not control technology decisions and empower others to learn to control their own technology or have self-determination. You are, or hopefully by the end of this course, digital choice advocates. And what that means is that you promote technological self-determination and privacy practices in assistive technology implementation.
Remember, our goal is not to prevent or limit technology access or adoption. It is to support it while maintaining appropriate professional boundaries. Let's talk about specific scenarios to illustrate these principles in action and give you more specific ideas. In the handouts, you'll find a tool called the digital choice decision pathway. For the sake of time, I'll summarize it for you.
The framework assists you in evaluating a technology support requests from end users potentially cross professional boundaries. It guides you through steps with considerations about account creation, device sharing, financial handling, and information access. It's not comprehensive. It's a simple tool to help you find the right way to say "yes" while respecting boundaries and supporting digital autonomy.
Let's talk about how smart home devices are changing lives and providing access. Every day tech is becoming more popular and affordable, making it easier for people to use off-the-shelf devices, like smart home speakers, as assistive technology instead of expensive specialized tools or specialized assistive technology. There are also cell phones, smart assistants, tablets, and we're seeing more and more all of these devices being distributed in assistive technology program. Also, more and more people are able to afford it individually.
This is what's great about smart home technology. This slide shows a person controlling various smart home devices in their living room. Consider a person with a physical mobility challenges and how they can control their home and environment with off-the-shelf devices instead of costly proprietary environmental controls. The example has a robot vacuum. The individual is controlling their lights, the temperature, speakers.
And this slide shows more examples. Smart cameras can enable those with mobility challenges or limitations to see visitors. Tablets can do many things. They can control environments, provide vital communication apps. Smart assistants can support memory, time keeping, and even provide visual descriptions. Not only do these technologies offer convenient and affordable options, they can also create genuine autonomy and connection.
In the next section, we will explore common situations you might encounter and couple ideas on supporting end users who have smart home technology. These scenarios come from real life experience from various support environments. For each, we'll discuss the situation and potential responses. Again, if your situation is not discussed, feel free to put it in the Q&A or ask us at the end, and we'll see if we can get it covered.
Let's talk about account management. One of the most common boundary challenges involves managing accounts. For various reasons, end s often request or need assistance setting up accounts. And these days, there's an account for everything-- the device, the app, online services. And in my observation in Alaska, the most common reason people needed assistance was lack of digital literacy.
I've observed consistent challenges when support staff create accounts for end users. Ownership becomes unclear. Privacy violations happen. The end user becomes dependent on staff for access, and digital independence is compromised. This can be particularly concerning with accounts that require payment information, like an Amazon account that is required to use an Alexa device. Setting up these accounts for someone else creates financial and privacy boundary issues.
In one case, a well-meaning support professional created digital accounts for everyone they served, not the same one who used the protected health information. And they provided several different smart devices. When that staff member left the organization, several individuals lost access to their assistive technology because no one else knew their login information.
Let's talk about a scenario with account management. Maria received an Amazon Echo as a gift and asked, "can you just set it up for me? I don't understand technical things." What would you do? Think about your answer privately or drop it in the chat. I am locating-- there is my Zoom bar, so I can open my chat.
So A is go ahead and set it up using your Amazon account. B, create an account in her name. C, explain you can't set it up, but connect her with Amazon or other resources for digital literacy, or even possibly an AT program. D, tell her she shouldn't use technology she doesn't understand. What would you do? Feel free to throw it in the chat.
So I feel the best answer is C to connect Maria. And I'm seeing some C's coming in. Thank you for your responses in the chat. Connecting Maria with resources maintains boundaries while still providing support. A and B cross boundaries by controlling account creation, and D dismisses her autonomy and interest in technology.
Now let's talk about privacy boundaries. Smart home technologies, especially cameras and monitoring devices, they capture, store, and potentially share deeply personal information out to that device company server outside the home. Most smart devices store data in cloud services that are not HIPAA compliant.
End users can use these devices, of course, however they want. But as direct service providers, we need to be careful about using these smart home devices, encouraging their use, or setting up accounts because we may potentially violate HIPAA. Other things that these devices capture are voice recordings, activity patterns, could be eye movements for VR headsets, health data is collected. Does anybody use the Apple Watch and your health data on it? I use it for assistive technology.
In the end, it's about end users maintaining control over what devices capture their information and how their data is used. They can choose to use it however they want. But it's not up to me to decide for them or use their protected health information. So remember, this data could sit on third party servers with limited to no privacy protection.
Let's talk about a scenario. "Robert's family wants to install a camera in his bedroom to quote, 'keep an eye on him' at night. They ask for your help to set it up. Robert hasn't expressed an opinion." What would you do? Feel free to drop it in the chat.
A, install the camera since family members requested it. They should know what's best, right? B, connect them with AT specialists or experts who can discuss different monitoring options or alternatives. C, install the camera, but position it to avoid capturing personal activities. And D, tell them it's not your problem. They need to figure it out themselves.
What would you do? Yes, we've got some B's. And yes, that is right, in my opinion. Connecting the family with experts who can advise on appropriate monitoring options, talk about various tools, and maybe not-- it may not even be a monitoring option that this person needs. There's other tools that could assist. This connects them with the experts and respects privacy while fulfilling your role as a resource connector.
And I love the comment in the chat about the family needing opportunity to learn about supports other than cameras. I wholeheartedly agree. They may not have been exposed to other options. A and C violate privacy boundaries by implementing surveillance without consent. And option D is dismissive.
And yes, this has happened, and even to the point of where an assisted living home considered using smart cameras to monitor their residents. Some had guardians, some did not. And of course, we educated them and discussed with them how, in this particular case, it was not appropriate, and how the assisted living home would be violating HIPAA with a smart camera because those recordings live on the cloud. There are more closed systems, and there are systems that may offer more strict privacy protections, but it's really important to connect them with the right resources.
Financial boundaries-- many smart devices require payment information or they require subscriptions or ongoing purchases. Key points-- never share your payment information. Yes, it's happened. Oh, I helped somebody download an app or it's just $0.99. Let me get that for you. You should maintain clear financial separation and be cautious about encouraging purchases that has a long-term financial impact or long-term financial implications.
Are you encouraging someone to use their funds in a certain way? Even well-intentioned financial entanglements can lead to complicated situations that blur professional boundaries and potentially violate organizational policies. Let's consider the next scenario. James asks, "can you just help me set up my account and figure out how to pay for apps on my iPad? What would you do?
A, set up the account using his credit card information. B, connect him with appropriate resources or supports to learn how to get assistance for creating the account or learning how to set up an account from Apple, like the Apple Store. D, create the account, but tell him to add the payment later. So you don't touch his financial information. D, set it up using your card until he can add his own. Access is access, right? I'm seeing some B's.
Yes, yes, I believe that the best answer is B to connect James with resources maintains financial boundaries while providing support. It keeps payment information separate while ensuring James gets the assistance he needs. A and C cross boundaries by directly handling account and financial information, while D creates an appropriate financial entanglement by using your own payment method.
Let's talk about tech support boundaries. As digital environments become more complex, there's often pressure to solve technical problems for individuals who are still learning about technology and increasing their digital literacy. However, we must recognize when these technical problems may beyond our role or expertise to provide support.
The key principle-- know when to refer to specialists. Ensure information stays with the end user. Understand the limits of your technical role. Just because you understand something does not mean it is your role to assist technically.
In most situations, it can be best to connect folks with appropriate resources that increase their independence and their learning for digital portfolio-- managing their digital portfolios. This is often more empowering than implementing solutions yourself. Account ownership and digital portfolio management are fundamental aspects of digital autonomy and literacy.
Let's consider this scenario. Susan has forgotten her email password again. She asked, can you just create a new account for me and just keep the password? It'll help me out in case I forget it again. What would you do? A, create the account and store the passwords securely somewhere. B, create the account but have her store the password securely. C, recommend specific password solutions. D, connect her to her natural supports in her life or other resources.
Oh, the answers here. We've got D. Yes, the band's best-- I believe the best answer is D to connect Susan with digital literacy resources or AT specialists or other natural supports or guardians can maintain boundaries and respect her autonomy while still ensuring she gets proper assistance. A and B cross boundaries by creating and managing her account. C crosses a boundary by making a specific technical recommendations that may not match her needs or abilities.
Also consider this-- many service providers and assistive technology assessors recommended the password management tool LastPass. Guess what? LastPass had a privacy breach in 2022. Could we potentially be held liable or accountable for recommending a tool that became insecure and experienced a privacy breach?
That is something that we should consider, and allow the other person to make their own informed choice about the appropriate tools that they need. And AT centers and other supports trained in that will be helpful-- be able to help the individual make that determination.
Which brings us to our next slide. You have a lot of excellent resources in Pennsylvania. It was really great to work with Caitlin on this, to put it together. In the handouts, you will find the full document.
And for here, I'm just going to be brief, but the document has contact information and websites. Because I know it's difficult to watch someone struggle, and we do want to create digital autonomy, and we want to improve digital independence-- excuse me, access for digital autonomy. And we can do that by connecting to resources.
So in southeast Pennsylvania, you do have your digital navigator programs. Their website isn't currently working, but you can connect with 211. There's Computer Reach in western Pennsylvania. Online programs, local libraries and community centers. And it's really important that individuals understand their technology, they understand their accounts, and we raise the digital literacy. Well, I would say everyone, but I also believe that it's very important that if we're providing devices or supporting devices, that's digital literacy and autonomy, very important.
You also have your assistive technology resource center through TechOWL or your centers across the state. And then TechOWL holds the assistive technology program funding through the federal government that allows them to provide free demonstrations, loans, and awareness about assistive technology for free statewide. So when in doubt, you could call your AT program, and they may be able to direct you to the right resource. But they're also a great local support.
The technical support section does not have specific phone numbers or websites. It can vary by device. For instance, there's manufacturer specific support for devices. If somebody needs assistance for setting up an Amazon Alexa account, have them call-- maybe one option is having them call Amazon, and have their customer service rep assist them with help setting it up.
For instance, I attend these assistive technology conferences, like CSUN, in California. And all of these companies, like Amazon and Google, are there telling us about their accessible customer service options. Let's try them out. Let's help Amazon be better and more accessible. So let's work together to increase literacy on all sides. There's also repair-- authorized repair locations, specialized accessibility helplines. Excuse me. Excuse me.
Companies like Apple, Microsoft, Google, they have accessibility helplines. Some of them even have partnered with apps like Be My Eyes to also provide visual access or direct calling through Be My Eyes for support. The power of these resources isn't in handling technical issues yourself but in connecting end users with the right expertise to maintain their digital autonomy. This approach empowers independence while respecting professional boundaries.
In closing, please remember the goal isn't to avoid technology or not promote smart home technology or say that it's bad. The goal is to empower digital autonomy. By understanding these digital boundaries and having resources ready for referral, you become true digital choice advocates, and you champion digital autonomy.
Remember, unintended harm is still harm. Let's work together to promote and empower digital autonomy while reducing the potential harm to those we support. Brings me to the question slide. I'm going to describe the slide and the process, and then turn it over to Caitlin to manage the questions.
So the communication framework-- your communication and your questions are important to us. We offer two ways to ask questions, through the Zoom Q&A or you can ask it live, if you prefer an alternative to text-based communication, or if you would like to use American Sign Language. We'll rotate between these methods.
If you would like to ask a question live, please follow these steps. Use the Raise Hand tool and wait for approval. If you're unable to use the tool, please connect with Caitlin. Once you are approved, please turn on your camera to ensure lip reading access for others, if you're using your mouth to speak. Keep your question brief and allow time for processing. And after asking your question, please turn off your camera and microphone unless instructed otherwise.
Sometimes I might want to ask you more questions. I'll ask you to stay on the line. The Zoom, I guess it's not the line. You can find my contact information by using the QR code or going to the website. pxl.to/Tiffany. And if you would like a certificate of attendance for this session, please email Questions@ChangeAccess.com, and we'll be happy to send you one next week. Over to Caitlin.
CAITLIN MCKENNEY: Wonderful. Thank you, Tiffany. So I don't see any text-based questions coming in through the Q&A or the chat yet. Let's see, does anyone have a question that they would like to raise their hand to ask in person?
And we have a little bit of time here, so if y'all want to chat, I'm here. Tiffany, I have a question for you.
TIFFANY WILSON: OK.
CAITLIN MCKENNEY: We discussed a little bit earlier today the different policies that smart home speaker companies, like Amazon and Google, have regarding storage of data. In terms of figuring out when they're listening or not listening. And I know that some of those change relatively frequently. Amazon seems to be just changing theirs. Some point in the next couple of weeks, sometime this month, they're going to be changing it so that you can no longer opt out of sending your requests to the cloud.
Is there a good way to keep on top of that stuff? How often do you recheck the privacy settings? Is it every time you install a device for someone? Or how do you keep on track of those?
TIFFANY WILSON: That's a great question. And you're asking from the perspective of an assistive technology provider?
CAITLIN MCKENNEY: Right.
TIFFANY WILSON: OK. Great question. And so you're right. It looks like you stay up on the news. I just read that article too. So with devices, we're not-- they're not necessarily bound by regulations in the way they implement their privacy practices.
And so with that, that means a lot of these devices are not secure. And in the past, smart home-- the smart speakers and these companies we talked about Amazon, Amazon would say that they did not save the recordings and that it would only be activated on the wake word. And we're learning in the news recently that these companies are changing their policies.
And because they're private companies that are selling us a product, they can decide their own privacy practices. And they are not medical devices. And so, to answer your question, Caitlin, what I normally do is I try to follow any newsletters I can find from the company or notifications. There are some companies that every time they have an update, they'll email it out what the patch is, what the update is.
You could check their current privacy policy practices. And so what I would do is I would also flip that in more learn where these companies host their privacy practices, and do they offer it in plain language? If they don't, can you take it and put it into plain language and understanding it? And honestly, they could change it tomorrow.
They could tell us, they could not tell us. And so instead of getting into the details of what parts are protected and what's not, it's best to just say that smart home devices are not secure, and they don't have robust privacy practices, and that we encourage individuals to learn what that means. As an assistive technology provider, I have used more plain language tools to explain privacy practices. But you also have to be careful interpreting these practices because then you might be interpreting wrong.
So I know that there's not a black and white answer, but I think it's more looking at smart home technology is not secure. Cybersecurity professionals will tell you that. And I think it's just a good rule of thumb that people know that. And then you just explain that "not secure" means that their information is not in a contained environment in their home. And that it goes out to the company and may live there, we don't know how long. And that is a risk that they take.
CAITLIN MCKENNEY: That's a fantastic answer. Thank you. Tiffany, I think seeing all of these different updates tempts me to get into the weeds with making sure that I know all the nitty gritty about it. And it's really much simpler to just make sure we all know that it's not perfect.
TIFFANY WILSON: Yes And I did try to get in the weeds. And with each company changing, and it varies why, there is a website I recently learned about-- I haven't used, but I'll send it to you, Caitlin. And this company claims to take terms and conditions and put it in plain language.
It's a tool for people to use. I haven't used it and people would have to use it at their own risk. Although I do caution being careful because terms and conditions are potentially legal agreements. So if you're setting up an account on somebody's behalf, you may be potentially agreeing to something legally for them. Or if you decide to explain terms and conditions to someone, are you potentially explaining a legal document to them? And is that appropriate? So those are other privacy considerations for terms and conditions.
CAITLIN MCKENNEY: Does anyone else have a question or a scenario they wanted to share and get an opinion on? Well, you do have Tiffany's contact information here on the slide and on the link in the chat. You all have my contact information as well from the Zoom registration, and you're more than welcome to reach out to TechOWL if you're in Pennsylvania or to your local assistive technology ACT program if any questions arise for you later and you're looking to continue this conversation.
Thank you all so much for joining us today. And, Tiffany, I really appreciate your time here. Thank you so much for having this conversation with us and providing these resources.
TIFFANY WILSON: Thank you. And thank you Temple University and everyone for attending.
CAITLIN MCKENNEY: All right. Have a good evening.