{{item.title}}
Key takeaways
Innovation happens in many ways: organically from the ground up, inspired by nature, out of think tanks or even from spontaneous conversations around the workplace. Innovation is often in aid of a business objective, and while that leads to some great leaps forward for society, other worthwhile needs can sometimes miss out.
Enter the Hack4Good hackathon.
This year, PwC Australia hosted and participated in Microsoft’s annual hackathon which focused on using technology to improve people’s lives. In partnership with Social Ventures Australia (SVA), the event aimed at developing solutions for people with disabilities, including ways to share and access information, overcoming the barriers that remote locations pose, and communication.
Key to any innovation is truly understanding the problem to be solved. In a hackathon environment, that means being able to quickly and efficiently get to grips with the issue – and the realities of those experiencing it.
In this instance, this was accomplished by discussions with people with disabilities, which were organised by SVA. This meant that the PwC team could address the chosen problem statement – ‘People with disabilities in remote locations have significant challenges connecting to people and services due to limited bandwidth’ – from the perspective of those with real insight.
From these discussions, it became clear that focusing on the problem as a technological limitation would not address the heart of the matter. Connection is an emotional need, not simply a structural one. Feeling connected is just as important as being connected, and for those with hearing or visual disabilities, a bad internet connection may not provide that.
Adopting as a guiding principle that emotional communication is critical for people to have an essential feeling of connection, the hackathon team developed their hypothesis. By collecting, interpreting, and shrinking the data for communications, they posited, we can empower disabled people in remote locations with limited bandwidth to feel connected.
Instead of tackling the problem from a business stance, thinking through efficiencies, cost and profit ramifications – all of which would take a great deal of prior knowledge and data analysis – the hackathon instead asked teams to adhere to the principle ‘solve for one, extend to many’.
In keeping with a design thinking approach, where the solution is designed from the customer up, the team created Jennie, a constructed user whose difficulties the technological ‘hack’ would address. Jennie, the team decided, would be a 55 year old grandmother living in far north Queensland.
With no car and limited bus services, three hours from the nearest big town, being able to connect with her kids and grandchildren was difficult for Jennie and contributed to a sense of social isolation. Hearing impaired, she struggled with the visual communication of video chatting with her grandkids – especially since her internet was unreliable and slow.
When the nuance of the problem is understood and the customer visualised, it’s much easier to get down to work creating a solution. In the Hack4Good event, the PwC team began doing this by putting themselves into Jennie’s shoes.
By combining user experience, data science and developer capabilities, they were able to envision a video chat ‘accessibility add-in’ utilising Microsoft Cognitive AI tools. By using local computing to reduce transmission size, the need for high-speed internet could be greatly reduced. For audio impairment, video frame rates could feasibly be reduced from 30fps to 1fps to show non-blurry sign language on slow connections. For visual impairment, audio compression, voice-to-text and audio descriptors of visuals would add much needed context.
With a slower video suitable for low-bandwidth internet, Microsoft Cognitive AI tools could then add in emotional and contextual enhancement, by detecting additional information from facial features or vocal tones – for instance, whether the speaker was happy or sad – or descriptions of the background environment, that would then be shown to the listener alongside the video.
For Jennie, this means she can talk to her family with a slower but more reliable video feed, and with visual or auditory cues and text descriptions to help, have a greater sense of connection.
By the end of the hackathon, the PwC team were able to present a working proof of concept that captured faces at a low frame rate, analysed and determined emotions and presented them in text over the video image as well as via sound.
Coming full circle after presenting the concept, a very real person, Rocco, chatted with the team. He himself living with a visual impairment, he empathised with Jennie’s predicament, as he too found it difficult video chatting with his kids. While he could hear them, he often struggled to understand what their faces were expressing, making connection difficult.
In just two days, the team had come up with a viable solution and a working AI-enabled prototype.
And that’s just the beginning. Future use of cognitive services could work directly from inside video chat programs such as Microsoft’s Skype or Facebook’s Facetime via add-ins. recognise and interpret sign language, or use machine learning to do deeper analysis.
That’s good news for Jennie, and great news for her real-life compatriots and it proves that super-fast innovation can make a huge impact.
Get the latest in your inbox weekly. Sign up for the Digital Pulse newsletter.
Sign Up
References
© 2017 - 2025 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details. Liability limited by a scheme approved under Professional Standards Legislation.