Problem 34 – Communicating with the Deaf

Problem

P1020092.JPG

Unless you know sign langauge, it can be very difficult to communicate with a deaf person, especially when face to face and carrying out everyday tasks.

Possible Solution

P1020093.JPG

As a deaf person would usually already know sign langauge, what is required is something that can translate sign langauge to the other person. This can be done by having motion sensors on the hands of the deaf person to detect the gestures. This information can then be relayed to the listener’s phone via bluetooth, and then be read out using the phone’s artifical intelligence system. If the person who isn’t deaf wears earphones and listens through them, then the experience will also become more seamless.

Similarly, when the other person speaks, their phone’s microphone can pick up the words and then relay this information to a screen on the deaf person’s motion sensing wearable as text.

Of course, there are potential problems with this solution. One problem can be seen as a disadvantage for the deaf person, as they have to look at their device when the other person is speaking. The translation might also have inaccuracies, especially in noisy environments, and there is also the potential of a time delay when sending the information between devices. However as artifical intelligence gets smarter, there should be fewer inaccuracies.

I think another solution could become more practical in the future, which involves augmented reality. This could allow a deaf person to have a conversation with someone else just as they normally would, except even people who don’t understand sign langauge will be able to fully understand them. This will be due to the various sensors on the AR devices, and also the display of information in real world environments.

As always if you think you have a solution or if you have a problem you would like me to try to solve then please feel free to comment it.

Problem 28 – No Time to Plan

P1020076.JPG

Problem

Sometimes, things happen quite spontaneously, or there might be things you forgot to consider. For example, you could suddenly decide to visit a friend one day and then realise you haven’t got anything to eat. You could then use the personal assistant on your phone to find places to eat, but it will only show you the places closest to you at that time and it won’t necessarily be places you like. A place you like might then be too far away and a compromise will have to be made.

Possible Solution

P1020077.JPG

I think that artificial personal assistants can get a lot smarter and that they are only just in their premature stage at the moment. One way I think they can get smarter is by getting the ability to plan your entire day ahead for you.

This can start by just asking the user what their plans for the day are after they wake up, perhaps when they’re having breakfast. Over time it can learn any potential patterns the user has, for example working 9 to 5 on weekdays or doing grocery shopping every Saturday.

It can then tell the user things like traffic information, weather and even possible places to eat or things to cook. In the evenings it can then recommend films or TV series to watch. If the user does something spontaneous, they can either tell the assistant or the assistant can detect that they are going to a specific location and then suggest places to eat along the way, or things to do when they get there.

Just telling the assistant that you plan to get to a specific location by a certain time can then allow it to calculate the exact leaving time to account for other activities such as eating. It can then help you build a schedule that fits comfortably around what you want to do while also allowing you to get what you need to do done.

Of course there is the issue with privacy if the device is constantly listening and sending the data to a server to analyse. Some people might also dislike the intrusiveness or dependence we will have on technology if it decides everything that we do.

As always, if you think you have a solution or if you have a problem you would like me to try to solve then please feel free to comment it.

Problem 21 – What to Wear?

P1020061.JPG

Problem

Sometimes we just can’t decide what to wear and it might take a long time to lay our clothes out in front of us just to pick one and put the rest back.

Possible Solution

P1020062.JPG

If artificial intelligence could help us pick sets of clothes that match, and even learn from our style then it could take away the need for us to decide what to wear. In order for it to be able to do this, it will have to know what clothes we currently have. It can do this by taking a photo of the clothes we put into our wardrobe and drawer. It will then require an algorithm to determine what colours and styles match.

Once it has determined how it could match the clothes it will require an interface to display the results to us when we want to know. This can be done on our phone or tablet, but we will not be able to see how it looks on us. Another possibility would be to show it on a mirror that has a camera which can track the user. It can then overlay the clothes on the user’s body in the display on the mirror. The user can then use gesture controls to switch between different outfits. The problem will be that the user will still have to find the clothes after picking a set.

The user can also specify the occassion the outfit is for and the AI can then adjust its recommendations based on this. It also means that outfits can be picked ahead of time and then reminded to the user when the time comes.

Clothes shopping can also be built into this if retailers allow their clothes to be able to be downloaded and tried on by users in their own homes. It can then also work out the user’s size and even order the clothes straight away.

As always, if you think you have a solution or if you have a problem you would like me to try to solve then please feel free to comment it.