Every two seconds someone is having a stroke. That’s the second leading cause of death worldwide and the number one for disabilities. Nearly 70% of all affected people do not recognise the symptoms correctly to get help within the lifesaving six hours‘ span. What they do have is their smartphone with a built-in face recognition camera. A smartphone that they use an average of 250 times a day. So why not use the latest face recognition technology of the Apple iPhone to identify the early warning signals of a stroke? Every time the phone unlocks, the camera scans the users facial feature and reads up to 30,000 infrared dots placed on an in-depth map on his face. Any changes or abnormalities in the movement of over 50 different facial muscles can be recognised in real time. The major symptoms like numbness or the dropping of one side of the face will immediate initiate a stroke warning test. If the user cannot pass the test and even more symptoms like speak disability appear, the phone will directly connect the user to a paramedic, who will then be able to help and use the phone’s GPS data to get to the patient as soon as possible. With our idea „Apple Saving Smiles“ we us face recognition beyond phone security and entertainment for the most valuable thing in our life: our health. This technology will be able to help thousands of people during a stroke by doing what they already do every day: using their phone!
Miami Ad School – Europe
Hinweis: Die Projektbeschreibungen sind Originaltexte der Studenten und wurden wie eingereicht belassen.