8 weeks ago I had an allergy attack that turned into bronchitis. Doc said my cough would linger for several weeks (as has happened in years past). My cough is very scary to anyone who hears it (like a fog horn). Mom says I have had it for 60+ years since I was a baby but even long time friends and family have always been aware of and ‘concerned’ by cough my entire life.
Fast forward to 4 weeks ago and I was around a family member who returned from an out of the country trip and was asymptomatic but tested positive so she chose to wear an N95 mask during our baby handoff meeting.
One week later I was still coughing and was scheduled to spend the day with a long time, very dear (but high anxiety) friend. I decided to take a test so that I could show her I was negative to put her mind at ease and we could enjoy our girls day out - it had been around a year since I last saw her. To my surprise I tested positive and was completely asymptomatic the entire time.
Somewhere around day 10 after testing I noticed my iPhone face recognition started having issues and would only work about 80% of the time. Each day it was getting worse and this morning it would not work at all. I rebooted my phone since it had been a while and rebooting fixes 90%+ of all problems but the face recognition still would not work. I had to reset it back from scratch and now it works 100% of the time.
Has anyone else noticed anything like this? I don’t understand how my face can “change” enough to make the facial recognition stop working.
Facial recognition algorithms supposedly use immutable things, starting with eye distance, mouth center and sides, chin center and cheekbones, as far as can be disregarding fat changes, puffiness or bloating. Supposedly they will do alright despite a black eye, but a change in relative puffiness (either direction, increase our decrease) greater than a certain percent will throw them off, as well as asymmetric bold makeup. Assuming these are not your case, and sunning that the standard algorithms are being employed, there's nothing on or in your face that would mess that up on its own.
Given the hot mess that is today's development environment in woke/slave labor profiteering organizations, however, I would heavily lean toward software changes and embedded incompetence as the Occam's Razor sort of explanation. Noting that the facial recognition systems were already known to be highly racist (California liberals at their best), the outsourcing of more development to cheaper rather than better dev, on top of more attempts at woke or purposeful commie confusion determinism of race and male vs female, I'm amazed if any of it works anymore.
Sorry to say it, but your best bet is probably to go back to a swipe unlock code.
You seem to know a lot about it. Are adam's apples included in face recognition? I wouldn't think so, but it would be helpful if they were .. because, well, you know. Do you know if things like a hat would mess it up at all? Or if the face is partially obscured? I remember seeing a video of a guy who said he showed up somewhere, I forget now, but it must have been some government office. He didn't have his ID and couldn't prove it was him, but he said it took them all of 14 seconds to scan his face and establish that he was who he said he was. He was floored by this and made a video about it. And I remember some athletes at the Olympics in China saying it was used extensively there. It's fascinating stuff, but kind of creepy that's for sure.
Highly creepy , I agree. I know a little, but I don't know specifically what each company is doing right now, just what has been feeding the community knowledge to a point. Adam's apples were not in the original list of things to rely on, as the whole thing started long ago, before the gender agenda was rushed to market. Doesn't mean they're not there now, but they'd be an add-on, which for software means it wouldn't be fundamental to the working of the algorithms.
It's very fast now, but you know that already. Military was concerned about beards and makeup disguises, and I don't know what they decided to do about those full- head latex masks, because only the eye distance would be clear (unless they can work out the nostril interiors or something). I would guess that most advances of the last decade or so would have been pattern recognition from machine learning (popular parlance: "AI"), guided by some people who have worked their way into the cushy gigs. Happily for us, these don't actually have to be the most inventive or creative people, but they do have expensive toys to play with and face matching is essentially a brute-force exercise, so they have and will continue to make headway.
Face gaiters and odd makeup may become more important than we would have thought, as would camera and data/wifi detectors, dare we consider blockers? For health reasons?
I just thought of the ubiquitous covid masks. Didn't even think of that before. I wonder if they hide enough of a face to obscure the person's identity. Probably the systems can recognize a person just from part of their face. I wouldn't be surprised. Kind of makes you wonder what will be happening 50 or 100 years from now.
Thank you for the detail. I just found the timing to be odd and after everything that has happened the past few years, I’m skeptical of coincidences. Just wondered if anyone had experienced anything similar and I get better answers here than any other place. Will assume it was a software issue even though I have not updated recently.