posted on Apr, 28 2024 @ 04:40 PM
a reply to:
Therealbeverage
Technology can be used for a variety of purposes, some good, some bad. Explosives can blast rock to construct a foundation for a building, or it can
be used to kill people.
My take on this study is the authors are actually trying to push technology away from this direction, by pointing out this danger to privacy and
encouraging legislation to be developed to protect facial recognition privacy, maybe something along the lines of how HIPPA works for medical
information privacy.
But if this really works (and maybe congress believes it does, though that doesn't mean it does), people will use facial recognition to advance their
power. What is it powerful people want? More power, and if facial tech can help them get it this way by figuring out how to target their election ads
or messages, why wouldn't they use it this way? They can still use AI all the other ways you mentioned.
So the authors may have a political agenda for more data privacy. I haven't seen how robust the data is yet, but I did look at the images, did you see
those? I can hardly see a difference between the liberal and conservative side.
Also, people change their voting patterns, from conservative to liberal or vice versa, and we don't expect their face changes when they do that,
right? So the statistical significance can't indicate a perfect predictor, and maybe it's not that great after all, but if they can use the research
to push for better privacy laws anyway, that may not be a bad thing.