Firewaller Posted November 16, 2022 Posted November 16, 2022 Image via Intel Intel is on the hunt for deepfakes, and its strategy to catch them is by using... blood. As improbable as it may sound, the company provides solid reasoning for using our life source to detect fake impersonations with its new FakeCatcher. Deepfakes are images and videos created to imitate another person by superimposing a face or voice–usually that of a celebrity or politician–onto a real person to bring these phony renders to life. As you can imagine, such technology has oftentimes landed in the wrong hands, with the FBI warning that malicious parties have been faking their way up job interviews to eventually obtain confidential info from companies. Some women have also been implicated in doctored footage that depicts them in compromising positions, against their knowledge or consent. However, Intel points out that the technology can be used for good, such as concealing one’s identity. FakeCatcher is a new system that studies blood flow, among other things, in videos of people to determine if they are real or altered in real-time. The tech firm describes using photoplethysmography (PPG) to map out the changes in color in a person’s veins, and then feeding the data into a deep-learning model to gather the true nature of the video. The program is built on Intel’s hardware and software, which run on web-based platforms, and its third-generation Xeon scalable processor. The company had also tapped into OpenVino to deploy optimized computer vision blocks and run AI models for face detection. FakeCatcher apparently yielded a 96% accuracy rate when put to the test. Have a look at the video below to watch how FakeCatcher works. [via PC Gamer and AI Business, cover image via Intel]View the full article
Recommended Posts
Archived
This topic is now archived and is closed to further replies.