I used origin arcface model and deepface’s arcface to get identity embedding of aligned images.
But I got completely different scale of values.
Inputs are same but the only different thing on these two way was input shape.
Origin arcface model : (batch, channel, height, width)
Deepface arcface model : (batch, height, width, channel)
Aligned images are normalized like below.
transforms_arcface = transforms.Compose([
transforms.ColorJitter(0.2, 0.2, 0.2, 0.01),
transforms.Resize((224, 224)),
transforms.ToTensor(),
# transforms.Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225))
transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
])
Xs = cv2.imread(source_image_path)[:, :, ::-1]
Xs = Image.fromarray(Xs)
normalized_Xs = transforms_arcface(Xs)
I printed some outputs(embeddings) here.
# origin arcface
# maximum, minimum value of embedding
tensor(2.7417, device='cuda:2') tensor(-2.4630, device='cuda:2')
tensor(2.2528, device='cuda:1') tensor(-2.4806, device='cuda:1')
tensor(2.8164, device='cuda:0') tensor(-3.0586, device='cuda:0')
tensor(2.5641, device='cuda:2') tensor(-2.7087, device='cuda:2')
tensor(3.1357, device='cuda:1') tensor(-3.4846, device='cuda:1')
tensor(3.1438, device='cuda:0') tensor(-2.9450, device='cuda:0')
tensor(3.1075, device='cuda:0') tensor(2.4668, device='cuda:2')
# deepface arcface
# maximum, minimum value of embedding
tensor(0.3724, device='cuda:2') tensor(-0.4499, device='cuda:2')
tensor(0.4816, device='cuda:0') tensor(-0.6993, device='cuda:0')
tensor(0.5832, device='cuda:1') tensor(-0.5441, device='cuda:1')
tensor(0.4039, device='cuda:1') tensor(-0.4289, device='cuda:1')
tensor(0.3976, device='cuda:0') tensor(-0.3404, device='cuda:0')
tensor(0.6162, device='cuda:2') tensor(-0.4228, device='cuda:2')
tensor(0.3019, device='cuda:0') tensor(-0.4458, device='cuda:0')
Like you can see, deepface’s arcface result in very small scale of value so its embedding are useless when i’m trying to get face swapped image from that.
I want to swap source image’s identity to target image by source image’s embedding generated from deepface’s arcface.
How can I get the same embedding as origin one?
I’m trying to get identity embedding from deepface’s arcface model and generate swapped face image. On target image, while keeping attributes of it, generator will change identity part using source’s embedding(that i got from deepface’s arcface)