Human Generated Data

Title

Untitled (portrait of a couple)

Date

c. 1940

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1429

Human Generated Data

Title

Untitled (portrait of a couple)

People

Artist: Martin Schweig, American 20th century

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1429

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Hat 99.7
Clothing 99.7
Apparel 99.7
Person 99
Human 99
Person 97.2
Coat 73.1
Cap 69.4
Overcoat 68.8
People 63.5
Baseball Cap 58

Clarifai
created on 2023-10-26

people 99.4
monochrome 99.4
portrait 98.1
man 96.7
two 96.5
woman 96.4
adult 96.2
sepia 95.8
actress 95
retro 93.9
nostalgia 93.6
wear 91.5
facial expression 88.9
group 88.6
three 87.8
veil 87.1
indoors 86.9
fur coat 86.5
family 84.8
actor 84.5

Imagga
created on 2022-01-22

man 40.3
person 39.4
portrait 34.9
male 34.2
adult 29.8
people 27.9
couple 22.6
face 20.6
hat 19.9
senior 18.7
happy 17.5
old 17.4
love 17.4
guy 17.2
attractive 16.1
black 15.7
handsome 15.2
hair 15.1
smile 15
actor 14.6
star 14.4
model 14
suit 13.9
hand 13.7
fashion 13.6
businessman 13.2
mature 13
lifestyle 13
men 12.9
expression 12.8
married 12.5
serious 12.4
sexy 12
looking 12
performer 11.6
family 11.6
elderly 11.5
human 11.2
one 11.2
clothing 11.2
happiness 11
business 10.9
smiling 10.9
pretty 10.5
kin 10.4
professional 10.4
glasses 10.2
confident 10
vintage 9.9
posing 9.8
cheerful 9.8
mother 9.7
success 9.7
look 9.6
groom 9.6
aged 9
retro 9
lady 8.9
together 8.8
sitting 8.6
holding 8.3
husband 8.2
style 8.2
entertainer 8
world 7.9
women 7.9
executive 7.7
youth 7.7
cowboy hat 7.6
studio 7.6
shirt 7.5
closeup 7.4
wedding 7.4
dress 7.2
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 98.5
human face 98.1
fashion accessory 97.6
hat 95.9
text 95.8
clothing 95.7
smile 84.4
fedora 73
people 72.3
cowboy hat 66.2
portrait 65.9
man 59.9
posing 44.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 100%
Calm 99.7%
Confused 0.1%
Sad 0.1%
Angry 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 99.9%
Confused 0.1%
Angry 0%
Surprised 0%
Sad 0%
Fear 0%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.7%
Person 99%

Categories

Imagga

people portraits 96.7%
paintings art 3.1%

Text analysis

Amazon

Studio
Proof
Schweig Studio Proof
Schweig
down
smoorh X down
x
smoorh
X
dending
eurder

Google

Schweig Studio Proof
Schweig
Studio
Proof