Human Generated Data

Title

Untitled (man and woman)

Date

1930s

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.715

Human Generated Data

Title

Untitled (man and woman)

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.715

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.7
Clothing 89
Apparel 89
Face 88.5
People 81.7
Man 69.4
Portrait 61.1
Photography 61.1
Photo 61.1
Finger 60.8

Clarifai
created on 2023-10-26

portrait 99.9
people 99.9
two 98.8
man 98.6
adult 98.6
group 97.4
three 95.9
facial expression 95.6
woman 94.1
monochrome 92.7
group together 91.6
family 89.1
music 88.8
actor 88.4
leader 87.5
administration 87.1
documentary 84.4
four 84.1
wear 83.2
street 82

Imagga
created on 2022-01-22

man 43.7
male 34.2
people 32.9
person 31.1
portrait 30.4
adult 27.8
happy 23.8
couple 22.6
senior 22.5
family 19.6
elderly 18.2
smile 17.8
old 17.4
love 17.4
mature 16.7
suit 16.3
smiling 15.9
face 15.6
mother 15.5
men 15.5
handsome 15.1
bow tie 14.8
business 14.6
world 14.2
guy 13.6
casual 13.6
one 13.4
kin 13.3
happiness 13.3
businessman 13.2
lifestyle 13
sibling 12.3
looking 12
child 11.9
professional 11.9
hand 11.4
tie 11.4
clothing 11.3
boy 11.3
human 11.2
relationship 11.2
home 11.2
jacket 11
necktie 10.6
cheerful 10.6
together 10.5
attractive 10.5
office 10.4
black 10.2
aged 10
fashion 9.8
standing 9.6
executive 9.6
hair 9.5
women 9.5
princess 9.5
uniform 8.9
husband 8.9
older 8.7
married 8.6
aristocrat 8.6
sitting 8.6
expression 8.5
success 8
military uniform 7.9
model 7.8
retired 7.8
modern 7.7
serious 7.6
wife 7.6
meeting 7.5
joy 7.5
holding 7.4
parent 7.4
grandfather 7.3
occupation 7.3
alone 7.3
private 7.1
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

human face 99.3
person 98.8
text 97.5
clothing 96.9
smile 92.6
man 76.9
portrait 68.9
old 55.9
posing 50.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-59
Gender Female, 98.6%
Sad 77.8%
Calm 20.9%
Confused 0.4%
Surprised 0.4%
Angry 0.3%
Disgusted 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Calm 95.9%
Happy 2.4%
Confused 0.5%
Sad 0.4%
Angry 0.2%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%

Microsoft Cognitive Services

Age 57
Gender Male

Microsoft Cognitive Services

Age 49
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

people portraits 98.5%
paintings art 1.3%