Human Generated Data

Title

Untitled (studio portrait of three women with two young boys)

Date

c. 1942

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6631

Human Generated Data

Title

Untitled (studio portrait of three women with two young boys)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

c. 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6631

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.3
Face 96.2
People 90.3
Apparel 87.5
Clothing 87.5
Sitting 74.9
Female 74.5
Hair 71.4
Crowd 69.3
Head 65.2
Photography 65
Portrait 65
Photo 65
Girl 62.9
Family 61
Child 59.2
Kid 59.2
Smile 55.6
Jaw 55.2

Clarifai
created on 2019-03-26

people 99.6
portrait 98.8
group 98.4
man 97.6
facial expression 95.2
child 94.1
adult 93.7
three 92
woman 91.9
group together 91.7
boy 85.1
two 81.5
son 81.5
monochrome 81.1
indoors 80.7
music 79
four 79
five 75.3
wear 73.3
adolescent 70.9

Imagga
created on 2019-03-26

man 32.9
male 29.8
person 29.1
portrait 29.1
face 23.4
people 23.4
mask 23
adult 21.4
black 19.9
clothing 18.4
head 17.6
happy 16.9
human 16.5
cap 15.8
covering 15.4
bathing cap 14.7
headdress 13.9
one 12.7
attractive 12.6
smiling 12.3
smile 11.4
group 11.3
fun 10.5
looking 10.4
world 10.3
kin 10.3
goggles 10.2
expression 10.2
handsome 9.8
sexy 9.6
sunglasses 9.6
hair 9.5
men 9.4
happiness 9.4
child 9.3
dark 9.2
hat 9.1
pretty 9.1
spectator 8.9
look 8.7
couple 8.7
women 8.7
boy 8.7
disguise 8.6
close 8.5
model 8.5
senior 8.4
old 8.3
joy 8.3
alone 8.2
lady 8.1
guy 8.1
together 7.9
friends 7.5
nice 7.3
helmet 7.2
cute 7.2
cool 7.1

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

person 90.4
posing 67.6
music 18.7
black and white 15.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 10-15
Gender Female, 54.5%
Happy 18.2%
Disgusted 0.9%
Sad 3.3%
Angry 1.1%
Confused 0.9%
Surprised 3.3%
Calm 72.3%

AWS Rekognition

Age 10-15
Gender Male, 55.6%
Disgusted 0.2%
Sad 1.9%
Surprised 1%
Calm 11.9%
Confused 0.5%
Angry 0.5%
Happy 84%

AWS Rekognition

Age 26-43
Gender Male, 85.4%
Surprised 4.1%
Sad 56.9%
Calm 23.9%
Confused 2.7%
Disgusted 2.6%
Happy 5.6%
Angry 4.3%

AWS Rekognition

Age 29-45
Gender Male, 77%
Happy 54.5%
Sad 4.4%
Calm 29.6%
Angry 2.5%
Confused 2%
Disgusted 1.6%
Surprised 5.3%

AWS Rekognition

Age 23-38
Gender Male, 60.6%
Disgusted 2.3%
Sad 9.3%
Angry 4%
Happy 14.9%
Calm 60.9%
Confused 2%
Surprised 6.6%

Feature analysis

Amazon

Person 99.8%