Human Generated Data

Title

Untitled (studio portrait of family with six children)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6087

Human Generated Data

Title

Untitled (studio portrait of family with six children)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6087

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.7
Human 99.7
Person 99.7
Interior Design 99.6
Indoors 99.6
Person 98.4
Person 98.3
Person 97.9
Person 96.8
Person 95.9
Person 94.2
Advertisement 92.6
Collage 91.7
Room 86.9
Person 77.5
Electronics 76.2
Screen 76.2
Living Room 74.3
Person 73
Home Decor 67.2
Display 62.8
Monitor 62.8
People 61.7
Canvas 57.5
Tripod 56.6
Furniture 55.8
Poster 51.8

Clarifai
created on 2019-11-16

people 100
group 99.4
adult 99
man 98.2
woman 96.6
room 96.4
furniture 95.7
group together 93.4
leader 92.4
administration 91.4
home 90.7
two 90.6
many 90.5
indoors 90.4
wear 90.3
several 89.3
music 88.3
child 88.3
outfit 86.9
three 85.8

Imagga
created on 2019-11-16

window 27.9
groom 27.1
case 18.5
architecture 15.6
man 15.4
old 15.3
people 15.1
home 14.3
house 14.2
interior 14.1
building 14.1
black 13.8
wall 13.7
glass 13.2
decoration 13.1
shop 13
room 12.9
bride 12.5
person 12.1
wedding 11.9
door 11.9
light 11.4
love 11
chair 10.9
indoors 10.5
couple 10.4
happiness 10.2
male 10
city 10
ancient 9.5
vintage 9.2
adult 9.1
dress 9
family 8.9
table 8.7
holiday 8.6
two 8.5
worker 8.5
furniture 8.4
dark 8.3
church 8.3
style 8.2
happy 8.1
working 8
entrance 7.7
bouquet 7.5
retro 7.4
mercantile establishment 7.3
detail 7.2
portrait 7.1
night 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97.3
clothing 93.1
person 91.9
old 79
woman 68.1
dress 63.6
picture frame 55
posing 40

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 17-29
Gender Female, 54.8%
Sad 45%
Fear 45%
Angry 45%
Surprised 45%
Confused 45%
Calm 54.9%
Happy 45%
Disgusted 45%

AWS Rekognition

Age 5-15
Gender Female, 54.8%
Calm 50.5%
Angry 45.3%
Disgusted 45.1%
Happy 45%
Sad 46.9%
Confused 45.6%
Fear 46.3%
Surprised 45.4%

AWS Rekognition

Age 22-34
Gender Male, 54.7%
Surprised 45%
Disgusted 45%
Happy 45%
Sad 45%
Calm 54%
Confused 45%
Fear 45%
Angry 45.8%

AWS Rekognition

Age 8-18
Gender Female, 54.6%
Angry 45%
Calm 54.9%
Surprised 45%
Confused 45%
Disgusted 45%
Happy 45%
Sad 45%
Fear 45%

AWS Rekognition

Age 17-29
Gender Female, 54.3%
Disgusted 45%
Sad 45.7%
Confused 45.1%
Happy 45.1%
Fear 45.1%
Surprised 45%
Calm 53.8%
Angry 45.1%

AWS Rekognition

Age 2-8
Gender Female, 50.3%
Disgusted 45%
Happy 45%
Fear 45.3%
Confused 45.1%
Surprised 45.7%
Calm 51.7%
Angry 47%
Sad 45.3%

AWS Rekognition

Age 2-8
Gender Male, 52.8%
Fear 45%
Calm 53.5%
Disgusted 45%
Happy 45%
Sad 45%
Confused 45.1%
Surprised 46.3%
Angry 45.1%

AWS Rekognition

Age 2-8
Gender Male, 51.6%
Happy 45%
Angry 45%
Disgusted 45%
Confused 45%
Sad 54.8%
Calm 45.1%
Surprised 45%
Fear 45%

AWS Rekognition

Age 7-17
Gender Male, 50.2%
Disgusted 45%
Fear 45%
Confused 50.6%
Calm 48.3%
Surprised 45.1%
Angry 45.5%
Sad 45.2%
Happy 45.2%

AWS Rekognition

Age 12-22
Gender Female, 52.4%
Confused 45%
Surprised 45.1%
Fear 45.3%
Happy 45%
Disgusted 45%
Angry 54.5%
Sad 45%
Calm 45%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Poster 51.8%