Human Generated Data

Title

Untitled (two boys and little girl sitting in living room with birdcage, next to television)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18661

Human Generated Data

Title

Untitled (two boys and little girl sitting in living room with birdcage, next to television)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Screen 99.2
Electronics 99.2
Monitor 99.2
Display 99.2
Human 99
Person 99
Person 98
Person 96.9
Clothing 95.2
Helmet 95.2
Apparel 95.2
Shoe 90.5
Footwear 90.5
TV 90.3
Television 90.3
Face 88
Shoe 79.8
Interior Design 67.8
Indoors 67.8
Portrait 65.9
Photo 65.9
Photography 65.9
People 62.4
Kid 62.1
Child 62.1
Girl 57.7
Female 57.7
Flooring 55.3
Shoe 52.8

Imagga
created on 2022-03-05

musical instrument 85.4
accordion 71.8
keyboard instrument 57.8
wind instrument 49.4
person 16.3
people 16.2
device 14.8
man 13.4
washboard 12.7
style 12.6
adult 12.4
bass 12.1
fashion 12.1
art 11.9
shop 11.7
musician 11.6
black 11.4
old 11.1
male 10.6
music 10.5
culture 10.2
human 9.7
sexy 9.6
city 9.1
portrait 9.1
chair 9
singer 8.9
men 8.6
business 8.5
religion 8.1
history 8
equipment 8
rock 7.8
model 7.8
play 7.7
elegant 7.7
modern 7.7
musical 7.7
brass 7.6
statue 7.6
studio 7.6
sport 7.6
commerce 7.5
one 7.5
symbol 7.4
window 7.3
lady 7.3
decoration 7.2
lifestyle 7.2
work 7.2
interior 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 95.1
black and white 89.4
indoor 88.6
person 77.9
clothing 77
cartoon 75.8
furniture 51.5

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 71%
Happy 57.3%
Calm 33.3%
Disgusted 4.6%
Surprised 1.9%
Angry 0.9%
Sad 0.8%
Confused 0.7%
Fear 0.5%

AWS Rekognition

Age 35-43
Gender Female, 55.4%
Happy 97.6%
Calm 1.2%
Surprised 0.5%
Disgusted 0.2%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Male, 100%
Surprised 48.3%
Calm 45.6%
Happy 4.3%
Disgusted 0.7%
Confused 0.6%
Sad 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Helmet 95.2%
Shoe 90.5%

Captions

Microsoft

a person standing in front of a television 71.9%
a man and a woman taking a selfie in a room 56.1%
a person standing in a room 56%

Text analysis

Amazon

ae
KODVKSEEIA