Human Generated Data

Title

Untitled (man at piano with three wise men behind)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18213

Human Generated Data

Title

Untitled (man at piano with three wise men behind)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Apparel 98.9
Clothing 98.9
Animal 97.7
Horse 97.7
Mammal 97.7
Person 96.5
Human 96.5
Horse 96.1
Horse 90.1
Person 88.2
Hat 82.8
Horse 81.3
Person 79.6
Furniture 71.5
Person 71.3
Overcoat 67.9
Coat 67.9
Portrait 64
Photo 64
Face 64
Photography 64
Table 62.8
Sun Hat 59.9
Suit 59.2
Shorts 58.6
Female 56.1
Person 55

Imagga
created on 2022-03-04

percussion instrument 28.6
statue 27.6
musical instrument 27.3
man 26.9
work 19.7
marimba 19.2
mask 18.3
people 17.8
male 17.7
person 16.5
monument 15.9
architecture 15.6
worker 14.2
sky 13.4
travel 13.4
sculpture 12.8
equipment 12.7
religion 12.5
building 12.5
device 12.3
city 11.6
history 11.6
uniform 11.3
tourist 11
working 10.6
construction 10.3
engineer 10.1
vibraphone 10
tourism 9.9
old 9.7
ancient 9.5
men 9.4
landmark 9
circular saw 8.8
helmet 8.7
industry 8.5
safety 8.3
historic 8.2
occupation 8.2
professional 8.1
adult 8
job 8
soldier 7.8
portrait 7.8
military 7.7
war 7.7
god 7.6
historical 7.5
happy 7.5
religious 7.5
fun 7.5
technology 7.4
machine 7.3
protection 7.3
art 7.3
power saw 7.2
lifestyle 7.2
life 7.2
to 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 97.6
person 93
black and white 90.8
drawing 79.2
text 75.2
clothing 66.2
man 62.8

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 95.7%
Confused 1.8%
Happy 1.3%
Angry 0.4%
Surprised 0.3%
Sad 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 71.2%
Surprised 76.5%
Fear 10.8%
Sad 5.1%
Calm 2.9%
Confused 2.3%
Disgusted 1.1%
Happy 0.7%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Horse 97.7%
Person 96.5%
Hat 82.8%

Captions

Microsoft

a group of people standing next to a truck 65.5%
a group of people on a boat 46.1%
a group of people standing in front of a truck 46%

Text analysis

Amazon

E8
MW1CY

Google

E8
E8