Human Generated Data

Title

Untitled (man at piano outside, looking up)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18214

Human Generated Data

Title

Untitled (man at piano outside, looking up)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 98.7
Human 98.7
Clothing 95.8
Apparel 95.8
Accessories 94
Accessory 94
Sunglasses 94
Animal 87.5
Horse 87.5
Mammal 87.5
Horse 86.8
Furniture 86.6
Wood 83.4
Person 82
Person 75.9
Table 71.6
Person 62.9
Helmet 59.9
Horse 58.7
Astronaut 56.7
Carpenter 55.7

Imagga
created on 2022-03-04

man 43.6
male 36.1
musical instrument 35.4
percussion instrument 31.5
marimba 29.4
worker 26.8
person 26.1
work 23.6
people 22.8
uniform 18.4
job 17.7
industry 17.1
construction 16.2
building 16.2
professional 15.4
adult 14.7
occupation 14.7
men 14.6
mask 14.2
business 14
engineer 13.4
working 13.2
equipment 12.8
industrial 12.7
device 11.5
safety 11
helmet 10.7
builder 10.6
portrait 10.3
protection 10
contractor 9.7
businessman 9.7
indoors 9.7
site 9.4
machine 9.1
team 8.9
soldier 8.8
home 8.8
happy 8.8
manager 8.4
hand 8.3
circular saw 8.3
clothing 8.2
hat 8.2
kitchen 8
handsome 8
looking 8
boy 7.8
education 7.8
table 7.8
teacher 7.8
military 7.7
office 7.7
development 7.6
room 7.6
steel 7.5
house 7.5
outdoors 7.5
smiling 7.2
black 7.2
statue 7.1
workshop 7.1
face 7.1
architecture 7

Microsoft
created on 2022-03-04

text 95.7
clothing 91.4
outdoor 90.8
person 89.3
drawing 86.9
black and white 81.1
man 74.8
hat 64.3

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.2%
Calm 98.9%
Confused 0.3%
Happy 0.2%
Surprised 0.2%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Horse 87.5%

Captions

Microsoft

a person standing next to a boat 49.3%
a person standing on a boat 44%

Text analysis

Amazon

EB
NACOT

Google