Human Generated Data

Title

Untitled (men by tracks)

Date

c. 1970

People

Artist: Michael Mathers, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1826

Human Generated Data

Title

Untitled (men by tracks)

People

Artist: Michael Mathers, American born 1945

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.8
Human 99.8
Clothing 99.7
Apparel 99.7
Person 99.7
Person 99.6
Sitting 98.5
Shoe 92.3
Footwear 92.3
Wood 79
Hat 77.2
Sun Hat 71.6
Face 71.5
Furniture 68.1
Pants 65.7
Shoe 63.1
Bench 61.4
Shoe 56.5

Imagga
created on 2022-01-22

musical instrument 35.8
barrel organ 26.4
scholar 20.3
man 18.8
person 18.6
music 18.1
old 17.4
architecture 16.4
intellectual 16.2
people 16.2
sitting 15.4
building 14.4
piano 14.1
industry 13.6
male 13.5
city 13.3
keyboard instrument 12.3
machine 12.1
industrial 11.8
portrait 11.6
history 11.6
vintage 11.6
steel 11.5
black 11.4
adult 11
work 11
statue 10.7
urban 10.5
device 10.4
home 10.4
men 10.3
stringed instrument 9.8
working 9.7
outdoors 9.7
couple 9.6
construction 9.4
house 9.2
percussion instrument 8.9
happy 8.8
sculpture 8.6
smile 8.5
street 8.3
historic 8.2
typesetting machine 8.2
upright 8.1
religion 8.1
job 8
interior 8
bench 7.9
art 7.8
ancient 7.8
travel 7.7
outside 7.7
two 7.6
site 7.5
iron 7.5
occupation 7.3
metal 7.2
landmark 7.2
equipment 7.1
love 7.1
indoors 7

Google
created on 2022-01-22

Photograph 94.2
Human 90.1
Black 89.9
Black-and-white 86.9
Style 84.1
Monochrome 77.2
Monochrome photography 77
Chair 74.9
Snapshot 74.3
Hat 73.7
Sitting 71.7
Window 70.2
Vintage clothing 69
Room 68.6
Plant 68.2
Conversation 67.4
Wood 67.3
Street 65.4
Stock photography 65.3
Boot 63

Microsoft
created on 2022-01-22

person 97.1
black and white 95
clothing 93
man 79
text 66.4

Face analysis

Amazon

Google

AWS Rekognition

Age 57-65
Gender Male, 97.6%
Happy 98.1%
Calm 0.8%
Sad 0.4%
Surprised 0.2%
Confused 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 96.8%
Sad 1.2%
Confused 0.8%
Surprised 0.7%
Angry 0.4%
Fear 0.1%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 92.3%
Bench 61.4%

Captions

Microsoft

a man sitting on a bench 93%
a man that is sitting on a bench 91.2%
a man is sitting on a bench 90.1%

Text analysis

Amazon

NE
RV