Human Generated Data

Title

Untitled (man and boy with dead bobcat)

Date

1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18564

Human Generated Data

Title

Untitled (man and boy with dead bobcat)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Human 99.7
Person 99.7
Person 99.2
Person 99
Apparel 97.6
Clothing 97.6
Person 97.5
Face 74.8
Chair 74.5
Furniture 74.5
Leisure Activities 74.2
Advertisement 73
Poster 73
Collage 73
Hat 70.2
Portrait 62.1
Photography 62.1
Photo 62.1
Door 60.3
Room 57.6
Indoors 57.6
Musical Instrument 56.2
Banjo 56.2

Imagga
created on 2022-03-12

robe 65.6
garment 52.5
clothing 41.5
man 22.2
negative 20.3
people 19.5
covering 19
consumer goods 18
film 17.7
adult 17.2
person 15.8
male 15.6
business 12.7
locker 12.5
photographic paper 12.2
human 12
dress 11.7
professional 11.7
portrait 11
face 10.6
medical 10.6
technology 10.4
black 10.3
wedding 10.1
fastener 10
fashion 9.8
bride 9.6
love 9.5
work 9.4
mask 9.4
clothes 9.4
model 9.3
room 9.2
building 9.1
businessman 8.8
couple 8.7
gown 8.6
smile 8.5
worker 8.4
photographic equipment 8.1
device 8.1
medicine 7.9
eyes 7.7
modern 7.7
restraint 7.5
doctor 7.5
happy 7.5
occupation 7.3
20s 7.3
surgeon 7.2
looking 7.2
equipment 7.2
indoors 7

Google
created on 2022-03-12

Joint 97.5
Hand 95.9
Photograph 94.5
White 92.2
Black 90.3
Human 89.4
Sleeve 87.2
Black-and-white 86.7
Standing 86.4
Gesture 85.3
Style 84.3
Monochrome 78.3
Monochrome photography 77.6
Snapshot 74.3
Flash photography 73.8
Room 71.2
Elbow 69.2
Font 68.8
Design 68.4
Stock photography 67.5

Microsoft
created on 2022-03-12

clothing 97
text 96.2
person 90
man 82.5
black and white 78.5
footwear 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 79.3%
Happy 88.9%
Calm 4.4%
Sad 3.6%
Surprised 1.2%
Disgusted 0.7%
Fear 0.6%
Angry 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a person standing in front of a store 52.8%
a group of people standing in a room 52.7%
a group of people standing in front of a store 45.5%

Text analysis

Amazon

-
- -
MJI3
KODAKA
*YT3
*Y
tirw
KODVKA
*YT3 1 A A
kirw
11
1 A A

Google

YT33
KODVKA
Eirw
KODVKa
A°a
KODVKa KODVKA Eirw YT33 A*2 KODVKA YT33 A°a
A*2