Human Generated Data

Title

Untitled (two men displaying dead deer on trashcan)

Date

1948

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6223

Human Generated Data

Title

Untitled (two men displaying dead deer on trashcan)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6223

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.7
Person 99.7
Clothing 99.5
Apparel 99.5
Person 99.3
Face 91
Sleeve 78.5
Shorts 68.6
Person 68
Beard 66
Portrait 65.5
Photo 65.5
Photography 65.5
Animal 63.9
Mammal 63.9
Pet 63.9
Cat 63.9
Long Sleeve 63.9
Shoe 60.9
Footwear 60.9
Shirt 57.9
Home Decor 57.6
Pants 56.4
Hat 56.1
Urban 55.4
Person 49.3

Clarifai
created on 2023-10-26

people 99.8
canine 99.4
dog 99.3
two 98.5
adult 98.3
man 97.9
wear 97.4
mammal 96.8
monochrome 95.8
one 95.2
actor 93.5
lid 92
bucket 90
veil 89
street 87.6
administration 86.7
three 86.2
portrait 85.3
humor 81.7
outfit 79.4

Imagga
created on 2022-01-22

ashcan 24.6
bin 24.5
container 23.7
man 20.8
device 17.8
cleaner 16
old 14.6
people 14.5
male 14.2
hand 13.7
person 13.5
adult 13.1
men 12
building 10.9
work 10.8
worker 10.6
crutch 10.5
home 10.4
industry 10.2
window 10.1
staff 10
equipment 9.8
city 9.1
business 9.1
industrial 9.1
metal 8.8
fire extinguisher 8.8
urban 8.7
house 8.3
fashion 8.3
safety 8.3
protection 8.2
job 8
interior 8
tool 7.9
indoors 7.9
black 7.8
statue 7.8
travel 7.7
washboard 7.7
holding 7.4
style 7.4
stick 7.2
suit 7.2
religion 7.2
bag 7.1
to 7.1
happiness 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 98.8
person 94.4
man 93.3
clothing 89.1
black and white 83.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 91.2%
Calm 84.6%
Happy 7.8%
Surprised 4.8%
Sad 0.9%
Disgusted 0.6%
Angry 0.5%
Fear 0.4%
Confused 0.4%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Sad 91.1%
Angry 2.2%
Confused 1.9%
Calm 1.5%
Disgusted 1.3%
Happy 0.9%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Cat 63.9%
Shoe 60.9%

Categories

Captions

Microsoft
created on 2022-01-22

a man holding a dog 45.6%
a man standing next to a dog 45.5%
a man holding a gun 45.4%

Text analysis

Amazon

ISS

Google

ISI YT37A2-
YT37A2-
ISI