Human Generated Data

Title

Untitled (men standing around hanging dead deer)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14382

Human Generated Data

Title

Untitled (men standing around hanging dead deer)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14382

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.7
Human 99.7
Clothing 97.8
Apparel 97.8
Person 97.6
Person 97.6
Person 96.8
Person 96.6
Wheel 90.8
Machine 90.8
Car 85
Automobile 85
Vehicle 85
Transportation 85
Tire 78.6
Outdoors 74.5
Nature 73
Person 72.3
Plant 72.1
Meal 66.2
Food 66.2
Spoke 65.9
Tree 64.6
Photography 60.6
Photo 60.6
People 57.4
Car Wheel 56.6

Clarifai
created on 2023-10-27

people 99.8
adult 97.9
man 96.5
group 94.5
woman 92.9
group together 92.8
monochrome 91.1
vehicle 89.3
wedding 85.6
administration 84.4
wear 82.9
family 80.1
several 78.8
war 78.7
child 78.5
three 78.4
many 78.4
bride 77.9
leader 73.6
military 70.7

Imagga
created on 2022-01-29

picket fence 53
fence 44.6
barrier 31.8
structure 24
gravestone 21.8
obstruction 21.3
landscape 19.3
sky 19.1
old 18.8
rural 18.5
memorial 17.3
tree 16.9
cemetery 16.8
stone 16.2
winter 16.2
field 15.9
snow 15.3
farm 14.3
negative 13.3
vintage 13.2
grass 12.6
fog 11.6
outdoor 11.5
country 11.4
forest 11.3
scene 11.2
outdoors 11.2
cold 11.2
season 10.9
dark 10.9
trees 10.7
hay 10.6
vehicle 10.3
countryside 10
road 9.9
film 9.9
outside 9.4
clouds 9.3
travel 9.2
ice 9.1
morning 9
scenery 9
car 8.8
autumn 8.8
scenic 8.8
antique 8.7
ancient 8.6
black 8.4
park 8.2
environment 8.2
machine 8.2
danger 8.2
sun 8
history 8
agriculture 7.9
adult 7.8
cloud 7.7
wood 7.5
man 7.4
industrial 7.3
dirty 7.2
sunset 7.2
transportation 7.2
religion 7.2
holiday 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.5
outdoor 95.4
black and white 74.9
clothing 74.6
person 73.9
wedding dress 72.1
old 62.7
wheel 50.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.3%
Happy 77.4%
Calm 15.3%
Confused 2.6%
Fear 1.1%
Sad 1.1%
Angry 1%
Surprised 0.9%
Disgusted 0.7%

AWS Rekognition

Age 45-53
Gender Male, 98.8%
Happy 68.1%
Calm 14.7%
Confused 7%
Sad 5.8%
Surprised 1.3%
Angry 1.3%
Disgusted 1.2%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Car
Person 99.7%
Person 97.6%
Person 97.6%
Person 96.8%
Person 96.6%
Person 72.3%
Wheel 90.8%
Car 85%

Categories

Text analysis

Amazon

MJIR
MJIR YE37A2 ARDA
ARDA
YE37A2

Google

MJ17 YT37A2 A A
MJ17
YT37A2
A