Human Generated Data

Title

Untitled (two men displaying dead deer on ground)

Date

1948

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6222

Human Generated Data

Title

Untitled (two men displaying dead deer on ground)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 98.2
Animal 97.5
Dog 97.5
Canine 97.5
Pet 97.5
Mammal 97.5
Deer 90.8
Wildlife 90.8
Clothing 80.6
Sleeve 80.6
Apparel 80.6
Tarmac 74
Asphalt 74
Coat 72.9
Wolf 59.2
Red Wolf 59.2
Elk 55.2

Imagga
created on 2022-01-22

man 45
person 37.6
male 32.6
people 26.8
laptop 26.4
adult 24.2
sitting 23.2
working 23
patient 22.8
happy 21.9
computer 20.9
professional 20.4
home 19.9
business 18.8
smiling 18.8
indoors 18.4
worker 17.3
casual 16.9
job 16.8
lifestyle 15.9
businessman 15.9
office 15.9
case 15.8
table 15.6
work 14.9
portrait 14.9
sick person 14.8
senior 14.1
men 13.7
smile 13.5
house 12.5
technology 11.9
communication 11.8
handsome 11.6
room 11.4
looking 11.2
mature 11.2
occupation 11
cadaver 10.4
chair 10.3
notebook 10.2
cheerful 9.7
elderly 9.6
teacher 9.4
camera 9.2
student 9.1
fun 9
one 9
day 8.6
happiness 8.6
talking 8.6
shirt 8.4
horizontal 8.4
outdoors 8.3
team 8.1
women 7.9
together 7.9
boy 7.8
clothing 7.8
desk 7.6
reading 7.6
businesspeople 7.6
meeting 7.5
20s 7.3
school 7.2
kitchen 7.2
hair 7.1
uniform 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 96.9
black and white 91.5
outdoor 85.9
street 83.9
person 51.7

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Surprised 55.5%
Happy 13.4%
Confused 8.3%
Sad 6.6%
Calm 5.8%
Disgusted 4.3%
Fear 3.7%
Angry 2.5%

AWS Rekognition

Age 30-40
Gender Male, 100%
Sad 57.9%
Calm 26.5%
Angry 4.2%
Disgusted 3%
Surprised 2.6%
Confused 2.2%
Happy 2.1%
Fear 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Dog 97.5%

Captions

Microsoft

a man riding on the back of a sheep 51.6%
a man holding a dog 47.2%
a man sitting in front of a window 47.1%

Text analysis

Amazon

JSJ
KODOK-SVEELA