Human Generated Data

Title

Untitled (West Memphis, Arkansas)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1976.40

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (West Memphis, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1976.40

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Clothing 93.7
Hat 93.7
Face 91.5
Head 91.5
Musical Instrument 91.3
Car 89.3
Transportation 89.3
Vehicle 89.3
Guitar 86.4
Machine 79.9
Wheel 79.9
Violin 64
Coat 57.7
Cello 57.2
Guitarist 57.2
Leisure Activities 57.2
Music 57.2
Musician 57.2
Performer 57.2
Photography 56.9
Portrait 56.9
Sun Hat 55.7

Clarifai
created on 2018-05-10

people 99.9
group 99
group together 98.6
adult 98.5
man 97.8
administration 96.7
music 96.3
musician 95.7
two 95.4
leader 95.1
vehicle 94.9
wear 94.4
several 93.9
monochrome 93.7
three 92.8
military 91.1
four 90
outfit 89
portrait 88.4
instrument 88

Imagga
created on 2023-10-06

mask 52
man 49.1
male 37.6
person 35.2
adult 28.5
portrait 27.2
people 22.9
protective covering 20.4
danger 20
face 19.9
safety 19.3
looking 17.6
covering 17.4
black 17.4
military 17.4
protection 17.3
human 17.2
one 16.4
suit 16.4
men 16.3
professional 15.3
oxygen mask 15
security 14.7
business 14.6
war 14.4
device 14.2
goggles 14.2
businessman 14.1
guy 13
toxic 12.7
attractive 12.6
uniform 12.6
gas 12.5
pollution 12.5
holding 12.4
breathing device 12.4
soldier 11.7
protective 11.7
weapon 11.7
equipment 11.7
dangerous 11.4
smile 11.4
happy 11.3
gun 11
work 11
hand 10.8
expression 10.2
model 10.1
modern 9.8
handsome 9.8
radiation 9.8
criminal 9.8
army 9.7
driver 9.7
nuclear 9.7
aviator 9.6
sunglasses 9.2
air 9.2
ecology 9.2
sport 9.2
studio 9.1
respirator 9.1
fashion 9
photographer 9
technology 8.9
camouflage 8.8
disaster 8.8
warrior 8.8
chemical 8.7
tie 8.5
clothing 8.5
shirt 8.5
dark 8.4
cool 8
conceptual 7.9
love 7.9
forces 7.9
armed 7.9
radioactive 7.9
couple 7.8
standing 7.8
crime 7.8
car 7.6
casual 7.6
wearing 7.6
head 7.6
environment 7.4
glasses 7.4
happiness 7.1
look 7

Microsoft
created on 2018-05-10

person 97.5
man 95
old 87
bowed instrument 80.5
white 70.3
suit 65.3
vintage 36
cello 22.6
bass fiddle 13.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Male, 99.2%
Sad 99.9%
Calm 17.7%
Surprised 6.5%
Fear 6.1%
Angry 0.9%
Disgusted 0.5%
Happy 0.3%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Female, 56.4%
Calm 98.6%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 18-26
Gender Male, 96.6%
Surprised 80.4%
Calm 43.2%
Fear 6.8%
Sad 2.5%
Happy 2%
Confused 1.6%
Disgusted 1.2%
Angry 0.7%

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 43
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Car 89.3%
Wheel 79.9%

Text analysis

Amazon

CURB
1
ERVIC
but