Human Generated Data

Title

Untitled (Maynardville, Tennessee)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3367

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Maynardville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3367

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Cap 100
Baseball Cap 100
Coat 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Face 95.6
Head 95.6
Photography 95.6
Portrait 95.6
Jacket 94.5
Hat 77
Pants 57.5
Sun Hat 57.5
Body Part 56.5
Finger 56.5
Hand 56.5
Vest 55.1

Clarifai
created on 2018-05-10

people 99.7
man 99
military 97.6
adult 97.5
portrait 97.4
lid 94.7
war 94.7
uniform 94.5
two 94.4
monochrome 93.5
soldier 93.2
administration 90.6
wear 89.6
army 88.9
cap 88.8
actor 86
military uniform 85.8
retro 84.2
veil 82.5
three 81.5

Imagga
created on 2023-10-05

man 47.7
male 44.7
military uniform 44.5
uniform 42.7
person 42.1
hat 37.4
senior 35.6
clothing 32.3
portrait 29.1
adult 29.1
people 27.9
elderly 26.8
old 25.8
outdoors 23.9
mature 22.3
face 21.3
men 19.7
consumer goods 19.2
covering 19.2
happy 18.8
washboard 18.4
grandfather 16
looking 16
couple 15.7
standing 15.6
casual 15.2
handsome 15.1
shirt 14.9
device 14.9
aged 14.5
guy 14
old-timer 13.7
together 13.1
smile 12.8
beard 12.7
hair 12.7
lifestyle 12.3
engineer 12.3
hand 12.1
happiness 11.7
older 11.6
retired 11.6
smiling 11.6
job 11.5
attractive 11.2
oriental 11.1
industry 11.1
two 11
retirement 10.6
scholar 10.4
model 10.1
head 10.1
human 9.7
one 9.7
construction 9.4
outside 9.4
expression 9.4
glasses 9.3
intellectual 9.2
occupation 9.2
holding 9.1
cowboy 9
active 9
activity 9
cheerful 8.9
farmer 8.8
commodity 8.6
age 8.6
leisure 8.3
park 8.2
protection 8.2
style 8.2
worker 8.1
look 7.9
70s 7.9
grandmother 7.8
western 7.7
married 7.7
serious 7.6
wearing 7.6
professional 7.6
lady 7.3
pose 7.2
gray 7.2
private 7.2
women 7.1
posing 7.1
work 7.1

Microsoft
created on 2018-05-10

outdoor 99.1
person 97.9
man 91.2
military uniform 86
people 70.3
old 57.9
older 20.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 88.2%
Confused 8.5%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 1.7%
Disgusted 0.5%
Happy 0.1%

AWS Rekognition

Age 20-28
Gender Male, 92.4%
Calm 98.1%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.8%
Angry 0.4%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.8%
Calm 84.7%
Surprised 9.2%
Fear 6.1%
Confused 3.4%
Angry 3.2%
Sad 3%
Disgusted 0.5%
Happy 0.3%

Microsoft Cognitive Services

Age 16
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Jacket 94.5%
Hat 77%

Text analysis

Amazon

MILDEN