Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

Date

July 1938-August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3609

Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3609

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Body Part 99.9
Finger 99.9
Hand 99.9
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 98.4
Male 98.4
Boy 98.4
Child 98.4
Car 96.9
Transportation 96.9
Vehicle 96.9
Wood 94.9
Clothing 94.3
Coat 94.3
People 89
Machine 76.8
Wheel 76.8
Firearm 63
Weapon 63
Hat 57.9
Beard 57.2
Cap 56.8
License Plate 56.3
Carpenter 55.9
Gun 55.6
Rifle 55.6
Spoke 55.5
Jacket 55.5
Outdoors 55.4

Clarifai
created on 2018-05-10

people 99.9
group 99.7
group together 98.2
adult 97.8
man 97.5
four 95.8
three 95.8
portrait 95.4
woman 94.2
several 94.1
five 92.3
child 91.8
elderly 89.4
facial expression 87.8
wear 87.2
two 86.6
administration 85.1
veil 84.6
many 84.2
recreation 82.8

Imagga
created on 2023-10-06

grandfather 38.5
old 33.4
senior 31.9
man 28.2
grandma 27.1
elderly 24.9
people 24.5
statue 24.1
kin 23.9
male 23.4
retired 23.3
portrait 20.7
retirement 20.2
person 20.1
mature 19.5
couple 19.2
home 19.1
happy 18.8
pensioner 16.6
sculpture 16.5
adult 16.3
love 15.8
camera 15.7
face 13.5
married 13.4
wife 13.3
70s 12.8
park 12.5
husband 12.4
outdoors 11.9
happiness 11.8
smiling 11.6
together 11.4
mother 11.3
sitting 11.2
world 11.1
aged 10.9
family 10.7
ancient 10.4
casual 10.2
smile 10
religion 9.9
art 9.8
seniors 9.8
grandmother 9.8
lady 9.7
older 9.7
one 9.7
looking 9.6
lifestyle 9.4
architecture 9.4
parent 9.3
head 9.2
sibling 9.2
leisure 9.1
hair 8.7
marriage 8.5
stone 8.4
outdoor 8.4
glasses 8.3
building 8.2
closeup 8.1
history 8
half length 7.8
aging 7.7
age 7.6
hand 7.6
enjoying 7.6
laughing 7.6
horizontal 7.5
enjoyment 7.5
monument 7.5
vintage 7.4
historic 7.3
gray 7.2
active 7.2
women 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.5
outdoor 97.5
text 96.5
man 90.9
old 69.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Male, 99.8%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 0.3%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 16-22
Gender Male, 94.1%
Happy 64.9%
Fear 9%
Calm 8.5%
Sad 7.6%
Surprised 6.9%
Confused 4%
Angry 3.4%
Disgusted 2%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%
Boy 98.4%
Child 98.4%
Car 96.9%
Wheel 76.8%

Categories

Imagga

people portraits 91.7%
paintings art 4.7%

Captions

Microsoft
created on 2018-05-10

a vintage photo of a man 93.3%
an old photo of a man 93.2%
old photo of a man 91.7%

Text analysis

Amazon

N41763

Google

N476
N476