Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2776

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2776

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Face 100
Head 100
Photography 100
Portrait 100
Body Part 99.8
Finger 99.8
Hand 99.8
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.3
Person 97.6
Male 97.6
Boy 97.6
Child 97.6
Car 92.5
Transportation 92.5
Vehicle 92.5
People 89.1
Wood 81.8
Machine 68.8
Wheel 68.8
Firearm 68.2
Weapon 68.2
Outdoors 61.3
License Plate 56
Gun 55.8
Rifle 55.8
Clothing 55.8
Coat 55.8
Shorts 55.3

Clarifai
created on 2018-05-10

people 99.9
group 99.1
group together 97.6
adult 97.5
man 97.4
three 95.1
woman 95
portrait 94
two 92
several 91.3
child 90.9
four 90.8
administration 88.9
five 85.4
wear 85.2
war 84.9
elderly 82.5
facial expression 80
veil 79.8
recreation 79.4

Imagga
created on 2023-10-07

kin 55.1
senior 37.5
man 30.2
old 29.9
people 29
couple 27.9
elderly 27.8
male 26.3
retired 26.2
happy 24.4
person 24.1
home 23.9
grandfather 23
retirement 22.1
mature 21.4
love 20.5
family 20.5
outdoors 19.4
grandma 18.8
together 18.4
husband 18.1
wife 18
smile 17.1
smiling 16.6
lifestyle 15.9
park 15.6
portrait 15.5
married 15.3
adult 15
happiness 14.9
camera 14.8
70s 13.8
active 13.5
child 13.4
mother 13.1
sibling 13
statue 12.8
pensioner 12.7
fun 12
hair 11.9
women 11.9
aged 11.8
older 11.7
sculpture 11.6
enjoying 11.4
two 11
leisure 10.8
face 10.7
sitting 10.3
parent 10.3
seniors 9.8
grandmother 9.8
lady 9.7
affection 9.7
world 9.6
casual 9.3
girls 9.1
gray 9
blond 9
looking 8.8
ancient 8.6
men 8.6
marriage 8.5
head 8.4
glasses 8.3
playing 8.2
nursing home 7.9
indoors 7.9
gray hair 7.9
day 7.8
architecture 7.8
aging 7.7
laughing 7.6
friends 7.5
summer 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.7
outdoor 96.1
old 80.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 99.8%
Calm 98.3%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0.6%
Happy 0.3%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 14-22
Gender Male, 98%
Happy 59.7%
Sad 34.3%
Fear 7.5%
Surprised 6.5%
Confused 5.4%
Calm 4%
Angry 2.9%
Disgusted 1.4%

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Male 99.5%
Man 99.5%
Boy 97.6%
Child 97.6%
Car 92.5%
Wheel 68.8%

Categories

Imagga

people portraits 72.8%
paintings art 25.3%

Captions

Microsoft
created on 2018-05-10

an old photo of a man 92.7%
old photo of a man 90.6%
an old photo of a boy 74.5%

Text analysis

Amazon

H41763