Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2708

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled ("Hooverville," Circleville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2708

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Person 98.2
Boy 98.2
Child 98.2
Male 98.2
Person 98.2
Male 98.2
Adult 98.2
Man 98.2
Furniture 85.8
Smoke 79.9
Chair 74.3
Body Part 61
Finger 61
Hand 61
Dining Table 56.5
Table 56.5
Reading 55.6
Neck 55.5
Architecture 55.5
Building 55.5
Dining Room 55.5
Indoors 55.5
Room 55.5
Sitting 55.4
Armchair 55.2
Smoking 55

Clarifai
created on 2018-05-10

people 100
adult 98.3
two 97.6
group 97.4
man 96
one 95.7
child 94.6
three 92.7
administration 92.1
room 91.8
portrait 91.6
woman 91.2
sit 90.9
four 90.7
furniture 89.8
group together 89.4
indoors 88
actor 87.8
wear 85.1
facial expression 82.7

Imagga
created on 2023-10-06

call 28.6
punching bag 28.1
adult 26.5
man 25.5
people 23.4
person 21.6
male 20.8
portrait 20.7
black 18.3
hair 18.2
couple 17.4
game equipment 16.8
sexy 16.1
love 15.8
human 15.7
lifestyle 15.2
face 14.9
happy 14.4
pretty 14
attractive 14
hairdresser 13.5
sitting 12.9
child 12.6
equipment 12.3
fashion 12.1
expression 11.9
sensuality 11.8
posing 11.6
one 11.2
body 11.2
women 11.1
smile 10.7
hand 10.6
happiness 10.2
model 10.1
sensual 10
looking 9.6
home 9.6
smiling 9.4
two 9.3
room 9.1
fun 9
together 8.8
casual 8.5
skin 8.5
head 8.4
relaxation 8.4
dark 8.4
girls 8.2
device 8.2
style 8.2
lady 8.1
indoors 7.9
emotion 7.4
clothing 7.3
alone 7.3
glass 7.3
dress 7.2
family 7.1
kid 7.1
interior 7.1
look 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 88.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-14
Gender Male, 99.5%
Sad 99.1%
Fear 18.8%
Calm 11.4%
Surprised 6.4%
Angry 5.8%
Confused 2.1%
Happy 0.7%
Disgusted 0.7%

AWS Rekognition

Age 7-17
Gender Male, 99.5%
Sad 100%
Surprised 6.3%
Fear 6.3%
Angry 0.8%
Calm 0.2%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%

Feature analysis

Amazon

Person 98.2%
Boy 98.2%
Child 98.2%
Male 98.2%
Adult 98.2%
Man 98.2%

Categories