Human Generated Data

Title

Untitled (Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1603

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1603

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Architecture 99.4
Building 99.4
House 99.4
Housing 99.4
Staircase 99.4
Clothing 98.4
Pants 98.4
Person 98.1
Wood 86.8
Sitting 86.7
Furniture 74.5
Door 72.7
Chair 57.9
People 57.2
Art 56.9
Painting 56.9
Blouse 56.5
Animal 56
Cat 56
Mammal 56
Pet 56
Dress 55.7
Lady 55.7
Happy 55.7
Smile 55.7
Outdoors 55.5

Clarifai
created on 2018-05-11

people 100
adult 99.8
two 99.2
sit 99.1
woman 98.7
portrait 98.2
man 98.1
furniture 97.7
one 97.6
administration 97.2
seat 96.8
group 96.8
sitting 96.2
wear 95.8
indoors 95.5
room 94.8
three 94.3
child 91.3
medical practitioner 90.8
music 90.1

Imagga
created on 2023-10-06

adult 27.2
person 24.3
people 22.3
man 21.5
male 19.7
portrait 18.1
hair 17.4
fashion 17.3
attractive 16.8
black 16.4
mother 15.1
pretty 14.7
sexy 14.5
dress 14.5
model 14
happy 13.8
human 13.5
face 13.5
old 12.5
child 12.3
parent 12.1
lady 11.4
one 11.2
posing 10.7
couple 10.5
window 10.3
room 10.3
youth 10.2
smiling 10.1
cute 10
sensuality 10
smile 10
vintage 9.9
clothing 9.8
home 9.6
women 9.5
love 9.5
sitting 9.4
happiness 9.4
city 9.1
alone 9.1
make 9.1
brunette 8.7
ancient 8.7
relationship 8.4
elegance 8.4
house 8.4
makeup 8.2
teenager 8.2
family 8
father 8
lifestyle 8
scholar 7.9
urban 7.9
sepia 7.8
dad 7.7
world 7.6
two 7.6
teen 7.4
body 7.2
art 7.2
romantic 7.1
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 97
outdoor 92.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 100%
Happy 71.5%
Calm 16.1%
Surprised 9.8%
Fear 6.2%
Disgusted 2.6%
Sad 2.3%
Confused 2.2%
Angry 0.8%

AWS Rekognition

Age 2-10
Gender Female, 97.3%
Calm 92.1%
Surprised 6.4%
Fear 5.9%
Sad 5.4%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%

Categories