Human Generated Data

Title

Untitled (relief family, Lancaster, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.19

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief family, Lancaster, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.19

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 98.9
Female 98.9
Person 98.9
Woman 98.9
Adult 98.8
Person 98.8
Male 98.8
Man 98.8
Person 98.6
Furniture 98.4
Face 95.6
Head 95.6
Photography 95.6
Portrait 95.6
Clothing 91.2
Footwear 91.2
Shoe 91.2
Architecture 87.4
Building 87.4
House 87.4
Housing 87.4
Porch 87.4
Brick 85.7
Chair 77.1
Shorts 75.4
Rocking Chair 69.7
Shoe 63.7
Shoe 63.1
Dress 55

Clarifai
created on 2018-05-11

people 100
group 98.4
child 98.3
adult 98.3
woman 97.9
furniture 96.9
two 96.4
seat 93.4
three 93
sit 92.5
man 92.5
offspring 92.3
chair 91.6
group together 91.1
home 89.3
recreation 89.2
family 89.2
room 88.9
wear 88.1
facial expression 87.8

Imagga
created on 2023-10-05

chair 36.2
seat 25.3
wheelchair 24.8
man 22.8
male 19.3
washboard 18.3
barbershop 18
person 16.9
people 16.7
wheeled vehicle 16.4
shop 15.8
musical instrument 15.5
device 15.4
furniture 15.4
portrait 14.2
family 14.2
black 13.8
rocking chair 13
couple 12.2
kin 12.1
child 12
room 12
shopping cart 11.9
adult 11.9
old 11.8
happy 11.3
accordion 11.3
handcart 11.1
mercantile establishment 11.1
love 11
lifestyle 10.8
tricycle 9.9
fashion 9.8
outdoors 9.7
sitting 9.4
day 9.4
mother 9.2
silhouette 9.1
keyboard instrument 9
home 8.8
classroom 8.7
boy 8.7
vehicle 8.7
happiness 8.6
building 8.6
men 8.6
dark 8.3
window 8.2
park 8.2
wind instrument 8.1
business 7.9
architecture 7.8
retired 7.8
married 7.7
youth 7.7
bench 7.6
conveyance 7.6
leisure 7.5
furnishing 7.5
holding 7.4
care 7.4
place of business 7.4
dirty 7.2
history 7.2
women 7.1
interior 7.1
businessman 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

window 96.2
black 90.4
white 86.3
old 58.9
vintage 31.2
family 15.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 71%
Sad 80.7%
Calm 59.9%
Surprised 6.4%
Fear 6.1%
Confused 1.1%
Angry 0.9%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 27-37
Gender Female, 73.7%
Calm 93%
Surprised 6.6%
Fear 5.9%
Angry 5.5%
Sad 2.3%
Disgusted 0.2%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Calm 83.6%
Fear 6.8%
Surprised 6.7%
Sad 5.9%
Angry 1.9%
Confused 1.6%
Disgusted 1.3%
Happy 1%

AWS Rekognition

Age 0-3
Gender Female, 83.9%
Surprised 99.7%
Fear 5.9%
Sad 2.1%
Calm 0%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 23-33
Gender Male, 98.7%
Sad 55.8%
Calm 51.8%
Fear 10%
Surprised 8.2%
Confused 2.2%
Disgusted 2%
Angry 1.9%
Happy 1.7%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Female 98.9%
Person 98.9%
Woman 98.9%
Male 98.8%
Man 98.8%
Shoe 91.2%
Chair 77.1%

Categories

Imagga

paintings art 97.6%
pets animals 1.3%