Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1116

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1116

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Baby 99.3
Person 99.3
Person 98.6
Adult 98.6
Female 98.6
Woman 98.6
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Face 95.1
Head 95.1
Clothing 94.1
Dress 94.1
Coat 94.1
Dining Table 93.3
Furniture 93.3
Table 93.3
Formal Wear 92.9
Architecture 91.2
Building 91.2
Dining Room 91.2
Indoors 91.2
Room 91.2
Photography 78.5
Portrait 78.5
Shirt 77.5
Lady 70.8
Jeans 70.4
Pants 70.4
Blouse 66.7
Hospital 65.1
Home Decor 57.7
Linen 57.7
Smoke 57.3
Body Part 56.7
Finger 56.7
Hand 56.7
Restaurant 56.6
Dressing Room 56.1
Fashion 55.8
Gown 55.8
Clinic 55.7
Electronics 55.6
Phone 55.6
Chair 55.4
Classroom 55.3
School 55.3
Suit 55.3
Cafeteria 55.2

Clarifai
created on 2018-05-11

people 99.9
adult 97.3
two 97.2
group 96.5
man 95.1
three 94.4
monochrome 93.9
woman 92.8
child 92.4
wear 90.6
administration 89.9
group together 88.7
leader 88
sit 86.5
furniture 86.3
one 84
indoors 83.9
portrait 82.4
four 82.3
actor 81.7

Imagga
created on 2023-10-07

man 31.6
people 26.2
person 22.6
male 22.1
room 18.3
adult 17.2
business 16.4
black 14.5
men 13.7
home 13.6
couple 13.1
chair 12.2
office 12.2
fashion 12.1
sitting 12
portrait 11.6
human 11.2
device 11.1
inside 11
musical instrument 10.7
businessman 10.6
worker 10.1
interior 9.7
working 9.7
building 9
indoors 8.8
hair 8.7
lifestyle 8.7
old 8.4
clothing 8.2
computer 8.1
family 8
job 8
smiling 8
women 7.9
happiness 7.8
work 7.8
wind instrument 7.7
window 7.7
newspaper 7.7
serious 7.6
house 7.5
dark 7.5
senior 7.5
suit 7.5
call 7.4
back 7.3
indoor 7.3
sexy 7.2
dress 7.2
looking 7.2
love 7.1
face 7.1
modern 7

Microsoft
created on 2018-05-11

person 92.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Female, 99.8%
Calm 97.1%
Surprised 6.4%
Fear 6%
Sad 2.6%
Disgusted 0.3%
Confused 0.3%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 6-14
Gender Female, 100%
Happy 79.3%
Fear 18%
Surprised 6.4%
Sad 2.3%
Calm 0.6%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 41
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Baby 99.3%
Person 99.3%
Adult 98.6%
Female 98.6%
Woman 98.6%
Male 97.8%
Man 97.8%
Shirt 77.5%
Jeans 70.4%

Categories

Imagga

paintings art 99.5%

Captions