Human Generated Data

Title

Untitled (Jonathan Shahn)

Date

1940

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4041

Human Generated Data

Title

Untitled (Jonathan Shahn)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4041

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Body Part 100
Finger 100
Hand 100
Face 100
Head 100
Photography 100
Portrait 100
Clothing 99.7
Dress 99.7
Person 99.2
Baby 99.2
Dancing 95.2
Leisure Activities 95.2
Formal Wear 82.2
Fashion 69.3
Gown 69.3
Fence 55.3
Handrail 55.2

Clarifai
created on 2018-05-10

people 99.7
adult 97.7
monochrome 97.5
one 95.5
woman 94.5
wear 93.5
man 92.8
portrait 91.9
two 91.3
street 91
child 88.6
black and white 88.4
music 86.4
facial expression 86
musician 82.3
outfit 80.9
sepia 80.3
group 79.7
retro 78.3
group together 77.8

Imagga
created on 2023-10-06

person 32.8
portrait 27.2
fashion 25.6
people 22.3
adult 22
model 21
dress 20.8
child 20.5
happy 20.1
clothing 19.7
looking 18.4
attractive 18.2
male 17.5
human 15.7
hat 15.6
man 15.5
sexy 15.3
fence 15.1
youth 14.5
hair 14.3
love 14.2
one 14.2
look 14
expression 13.7
smile 13.5
style 12.6
outdoor 12.2
brunette 12.2
women 11.9
lifestyle 11.6
black 11.5
face 11.4
lady 11.4
fun 11.2
costume 11.2
outdoors 11.2
pretty 11.2
emotion 11.1
casual 11
children 10.9
joy 10.9
posing 10.7
domestic 10.6
happiness 10.2
cute 10
hand 9.9
couple 9.6
boy 9.6
day 9.4
outside 9.4
smiling 9.4
alone 9.1
teenager 9.1
modern 9.1
holding 9.1
barrier 9
kid 8.9
body 8.8
serious 8.6
mother 8.6
winter 8.5
dark 8.4
city 8.3
park 8.2
cheerful 8.1
suit 8.1
handsome 8
holiday 7.9
urban 7.9
standing 7.8
parent 7.7
sitting 7.7
kimono 7.7
teen 7.4
summer 7.1
season 7
together 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 85.1
crowd 1.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Female, 75.9%
Sad 76.7%
Calm 35%
Surprised 22%
Fear 8.8%
Angry 3.7%
Disgusted 1.6%
Happy 1.2%
Confused 0.3%

Feature analysis

Amazon

Person 99.2%
Baby 99.2%

Categories

Imagga

paintings art 71.4%
people portraits 17.3%
pets animals 8.8%

Captions

Microsoft
created on 2018-05-10

a black and white photo of a girl 45.8%
a girl taking a selfie 45.7%
a close up of a girl 45.6%

Text analysis

Amazon

Harvard
of
College
Art
Museums)
(Harvard
and
University
President and Fellows of Harvard College (Harvard University Art Museums)
Fellows
President
P1970.4041.0000

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4041.0000
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4041.0000