Human Generated Data

Title

Untitled (women gathered around table)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8333

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women gathered around table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.1
Person 99.1
Person 98.2
Person 97.6
Person 97
Person 92.9
Apparel 91
Hat 91
Clothing 91
Leisure Activities 90.9
Person 90.8
Musician 89.6
Musical Instrument 89.6
Guitar 79.2
People 72.8
Crowd 69.8
Performer 64.5
Person 62.4
Photography 61.3
Portrait 61.3
Face 61.3
Photo 61.3
Guitarist 59.3
Music Band 59.2

Imagga
created on 2022-01-09

person 29.8
people 28.4
adult 26.8
man 25.5
couple 24.4
bride 23
dress 22.6
love 21.3
happy 20.7
patient 20.5
wedding 20.2
married 19.2
male 19.1
smiling 18.8
happiness 18.8
senior 18.7
portrait 18.1
smile 17.8
groom 16.9
kin 16.3
musical instrument 16.3
women 15.8
men 15.4
two 14.4
marriage 14.2
cheerful 13.8
wind instrument 13.8
lifestyle 13.7
face 13.5
together 13.1
fashion 12.8
playing 12.8
outdoors 12.7
veil 11.8
gown 11.7
oboe 11.6
family 11.6
life 11.4
sick person 11.3
human 11.2
looking 11.2
celebration 11.2
case 11
nurse 10.9
care 10.7
hand 10.6
husband 10.6
elderly 10.5
fun 10.5
old 10.4
health 10.4
sitting 10.3
joy 10
brass 10
park 9.9
romance 9.8
attractive 9.8
lady 9.7
medical 9.7
wife 9.5
stringed instrument 9.2
outdoor 9.2
professional 9.1
worker 9.1
home 8.8
ceremony 8.7
play 8.6
cute 8.6
clothing 8.6
model 8.6
mother 8.5
holding 8.2
one 8.2
grandma 8.2
suit 8.1
mask 8
day 7.8
older 7.8
party 7.7
summer 7.7
retirement 7.7
costume 7.6
females 7.6
bouquet 7.5
traditional 7.5
help 7.4
blond 7.3
music 7.2
child 7.2
instrument 7
indoors 7
modern 7
musician 7
look 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 88.6%
Calm 99.7%
Sad 0.2%
Happy 0%
Fear 0%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Hat 91%

Captions

Microsoft

a group of people sitting in front of a window 75.9%
a group of people in front of a window 75.8%
a group of people standing in front of a window 75.7%

Text analysis

Amazon

11069
11069.
NAMTSAB
- NAMTSAB
-

Google

11069.
11069.