Human Generated Data

Title

Untitled (seated women and children, standing man)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4521

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated women and children, standing man)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4521

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.5
Person 99.2
Person 98.9
Person 98.8
Person 92
People 82.9
Person 74.6
Clothing 71.4
Apparel 71.4
Chair 69.6
Furniture 69.6
Clinic 58.9

Clarifai
created on 2023-10-26

people 99.9
group 97.7
man 97.7
adult 97.7
monochrome 96
woman 95.1
child 95
group together 93.7
many 90.5
leader 87.8
crowd 85
administration 84.7
actor 84.3
wear 82.5
boy 82
wedding 82
street 81.6
chair 80.5
family 79.7
music 77.6

Imagga
created on 2022-01-23

bride 29
people 26.8
couple 26.1
wedding 25.7
groom 25.4
dress 23.5
love 18.9
person 18
celebration 16.7
life 16.3
men 16.3
married 16.3
two 16.1
man 15.4
adult 15.2
happy 15
happiness 14.9
bouquet 14.1
women 13.4
outdoors 13.4
cheerful 13
kin 13
ceremony 12.6
marriage 12.3
smiling 12.3
male 12.1
human 12
pedestrian 11.8
clothing 11.8
wife 11.4
old 11.1
gown 10.9
negative 10.5
together 10.5
flowers 10.4
portrait 10.3
day 10.2
girls 10
brass 9.8
husband 9.8
veil 9.8
summer 9.6
clothes 9.4
film 9.2
church 9.2
suit 9
religion 9
romantic 8.9
wed 8.8
wind instrument 8.7
sunny 8.6
smile 8.5
nurse 8.5
joy 8.3
family 8
bridal 7.8
party 7.7
worker 7.7
attractive 7.7
pair 7.6
elegance 7.6
fashion 7.5
senior 7.5
traditional 7.5
fun 7.5
child 7.4
park 7.4
art 7.4
lady 7.3
business 7.3
decoration 7.2
lifestyle 7.2
holiday 7.2
romance 7.1
face 7.1
medical 7.1
businessman 7.1
work 7
musical instrument 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 93.7
outdoor 93.6
person 89.8
text 87.7
standing 77.5
dress 76.4
woman 71.7
footwear 70.1
posing 69.7
people 60.3
group 56.9
old 48.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 77.4%
Calm 76.8%
Angry 10.2%
Sad 4.4%
Happy 2.7%
Confused 2.6%
Surprised 1.3%
Disgusted 1.1%
Fear 0.9%

AWS Rekognition

Age 48-56
Gender Female, 84.2%
Calm 98.3%
Happy 1%
Sad 0.2%
Disgusted 0.2%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 99.7%
Calm 76.5%
Happy 17.4%
Sad 1.9%
Surprised 1.6%
Disgusted 1.3%
Angry 0.5%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 43-51
Gender Female, 64.7%
Calm 59.8%
Sad 33.2%
Confused 2.1%
Angry 1.4%
Happy 1.3%
Disgusted 1.2%
Surprised 0.6%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

32AB
32AB YT3RAS
YT3RAS