Human Generated Data

Title

Untitled (wedding guests seated on folding chairs)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10679

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated on folding chairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10679

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.9
Apparel 99.9
Person 99.4
Human 99.4
Person 99.3
Person 99.3
Person 99.2
Person 98.5
Face 95.7
Female 94.4
Robe 94
Fashion 94
Gown 93
Wedding 91.8
Dress 89.9
Suit 89
Overcoat 89
Coat 89
Plant 88.2
Bride 83.4
Wedding Gown 83.4
Woman 82.5
Smile 78
Bridegroom 78
Flower 77.6
Blossom 77.6
Evening Dress 76.5
Blonde 72.6
Teen 72.6
Kid 72.6
Child 72.6
Girl 72.6
Photography 69.2
Photo 69.2
People 69.2
Portrait 69.1
Flower Arrangement 66.3
Hair 64.8
Potted Plant 63.8
Pottery 63.8
Jar 63.8
Vase 63.8
Head 59.8
Crowd 59.5
Painting 59.5
Art 59.5
Flower Bouquet 57.4
Costume 57.1

Clarifai
created on 2023-10-26

people 99.8
group 98.8
musician 97.2
adult 96.5
music 96
man 95.8
monochrome 95.8
woman 95.7
singer 93.4
group together 90.4
outfit 88.5
furniture 88.5
stringed instrument 86.8
wear 86.7
actress 85.9
veil 85.8
portrait 85.8
several 84.6
three 83.5
facial expression 83.1

Imagga
created on 2022-01-15

people 30.7
man 26.9
male 22.8
person 20.3
adult 20.2
love 17.3
portrait 16.8
couple 16.5
home 15.9
smiling 15.9
women 15.8
dress 15.4
room 14.6
happy 14.4
happiness 14.1
lifestyle 13.7
husband 13.5
family 13.3
senior 13.1
sitting 12.9
black 12.7
kin 12.7
bride 12.5
men 12
indoor 11.9
together 11.4
adults 11.4
wedding 11
leisure 10.8
indoors 10.5
old 10.4
wife 10.4
looking 10.4
two 10.2
hospital 9.6
face 9.2
business 9.1
holding 9.1
clothing 8.9
life 8.9
patient 8.8
smile 8.5
togetherness 8.5
office 8.5
attractive 8.4
care 8.2
girls 8.2
groom 8.2
group 8.1
mother 8
interior 8
hair 7.9
married 7.7
elderly 7.7
loving 7.6
hairdresser 7.6
father 7.5
friends 7.5
fun 7.5
mature 7.4
20s 7.3
cheerful 7.3
lady 7.3
case 7.2
nurse 7.1
shower cap 7.1
medical 7.1
businessman 7.1
day 7.1
table 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.5
person 97.6
clothing 86.7
standing 78.3
woman 66.7
human face 57.8
posing 35.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 99.5%
Happy 61.9%
Sad 21.8%
Calm 10.1%
Surprised 2.5%
Confused 1.6%
Disgusted 0.9%
Angry 0.7%
Fear 0.6%

AWS Rekognition

Age 48-56
Gender Male, 88.2%
Happy 84%
Fear 7%
Surprised 4.7%
Calm 1.5%
Angry 0.8%
Sad 0.7%
Disgusted 0.7%
Confused 0.6%

AWS Rekognition

Age 35-43
Gender Female, 61%
Happy 63.9%
Surprised 21%
Calm 12.9%
Confused 0.7%
Disgusted 0.5%
Sad 0.5%
Angry 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 59.5%

Categories

Text analysis

Amazon

10
21
21 331.
331.
21331.

Google

21 331. 10
21
331.
10