Human Generated Data

Title

Untitled (guests to debutante ball seated at tables under a tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8441

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (guests to debutante ball seated at tables under a tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8441

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 95.8
Person 94.8
Person 94.1
Person 91.7
Person 91.3
Person 89.7
Person 89.4
Person 86.7
Person 85.4
Crowd 84.8
Person 79.9
Person 76.3
Clothing 75.2
Apparel 75.2
People 69.1
Sitting 67.4
Interior Design 63.1
Indoors 63.1
Building 58.1
Audience 56

Clarifai
created on 2023-10-25

people 99.9
group 98.9
woman 97.9
adult 97.2
man 96.9
many 96.8
music 93.1
group together 91.7
leader 90.9
child 87.8
ceremony 86.1
wear 85.7
audience 85.5
administration 85.1
religion 84.7
wedding 83.7
crowd 83.7
musician 80.8
singer 79.6
veil 78.9

Imagga
created on 2022-01-09

person 41.3
man 39.6
people 34.6
male 34
couple 34
groom 32.8
adult 28.1
happy 23.8
smiling 21.7
men 21.5
patient 19.8
life 19.7
office 19.3
professional 19.2
home 19.1
sitting 18.9
happiness 18.8
two 18.6
smile 18.5
women 18.2
love 17.3
business 17
bride 16.8
businessman 16.8
room 16.7
together 15.8
worker 15.4
doctor 15
group 14.5
clothing 14.2
indoors 14
wedding 13.8
work 13.3
family 13.3
talking 13.3
job 13.3
coat 13.2
medical 13.2
lab coat 13.1
mature 13
cheerful 13
indoor 12.8
husband 12.7
dress 12.6
team 12.5
desk 12.3
senior 12.2
table 12.1
computer 12
health 11.8
day 11.8
colleagues 11.7
portrait 11.6
lifestyle 11.6
kin 11.5
bouquet 11.4
hospital 11
care 10.7
nurse 10.6
working 10.6
married 10.5
businesspeople 10.4
hairdresser 10.4
wife 10.4
corporate 10.3
teamwork 10.2
occupation 10.1
associates 9.8
modern 9.8
romantic 9.8
mid adult 9.6
barbershop 9.4
meeting 9.4
communication 9.2
suit 9.1
businesswoman 9.1
holding 9.1
old 9.1
student 8.9
handsome 8.9
celebration 8.8
ceremony 8.7
partnership 8.6
loving 8.6
executive 8.6
marriage 8.5
face 8.5
adults 8.5
casual 8.5
relationship 8.4
clinic 8.2
fun 8.2
teacher 8.1
interior 8
medicine 7.9
color 7.8
middle aged 7.8
case 7.8
full length 7.8
affectionate 7.7
30s 7.7
attractive 7.7
elderly 7.7
sofa 7.6
illness 7.6
shop 7.6
practitioner 7.6
gown 7.6
sick person 7.6
fashion 7.5
friends 7.5
human 7.5
treatment 7.3
laptop 7.3
looking 7.2
garment 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

wedding dress 97.8
text 96
bride 94.9
person 93.5
clothing 92.3
man 73.6
dress 70.6
black and white 62.5
woman 62.4
wedding 60.8
clothes 16.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.3%
Surprised 57.9%
Sad 35.8%
Fear 1.9%
Disgusted 1.4%
Angry 1.3%
Calm 0.8%
Confused 0.8%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Calm 75.9%
Surprised 23%
Sad 0.4%
Happy 0.3%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 99.7%
Calm 99.7%
Sad 0.1%
Confused 0%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 16-24
Gender Female, 66.4%
Calm 92%
Sad 4.1%
Confused 2.4%
Disgusted 0.3%
Surprised 0.3%
Angry 0.3%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 42-50
Gender Male, 93.3%
Sad 31.6%
Calm 27.3%
Fear 22.8%
Angry 4.8%
Confused 4%
Disgusted 3.5%
Surprised 3.1%
Happy 2.9%

AWS Rekognition

Age 27-37
Gender Male, 86.7%
Calm 58.5%
Sad 33.2%
Confused 2.4%
Happy 1.5%
Angry 1.3%
Disgusted 1.2%
Fear 1.1%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

05021
3

Google

12050 3
12050
3