Human Generated Data

Title

Untitled (men and women dressed in formal attire seated on couch)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8367

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women dressed in formal attire seated on couch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Apparel 99.9
Clothing 99.9
Person 98.8
Human 98.8
Person 98.5
Person 98
Dress 97.6
Person 96.7
Coat 95.1
Overcoat 95.1
Suit 95.1
Female 95.1
Person 93.6
Gown 92.6
Fashion 92.6
Robe 92.4
Face 89.5
Wedding 88.7
Plant 88.1
Bridegroom 87.1
Furniture 86.9
Chair 85.3
Indoors 83.5
Woman 82.6
Person 81.2
Person 80.3
Wedding Gown 79
People 76.7
Blossom 76
Flower 76
Bride 74.5
Portrait 73.7
Photography 73.7
Photo 73.7
Flower Arrangement 72.4
Room 71.9
Girl 70.9
Kid 67.9
Child 67.9
Man 61.8
Flower Bouquet 59.4
Couch 59.1
Tuxedo 57.7
Smile 57.1
Accessories 55.6
Tie 55.6
Accessory 55.6

Imagga
created on 2022-01-09

nurse 70.3
man 38.3
person 36.2
professional 34.3
people 32.9
male 29.8
medical 27.4
adult 27.2
patient 24.6
doctor 24.4
coat 22
hospital 21.2
men 20.6
barbershop 20.4
lab coat 19.6
indoors 19.3
life 18.8
health 18.8
clinic 18.7
happy 18.2
shop 17.8
room 17.4
smiling 17.4
work 17.3
worker 16.2
medicine 15.9
couple 15.7
occupation 15.6
two 15.3
home 15.2
businessman 15
happiness 14.9
business 14.6
office 14.5
team 14.3
smile 14.3
clothing 14.1
mature 14
teacher 13.8
laboratory 13.5
women 13.4
job 13.3
interior 13.3
senior 13.1
lab 12.6
mercantile establishment 12.6
portrait 12.3
standing 12.2
family 11.6
treatment 11
uniform 10.8
care 10.7
modern 10.5
looking 10.4
lifestyle 10.1
practitioner 10
group 9.7
profession 9.6
case 9.5
corporate 9.5
desk 9.5
sitting 9.5
indoor 9.1
old 9.1
educator 9
scientist 8.8
examination 8.8
colleagues 8.7
chemistry 8.7
test 8.7
day 8.6
elderly 8.6
talking 8.6
businesspeople 8.5
adults 8.5
instrument 8.5
casual 8.5
student 8.5
clothes 8.4
place of business 8.4
color 8.3
cheerful 8.1
working 8
associates 7.9
doctors 7.9
black 7.8
physician 7.8
education 7.8
40s 7.8
assistant 7.8
chemical 7.7
research 7.6
sick person 7.5
study 7.5
holding 7.4
teamwork 7.4
businesswoman 7.3
aged 7.2
board 7.2
bright 7.2
science 7.1
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.5
clothing 95.5
person 91.4
wedding dress 89.7
bride 83
flower 72.4
woman 68.6
dress 68.6
posing 41.7
clothes 15.8

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 55.9%
Surprised 33.4%
Happy 28.3%
Calm 21.7%
Fear 6%
Sad 4.5%
Angry 2.3%
Disgusted 2.3%
Confused 1.5%

AWS Rekognition

Age 39-47
Gender Female, 99.4%
Happy 73.7%
Calm 12.2%
Surprised 7%
Sad 2.9%
Disgusted 1.6%
Angry 1.3%
Confused 0.9%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 84.2%
Calm 84.9%
Happy 11.1%
Confused 1.2%
Disgusted 1.1%
Surprised 0.6%
Fear 0.4%
Sad 0.4%
Angry 0.3%

AWS Rekognition

Age 30-40
Gender Female, 79.2%
Calm 98.5%
Sad 0.9%
Surprised 0.4%
Fear 0.1%
Confused 0.1%
Disgusted 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 98.6%
Calm 93.3%
Surprised 5.6%
Disgusted 0.3%
Confused 0.3%
Happy 0.2%
Fear 0.2%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 25-35
Gender Male, 72.9%
Calm 89.2%
Surprised 6%
Happy 1.8%
Fear 0.7%
Confused 0.7%
Disgusted 0.6%
Sad 0.5%
Angry 0.4%

AWS Rekognition

Age 52-60
Gender Male, 95%
Calm 99.8%
Disgusted 0.1%
Confused 0%
Surprised 0%
Sad 0%
Happy 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 99.9%
Sad 0%
Disgusted 0%
Surprised 0%
Confused 0%
Happy 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 98.8%
Tie 55.6%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 82.5%
a vintage photo of a group of people posing for a picture 82.4%
a vintage photo of a group of people in a room 82.3%

Text analysis

Amazon

14495
BE

Google

14495
NAGON-YT3RA2-MAMTZA3
чЧ5.
14495.
emp
14495. чЧ5. emp 14495 NAGON-YT3RA2-MAMTZA3