Human Generated Data

Title

Untitled (seated couple and woman)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4459

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated couple and woman)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4459

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.2
Apparel 99.2
Person 99.1
Human 99.1
Person 97.9
Person 97.7
Person 96.3
Person 94.5
Face 86
Clinic 79.3
Sleeve 76.2
Coat 70.8
Portrait 67.8
Photography 67.8
Photo 67.8
Female 66.9
Tie 64.4
Accessories 64.4
Accessory 64.4
People 63.6
Doctor 59.5
Fashion 57.5
Lab Coat 57.4
Robe 56.9
Head 56.8
Person 56.4
Suit 55.3
Overcoat 55.3

Clarifai
created on 2023-10-26

man 98.9
people 98.5
healthcare 98.4
adult 93.8
medicine 93.2
group 89.3
woman 88.9
health 86
doctor 85.8
scientist 83.5
coat 81.7
uniform 79.4
hospital 78.8
science 77.9
medical 70.4
medical practitioner 67.9
portrait 67.9
monochrome 65.2
isolate 64
looking 59.6

Imagga
created on 2022-01-23

specialist 45
person 41.9
man 39.7
people 34
male 29.8
professional 29.1
adult 28.6
doctor 28.2
mask 28.1
medical 27.4
surgeon 25.9
medicine 24.7
health 24.3
human 24
photographer 23.4
patient 21.4
portrait 20.7
worker 20.4
work 20.4
looking 18.4
laboratory 18.3
clinic 18.2
holding 18.2
hospital 18.1
occupation 17.4
businessman 16.8
chemical 16.2
science 16
business 15.8
care 15.6
coat 15.3
equipment 14.9
lab 14.6
uniform 14.3
face 14.2
scientist 13.7
suit 13.5
case 13.4
job 13.3
working 13.3
groom 12.8
technology 12.6
expertise 12.6
research 12.4
biology 12.3
hand 12.2
instrument 12.1
happy 11.9
team 11.7
smiling 11.6
test 11.5
nurse 11.4
one 11.2
manager 11.2
men 11.2
student 10.9
assistant 10.7
chemistry 10.6
lab coat 10.5
illness 10.5
development 10.5
tie 10.4
love 10.3
day 10.2
microscope 9.9
technician 9.8
nuclear 9.7
sick person 9.7
black 9.6
profession 9.6
hands 9.6
education 9.5
teamwork 9.3
successful 9.2
observation 8.9
researcher 8.9
scientific 8.7
couple 8.7
standing 8.7
lifestyle 8.7
happiness 8.6
model 8.6
mobile 8.5
modern 8.4
sky 8.3
confident 8.2
protection 8.2
danger 8.2
active 8.1
success 8
women 7.9
biochemistry 7.9
microbiology 7.9
chemist 7.9
biotechnology 7.9
toxic 7.8
optical 7.8
attractive 7.7
pollution 7.7
shirt 7.5
future 7.4
mature 7.4
environment 7.4
finger 7.4
phone 7.4
ecology 7.3
cheerful 7.3
sexy 7.2
bride 7.2
posing 7.1
negative 7.1

Google
created on 2022-01-23

Coat 86.8
Gesture 85.3
Black-and-white 82.6
Headgear 81.3
Font 78.1
Happy 74.3
Formal wear 73
Event 72.1
Vintage clothing 71
Art 70.8
Suit 70.4
Monochrome photography 69.4
Hat 69.3
Team 68.3
Fun 67.9
History 65.5
Monochrome 65.4
Crew 62
Photo caption 61.4
Fashion design 61.2

Microsoft
created on 2022-01-23

text 98.5
posing 94.9
wedding dress 92.5
clothing 92.3
person 92.1
bride 88
smile 82.1
human face 81.1
black and white 74
woman 72.6
man 63.3
old 40.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.6%
Calm 79%
Happy 5.7%
Disgusted 4.5%
Confused 3.3%
Sad 2.3%
Angry 2%
Surprised 2%
Fear 1.3%

AWS Rekognition

Age 16-22
Gender Female, 91.6%
Happy 92.5%
Surprised 5.7%
Confused 1.1%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%
Sad 0.1%
Calm 0.1%

AWS Rekognition

Age 41-49
Gender Male, 87.6%
Happy 82.5%
Calm 7.2%
Surprised 5.5%
Confused 2.6%
Disgusted 0.9%
Sad 0.7%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 41-49
Gender Female, 89.3%
Sad 51.6%
Calm 26.5%
Happy 17.6%
Confused 1.9%
Disgusted 0.7%
Surprised 0.6%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Female, 71.3%
Sad 57.8%
Confused 11.5%
Disgusted 9.2%
Calm 7.3%
Fear 5.3%
Happy 4.5%
Angry 2.6%
Surprised 1.8%

AWS Rekognition

Age 50-58
Gender Male, 97.7%
Happy 38.9%
Surprised 38.1%
Disgusted 6.5%
Calm 6.4%
Angry 4%
Fear 3.1%
Sad 1.7%
Confused 1.4%

AWS Rekognition

Age 33-41
Gender Female, 95.5%
Sad 61.8%
Calm 17.5%
Confused 6.4%
Fear 5.7%
Happy 3.5%
Surprised 2.4%
Angry 1.5%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Tie 64.4%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

17288
19288
na88.
KUDVR
RV

Google

17288 n288 AGOX
17288
n288
AGOX