Human Generated Data

Title

Untitled (Mask and Wig members posed in sitting room)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10665

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask and Wig members posed in sitting room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10665

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 99.1
Person 98.6
Person 98
Person 97.6
Person 97.5
Person 97.2
Bed 97.2
Furniture 97.2
Person 96.3
Person 96.2
Clothing 94.3
Apparel 94.3
Person 82.5
Person 69.9
People 62.8
Suit 60.4
Coat 60.4
Overcoat 60.4
Text 57.6
Clinic 57.6
Room 56
Indoors 56
Shoe 55.8
Footwear 55.8
Shoe 54.9
Shoe 53.8

Clarifai
created on 2023-10-26

people 99.8
group 99.4
group together 98.5
adult 98.2
man 97.4
many 95.5
woman 95.4
several 95
education 92.2
leader 91.4
child 91
wear 89.9
five 88.7
medical practitioner 87
room 86.9
three 84.7
administration 83
furniture 83
four 82.6
indoors 81.2

Imagga
created on 2022-01-15

nurse 45.9
people 35.1
groom 32.9
person 32.4
man 32.2
male 28.4
adult 27.8
men 27.5
couple 21.8
businessman 20.3
patient 20.2
business 20
women 19
happy 18.8
professional 18.6
portrait 17.5
dress 17.2
bride 16.3
wedding 15.6
love 15
corporate 14.6
happiness 14.1
smiling 13.7
room 13.7
office 13.7
lifestyle 13.7
group 13.7
medical 13.2
indoors 13.2
together 13.1
two 12.7
family 12.5
case 12
home 12
day 11.8
life 11.7
team 11.6
working 11.5
clothing 11.5
talking 11.4
worker 10.9
job 10.6
world 10.6
married 10.5
hospital 10.3
senior 10.3
sick person 10.2
businesswoman 10
face 9.9
human 9.7
colleagues 9.7
mid adult 9.6
looking 9.6
building 9.5
businesspeople 9.5
career 9.5
bouquet 9.4
mature 9.3
teamwork 9.3
20s 9.2
indoor 9.1
health 9
black 9
cheerful 8.9
husband 8.8
full length 8.7
loving 8.6
smile 8.5
marriage 8.5
casual 8.5
modern 8.4
manager 8.4
old 8.4
kin 8.3
color 8.3
city 8.3
clinic 8.3
outdoors 8.2
new 8.1
urban 7.9
casual clothing 7.8
hands 7.8
sitting 7.7
elderly 7.7
adults 7.6
mother 7.6
meeting 7.5
doctor 7.5
care 7.4
negative 7.4
uniform 7.3
suit 7.3
occupation 7.3
success 7.2
romantic 7.1
travel 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 96.7
person 91.2
drawing 89.9
outdoor 87
clothing 86.2
man 78.6
woman 66.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 90.9%
Calm 92.6%
Sad 6.2%
Surprised 0.3%
Happy 0.3%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Calm 99.3%
Happy 0.4%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 84.9%
Sad 79.2%
Happy 8.1%
Calm 7.4%
Surprised 1.6%
Confused 1.3%
Fear 1.1%
Angry 0.7%
Disgusted 0.5%

AWS Rekognition

Age 52-60
Gender Male, 98.5%
Calm 79%
Confused 9.6%
Sad 8%
Happy 2.2%
Disgusted 0.4%
Surprised 0.4%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 25-35
Gender Male, 69.5%
Happy 67.6%
Calm 24.8%
Confused 3.1%
Fear 1.6%
Disgusted 0.9%
Sad 0.9%
Surprised 0.7%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Male, 98.9%
Sad 63.6%
Calm 34.6%
Confused 0.9%
Happy 0.4%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 30-40
Gender Female, 72.4%
Calm 60.8%
Happy 23.6%
Sad 9.4%
Confused 2.8%
Angry 1.2%
Surprised 1.1%
Disgusted 0.8%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Female, 90.2%
Calm 78.9%
Happy 12.7%
Sad 5%
Confused 1.5%
Disgusted 0.9%
Angry 0.5%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.7%
Calm 83.6%
Sad 7%
Happy 4.7%
Disgusted 1.4%
Confused 1.2%
Surprised 0.8%
Fear 0.7%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Bed 97.2%
Shoe 55.8%

Text analysis

Amazon

015
21 015
21
21015

Google

21015.
21015.