Human Generated Data

Title

Untitled (wedding guests clapping at reception)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10690

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests clapping at reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10690

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.3
Person 99.3
Person 99.1
Person 97.8
Person 97.4
Person 97
Person 96.6
Person 96.4
Person 94.3
Person 92.3
Person 91.5
Person 91.1
Person 87.5
Sunglasses 86.7
Accessory 86.7
Accessories 86.7
Person 86.2
Clothing 85.2
Apparel 85.2
Person 85.1
Face 79.2
Crowd 77.6
People 72.8
Person 70.4
Person 66.2
Person 64.2
Person 63.4
Photography 62.5
Photo 62.5
Person 62.5
Clinic 58.5
Indoors 56

Clarifai
created on 2023-10-26

people 99.8
group 98.8
man 96
group together 95.3
many 95.1
woman 95.1
education 94.3
child 92.2
adult 92.1
audience 88.6
music 88.4
administration 87.2
school 87
monochrome 85.6
leader 85.3
recreation 84.6
crowd 83.8
indoors 83.7
wear 82.6
actor 79.6

Imagga
created on 2022-01-15

nurse 64.5
senior 42.1
man 40.3
person 38.2
people 34
couple 28.7
male 28.4
patient 27.5
elderly 25.8
retired 25.2
happy 25
smiling 21.7
hospital 21.4
mature 21.4
adult 21
portrait 20.7
love 20.5
retirement 20.2
old 19.5
together 18.4
husband 17.2
married 16.3
room 16.2
wife 16.1
home 15.9
men 14.6
health 14.6
lifestyle 14.4
case 14.3
hair 14.3
indoors 14
surgeon 14
smile 13.5
medical 13.2
salon 13.2
doctor 13.2
sick person 13.1
hand 12.1
sitting 12
camera 12
gray 11.7
affection 11.6
marriage 11.4
looking 11.2
happiness 11
care 10.7
illness 10.5
grandma 10.4
active 10.4
women 10.3
specialist 10.1
aged 9.9
clinic 9.8
70s 9.8
grandmother 9.8
family 9.8
surgery 9.8
60s 9.8
fun 9.7
table 9.5
professional 9.5
casual 9.3
horizontal 9.2
holding 9.1
interior 8.8
day 8.6
loving 8.6
age 8.6
expression 8.5
enjoying 8.5
dress 8.1
lady 8.1
worker 8.1
romance 8
pensioner 8
grandfather 8
to 8
medicine 7.9
citizen 7.9
face 7.8
adults 7.6
meeting 7.5
leisure 7.5
outdoors 7.5
office 7.4
occupation 7.3
indoor 7.3
holiday 7.2
handsome 7.1
romantic 7.1
working 7.1
work 7.1
businessman 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 97
text 93.8
clothing 81.2
crowd 0.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 59.8%
Calm 89%
Sad 5.3%
Surprised 3.4%
Fear 0.9%
Happy 0.5%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 34-42
Gender Male, 64.9%
Calm 96.4%
Happy 2%
Sad 0.5%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Female, 99.6%
Calm 93.5%
Sad 4.4%
Surprised 0.8%
Happy 0.6%
Confused 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.2%
Happy 72%
Calm 9.8%
Sad 6.9%
Confused 6.5%
Surprised 2.3%
Disgusted 1.3%
Fear 0.8%
Angry 0.4%

AWS Rekognition

Age 26-36
Gender Male, 81.8%
Calm 65.4%
Sad 21%
Surprised 5.7%
Confused 3.8%
Angry 2.2%
Happy 1%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 48-56
Gender Male, 98.6%
Sad 85.7%
Happy 9.4%
Calm 2.9%
Confused 0.7%
Disgusted 0.4%
Angry 0.3%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 49-57
Gender Female, 83.8%
Calm 98.7%
Sad 0.5%
Happy 0.3%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 95.6%
Calm 74.1%
Sad 21.5%
Happy 2%
Disgusted 0.7%
Confused 0.6%
Fear 0.5%
Surprised 0.3%
Angry 0.2%

AWS Rekognition

Age 42-50
Gender Male, 60%
Happy 97.2%
Sad 1.2%
Calm 0.9%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Sunglasses 86.7%

Text analysis

Amazon

12
21358
21358.
22
358. 22
358.

Google

2)358· 21358. 21358. 12
2)358·
21358.
12