Human Generated Data

Title

Untitled (crowd at bottom of stairs)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4534

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowd at bottom of stairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 98.8
Person 98.3
Person 98.2
Person 95.8
Person 95.7
Crowd 93.9
Audience 91
Person 90.7
Person 90.6
Clinic 87.2
Indoors 86.4
Interior Design 86.4
Person 85.4
Person 83.7
Person 83.1
Room 81.6
Person 80.7
Person 76.2
People 74.5
Person 70.8
Accessory 69.8
Accessories 69.8
Sunglasses 69.8
Person 66.8
Person 65.4
Person 62.8
Person 60.4
Speech 58.1
Lecture 58.1
Jury 55.6
Operating Theatre 55.2
Hospital 55.2
Person 42.1

Imagga
created on 2022-01-23

person 39.8
man 39
people 37.4
male 34
adult 32
senior 30
patient 26.4
nurse 22.7
medical 22.1
professional 21.4
happy 21.3
smiling 21
couple 20.9
men 20.6
indoors 20.2
elderly 19.1
home 19.1
doctor 18.8
businessman 18.5
room 18.2
mature 17.7
occupation 17.4
group 16.9
women 16.6
business 16.4
teacher 16
worker 15.8
work 15.7
sitting 15.5
old 15.3
portrait 14.9
student 14.8
indoor 14.6
health 14.6
clothing 14.4
education 13.8
lifestyle 13.7
case 13.6
smile 13.5
team 13.4
hospital 13.2
office 13.1
looking 12.8
coat 12.7
retired 12.6
job 12.4
medicine 12.3
together 12.3
executive 12
human 12
two 11.9
happiness 11.7
colleagues 11.7
mid adult 11.6
husband 11.4
businesspeople 11.4
face 11.4
desk 11.3
blackboard 11
casual 11
pensioner 10.9
clinic 10.9
family 10.7
working 10.6
sick person 10.6
cheerful 10.6
modern 10.5
adults 10.4
aged 9.9
care 9.9
hand 9.9
70s 9.8
retirement 9.6
life 9.6
illness 9.5
table 9.5
wife 9.5
love 9.5
corporate 9.4
meeting 9.4
teamwork 9.3
holding 9.1
classroom 8.9
lab coat 8.9
lab 8.7
standing 8.7
laboratory 8.7
30s 8.7
day 8.6
career 8.5
inside 8.3
businesswoman 8.2
technology 8.2
grandfather 8
handsome 8
computer 8
scientist 7.8
40s 7.8
color 7.8
talking 7.6
college 7.6
communication 7.6
shower cap 7.4
laptop 7.3
board 7.2
school 7.2
cap 7.1
science 7.1
to 7.1

Google
created on 2022-01-23

White 92.2
Font 85
Black-and-white 82.3
Line 82.3
Crowd 76.8
Event 74.4
Snapshot 74.3
Monochrome 73.5
Monochrome photography 73.4
Suit 72.6
Room 67.5
Hat 66.1
Stock photography 65.2
Team 60.1
History 59.4
Crew 53.4
Photo caption 53.3
T-shirt 52.1
Music 51.5

Microsoft
created on 2022-01-23

person 99.6
text 94.8
clothing 75.2
group 65.3
people 55.2
human face 54.2
posing 51
wedding dress 50.3

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99.8%
Calm 81.1%
Sad 12.3%
Fear 2.2%
Confused 1.4%
Happy 1.2%
Disgusted 1.1%
Surprised 0.5%
Angry 0.3%

AWS Rekognition

Age 48-56
Gender Male, 99.6%
Calm 94.2%
Sad 3.5%
Confused 1.1%
Surprised 0.4%
Angry 0.2%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Male, 92.4%
Calm 81.3%
Sad 11.7%
Happy 2%
Confused 1.5%
Fear 1.3%
Disgusted 0.9%
Surprised 0.7%
Angry 0.6%

AWS Rekognition

Age 48-54
Gender Male, 97.7%
Happy 91.3%
Calm 5.3%
Confused 1.1%
Surprised 1%
Sad 0.6%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Male, 99.2%
Happy 52.1%
Surprised 21.5%
Sad 15.1%
Calm 5.3%
Confused 1.9%
Fear 1.6%
Angry 1.4%
Disgusted 1.1%

AWS Rekognition

Age 45-53
Gender Male, 91.1%
Calm 80.7%
Fear 11.5%
Surprised 2%
Sad 1.5%
Happy 1.3%
Confused 1.2%
Disgusted 1.1%
Angry 0.8%

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Sad 95.1%
Calm 3.8%
Angry 0.5%
Confused 0.2%
Happy 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 31-41
Gender Female, 68.8%
Surprised 56%
Calm 17.3%
Confused 12.9%
Fear 5.5%
Happy 3.3%
Disgusted 2.7%
Angry 1.5%
Sad 0.7%

AWS Rekognition

Age 28-38
Gender Female, 75%
Calm 75.4%
Happy 12.9%
Fear 3.6%
Sad 3.6%
Surprised 1.7%
Angry 1.2%
Disgusted 0.9%
Confused 0.6%

AWS Rekognition

Age 40-48
Gender Female, 64.7%
Sad 41.3%
Confused 29.1%
Calm 25.9%
Angry 1.6%
Happy 0.8%
Surprised 0.5%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 19-27
Gender Male, 83.4%
Sad 41.9%
Calm 30%
Happy 17%
Fear 6.5%
Confused 1.4%
Angry 1.2%
Disgusted 1.1%
Surprised 0.8%

AWS Rekognition

Age 54-62
Gender Female, 53.7%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 39-47
Gender Female, 95.7%
Sad 72.1%
Confused 18.8%
Happy 4.6%
Calm 1.8%
Disgusted 0.8%
Surprised 0.7%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 61.8%
Sad 53.5%
Confused 27.7%
Disgusted 8%
Calm 6.3%
Happy 1.3%
Angry 1.2%
Fear 1.1%
Surprised 0.9%

AWS Rekognition

Age 36-44
Gender Male, 73.2%
Sad 39%
Happy 34%
Calm 19.4%
Confused 3.9%
Angry 1.1%
Disgusted 0.9%
Surprised 0.9%
Fear 0.8%

AWS Rekognition

Age 6-16
Gender Female, 100%
Fear 73.1%
Sad 19.6%
Calm 2.5%
Disgusted 1.6%
Angry 1%
Surprised 0.8%
Happy 0.7%
Confused 0.7%

AWS Rekognition

Age 27-37
Gender Female, 65.5%
Sad 44.1%
Calm 30.4%
Happy 14.8%
Confused 2.7%
Fear 2.3%
Angry 2.1%
Surprised 2.1%
Disgusted 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Sunglasses 69.8%

Captions

Microsoft

a group of people posing for a photo 92.8%
a group of people posing for the camera 92.7%
a group of people posing for a picture 92.6%

Text analysis

Amazon

13549.
13549
EVEINS

Google

13549.
AUON-YT3RA2-
13549. 13549. 13549. AUON-YT3RA2- MAMT2A3
MAMT2A3