Human Generated Data

Title

Untitled (man juggling on stage in front of a band)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8537

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man juggling on stage in front of a band)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 98.6
Stage 98.2
Person 96.5
Person 93.6
Person 89
Indoors 88.8
Interior Design 88.8
Musical Instrument 86.7
Musician 86.7
Person 85.9
Crowd 85.5
Person 83.7
Room 79.2
Leisure Activities 78
Face 76.5
People 72.9
Clothing 69.8
Apparel 69.8
Person 68.9
Furniture 66.2
Performer 66.2
Portrait 65.9
Photography 65.9
Photo 65.9
Person 62.4
Curtain 62.3
Music Band 59.6
Person 59.1
Person 56.1
Concert 55.8
Living Room 55.7
Bedroom 55.2

Imagga
created on 2022-01-09

people 24
adult 23.7
person 22.4
man 20.8
gymnastic apparatus 19.2
lifestyle 17.3
sports equipment 17.3
sport 16.5
black 16.2
business 15.8
exercise 15.4
women 15
professional 14.8
pretty 14.7
attractive 14.7
male 14.2
equipment 14.1
sitting 13.7
body 13.6
fitness 13.5
urban 13.1
fashion 12.8
stage 12.8
silhouette 12.4
portrait 12.3
building 11.9
happy 11.9
modern 11.2
hair 11.1
training 11.1
city 10.8
businessman 10.6
posing 9.8
office 9.8
lady 9.7
indoors 9.7
sexy 9.6
dancer 9.6
looking 9.6
legs 9.4
meeting 9.4
slim 9.2
leisure 9.1
active 9.1
performer 9
one 9
job 8.8
brunette 8.7
teenage 8.6
work 8.6
cute 8.6
motion 8.6
outside 8.6
model 8.6
smile 8.5
casual 8.5
energy 8.4
communication 8.4
health 8.3
sky 8.3
fit 8.3
life 8.2
indoor 8.2
outdoors 8.2
healthy 8.2
group 8.1
full length 7.8
outdoor 7.6
studio 7.6
clothing 7.5
human 7.5
trampoline 7.5
fun 7.5
figure 7.3
reflection 7.2
smiling 7.2
platform 7.2
athlete 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

concert 97.4
person 95.6
musical instrument 92.1
text 90.3
outdoor 85.7
clothing 80.4
guitar 73.7
drum 71.3
man 51.6

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 98.4%
Calm 82.1%
Sad 9.9%
Angry 3.1%
Happy 1.5%
Disgusted 1.3%
Confused 0.9%
Fear 0.7%
Surprised 0.5%

AWS Rekognition

Age 40-48
Gender Male, 72.9%
Calm 98.9%
Sad 0.6%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 6-14
Gender Female, 84.4%
Calm 44.9%
Sad 21.5%
Fear 15.4%
Confused 9.7%
Surprised 3.5%
Angry 2.3%
Happy 1.9%
Disgusted 0.9%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Calm 93.3%
Surprised 2.3%
Disgusted 1.4%
Angry 0.8%
Confused 0.8%
Happy 0.6%
Sad 0.5%
Fear 0.2%

AWS Rekognition

Age 48-56
Gender Male, 99.5%
Sad 50.6%
Calm 31.1%
Confused 9.5%
Disgusted 3.8%
Angry 2.8%
Happy 0.9%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 16-24
Gender Female, 68.6%
Sad 65.1%
Calm 26.6%
Confused 5.6%
Fear 1.2%
Angry 0.8%
Happy 0.4%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Male, 98.6%
Sad 53.9%
Disgusted 17.2%
Calm 15.9%
Angry 5.1%
Fear 4.2%
Confused 2%
Happy 0.9%
Surprised 0.8%

AWS Rekognition

Age 18-26
Gender Female, 91.6%
Sad 94%
Calm 1.3%
Surprised 1.2%
Confused 0.9%
Happy 0.7%
Fear 0.7%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 23-31
Gender Male, 70.9%
Sad 45.9%
Confused 29.9%
Calm 13.6%
Fear 4.7%
Angry 2.8%
Happy 1.2%
Disgusted 1.1%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

S
se
28461
XXX
=
= FOR
DO FROTES S
FROTES
FOR
DO

Google

28461. YT37A2 VAGON
YT37A2
28461.
VAGON