Human Generated Data

Title

Untitled (Mask & Wig club members dancing in pairs)

Date

c. 1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5466

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask & Wig club members dancing in pairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.6
Person 98.6
Person 98.5
Leisure Activities 98.4
Dance Pose 98.4
Person 92.8
Sports 87
Sport 87
Dance 82.6
Person 82.4
Person 74.1
Martial Arts 70.7
Chair 69.6
Furniture 69.6
Person 69.4
Judo 55.3
Person 42.1

Imagga
created on 2022-01-23

man 30.2
person 29.3
adult 28.4
people 26.8
male 26.2
dancer 18.8
sport 18.4
newspaper 18.2
business 18.2
lifestyle 18.1
performer 16.1
businessman 15.9
happy 15.7
attractive 15.4
active 14.8
product 14.8
professional 14.7
men 14.6
exercise 14.5
portrait 14.2
creation 14.1
black 13.8
dance 13.7
group 13.7
fitness 13.5
women 13.4
corporate 12.9
team 11.6
smiling 11.6
standing 11.3
fun 11.2
one 11.2
action 11.1
casual 11
day 11
smile 10.7
job 10.6
lady 10.5
indoors 10.5
pretty 10.5
outdoors 10.4
motion 10.3
athlete 10.2
teamwork 10.2
planner 10.1
player 10
suit 9.9
entertainer 9.8
human 9.7
success 9.7
body 9.6
walking 9.5
wind instrument 9.3
leisure 9.1
teacher 9.1
fashion 9
health 9
brass 9
office 9
boy 8.7
light 8.7
happiness 8.6
company 8.4
holding 8.3
teenager 8.2
businesswoman 8.2
art 8.2
pose 8.1
ball 8.1
handsome 8
cute 7.9
work 7.8
couple 7.8
play 7.8
outside 7.7
move 7.7
youth 7.7
formal 7.6
dishwasher 7.6
drawing 7.6
city 7.5
manager 7.4
silhouette 7.4
guy 7.3
dress 7.2
bright 7.1
modern 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.8
baseball 83.5
dance 82.2
footwear 63.6
black and white 58
person 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.1%
Sad 96.6%
Happy 1.8%
Confused 0.5%
Disgusted 0.5%
Surprised 0.2%
Calm 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 65.5%
Calm 29.1%
Happy 1.4%
Angry 1.3%
Confused 1.3%
Disgusted 0.6%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Female, 72.5%
Calm 91.1%
Sad 4.8%
Surprised 1.2%
Confused 1.1%
Happy 0.8%
Disgusted 0.5%
Angry 0.4%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Female, 84.7%
Happy 72.1%
Calm 16.2%
Confused 7%
Sad 2.1%
Surprised 1%
Disgusted 1%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Chair 69.6%

Captions

Microsoft

graphical user interface 42.3%

Text analysis

Amazon

21
21 006
006
21006

Google

21006•
21006•