Human Generated Data

Title

Untitled (Junior League group of man and four women performing kickline in large room with wooden floor)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10034

Human Generated Data

Title

Untitled (Junior League group of man and four women performing kickline in large room with wooden floor)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Leisure Activities 99.8
Dance Pose 99.8
Person 98.8
Human 98.8
Person 98.8
Dance 98.2
Footwear 98.1
Apparel 98.1
Shoe 98.1
Clothing 98.1
Helmet 97.9
Ballet 96.4
Shoe 96.2
Person 95.6
Person 95
Guitar 91.6
Musical Instrument 91.6
Person 88.9
Ballerina 86.6
Guitar 81.2
Stage 62.3
Flooring 55.8

Imagga
created on 2022-01-28

dancer 97.4
performer 80.5
entertainer 54.5
person 45.4
adult 27.8
people 25.7
man 22.8
teacher 21.1
fashion 18.8
male 17.7
professional 17.7
portrait 17.5
black 17.4
body 16.8
attractive 16.1
studio 16
educator 15.7
brass 15.3
sport 15.2
posing 15.1
active 14.5
sensuality 14.5
sexy 14.5
dance 14.3
human 14.2
model 14
women 13.4
silhouette 13.2
wind instrument 13.2
action 13
slim 12.9
men 12.9
pretty 12.6
dancing 12.5
performance 12.4
style 11.9
music 11.9
dress 11.7
bass 11.4
group 11.3
modern 11.2
elegance 10.9
musical instrument 10.9
musician 10.9
fitness 10.8
team 10.7
happy 10.6
teenage 10.6
lifestyle 10.1
teenager 10
girls 10
life 10
exercise 10
singer 9.8
fun 9.7
urban 9.6
couple 9.6
standing 9.6
athlete 9.5
teen 9.2
city 9.1
lady 8.9
happiness 8.6
cute 8.6
motion 8.6
youth 8.5
legs 8.5
dark 8.3
art 8.3
guitar 8.3
pose 8.2
looking 8
cool 8
concert 7.8
party 7.7
leisure 7.5
business 7.3
smiling 7.2
leg 7.2
sunset 7.2
shadow 7.2
clothing 7.2
bright 7.1
hair 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 95.7
floor 92.4
clothing 90.1
person 87.7
dance 82.5
footwear 77.2
musical instrument 76.6
sport 69.3
black and white 62.1
woman 54.9
posing 44.3

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 98.4%
Happy 26.8%
Confused 23.8%
Sad 23.1%
Calm 17%
Disgusted 3.9%
Angry 2.4%
Surprised 1.9%
Fear 1%

AWS Rekognition

Age 31-41
Gender Male, 90.2%
Sad 83.8%
Confused 5.4%
Surprised 4.9%
Calm 2%
Angry 1.7%
Disgusted 1.1%
Fear 0.9%
Happy 0.3%

AWS Rekognition

Age 42-50
Gender Male, 75.6%
Happy 91.1%
Calm 3.5%
Sad 1.5%
Surprised 1.4%
Confused 1%
Angry 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 49-57
Gender Male, 63.6%
Calm 54.7%
Surprised 36.3%
Sad 4.4%
Fear 1.5%
Happy 1.2%
Confused 1.2%
Disgusted 0.5%
Angry 0.2%

AWS Rekognition

Age 30-40
Gender Male, 89.9%
Calm 52.6%
Happy 33.2%
Surprised 7.7%
Sad 2.6%
Confused 1.4%
Fear 0.9%
Disgusted 0.9%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Shoe 98.1%
Helmet 97.9%
Guitar 91.6%

Captions

Microsoft

a group of people posing for the camera 72.5%
a group of people posing for a picture 72.4%
a group of people posing for a photo 62.9%

Text analysis

Amazon

KODAK--A-1TW

Google

33A°2--
NAGON.
MJI7--Y T 33A°2-- NAGON.
MJI7--Y
T