Human Generated Data

Title

Untitled (children looking toward staircase during Christmas party)

Date

1962

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9875

Human Generated Data

Title

Untitled (children looking toward staircase during Christmas party)

People

Artist: Martin Schweig, American 20th century

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Clothing 99.9
Apparel 99.9
Person 99.4
Human 99.4
Person 99.2
Person 98.9
Person 98.4
Person 97.9
Person 97.7
Skirt 96.7
Female 96.5
Shorts 94.8
Person 93.8
Dress 92.5
Footwear 88.5
Shoe 88.5
Woman 84.9
Girl 75.3
People 75.2
Person 67.5
Child 65.9
Kid 65.9
Stage 64.3
Shoe 50.1

Imagga
created on 2022-01-28

musical instrument 41.4
percussion instrument 29.1
marimba 28.6
people 24.5
man 24.2
wind instrument 22.4
male 21.3
person 19.4
group 18.5
room 17
adult 16.4
men 15.4
classroom 15.4
businessman 14.1
silhouette 14.1
business 14
musician 13.7
black 13.2
women 11.9
concert 11.6
team 11.6
brass 11.5
couple 11.3
music 10.9
crowd 10.6
musical 10.5
art 10.4
stage 10.3
play 10.3
singer 10.2
device 10.2
teacher 10.2
sax 9.6
education 9.5
culture 9.4
youth 9.4
outfit 9.1
fashion 9
style 8.9
indoors 8.8
free-reed instrument 8.7
window 8.4
hand 8.3
city 8.3
clothing 8.1
sexy 8
harmonica 7.9
rock 7.8
old 7.7
microphone 7.4
performer 7.4
trombone 7.3
active 7.3
graphic 7.3
lifestyle 7.2
dress 7.2
player 7.1
love 7.1
world 7.1
job 7.1
bass 7.1
modern 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

outdoor 96.4
person 94.2
clothing 93.2
standing 82.5
people 80.6
woman 80.2
footwear 75.9
text 66.7
group 61.5
dress 58.2

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.2%
Calm 70.7%
Confused 14%
Surprised 7.3%
Angry 3.5%
Sad 1.5%
Fear 1.3%
Happy 1%
Disgusted 0.7%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 99.9%
Surprised 0%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 98.4%
Calm 93.9%
Surprised 3.4%
Sad 1.2%
Fear 0.4%
Disgusted 0.3%
Angry 0.3%
Happy 0.2%
Confused 0.2%

AWS Rekognition

Age 30-40
Gender Male, 83.2%
Sad 40.7%
Calm 24.5%
Happy 18%
Confused 6.8%
Surprised 4.7%
Disgusted 2.1%
Angry 1.6%
Fear 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 88.5%

Captions

Microsoft

a group of people standing in front of a building 95.8%
a group of people posing for a photo 93.5%
a group of people standing next to a building 93.4%

Text analysis

Amazon

a