Human Generated Data

Title

Untitled (dance class; audience watches)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.480

Human Generated Data

Title

Untitled (dance class; audience watches)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Indoors 99.8
Interior Design 99.8
Room 97.9
Person 93.3
Human 93.3
Person 93.2
Mammal 92.7
Pet 92.7
Canine 92.7
Dog 92.7
Animal 92.7
Person 91.3
Person 88.2
Person 88.1
School 86.5
Classroom 86.5
Person 85.8
Person 85.7
Person 85.2
Person 82.4
Crowd 77.8
Person 76.8
Person 76.5
Audience 75.1
Person 75.1
Person 74.9
Person 73.3
Person 72.2
Person 69.3
Art 67.5
Painting 67.5
Person 65
Stage 58.9
Person 57.1
Person 49.3

Imagga
created on 2022-01-23

curtain 73.8
theater curtain 71.1
blind 54.7
protective covering 37
covering 22.5
design 22.5
decoration 21.7
texture 20.8
graphic 19.7
window shade 19.6
wall 18
interior 17.7
pattern 17.1
hanging 17
stage 16.8
decor 15.9
art 15.8
window blind 15.7
modern 15.4
light 15.4
room 15.3
style 14.8
theater 14.8
entertainment 14.7
color 14.4
furniture 14
old 13.9
frame 13.8
hall 13.5
performance 13.4
backdrop 13.2
decorative 12.5
wood 12.5
architecture 12.5
antique 12.3
house 11.7
show 11.4
home 11.2
celebration 11.2
luxury 11.1
elegant 11.1
grunge 11.1
cinema 11
silhouette 10.7
vintage 10.7
wallpaper 10.7
retro 10.6
classical 10.5
window 10.4
fabric 10.4
holiday 10
border 9.9
drapes 9.9
indoors 9.7
entrance 9.7
floor 9.3
event 9.2
elegance 9.2
people 8.9
orchestra 8.9
sofa 8.8
textured 8.8
movie 8.7
building 8.6
party 8.6
presentation 8.4
classic 8.3
shape 8.2
material 8
wooden 7.9
velvet 7.9
black 7.8
panel 7.8
audience 7.8
table 7.8
space 7.7
film 7.7
relaxation 7.5
brown 7.4
digital 7.3
paint 7.2
domestic 7.2
star 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 71.4
cartoon 63.7
person 51.8

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 97.8%
Fear 59.6%
Calm 20.7%
Happy 8.7%
Sad 3.5%
Surprised 2.6%
Disgusted 1.7%
Angry 1.6%
Confused 1.5%

AWS Rekognition

Age 10-18
Gender Female, 52.5%
Sad 45.3%
Fear 18.6%
Angry 10.5%
Disgusted 8.1%
Calm 6.6%
Confused 5%
Surprised 4.6%
Happy 1.3%

AWS Rekognition

Age 2-10
Gender Female, 95.3%
Calm 61.7%
Sad 10.6%
Fear 9.2%
Surprised 5.5%
Disgusted 4.8%
Confused 4.1%
Angry 2.9%
Happy 1.3%

AWS Rekognition

Age 9-17
Gender Female, 66.2%
Calm 51.3%
Happy 41.4%
Sad 2.7%
Angry 1.6%
Disgusted 1%
Confused 0.9%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 6-16
Gender Female, 98%
Calm 95.4%
Sad 3.4%
Happy 0.6%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 19-27
Gender Female, 93.5%
Sad 51.6%
Calm 39.8%
Angry 3.5%
Confused 1.9%
Surprised 1.8%
Happy 0.5%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Male, 97.6%
Calm 83%
Surprised 5.4%
Fear 3.4%
Disgusted 2.9%
Confused 2.5%
Sad 1.7%
Angry 0.7%
Happy 0.4%

AWS Rekognition

Age 6-14
Gender Female, 90.8%
Sad 69.2%
Calm 22%
Angry 4%
Fear 2.6%
Happy 1%
Confused 0.7%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 18-24
Gender Male, 79.5%
Calm 55.5%
Angry 32.2%
Sad 5%
Confused 3.4%
Disgusted 1.8%
Fear 0.9%
Surprised 0.9%
Happy 0.3%

AWS Rekognition

Age 18-24
Gender Male, 87.4%
Disgusted 47.8%
Calm 13.7%
Sad 13%
Fear 9%
Angry 6.5%
Surprised 4.7%
Confused 3.9%
Happy 1.4%

AWS Rekognition

Age 9-17
Gender Female, 63.4%
Surprised 68.6%
Confused 18.3%
Angry 6.4%
Calm 3.6%
Fear 1.4%
Sad 0.7%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 21-29
Gender Male, 60.4%
Sad 30.7%
Disgusted 25.2%
Angry 14.9%
Calm 14%
Fear 5.9%
Happy 4.2%
Surprised 3.1%
Confused 1.9%

AWS Rekognition

Age 24-34
Gender Female, 55.2%
Fear 54.4%
Sad 23.8%
Calm 6.8%
Angry 5.8%
Surprised 4.2%
Happy 1.9%
Disgusted 1.8%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.3%
Dog 92.7%
Painting 67.5%

Captions

Microsoft

a group of people posing for a photo 82.4%
a group of people in a room 82.3%
a group of people pose for a photo 82.2%