Human Generated Data

Title

Untitled (four girls in tutus)

Date

1949

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2041

Human Generated Data

Title

Untitled (four girls in tutus)

People

Artist: Hamblin Studio, American active 1930s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2041

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Person 99.2
Dance 97
Person 94.3
Shoe 89.7
Clothing 89.7
Footwear 89.7
Apparel 89.7
Ballet 88.2
Ballerina 78.7
Costume 59.1

Clarifai
created on 2023-10-25

people 99.9
dancer 98.3
adult 98.3
man 97.5
woman 95.8
dancing 95.5
group 95.4
wear 94.2
child 91.8
tutu 91.6
ballerina 91.3
ballet dancer 90.6
two 87.2
costume 85.5
portrait 83.5
print 82.5
dress 82.4
group together 81.5
three 80.1
actor 79

Imagga
created on 2021-12-14

dancer 57.6
musical instrument 52.3
accordion 46.3
performer 46.2
keyboard instrument 36.7
wind instrument 34.8
entertainer 34.6
person 32.7
people 25.6
man 22.2
adult 19.9
dance 17
athlete 17
runner 16.6
couple 16.5
male 16.3
summer 15.4
lifestyle 15.2
happy 15
happiness 14.1
silhouette 14.1
art 13.2
outdoor 13
men 12.9
sport 12.7
attractive 12.6
active 12.6
joy 12.5
portrait 12.3
sky 12.1
women 11.8
fitness 11.7
bench 11.5
fashion 11.3
holding 10.7
together 10.5
beach 10.3
exercise 10
dress 9.9
pretty 9.8
lady 9.7
fun 9.7
run 9.6
park bench 9.6
cheerful 8.9
group 8.9
body 8.8
smiling 8.7
brass 8.5
walking 8.5
travel 8.4
seat 8.3
city 8.3
outdoors 8.2
looking 8
creation 8
sand 7.9
urban 7.9
day 7.8
life 7.8
black 7.8
two 7.6
relax 7.6
human 7.5
leisure 7.5
contestant 7.4
fit 7.4
freedom 7.3
sun 7.2
sexy 7.2
copy space 7.2
sunset 7.2
grass 7.1
love 7.1
clothing 7
sea 7
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

dance 96.5
text 95.9
black and white 86.3
clothing 85.4
footwear 81.3
woman 78.8
person 75.5
posing 64.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 94.2%
Surprised 87.3%
Happy 3.6%
Calm 3.6%
Fear 2.5%
Confused 1.3%
Sad 0.7%
Angry 0.6%
Disgusted 0.4%

AWS Rekognition

Age 23-37
Gender Female, 91.6%
Sad 44.6%
Calm 32.2%
Surprised 12.7%
Fear 3.7%
Confused 3.1%
Happy 2.2%
Angry 1.2%
Disgusted 0.3%

AWS Rekognition

Age 26-40
Gender Male, 89.2%
Calm 61.7%
Surprised 16.7%
Happy 9.1%
Confused 5.7%
Angry 2.6%
Fear 1.9%
Sad 1.8%
Disgusted 0.5%

AWS Rekognition

Age 36-54
Gender Male, 84.4%
Angry 49.7%
Calm 35.4%
Surprised 5%
Sad 2.8%
Happy 2.5%
Fear 2.2%
Confused 1.6%
Disgusted 0.8%

AWS Rekognition

Age 20-32
Gender Male, 53%
Calm 57%
Sad 18.7%
Happy 17.8%
Surprised 2%
Angry 1.6%
Confused 1.4%
Fear 1.1%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 89.7%

Categories

Imagga

paintings art 97.6%
interior objects 1.8%

Text analysis

Amazon

11

Google

11
11